Sample records for large scale project

  1. Managing Risk and Uncertainty in Large-Scale University Research Projects

    ERIC Educational Resources Information Center

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  2. Studies on combined model based on functional objectives of large scale complex engineering

    NASA Astrophysics Data System (ADS)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  3. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    NASA Astrophysics Data System (ADS)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  4. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less

  5. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less

  6. What Will the Neighbors Think? Building Large-Scale Science Projects Around the World

    ScienceCinema

    Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug

    2017-12-22

    Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.

  7. Designing an External Evaluation of a Large-Scale Software Development Project.

    ERIC Educational Resources Information Center

    Collis, Betty; Moonen, Jef

    This paper describes the design and implementation of the evaluation of the POCO Project, a large-scale national software project in the Netherlands which incorporates the perspective of an evaluator throughout the entire span of the project, and uses the experiences gained from it to suggest an evaluation procedure that could be applied to other…

  8. 78 FR 18348 - Submission for OMB Review; Use of Project Labor Agreements for Federal Construction Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... agreement (PLA), as they may decide appropriate, on large-scale construction projects, where the total cost... procurement. A PLA is a pre-hire collective bargaining agreement with one or more labor organizations that... the use of a project labor agreement (PLA), as they may decide appropriate, on large-scale...

  9. Project Management Life Cycle Models to Improve Management in High-rise Construction

    NASA Astrophysics Data System (ADS)

    Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana

    2018-03-01

    The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.

  10. Geospatial Optimization of Siting Large-Scale Solar Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less

  12. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  13. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    ERIC Educational Resources Information Center

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-01-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials…

  14. Intercomparison Project on Parameterizations of Large-Scale Dynamics for Simulations of Tropical Convection

    NASA Astrophysics Data System (ADS)

    Sobel, A. H.; Wang, S.; Bellon, G.; Sessions, S. L.; Woolnough, S.

    2013-12-01

    Parameterizations of large-scale dynamics have been developed in the past decade for studying the interaction between tropical convection and large-scale dynamics, based on our physical understanding of the tropical atmosphere. A principal advantage of these methods is that they offer a pathway to attack the key question of what controls large-scale variations of tropical deep convection. These methods have been used with both single column models (SCMs) and cloud-resolving models (CRMs) to study the interaction of deep convection with several kinds of environmental forcings. While much has been learned from these efforts, different groups' efforts are somewhat hard to compare. Different models, different versions of the large-scale parameterization methods, and experimental designs that differ in other ways are used. It is not obvious which choices are consequential to the scientific conclusions drawn and which are not. The methods have matured to the point that there is value in an intercomparison project. In this context, the Global Atmospheric Systems Study - Weak Temperature Gradient (GASS-WTG) project was proposed at the Pan-GASS meeting in September 2012. The weak temperature gradient approximation is one method to parameterize large-scale dynamics, and is used in the project name for historical reasons and simplicity, but another method, the damped gravity wave (DGW) method, will also be used in the project. The goal of the GASS-WTG project is to develop community understanding of the parameterization methods currently in use. Their strengths, weaknesses, and functionality in models with different physics and numerics will be explored in detail, and their utility to improve our understanding of tropical weather and climate phenomena will be further evaluated. This presentation will introduce the intercomparison project, including background, goals, and overview of the proposed experimental design. Interested groups will be invited to join (it will not be too late), and preliminary results will be presented.

  15. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    PubMed

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  16. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    PubMed Central

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  17. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  18. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    PubMed

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  19. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    PubMed Central

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  20. Development of a database system for mapping insertional mutations onto the mouse genome with large-scale experimental data

    PubMed Central

    2009-01-01

    Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505

  1. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  2. Optical mapping and its potential for large-scale sequencing projects.

    PubMed

    Aston, C; Mishra, B; Schwartz, D C

    1999-07-01

    Physical mapping has been rediscovered as an important component of large-scale sequencing projects. Restriction maps provide landmark sequences at defined intervals, and high-resolution restriction maps can be assembled from ensembles of single molecules by optical means. Such optical maps can be constructed from both large-insert clones and genomic DNA, and are used as a scaffold for accurately aligning sequence contigs generated by shotgun sequencing.

  3. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  4. Utility-Scale Solar 2014. An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark; Seel, Joachim

    2015-09-01

    Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MW AC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the nextmore » few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.« less

  5. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    ERIC Educational Resources Information Center

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  6. The impact of large-scale, long-term optical surveys on pulsating star research

    NASA Astrophysics Data System (ADS)

    Soszyński, Igor

    2017-09-01

    The era of large-scale photometric variability surveys began a quarter of a century ago, when three microlensing projects - EROS, MACHO, and OGLE - started their operation. These surveys initiated a revolution in the field of variable stars and in the next years they inspired many new observational projects. Large-scale optical surveys multiplied the number of variable stars known in the Universe. The huge, homogeneous and complete catalogs of pulsating stars, such as Cepheids, RR Lyrae stars, or long-period variables, offer an unprecedented opportunity to calibrate and test the accuracy of various distance indicators, to trace the three-dimensional structure of the Milky Way and other galaxies, to discover exotic types of intrinsically variable stars, or to study previously unknown features and behaviors of pulsators. We present historical and recent findings on various types of pulsating stars obtained from the optical large-scale surveys, with particular emphasis on the OGLE project which currently offers the largest photometric database among surveys for stellar variability.

  7. Plans for Embedding ICTs into Teaching and Learning through a Large-Scale Secondary Education Reform in the Country of Georgia

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; Sales, Gregory; Sentocnik, Sonja

    2015-01-01

    Integrating ICTs into international development projects is common. However, focusing on how ICTs support leading, teaching, and learning is often overlooked. This article describes a team's approach to technology integration into the design of a large-scale, five year, teacher and leader professional development project in the country of Georgia.…

  8. Analysis of central enterprise architecture elements in models of six eHealth projects.

    PubMed

    Virkanen, Hannu; Mykkänen, Juha

    2014-01-01

    Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.

  9. CONSIDERATIONS FOR A REGULATORY FRAMEWORK FOR LARGE-SCALE GEOLOGIC SEQUESTRATION OF CARBON DIOXIDE: A NORTH AMERICAN PERSPECTIVE

    EPA Science Inventory

    Large scale geologic sequestration (GS) of carbon dioxide poses a novel set of challenges for regulators. This paper focuses on the unique needs of large scale GS projects in light of the existing regulatory regimes in the United States and Canada and identifies several differen...

  10. StePS: Stereographically Projected Cosmological Simulations

    NASA Astrophysics Data System (ADS)

    Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László

    2018-05-01

    StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.

  11. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  12. Project BALLOTS: Bibliographic Automation of Large Library Operations Using a Time-Sharing System. Progress Report (3/27/69 - 6/26/69).

    ERIC Educational Resources Information Center

    Veaner, Allen B.

    Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…

  13. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    ERIC Educational Resources Information Center

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  14. How Robust Is Your Project? From Local Failures to Global Catastrophes: A Complex Networks Approach to Project Systemic Risk.

    PubMed

    Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders

    2015-01-01

    Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.

  15. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  16. Talking About The Smokes: a large-scale, community-based participatory research project.

    PubMed

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  17. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    NASA Astrophysics Data System (ADS)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").

  18. NASA: Assessments of Selected Large-Scale Projects

    DTIC Science & Technology

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  19. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoliang; Stauffer, Philip H.

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  20. Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Beatty, Brenda; Hill, Graham

    2013-12-01

    Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less

  1. Collaborative Working for Large Digitisation Projects

    ERIC Educational Resources Information Center

    Yeates, Robin; Guy, Damon

    2006-01-01

    Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…

  2. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2009-09-30

    Modeling of Burning Emissions ( FLAMBE ) project, and other related parameters. Our plans to embed NAAPS inside NOGAPS may need to be put on hold...AOD, FLAMBE and FAROP at FNMOC are supported by 6.4 funding from PMW-120 for “Large-scale Atmospheric Models”, “Small-scale Atmospheric Models

  3. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  4. Development of a large-scale transportation optimization course.

    DOT National Transportation Integrated Search

    2011-11-01

    "In this project, a course was developed to introduce transportation and logistics applications of large-scale optimization to graduate students. This report details what : similar courses exist in other universities, and the methodology used to gath...

  5. Lessons from a Large-Scale Assessment: Results from Conceptual Inventories

    ERIC Educational Resources Information Center

    Thacker, Beth; Dulli, Hani; Pattillo, Dave; West, Keith

    2014-01-01

    We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER) (physics education research-informed materials) into a department where most instruction has previously been traditional and a significant…

  6. Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring

    USGS Publications Warehouse

    Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.

    2015-04-14

    Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.

  7. The epistemic culture in an online citizen science project: Programs, antiprograms and epistemic subjects.

    PubMed

    Kasperowski, Dick; Hillman, Thomas

    2018-05-01

    In the past decade, some areas of science have begun turning to masses of online volunteers through open calls for generating and classifying very large sets of data. The purpose of this study is to investigate the epistemic culture of a large-scale online citizen science project, the Galaxy Zoo, that turns to volunteers for the classification of images of galaxies. For this task, we chose to apply the concepts of programs and antiprograms to examine the 'essential tensions' that arise in relation to the mobilizing values of a citizen science project and the epistemic subjects and cultures that are enacted by its volunteers. Our premise is that these tensions reveal central features of the epistemic subjects and distributed cognition of epistemic cultures in these large-scale citizen science projects.

  8. Evaluating Introductory Physics Classes in Light of the ABET Criteria: An Example from the SCALE-UP Project.

    ERIC Educational Resources Information Center

    Saul, Jeffery M.; Deardorff, Duane L.; Abbott, David S.; Allain, Rhett J.; Beichner, Robert J.

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project at North Carolina State University (NCSU) is developing a curriculum to promote learning through in-class group activities in introductory physics classes up to 100 students. The authors are currently in Phase II of the project using a specially designed…

  9. Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies

    NASA Astrophysics Data System (ADS)

    Kauffmann, Guinevere

    2018-03-01

    The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 < z < 0.8 selected from Sloan Digital Sky Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.

  10. Alternative projections of the impacts of private investment on southern forests: a comparison of two large-scale forest sector models of the United States.

    Treesearch

    Ralph Alig; Darius Adams; John Mills; Richard Haynes; Peter Ince; Robert Moulton

    2001-01-01

    The TAMM/NAPAP/ATLAS/AREACHANGE(TNAA) system and the Forest and Agriculture Sector Optimization Model (FASOM) are two large-scale forestry sector modeling systems that have been employed to analyze the U.S. forest resource situation. The TNAA system of static, spatial equilibrium models has been applied to make SO-year projections of the U.S. forest sector for more...

  11. On a Game of Large-Scale Projects Competition

    NASA Astrophysics Data System (ADS)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  12. Discovering Beaten Paths in Collaborative Ontology-Engineering Projects using Markov Chains

    PubMed Central

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A.; Noy, Natalya F.

    2014-01-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50, 000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. PMID:24953242

  13. Discovering beaten paths in collaborative ontology-engineering projects using Markov chains.

    PubMed

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A; Noy, Natalya F

    2014-10-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50,000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.

    PubMed

    Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos

    2016-01-01

    Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.

  15. How uncertain are climate model projections of water availability indicators across the Middle East?

    PubMed

    Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil

    2010-11-28

    The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.

  16. Philippine Academy of Rehabilitation Medicine emergency basic relief and medical aid mission project (November 2013-February 2014): the role of physiatrists in Super Typhoon Haiyan.

    PubMed

    Ganchoon, Filipinas; Bugho, Rommel; Calina, Liezel; Dy, Rochelle; Gosney, James

    2017-06-09

    Physiatrists have provided humanitarian assistance in recent large-scale global natural disasters. Super Typhoon Haiyan, the deadliest and most costly typhoon in modern Philippine history, made landfall on 8 November 2013 resulting in significant humanitarian needs. Philippine Academy of Rehabilitation Medicine physiatrists conducted a project of 23 emergency basic relief and medical aid missions in response to Super Typhoon Haiyan from November 2013 to February 2014. The final mission was a medical aid mission to the inland rural community of Burauen, Leyte. Summary data were collected, collated, and tabulated; project and mission evaluation was performed. During the humanitarian assistance project, 31,254 basic relief kits containing a variety of food and non-food items were distributed and medical services including consultation, treatment, and medicines were provided to 7255 patients. Of the 344 conditions evaluated in the medical aid mission to Burauen, Leyte 85 (59%) were physical and rehabilitation medicine conditions comprised of musculoskeletal (62 [73%]), neurological (17 [20%]), and dermatological (6 [7%]) diagnoses. Post-mission and project analysis resulted in recommendations and programmatic changes to strengthen response in future disasters. Physiatrists functioned as medical providers, mission team leaders, community advocates, and in other roles. This physiatrist-led humanitarian assistance project met critical basic relief and medical aid needs of persons impacted by Super Typhoon Haiyan, demonstrating significant roles performed by physiatrists in response to a large-scale natural disaster. Resulting disaster programing changes and recommendations may inform a more effective response by PARM mission teams in the Philippines as well as by other South-Eastern Asia teams comprising rehabilitation professionals to large-scale, regional natural disasters. Implications for rehabilitation Large-scale natural disasters including tropical cyclones can have a catastrophic impact on the affected population. In response to Super Typhoon Haiyan, physiatrists representing the Philippine Academy of Rehabilitation Medicine conducted a project of 23 emergency basic relief and medical aid missions from November 2013 to February 2014. Project analysis indicates that medical mission teams responding in similar settings may expect to evaluate a significant number of physical medicine and rehabilitation conditions. Medical rehabilitation with participation by rehabilitation professionals including rehabilitation doctors is essential to the emergency medical response in large-scale natural disasters.

  17. Large-scale standardized phenotyping of strawberry in RosBREED

    USDA-ARS?s Scientific Manuscript database

    A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...

  18. SCALING-UP INFORMATION IN LAND-COVER DATA FOR LARGE-SCALE ENVIRONMENTAL ASSESSMENTS

    EPA Science Inventory

    The NLCD project provides national-scope land-cover data for the conterminous United States. The first land-cover data set was completed in 2000, and the continuing need for recent land-cover information has motivated continuation of the project to provide current and change info...

  19. Successful contracting of prevention services: fighting malnutrition in Senegal and Madagascar.

    PubMed

    Marek, T; Diallo, I; Ndiaye, B; Rakotosalama, J

    1999-12-01

    There are very few documented large-scale successes in nutrition in Africa, and virtually no consideration of contracting for preventive services. This paper describes two successful large-scale community nutrition projects in Africa as examples of what can be done in prevention using the contracting approach in rural as well as urban areas. The two case-studies are the Secaline project in Madagascar, and the Community Nutrition Project in Senegal. The article explains what is meant by 'success' in the context of these two projects, how these results were achieved, and how certain bottlenecks were avoided. Both projects are very similar in the type of service they provide, and in combining private administration with public finance. The article illustrates that contracting out is a feasible option to be seriously considered for organizing certain prevention programmes on a large scale. There are strong indications from these projects of success in terms of reducing malnutrition, replicability and scale, and community involvement. When choosing that option, a government can tap available private local human resources through contracting out, rather than delivering those services by the public sector. However, as was done in both projects studied, consideration needs to be given to using a contract management unit for execution and monitoring, which costs 13-17% of the total project's budget. Rigorous assessments of the cost-effectiveness of contracted services are not available, but improved health outcomes, targeting of the poor, and basic cost data suggest that the programmes may well be relatively cost-effective. Although the contracting approach is not presented as the panacea to solve the malnutrition problem faced by Africa, it can certainly provide an alternative in many countries to increase coverage and quality of services.

  20. Basin-Scale Hydrologic Impacts of CO2 Storage: Regulatory and Capacity Implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, J.T.; Zhou, Q.

    Industrial-scale injection of CO{sub 2} into saline sedimentary basins will cause large-scale fluid pressurization and migration of native brines, which may affect valuable groundwater resources overlying the deep sequestration reservoirs. In this paper, we discuss how such basin-scale hydrologic impacts can (1) affect regulation of CO{sub 2} storage projects and (2) may reduce current storage capacity estimates. Our assessment arises from a hypothetical future carbon sequestration scenario in the Illinois Basin, which involves twenty individual CO{sub 2} storage projects in a core injection area suitable for long-term storage. Each project is assumed to inject five million tonnes of CO{sub 2}more » per year for 50 years. A regional-scale three-dimensional simulation model was developed for the Illinois Basin that captures both the local-scale CO{sub 2}-brine flow processes and the large-scale groundwater flow patterns in response to CO{sub 2} storage. The far-field pressure buildup predicted for this selected sequestration scenario suggests that (1) the area that needs to be characterized in a permitting process may comprise a very large region within the basin if reservoir pressurization is considered, and (2) permits cannot be granted on a single-site basis alone because the near- and far-field hydrologic response may be affected by interference between individual sites. Our results also support recent studies in that environmental concerns related to near-field and far-field pressure buildup may be a limiting factor on CO{sub 2} storage capacity. In other words, estimates of storage capacity, if solely based on the effective pore volume available for safe trapping of CO{sub 2}, may have to be revised based on assessments of pressure perturbations and their potential impact on caprock integrity and groundwater resources, respectively. We finally discuss some of the challenges in making reliable predictions of large-scale hydrologic impacts related to CO{sub 2} sequestration projects.« less

  1. Telecommunications technology and rural education in the United States

    NASA Technical Reports Server (NTRS)

    Perrine, J. R.

    1975-01-01

    The rural sector of the US is examined from the point of view of whether telecommunications technology can augment the development of rural education. Migratory farm workers and American Indians were the target groups which were examined as examples of groups with special needs in rural areas. The general rural population and the target groups were examined to identify problems and to ascertain specific educational needs. Educational projects utilizing telecommunications technology in target group settings were discussed. Large scale regional ATS-6 satellite-based experimental educational telecommunications projects were described. Costs and organizational factors were also examined for large scale rural telecommunications projects.

  2. Emily Evans | NREL

    Science.gov Websites

    Evans Emily Evans Project Controller Emily.Evans@nrel.gov | 303-275-3125 Emily joined NREL in 2010 . As a Project Administrator in the Integrated Applications Center, Emily works with project managers and teams to develop and maintain project management excellence on large-scale, multi-year projects

  3. Research to Real Life, 2006: Innovations in Deaf-Blindness

    ERIC Educational Resources Information Center

    Leslie, Gail, Ed.

    2006-01-01

    This publication presents several projects that support children who are deaf-blind. These projects are: (1) Learning To Learn; (2) Project SALUTE; (3) Project SPARKLE; (4) Bringing It All Back Home; (5) Project PRIIDE; and (6) Including Students With Deafblindness In Large Scale Assessment Systems. Each project lists components, key practices,…

  4. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-12-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.

  5. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    NASA Astrophysics Data System (ADS)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  6. Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs

    ERIC Educational Resources Information Center

    Carr, Nathan T.

    2008-01-01

    Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…

  7. Development of Affordable, Low-Carbon Hydrogen Supplies at an Industrial Scale

    ERIC Educational Resources Information Center

    Roddy, Dermot J.

    2008-01-01

    An existing industrial hydrogen generation and distribution infrastructure is described, and a number of large-scale investment projects are outlined. All of these projects have the potential to generate significant volumes of low-cost, low-carbon hydrogen. The technologies concerned range from gasification of coal with carbon capture and storage…

  8. Real-time adaptive ramp metering : phase I, MILOS proof of concept (multi-objective, integrated, large-scale, optimized system).

    DOT National Transportation Integrated Search

    2006-12-01

    Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...

  9. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  10. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    NASA Astrophysics Data System (ADS)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  11. Mems: Platform for Large-Scale Integrated Vacuum Electronic Circuits

    DTIC Science & Technology

    2017-03-20

    SECURITY CLASSIFICATION OF: The objective of the LIVEC advanced study project was to develop a platform for large-scale integrated vacuum electronic ...Distribution Unlimited UU UU UU UU 20-03-2017 1-Jul-2014 30-Jun-2015 Final Report: MEMS Platform for Large-Scale Integrated Vacuum Electronic ... Electronic Circuits (LIVEC) Contract No: W911NF-14-C-0093 COR Dr. James Harvey U.S. ARO RTP, NC 27709-2211 Phone: 702-696-2533 e-mail

  12. Demonstrating a new framework for the comparison of environmental impacts from small- and large-scale hydropower and wind power projects.

    PubMed

    Bakken, Tor Haakon; Aase, Anne Guri; Hagen, Dagmar; Sundt, Håkon; Barton, David N; Lujala, Päivi

    2014-07-01

    Climate change and the needed reductions in the use of fossil fuels call for the development of renewable energy sources. However, renewable energy production, such as hydropower (both small- and large-scale) and wind power have adverse impacts on the local environment by causing reductions in biodiversity and loss of habitats and species. This paper compares the environmental impacts of many small-scale hydropower plants with a few large-scale hydropower projects and one wind power farm, based on the same set of environmental parameters; land occupation, reduction in wilderness areas (INON), visibility and impacts on red-listed species. Our basis for comparison was similar energy volumes produced, without considering the quality of the energy services provided. The results show that small-scale hydropower performs less favourably in all parameters except land occupation. The land occupation of large hydropower and wind power is in the range of 45-50 m(2)/MWh, which is more than two times larger than the small-scale hydropower, where the large land occupation for large hydropower is explained by the extent of the reservoirs. On all the three other parameters small-scale hydropower performs more than two times worse than both large hydropower and wind power. Wind power compares similarly to large-scale hydropower regarding land occupation, much better on the reduction in INON areas, and in the same range regarding red-listed species. Our results demonstrate that the selected four parameters provide a basis for further development of a fair and consistent comparison of impacts between the analysed renewable technologies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Considerations for Managing Large-Scale Clinical Trials.

    ERIC Educational Resources Information Center

    Tuttle, Waneta C.; And Others

    1989-01-01

    Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…

  14. Raising Concerns about Sharing and Reusing Large-Scale Mathematics Classroom Observation Video Data

    ERIC Educational Resources Information Center

    Ing, Marsha; Samkian, Artineh

    2018-01-01

    There are great opportunities and challenges to sharing large-scale mathematics classroom observation data. This Research Commentary describes the methodological opportunities and challenges and provides a specific example from a mathematics education research project to illustrate how the research questions and framework drove observational…

  15. Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldenson, N.; Mauger, G.; Leung, L. R.

    Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less

  16. Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER

    PubMed Central

    Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos

    2016-01-01

    Abstract Background Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. New information The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues. PMID:28174507

  17. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  18. 75 FR 13765 - Submission for OMB Review; Use of Project Labor Agreements for Federal Construction Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-23

    ... a project labor agreement (PLA), as they may decide appropriate, on large-scale construction... efficiency in Federal procurement. A PLA is a pre-hire collective bargaining agreement with one or more labor...

  19. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  20. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  1. Effects of Large-Scale Solar Installations on Dust Mobilization and Air Quality

    NASA Astrophysics Data System (ADS)

    Pratt, J. T.; Singh, D.; Diffenbaugh, N. S.

    2012-12-01

    Large-scale solar projects are increasingly being developed worldwide and many of these installations are located in arid, desert regions. To examine the effects of these projects on regional dust mobilization and air quality, we analyze aerosol product data from NASA's Multi-angle Imaging Spectroradiometer (MISR) at annual and seasonal time intervals near fifteen photovoltaic and solar thermal stations ranging from 5-200 MW (12-4,942 acres) in size. The stations are distributed over eight different countries and were chosen based on size, location and installation date; most of the installations are large-scale, took place in desert climates and were installed between 2006 and 2010. We also consider air quality measurements of particulate matter between 2.5 and 10 micrometers (PM10) from the Environmental Protection Agency (EPA) monitoring sites near and downwind from the project installations in the U.S. We use monthly wind data from the NOAA's National Center for Atmospheric Prediction (NCEP) Global Reanalysis to select the stations downwind from the installations, and then perform statistical analysis on the data to identify any significant changes in these quantities. We find that fourteen of the fifteen regions have lower aerosol product after the start of the installations as well as all six PM10 monitoring stations showing lower particulate matter measurements after construction commenced. Results fail to show any statistically significant differences in aerosol optical index or PM10 measurements before and after the large-scale solar installations. However, many of the large installations are very recent, and there is insufficient data to fully understand the long-term effects on air quality. More data and higher resolution analysis is necessary to better understand the relationship between large-scale solar, dust and air quality.

  2. US EPA - ToxCast and the Tox21 program: perspectives

    EPA Science Inventory

    ToxCast is a large-scale project being conducted by the U.S. EPA to screen ~2000 chemicals against a large battery of in vitro high-throughput screening (HTS) assays. ToxCast is complemented by the Tox21 project being jointly carried out by the U.S. NIH Chemical Genomics Center (...

  3. ISMIP6 - initMIP: Greenland ice sheet model initialisation experiments

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Payne, Tony; Larour, Eric; Abe Ouchi, Ayako; Gregory, Jonathan; Lipscomb, William; Seroussi, Helene; Shepherd, Andrew; Edwards, Tamsin

    2016-04-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. This intercomparison exercise (initMIP) aims at comparing, evaluating and improving the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experiments are conceived for the large-scale Greenland ice sheet and are designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The latter experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss first results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  4. Colorado State Capitol Geothermal project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepherd, Lance

    Colorado State Capitol Geothermal Project - Final report is redacted due to space constraints. This project was an innovative large-scale ground-source heat pump (GSHP) project at the Colorado State Capitol in Denver, Colorado. The project employed two large wells on the property. One for pulling water from the aquifer, and another for returning the water to the aquifer, after performing the heat exchange. The two wells can work in either direction. Heat extracted/added to the water via a heat exchanger is used to perform space conditioning in the building.

  5. Teaching English Critically to Mexican Children

    ERIC Educational Resources Information Center

    López-Gopar, Mario E.

    2014-01-01

    The purpose of this article is to present one significant part of a large-scale critical-ethnographic-action-research project (CEAR Project) carried out in Oaxaca, Mexico. The overall CEAR Project has been conducted since 2007 in different Oaxacan elementary schools serving indigenous and mestizo (mixed-race) children. In the CEAR Project, teacher…

  6. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenison, LaVesta; Flanigan, Thomas; Hagerty, Gregg

    The primary objectives of the FutureGen 2.0 CO 2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO 2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO 2 capture in steady-state operations. The project was to be fully integratedmore » in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO 2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO 2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO 2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will be helpful to plotting the course of, and successfully executing future large demonstration projects. This Final Scientific and Technical Report describes the technology and engineering basis of the project, inclusive of process systems, performance, effluents and emissions, and controls. Further, the project cost estimate, schedule, and permitting requirements are presented, along with a project risk and opportunity assessment. Lessons-learned related to these elements are summarized in this report. Companion reports Oxy-combustion further document the accomplishments and learnings of the project, including: A.01 Project Management Report which describes what was done to coordinate the various participants, and to track their performance with regard to schedule and budget B.02 Lessons Learned - Technology Integration, Value Improvements, and Program Management, which describes the innovations and conclusions that we arrived upon during the development of the project, and makes recommendations for improvement of future projects of a similar nature . B.03 Project Economics, which details the capital and operation costs and their basis, and also illustrates the cost of power produced by the plant with certain sensitivities. B.04 Power Plant, Pipeline, and Injection Site Interfaces, which details the interfaces between the two FutureGen projects B.05 Contractual Mechanisms for Design, Construction, and Operation, which describes the major EPC, and Operations Contracts required to execute the project.« less

  7. Final Report on DOE Project entitled Dynamic Optimized Advanced Scheduling of Bandwidth Demands for Large-Scale Science Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramamurthy, Byravamurthy

    2014-05-05

    In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less

  8. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  9. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  10. The Billboard Project

    ERIC Educational Resources Information Center

    Weaver, Victoria

    2005-01-01

    Since 1997, the author coordinated a large-scale billboard project. Coordinated to coincide with the National Art Education Association's celebration of Youth Art Month, strong commitments from faculty, students, administrators, public-relations liaisons, local press, radio, TV, and community businesses have made this project a success. The first…

  11. Architecture and Programming Models for High Performance Intensive Computation

    DTIC Science & Technology

    2016-06-29

    Applications Systems and Large-Scale-Big-Data & Large-Scale-Big-Computing (DDDAS- LS ). ICCS 2015, June 2015. Reykjavk, Ice- land. 2. Bo YT, Wang P, Guo ZL...The Mahali project,” Communications Magazine , vol. 52, pp. 111–133, Aug 2014. 14 DISTRIBUTION A: Distribution approved for public release. Response ID

  12. The Large-Scale Biosphere-Atmosphere Experiment in Amazonia: Analyzing Regional Land Use Change Effects.

    Treesearch

    Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae

    2004-01-01

    The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...

  13. Large-scale hybrid poplar production economics: 1995 Alexandria, Minnesota establishment cost and management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downing, M.; Langseth, D.; Stoffel, R.

    1996-12-31

    The purpose of this project was to track and monitor costs of planting, maintaining, and monitoring large scale commercial plantings of hybrid poplar in Minnesota. These costs assists potential growers and purchasers of this resource to determine the ways in which supply and demand may be secured through developing markets.

  14. The global climate of December 1992-February 1993. Part 2: Large-scale variability across the tropical western Pacific during TOGA COARE

    NASA Technical Reports Server (NTRS)

    Gutzler, D. S.; Kiladis, G. N.; Meehl, G. A.; Weickmann, K. M.; Wheeler, M.

    1994-01-01

    Recently, scientists from more than a dozen countries carried out the field phase of a project called the Coupled-Atmosphere Response Experiment (COARE), devoted to describing the ocean-atmosphere system of the western Pacific near-equatorial warm pool. The project was conceived, organized, and funded under the auspices of the International Tropical Ocean Global Atmosphere (TOGA) Program. Although COARE consisted of several field phases, including a year-long atmospheric enhanced monitoring period (1 July 1992 -- 30 June 1993), the heart of COARE was its four-month Intensive Observation Period (IOP) extending from 1 Nov. 1992 through 28 Feb. 1993. An overview of large-scale variability during COARE is presented. The weather and climate observed in the IOP is placed into context with regard to large-scale, low-frequency fluctuations of the ocean-atmosphere system. Aspects of tropical variability beginning in Aug. 1992 and extending through Mar. 1993, with some sounding data for Apr. 1993 are considered. Variability over the large-scale sounding array (LSA) and the intensive flux array (IFA) is emphasized.

  15. Data integration in the era of omics: current and future challenges

    PubMed Central

    2014-01-01

    To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990

  16. More robust regional precipitation projection from selected CMIP5 models based on multiple-dimensional metrics

    NASA Astrophysics Data System (ADS)

    Qian, Y.; Wang, L.; Leung, L. R.; Lin, G.; Lu, J.; Gao, Y.; Zhang, Y.

    2017-12-01

    Projecting precipitation changes is challenging because of incomplete understanding of the climate system and biases and uncertainty in climate models. In East Asia where summer precipitation is dominantly influenced by the monsoon circulation and the global models from Coupled Model Intercomparison Project Phase 5 (CMIP5), however, give various projection of precipitation change for 21th century. It is critical for community to know which models' projection are more reliable in response to natural and anthropogenic forcings. In this study we defined multiple-dimensional metrics, measuring the model performance in simulating the present-day of large-scale circulation, regional precipitation and relationship between them. The large-scale circulation features examined in this study include the lower tropospheric southwesterly winds, the western North Pacific subtropical high, the South China Sea Subtropical High, and the East Asian westerly jet in the upper troposphere. Each of these circulation features transport moisture to East Asia, enhancing the moist static energy and strengthening the Meiyu moisture front that is the primary mechanism for precipitation generation in eastern China. Based on these metrics, 30 models in CMIP5 ensemble are classified into three groups. Models in the top performing group projected regional precipitation patterns that are more similar to each other than the bottom or middle performing group and consistently projected statistically significant increasing trends in two of the large-scale circulation indices and precipitation. In contrast, models in the bottom or middle performing group projected small drying or no trends in precipitation. We also find the models that only reasonably reproduce the observed precipitation climatology does not guarantee more reliable projection of future precipitation because good simulation skill could be achieved through compensating errors from multiple sources. Herein the potential for more robust projections of precipitation changes at regional scale is demonstrated through the use of discriminating metric to subsample the multi-model ensemble. The results from this study provides insights for how to select models from CMIP ensemble to project regional climate and hydrological cycle changes.

  17. Ecological research at the Goosenest Adaptive Management Area in northeastern California

    Treesearch

    Martin W. Ritchie

    2005-01-01

    This paper describes the establishment of an interdisciplinary, large-scale ecological research project on the Goosenest Adaptive Management Area of the Klamath National Forest in northeastern California. This project is a companion to the Blacks Mountain Ecological Research Project described by Oliver (2000). The genesis for this project was the Northwest...

  18. HTS-DB: an online resource to publish and query data from functional genomics high-throughput siRNA screening projects.

    PubMed

    Saunders, Rebecca E; Instrell, Rachael; Rispoli, Rossella; Jiang, Ming; Howell, Michael

    2013-01-01

    High-throughput screening (HTS) uses technologies such as RNA interference to generate loss-of-function phenotypes on a genomic scale. As these technologies become more popular, many research institutes have established core facilities of expertise to deal with the challenges of large-scale HTS experiments. As the efforts of core facility screening projects come to fruition, focus has shifted towards managing the results of these experiments and making them available in a useful format that can be further mined for phenotypic discovery. The HTS-DB database provides a public view of data from screening projects undertaken by the HTS core facility at the CRUK London Research Institute. All projects and screens are described with comprehensive assay protocols, and datasets are provided with complete descriptions of analysis techniques. This format allows users to browse and search data from large-scale studies in an informative and intuitive way. It also provides a repository for additional measurements obtained from screens that were not the focus of the project, such as cell viability, and groups these data so that it can provide a gene-centric summary across several different cell lines and conditions. All datasets from our screens that can be made available can be viewed interactively and mined for further hit lists. We believe that in this format, the database provides researchers with rapid access to results of large-scale experiments that might facilitate their understanding of genes/compounds identified in their own research. DATABASE URL: http://hts.cancerresearchuk.org/db/public.

  19. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    PubMed

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).

  20. UAS in the NAS Project: Large-Scale Communication Architecture Simulations with NASA GRC Gen5 Radio Model

    NASA Technical Reports Server (NTRS)

    Kubat, Gregory

    2016-01-01

    This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.

  1. The 80 megawatt wind power project at Kahuku Point, Hawaii

    NASA Technical Reports Server (NTRS)

    Laessig, R. R.

    1982-01-01

    Windfarms Ltd. is developing the two largest wind energy projects in the world. Designed to produce 80 megawatts at Kahuku Point, Hawaii and 350 megawatts in Solano County, California, these projects will be the prototypes for future large-scale wind energy installations throughout the world.

  2. Post-project geomorphic assessment of a large process-based river restoration project

    USGS Publications Warehouse

    Erwin, Susannah O.; Schmidt, John C.; Allred, Tyler M.

    2016-01-01

    This study describes channel changes following completion of the Provo River Restoration Project (PRRP), the largest stream restoration project in Utah and one of the largest projects in the United States in which a gravel-bed river was fully reconstructed. We summarize project objectives and the design process, and we analyze monitoring data collected during the first 7 years after project completion. Post-project channel adjustment during the study period included two phases: (i) an initial phase of rapid, but small-scale, adjustment during the first years after stream flow was introduced to the newly constructed channel and (ii) a subsequent period of more gradual topographic adjustment and channel migration. Analysis of aerial imagery and ground-survey data demonstrate that the channel has been more dynamic in the downstream 4 km where a local source contributes a significant annual supply of bed material. Here, the channel migrates and exhibits channel adjustments that are more consistent with project objectives. The upstream 12 km of the PRRP are sediment starved, the channel has been laterally stable, and this condition may not be consistent with large-scale project objectives.

  3. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  4. Development and Large-Scale Validation of an Instrument to Assess Arabic-Speaking Students' Attitudes toward Science

    ERIC Educational Resources Information Center

    Abd-El-Khalick, Fouad; Summers, Ryan; Said, Ziad; Wang, Shuai; Culbertson, Michael

    2015-01-01

    This study is part of a large-scale project focused on "Qatari students' Interest in, and Attitudes toward, Science" (QIAS). QIAS aimed to gauge Qatari student attitudes toward science in grades 3-12, examine factors that impact these attitudes, and assess the relationship between student attitudes and prevailing modes of science…

  5. Status of JUPITER Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inoue, T.; Shirakata, K.; Kinjo, K.

    To obtain the data necessary for evaluating the nuclear design method of a large-scale fast breeder reactor, criticality tests with a large- scale homogeneous reactor were conducted as part of a joint research program by Japan and the U.S. Analyses of the tests are underway in both countries. The purpose of this paper is to describe the status of this project.

  6. AN EXAMINATION OF CITIZEN PARTICIPATION AND PROCEDURAL FAIRNESS IN LARGE-SCALE URBAN TREE PLANTING INITIATIVES IN THE UNITED STATES

    EPA Science Inventory

    This project will result in a typology of the degrees and forms of citizen participation in large-scale urban tree planting initiatives. It also will identify specific aspects of urban tree planting processes that residents perceive as fair and unfair, which will provide ad...

  7. Regional variability of the frequency distribution of daily precipitation and the synoptic characteristics of heavy precipitation events in present and future climate simulations

    NASA Astrophysics Data System (ADS)

    DeAngelis, Anthony M.

    Changes in the characteristics of daily precipitation in response to global warming may have serious impacts on human life and property. An analysis of precipitation in climate models is performed to evaluate how well the models simulate the present climate and how precipitation may change in the future. Models participating in phase 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) have substantial biases in their simulation of heavy precipitation intensity over parts of North America during the 20th century. Despite these biases, the large-scale atmospheric circulation accompanying heavy precipitation is either simulated realistically or the strength of the circulation is overestimated. The biases are not related to the large-scale flow in a simple way, pointing toward the importance of other model deficiencies, such as coarse horizontal resolution and convective parameterizations, for the accurate simulation of intense precipitation. Although the models may not sufficiently simulate the intensity of precipitation, their realistic portrayal of the large-scale circulation suggests that projections of future precipitation may be reliable. In the CMIP5 ensemble, the distribution of daily precipitation is projected to undergo substantial changes in response to future atmospheric warming. The regional distribution of these changes was investigated, revealing that dry days and days with heavy-extreme precipitation are projected to increase at the expense of light-moderate precipitation over much of the middle and low latitudes. Such projections have serious implications for future impacts from flood and drought events. In other places, changes in the daily precipitation distribution are characterized by a shift toward either wetter or drier conditions in the future, with heavy-extreme precipitation projected to increase in all but the driest subtropical subsidence regions. Further analysis shows that increases in heavy precipitation in midlatitudes are largely explained by thermodynamics, including increases in atmospheric water vapor. However, in low latitudes and northern high latitudes, changes in vertical velocity accompanying heavy precipitation are also important. The strength of the large-scale atmospheric circulation is projected to change in accordance with vertical velocity in many places, though the circulation patterns, and therefore physical mechanisms that generate heavy precipitation, may remain the same.

  8. Chapter 13 - Perspectives on LANDFIRE Prototype Project Accuracy Assessment

    Treesearch

    James Vogelmann; Zhiliang Zhu; Jay Kost; Brian Tolk; Donald Ohlen

    2006-01-01

    The purpose of this chapter is to provide a general overview of the many aspects of accuracy assessment pertinent to the Landscape Fire and Resource Management Planning Tools Prototype Project (LANDFIRE Prototype Project). The LANDFIRE Prototype formed a large and complex research and development project with many broad-scale data sets and products developed throughout...

  9. Results of the Greenland Ice Sheet Model Initialisation Experiments ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, H.; Nowicki, S.; Edwards, T.; Beckley, M.; Abe-Ouchi, A.; Aschwanden, A.; Calov, R.; Gagliardini, O.; Gillet-chaulet, F.; Golledge, N. R.; Gregory, J. M.; Greve, R.; Humbert, A.; Huybrechts, P.; Larour, E. Y.; Lipscomb, W. H.; Le ´h, S.; Lee, V.; Kennedy, J. H.; Pattyn, F.; Payne, A. J.; Rodehacke, C. B.; Rückamp, M.; Saito, F.; Schlegel, N.; Seroussi, H. L.; Shepherd, A.; Sun, S.; Vandewal, R.; Ziemen, F. A.

    2016-12-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. The goal of this intercomparison exercise (initMIP-Greenland) is to compare, evaluate and improve the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss final results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  10. An efficient and scalable analysis framework for variant extraction and refinement from population-scale DNA sequence data.

    PubMed

    Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min

    2015-06-01

    The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.

  11. 3 CFR 13502 - Executive Order 13502 of February 6, 2009. Use of Project Labor Agreements for Federal...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... developing by providing structure and stability to large-scale construction projects, thereby promoting the... procurement, producing labor-management stability, and ensuring compliance with laws and regulations governing... construction projects receiving Federal financial assistance, would help to promote the economical, efficient...

  12. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  13. Large scale rigidity-based flexibility analysis of biomolecules

    PubMed Central

    Streinu, Ileana

    2016-01-01

    KINematics And RIgidity (KINARI) is an on-going project for in silico flexibility analysis of proteins. The new version of the software, Kinari-2, extends the functionality of our free web server KinariWeb, incorporates advanced web technologies, emphasizes the reproducibility of its experiments, and makes substantially improved tools available to the user. It is designed specifically for large scale experiments, in particular, for (a) very large molecules, including bioassemblies with high degree of symmetry such as viruses and crystals, (b) large collections of related biomolecules, such as those obtained through simulated dilutions, mutations, or conformational changes from various types of dynamics simulations, and (c) is intended to work as seemlessly as possible on the large, idiosyncratic, publicly available repository of biomolecules, the Protein Data Bank. We describe the system design, along with the main data processing, computational, mathematical, and validation challenges underlying this phase of the KINARI project. PMID:26958583

  14. How big is too big or how many partners are needed to build a large project which still can be managed successfully?

    NASA Astrophysics Data System (ADS)

    Henkel, Daniela; Eisenhauer, Anton

    2017-04-01

    During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.

  15. The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Hill, Kueichien C.

    1996-01-01

    The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.

  16. Comprehensive evaluation of transportation projects : a toolkit for sketch planning.

    DOT National Transportation Integrated Search

    2010-10-01

    A quick-response project-planning tool can be extremely valuable in anticipating the congestion, safety, : emissions, and other impacts of large-scale network improvements and policy implementations. This report : identifies the advantages and limita...

  17. Leveraging Resources to Address Transportation Needs: Transportation Pooled Fund Program

    DOT National Transportation Integrated Search

    2004-05-28

    This brochure describes the Transportation Pooled Fund (TPF) Program. The objectives of the TPF Program are to leverage resources, avoid duplication of effort, undertake large-scale projects, obtain greater input on project definition, achieve broade...

  18. Living the lesson: can the Lifestyle Project be used to achieve deep learning in environmental earth science?

    NASA Astrophysics Data System (ADS)

    Padden, M.; Whalen, K.

    2013-12-01

    Students in a large, second-year environmental earth science class made significant changes to their daily lives over a three-week period to learn how small-scale actions interact with global-scaled issues such as water and energy supplies, waste management and agriculture. The Lifestyle Project (Kirk and Thomas, 2003) was slightly adapted to fit a large-class setting (350 students). Students made changes to their lifestyle in self-selected categories (water, home heating, transportation, waste, food) and created journals over a three-week period as the changes increased in difficulty. The goal of this study is to gain an understanding of which aspects of the project played a pivotal role in impacting long-term learning. Content analysis of the journal entries and follow-up interviews are used to investigate if the Lifestyle Project is having a lasting impact on the students 18 months after the initial assignment.

  19. The global gridded crop model intercomparison: Data and modeling protocols for Phase 1 (v1.0)

    DOE PAGES

    Elliott, J.; Müller, C.; Deryng, D.; ...

    2015-02-11

    We present protocols and input data for Phase 1 of the Global Gridded Crop Model Intercomparison, a project of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The project consist of global simulations of yields, phenologies, and many land-surface fluxes using 12–15 modeling groups for many crops, climate forcing data sets, and scenarios over the historical period from 1948 to 2012. The primary outcomes of the project include (1) a detailed comparison of the major differences and similarities among global models commonly used for large-scale climate impact assessment, (2) an evaluation of model and ensemble hindcasting skill, (3) quantification ofmore » key uncertainties from climate input data, model choice, and other sources, and (4) a multi-model analysis of the agricultural impacts of large-scale climate extremes from the historical record.« less

  20. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    NASA Astrophysics Data System (ADS)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  1. Software engineering risk factors in the implementation of a small electronic medical record system: the problem of scalability.

    PubMed

    Chiang, Michael F; Starren, Justin B

    2002-01-01

    The successful implementation of clinical information systems is difficult. In examining the reasons and potential solutions for this problem, the medical informatics community may benefit from the lessons of a rich body of software engineering and management literature about the failure of software projects. Based on previous studies, we present a conceptual framework for understanding the risk factors associated with large-scale projects. However, the vast majority of existing literature is based on large, enterprise-wide systems, and it unclear whether those results may be scaled down and applied to smaller projects such as departmental medical information systems. To examine this issue, we discuss the case study of a delayed electronic medical record implementation project in a small specialty practice at Columbia-Presbyterian Medical Center. While the factors contributing to the delay of this small project share some attributes with those found in larger organizations, there are important differences. The significance of these differences for groups implementing small medical information systems is discussed.

  2. Progress of the Photovoltaic Technology Incubator Project Towards an Enhanced U.S. Manufacturing Base: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullal, H.; Mitchell, R.; Keyes, B.

    In this paper, we report on the major accomplishments of the U.S. Department of Energy's (DOE) Solar Energy Technologies Program (SETP) Photovoltaic (PV) Technology Incubator project. The Incubator project facilitates a company's transition from developing a solar cell or PV module prototype to pilot- and large-scale U.S. manufacturing. The project targets small businesses that have demonstrated proof-of-concept devices or processes in the laboratory. Their success supports U.S. Secretary of Energy Steven Chu's SunShot Initiative, which seeks to achieve PV technologies that are cost-competitive without subsidies at large scale with fossil-based energy sources by the end of this decade. The Incubatormore » Project has enhanced U.S. PV manufacturing capacity and created more than 1200 clean energy jobs, resulting in an increase in American economic competitiveness. The investment raised to date by these PV Incubator companies as a result of DOE's $ 59 million investment totals nearly $ 1.3 billion.« less

  3. Progress of the PV Technology Incubator Project Towards an Enhanced U.S. Manufacturing Base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullal, H.; Mitchell, R.; Keyes, B.

    In this paper, we report on the major accomplishments of the U.S. Department of Energy's (DOE) Solar Energy Technologies Program (SETP) Photovoltaic (PV) Technology Incubator project. The Incubator project facilitates a company's transition from developing a solar cell or PV module prototype to pilot- and large-scale U.S. manufacturing. The project targets small businesses that have demonstrated proof-of-concept devices or processes in the laboratory. Their success supports U.S. Secretary of Energy Steven Chu's SunShot Initiative, which seeks to achieve PV technologies that are cost-competitive without subsidies at large scale with fossil-based energy sources by the end of this decade. The Incubatormore » Project has enhanced U.S. PV manufacturing capacity and created more than 1200 clean energy jobs, resulting in an increase in American economic competitiveness. The investment raised to date by these PV Incubator companies as a result of DOE's $ 59 million investment total nearly $ 1.3 billion.« less

  4. Improving Future Ecosystem Benefits through Earth Observations: the H2020 Project ECOPOTENTIAL

    NASA Astrophysics Data System (ADS)

    Provenzale, Antonello; Beierkuhnlein, Carl; Ziv, Guy

    2016-04-01

    Terrestrial and marine ecosystems provide essential goods and services to human societies. In the last decades, however, anthropogenic pressures caused serious threats to ecosystem integrity, functions and processes, potentially leading to the loss of essential ecosystem services. ECOPOTENTIAL is a large European-funded H2020 project which focuses its activities on a targeted set of internationally recognised protected areas in Europe, European Territories and beyond, blending Earth Observations from remote sensing and field measurements, data analysis and modelling of current and future ecosystem conditions and services. The definition of future scenarios is based on climate and land-use change projections, addressing the issue of uncertainties and uncertainty propagation across the modelling chain. The ECOPOTENTIAL project addresses cross-scale geosphere-biosphere interactions and landscape-ecosystem dynamics at regional to continental scales, using geostatistical methods and the emerging approaches in Macrosystem Ecology and Earth Critical Zone studies, addressing long-term and large-scale environmental and ecological challenges. The project started its activities in 2015, by defining a set of storylines which allow to tackle some of the most crucial issues in the assessment of present conditions and the estimate of the future state of selected ecosystem services. In this contribution, we focus on some of the main storylines of the project and discuss the general approach, focusing on the interplay of data and models and on the estimate of projection uncertainties.

  5. Prototype solar house. Study of the scientific evaluation and feasibility of a research and development project

    NASA Astrophysics Data System (ADS)

    Bundschuh, V.; Grueter, J. W.; Kleemann, M.; Melis, M.; Stein, H. J.; Wagner, H. J.; Dittrich, A.; Pohlmann, D.

    1982-08-01

    A preliminary study was undertaken before a large scale project for construction and survey of about a hundred solar houses was launched. The notion of solar house was defined and the use of solar energy (hot water preparation, heating of rooms, heating of swimming pool, or a combination of these possibilities) were examined. A coherent measuring program was set up. Advantages and inconveniences of the large scale project were reviewed. Production of hot water, evaluation of different concepts and different fabrications of solar systems, coverage of the different systems, conservation of energy, failure frequency and failures statistics, durability of the installation, investment maintenance and energy costs were retained as study parameters. Different solar hot water production systems and the heat counter used for measurements are described.

  6. Assessing the Feasibility of Large-Scale Countercyclical Public Job-Creation. Final Report, Volume III. Selected Implications of Public Job-Creation.

    ERIC Educational Resources Information Center

    Urban Inst., Washington, DC.

    This last of a three-volume report of a study done to assess the feasibility of large-scale, countercyclical public job creation covers the findings regarding the priorities among projects, indirect employment effects, skill imbalances, and administrative issues; and summarizes the overall findings, conclusions, and recommendations. (Volume 1,…

  7. First Large-Scale Proteogenomic Study of Breast Cancer Provides Insight into Potential Therapeutic Targets | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.

  8. Projecting Images of the "Good" and the "Bad School": Top Scorers in Educational Large-Scale Assessments as Reference Societies

    ERIC Educational Resources Information Center

    Waldow, Florian

    2017-01-01

    Researchers interested in the global flow of educational ideas and programmes have long been interested in the role of so-called "reference societies." The article investigates how top scorers in large-scale assessments are framed as positive or negative reference societies in the education policy-making debate in German mass media and…

  9. Infrastructure for Large-Scale Quality-Improvement Projects: Early Lessons from North Carolina Improving Performance in Practice

    ERIC Educational Resources Information Center

    Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…

  10. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    PubMed

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  11. Small-scale response in an avian community to a large-scale thinning project in the southwestern United States

    Treesearch

    Karen E. Bagne; Deborah M. Finch

    2009-01-01

    Avian populations were monitored using point counts from 2002 to 2007, two years before and four years after a 2800 ha fuel reduction project. The study area was within a ponderosa pine forest near Santa Fe, New Mexico, USA. Adjacent unthinned areas were also monitored as a reference for population variation related to other factors. For individual bird species...

  12. How do glacier inventory data aid global glacier assessments and projections?

    NASA Astrophysics Data System (ADS)

    Hock, R.

    2017-12-01

    Large-scale glacier modeling relies heavily on datasets that are collected by many individuals across the globe, but managed and maintained in a coordinated fashion by international data centers. The Global Terrestrial Network for Glaciers (GTN-G) provides the framework for coordinating and making available a suite of data sets such as the Randolph Glacier Inventory (RGI), the Glacier Thickness Dataset or the World Glacier Inventory (WGI). These datasets have greatly increased our ability to assess global-scale glacier mass changes. These data have also been vital for projecting the glacier mass changes of all mountain glaciers in the world outside the Greenland and Antarctic ice sheet, a total >200,000 glaciers covering an area of more than 700,000 km2. Using forcing from 8 to 15 GCMs and 4 different emission scenarios, global-scale glacier evolution models project multi-model mean net mass losses of all glaciers between 7 cm and 24 cm sea-level equivalent by the end of the 21st century. Projected mass losses vary greatly depending on the choice of the forcing climate and emission scenario. Insufficiently constrained model parameters likely are an important reason for large differences found among these studies even when forced by the same emission scenario, especially on regional scales.

  13. Nanomanufacturing : nano-structured materials made layer-by-layer.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, James V.; Cheng, Shengfeng; Grest, Gary Stephen

    Large-scale, high-throughput production of nano-structured materials (i.e. nanomanufacturing) is a strategic area in manufacturing, with markets projected to exceed $1T by 2015. Nanomanufacturing is still in its infancy; process/product developments are costly and only touch on potential opportunities enabled by growing nanoscience discoveries. The greatest promise for high-volume manufacturing lies in age-old coating and imprinting operations. For materials with tailored nm-scale structure, imprinting/embossing must be achieved at high speeds (roll-to-roll) and/or over large areas (batch operation) with feature sizes less than 100 nm. Dispersion coatings with nanoparticles can also tailor structure through self- or directed-assembly. Layering films structured with thesemore » processes have tremendous potential for efficient manufacturing of microelectronics, photovoltaics and other topical nano-structured devices. This project is designed to perform the requisite R and D to bring Sandia's technology base in computational mechanics to bear on this scale-up problem. Project focus is enforced by addressing a promising imprinting process currently being commercialized.« less

  14. Growing up and Growing out: Emerging Adults Learn Management through Service-Learning

    ERIC Educational Resources Information Center

    Fairfield, Kent D.

    2010-01-01

    This article describes a journey introducing service-learning based on large-scale projects in an undergraduate management curriculum, leading to supplementing this approach with more conventional small-group projects. It outlines some of the foundation for service-learning. Having students undertake a single class-wide project offers distinctive…

  15. Primary Teachers Conducting Inquiry Projects: Effects on Attitudes towards Teaching Science and Conducting Inquiry

    ERIC Educational Resources Information Center

    van Aalderen-Smeets, Sandra I.; Walma van der Molen, Juliette H.; van Hest, Erna G. W. C. M.; Poortman, Cindy

    2017-01-01

    This study used an experimental, pretest-posttest control group design to investigate whether participation in a large-scale inquiry project would improve primary teachers' attitudes towards teaching science and towards conducting inquiry. The inquiry project positively affected several elements of teachers' attitudes. Teachers felt less anxious…

  16. Geomorphic analysis of large alluvial rivers

    NASA Astrophysics Data System (ADS)

    Thorne, Colin R.

    2002-05-01

    Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.

  17. Subsurface Monitoring of CO2 Sequestration - A Review and Look Forward

    NASA Astrophysics Data System (ADS)

    Daley, T. M.

    2012-12-01

    The injection of CO2 into subsurface formations is at least 50 years old with large-scale utilization of CO2 for enhanced oil recovery (CO2-EOR) beginning in the 1970s. Early monitoring efforts had limited measurements in available boreholes. With growing interest in CO2 sequestration beginning in the 1990's, along with growth in geophysical reservoir monitoring, small to mid-size sequestration monitoring projects began to appear. The overall goals of a subsurface monitoring plan are to provide measurement of CO2 induced changes in subsurface properties at a range of spatial and temporal scales. The range of spatial scales allows tracking of the location and saturation of the plume with varying detail, while finer temporal sampling (up to continuous) allows better understanding of dynamic processes (e.g. multi-phase flow) and constraining of reservoir models. Early monitoring of small scale pilots associated with CO2-EOR (e.g., the McElroy field and the Lost Hills field), developed many of the methodologies including tomographic imaging and multi-physics measurements. Large (reservoir) scale sequestration monitoring began with the Sleipner and Weyburn projects. Typically, large scale monitoring, such as 4D surface seismic, has limited temporal sampling due to costs. Smaller scale pilots can allow more frequent measurements as either individual time-lapse 'snapshots' or as continuous monitoring. Pilot monitoring examples include the Frio, Nagaoka and Otway pilots using repeated well logging, crosswell imaging, vertical seismic profiles and CASSM (continuous active-source seismic monitoring). For saline reservoir sequestration projects, there is typically integration of characterization and monitoring, since the sites are not pre-characterized resource developments (oil or gas), which reinforces the need for multi-scale measurements. As we move beyond pilot sites, we need to quantify CO2 plume and reservoir properties (e.g. pressure) over large scales, while still obtaining high resolution. Typically the high-resolution (spatial and temporal) tools are deployed in permanent or semi-permanent borehole installations, where special well design may be necessary, such as non-conductive casing for electrical surveys. Effective utilization of monitoring wells requires an approach of modular borehole monitoring (MBM) were multiple measurements can be made. An example is recent work at the Citronelle pilot injection site where an MBM package with seismic, fluid sampling and distributed fiber sensing was deployed. For future large scale sequestration monitoring, an adaptive borehole-monitoring program is proposed.

  18. Comparison of WinSLAMM Modeled Results with Monitored Biofiltration Data

    EPA Science Inventory

    The US EPA’s Green Infrastructure Demonstration project in Kansas City incorporates both small scale individual biofiltration device monitoring, along with large scale watershed monitoring. The test watershed (100 acres) is saturated with green infrastructure components (includin...

  19. Evaluating a collaborative IT based research and development project.

    PubMed

    Khan, Zaheer; Ludlow, David; Caceres, Santiago

    2013-10-01

    In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. MIGHTEE: The MeerKAT International GHz Tiered Extragalactic Exploration

    NASA Astrophysics Data System (ADS)

    Taylor, A. Russ; Jarvis, Matt

    2017-05-01

    The MeerKAT telescope is the precursor of the Square Kilometre Array mid-frequency dish array to be deployed later this decade on the African continent. MIGHTEE is one of the MeerKAT large survey projects designed to pathfind SKA key science in cosmology and galaxy evolution. Through a tiered radio continuum deep imaging project including several fields totaling 20 square degrees to microJy sensitivities and an ultra-deep image of a single 1 square degree field of view, MIGHTEE will explore dark matter and large scale structure, the evolution of galaxies, including AGN activity and star formation as a function of cosmic time and environment, the emergence and evolution of magnetic fields in galaxies, and the magnetic counter part to large scale structure of the universe.

  1. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  2. Evaluation of Subgrid-Scale Models for Large Eddy Simulation of Compressible Flows

    NASA Technical Reports Server (NTRS)

    Blaisdell, Gregory A.

    1996-01-01

    The objective of this project was to evaluate and develop subgrid-scale (SGS) turbulence models for large eddy simulations (LES) of compressible flows. During the first phase of the project results from LES using the dynamic SGS model were compared to those of direct numerical simulations (DNS) of compressible homogeneous turbulence. The second phase of the project involved implementing the dynamic SGS model in a NASA code for simulating supersonic flow over a flat-plate. The model has been successfully coded and a series of simulations has been completed. One of the major findings of the work is that numerical errors associated with the finite differencing scheme used in the code can overwhelm the SGS model and adversely affect the LES results. Attached to this overview are three submitted papers: 'Evaluation of the Dynamic Model for Simulations of Compressible Decaying Isotropic Turbulence'; 'The effect of the formulation of nonlinear terms on aliasing errors in spectral methods'; and 'Large-Eddy Simulation of a Spatially Evolving Compressible Boundary Layer Flow'.

  3. Projection Effects of Large-scale Structures on Weak-lensing Peak Abundances

    NASA Astrophysics Data System (ADS)

    Yuan, Shuo; Liu, Xiangkun; Pan, Chuzhong; Wang, Qiao; Fan, Zuhui

    2018-04-01

    High peaks in weak lensing (WL) maps originate dominantly from the lensing effects of single massive halos. Their abundance is therefore closely related to the halo mass function and thus a powerful cosmological probe. However, besides individual massive halos, large-scale structures (LSS) along lines of sight also contribute to the peak signals. In this paper, with ray-tracing simulations, we investigate the LSS projection effects. We show that for current surveys with a large shape noise, the stochastic LSS effects are subdominant. For future WL surveys with source galaxies having a median redshift z med ∼ 1 or higher, however, they are significant. For the cosmological constraints derived from observed WL high-peak counts, severe biases can occur if the LSS effects are not taken into account properly. We extend the model of Fan et al. by incorporating the LSS projection effects into the theoretical considerations. By comparing with simulation results, we demonstrate the good performance of the improved model and its applicability in cosmological studies.

  4. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of Their High School Science Classroom

    ERIC Educational Resources Information Center

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2016-01-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more…

  5. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  6. The topology of galaxy clustering.

    NASA Astrophysics Data System (ADS)

    Coles, P.; Plionis, M.

    The authors discuss an objective method for quantifying the topology of the galaxy distribution using only projected galaxy counts. The method is a useful complement to fully three-dimensional studies of topology based on the genus by virtue of the enormous projected data sets available. Applying the method to the Lick counts they find no evidence for large-scale non-gaussian behaviour, whereas the small-scale distribution is strongly non-gaussian, with a shift in the meatball direction.

  7. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys).

    PubMed

    Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).

  8. Kingsbury Bay-Grassy Point habitat restoration project: A Health Impact Assessment-oral presentation

    EPA Science Inventory

    Undertaking large-scale aquatic habitat restoration projects in prominent waterfront locations, such as city parks, provides an opportunity to both improve ecological integrity and enhance community well-being. However, to consider both opportunities simultaneously, a community-b...

  9. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption

    ERIC Educational Resources Information Center

    Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane

    2014-01-01

    A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…

  10. Multi-Scale Models for the Scale Interaction of Organized Tropical Convection

    NASA Astrophysics Data System (ADS)

    Yang, Qiu

    Assessing the upscale impact of organized tropical convection from small spatial and temporal scales is a research imperative, not only for having a better understanding of the multi-scale structures of dynamical and convective fields in the tropics, but also for eventually helping in the design of new parameterization strategies to improve the next-generation global climate models. Here self-consistent multi-scale models are derived systematically by following the multi-scale asymptotic methods and used to describe the hierarchical structures of tropical atmospheric flows. The advantages of using these multi-scale models lie in isolating the essential components of multi-scale interaction and providing assessment of the upscale impact of the small-scale fluctuations onto the large-scale mean flow through eddy flux divergences of momentum and temperature in a transparent fashion. Specifically, this thesis includes three research projects about multi-scale interaction of organized tropical convection, involving tropical flows at different scaling regimes and utilizing different multi-scale models correspondingly. Inspired by the observed variability of tropical convection on multiple temporal scales, including daily and intraseasonal time scales, the goal of the first project is to assess the intraseasonal impact of the diurnal cycle on the planetary-scale circulation such as the Hadley cell. As an extension of the first project, the goal of the second project is to assess the intraseasonal impact of the diurnal cycle over the Maritime Continent on the Madden-Julian Oscillation. In the third project, the goals are to simulate the baroclinic aspects of the ITCZ breakdown and assess its upscale impact on the planetary-scale circulation over the eastern Pacific. These simple multi-scale models should be useful to understand the scale interaction of organized tropical convection and help improve the parameterization of unresolved processes in global climate models.

  11. Northeastern Oregon bark beetle control project 1910-11.

    Treesearch

    H.E. Burke

    1990-01-01

    This history, from the memoirs of the entomologist in charge, describes the first large-scale cooperative bark beetle control project funded by Congress in the Western United States. It describes relations between the Forest Service, Bureau of Entomology, and private timber owners, how the project was organized and conducted, and results of the control measures. The...

  12. The Galics Project: Virtual Galaxy: from Cosmological N-body Simulations

    NASA Astrophysics Data System (ADS)

    Guiderdoni, B.

    The GalICS project develops extensive semi-analytic post-processing of large cosmological simulations to describe hierarchical galaxy formation. The multiwavelength statistical properties of high-redshift and local galaxies are predicted within the large-scale structures. The fake catalogs and mock images that are generated from the outputs are used for the analysis and preparation of deep surveys. The whole set of results is now available in an on-line database that can be easily queried. The GalICS project represents a first step towards a 'Virtual Observatory of virtual galaxies'.

  13. Large scale afforestation projects mitigate degradation and increase the stability of the karst ecosystems in southwest China

    NASA Astrophysics Data System (ADS)

    Yue, Y.; Tong, X.; Wang, K.; Fensholt, R.; Brandt, M.

    2017-12-01

    With the aim to combat desertification and improve the ecological environment, mega-engineering afforestation projects have been launched in the karst regions of southwest China around the turn of the new millennium. A positive impact of these projects on vegetation cover has been shown, however, it remains unclear if conservation efforts have been able to effectively restore ecosystem properties and reduce the sensitivity of the karst ecosystem to climate variations at large scales. Here we use passive microwave and optical satellite time series data combined with the ecosystem model LPJ-GUESS and show widespread increase in vegetation cover with a clear demarcation at the Chinese national border contrasting the conditions of neighboring countries. We apply a breakpoint detection to identify permanent changes in vegetation time series and assess the vegetation's sensitivity against climate before and after the breakpoints. A majority (74%) of the breakpoints were detected between 2001 and 2004 and are remarkably in line with the implementation and spatial extent of the Grain to Green project. We stratify the counties of the study area into four groups according to the extent of Grain to Green conservation areas and find distinct differences between the groups. Vegetation trends are similar prior to afforestation activities (1982-2000), but clearly diverge at a later stage, following the spatial extent of conservation areas. Moreover, vegetation cover dynamics were increasingly decoupled from climatic influence in areas of high conservation efforts. Whereas both vegetation resilience and resistance were considerably improved in areas with large conservation efforts thereby showing an increase in ecosystem stability, ongoing degradation and an amplified sensitivity to climate variability was found in areas with limited project implementation. Our study concludes that large scale conservation projects can regionally contribute to a greening Earth and are able to mitigate desertification by increasing the vegetation cover and reducing the ecosystem sensitivity to climate change, however, degradation remains a serious issue in the karst ecosystem of southwest China.

  14. [Privacy and public benefit in using large scale health databases].

    PubMed

    Yamamoto, Ryuichi

    2014-01-01

    In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.

  15. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  16. Composites for Exploration Upper Stage

    NASA Technical Reports Server (NTRS)

    Fikes, J. C.; Jackson, J. R.; Richardson, S. W.; Thomas, A. D.; Mann, T. O.; Miller, S. G.

    2016-01-01

    The Composites for Exploration Upper Stage (CEUS) was a 3-year, level III project within the Technology Demonstration Missions program of the NASA Space Technology Mission Directorate. Studies have shown that composites provide important programmatic enhancements, including reduced weight to increase capability and accelerated expansion of exploration and science mission objectives. The CEUS project was focused on technologies that best advanced innovation, infusion, and broad applications for the inclusion of composites on future large human-rated launch vehicles and spacecraft. The benefits included near- and far-term opportunities for infusion (NASA, industry/commercial, Department of Defense), demonstrated critical technologies and technically implementable evolvable innovations, and sustained Agency experience. The initial scope of the project was to advance technologies for large composite structures applicable to the Space Launch System (SLS) Exploration Upper Stage (EUS) by focusing on the affordability and technical performance of the EUS forward and aft skirts. The project was tasked to develop and demonstrate critical composite technologies with a focus on full-scale materials, design, manufacturing, and test using NASA in-house capabilities. This would have demonstrated a major advancement in confidence and matured the large-scale composite technology to a Technology Readiness Level 6. This project would, therefore, have bridged the gap for providing composite application to SLS upgrades, enabling future exploration missions.

  17. Analyzing large-scale proteomics projects with latent semantic indexing.

    PubMed

    Klie, Sebastian; Martens, Lennart; Vizcaíno, Juan Antonio; Côté, Richard; Jones, Phil; Apweiler, Rolf; Hinneburg, Alexander; Hermjakob, Henning

    2008-01-01

    Since the advent of public data repositories for proteomics data, readily accessible results from high-throughput experiments have been accumulating steadily. Several large-scale projects in particular have contributed substantially to the amount of identifications available to the community. Despite the considerable body of information amassed, very few successful analyses have been performed and published on this data, leveling off the ultimate value of these projects far below their potential. A prominent reason published proteomics data is seldom reanalyzed lies in the heterogeneous nature of the original sample collection and the subsequent data recording and processing. To illustrate that at least part of this heterogeneity can be compensated for, we here apply a latent semantic analysis to the data contributed by the Human Proteome Organization's Plasma Proteome Project (HUPO PPP). Interestingly, despite the broad spectrum of instruments and methodologies applied in the HUPO PPP, our analysis reveals several obvious patterns that can be used to formulate concrete recommendations for optimizing proteomics project planning as well as the choice of technologies used in future experiments. It is clear from these results that the analysis of large bodies of publicly available proteomics data by noise-tolerant algorithms such as the latent semantic analysis holds great promise and is currently underexploited.

  18. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.

    Bioinformatics researchers are increasingly confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBasemore » project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hite, Roger

    The project site is located in Livingston Parish, Louisiana, approximately 26 miles due east of Baton Rouge. This project proposed to evaluate an early Eocene-aged Wilcox oil reservoir for permanent storage of CO 2. Blackhorse Energy, LLC planned to conduct a parallel CO 2 oil recovery project in the First Wilcox Sand. The primary focus of this project was to examine and prove the suitability of South Louisiana geologic formations for large-scale geologic sequestration of CO 2 in association with enhanced oil recovery applications. This was to be accomplished through the focused demonstration of small-scale, permanent storage of CO 2more » in the First Wilcox Sand. The project was terminated at the request of Blackhorse Energy LLC on October 22, 2014.« less

  20. Algorithm of OMA for large-scale orthology inference

    PubMed Central

    Roth, Alexander CJ; Gonnet, Gaston H; Dessimoz, Christophe

    2008-01-01

    Background OMA is a project that aims to identify orthologs within publicly available, complete genomes. With 657 genomes analyzed to date, OMA is one of the largest projects of its kind. Results The algorithm of OMA improves upon standard bidirectional best-hit approach in several respects: it uses evolutionary distances instead of scores, considers distance inference uncertainty, includes many-to-many orthologous relations, and accounts for differential gene losses. Herein, we describe in detail the algorithm for inference of orthology and provide the rationale for parameter selection through multiple tests. Conclusion OMA contains several novel improvement ideas for orthology inference and provides a unique dataset of large-scale orthology assignments. PMID:19055798

  1. In Defense of the National Labs and Big-Budget Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodwin, J R

    2008-07-29

    The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less

  2. A general method for large-scale fabrication of Cu nanoislands/dragonfly wing SERS flexible substrates

    NASA Astrophysics Data System (ADS)

    Wang, Yuhong; Wang, Mingli; Shen, Lin; Zhu, Yanying; Sun, Xin; Shi, Guochao; Xu, Xiaona; Li, Ruifeng; Ma, Wanli

    2018-01-01

    Not Available Project supported by the Youth Fund Project of University Science and Technology Plan of Hebei Provincial Department of Education, China (Grant No. QN2015004) and the Doctoral Fund of Yanshan University, China (Grant No. B924).

  3. Breaking barriers through collaboration: the example of the Cell Migration Consortium.

    PubMed

    Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas

    2002-10-15

    Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.

  4. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  5. The dynamics and evolution of clusters of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret; Huchra, John P.

    1987-01-01

    Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.

  6. White Paper on Dish Stirling Technology: Path Toward Commercial Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andraka, Charles E.; Stechel, Ellen; Becker, Peter

    2016-07-01

    Dish Stirling energy systems have been developed for distributed and large-scale utility deployment. This report summarizes the state of the technology in a joint project between Stirling Energy Systems, Sandia National Laboratories, and the Department of Energy in 2011. It then lays out a feasible path to large scale deployment, including development needs and anticipated cost reduction paths that will make a viable deployment product.

  7. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    PubMed

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  8. High Fidelity Simulations of Large-Scale Wireless Networks (Plus-Up)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma

    Sandia has built a strong reputation in scalable network simulation and emulation for cyber security studies to protect our nation’s critical information infrastructures. Georgia Tech has preeminent reputation in academia for excellence in scalable discrete event simulations, with strong emphasis on simulating cyber networks. Many of the experts in this field, such as Dr. Richard Fujimoto, Dr. George Riley, and Dr. Chris Carothers, have strong affiliations with Georgia Tech. The collaborative relationship that we intend to immediately pursue is in high fidelity simulations of practical large-scale wireless networks using ns-3 simulator via Dr. George Riley. This project will have mutualmore » benefits in bolstering both institutions’ expertise and reputation in the field of scalable simulation for cyber-security studies. This project promises to address high fidelity simulations of large-scale wireless networks. This proposed collaboration is directly in line with Georgia Tech’s goals for developing and expanding the Communications Systems Center, the Georgia Tech Broadband Institute, and Georgia Tech Information Security Center along with its yearly Emerging Cyber Threats Report. At Sandia, this work benefits the defense systems and assessment area with promise for large-scale assessment of cyber security needs and vulnerabilities of our nation’s critical cyber infrastructures exposed to wireless communications.« less

  9. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    DTIC Science & Technology

    2004-10-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory

  10. How much a galaxy knows about its large-scale environment?: An information theoretic perspective

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2017-05-01

    The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.

  11. Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)

    NASA Technical Reports Server (NTRS)

    Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David

    2012-01-01

    With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.

  12. Integrated Mid-Continent Carbon Capture, Sequestration & Enhanced Oil Recovery Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian McPherson

    2010-08-31

    A consortium of research partners led by the Southwest Regional Partnership on Carbon Sequestration and industry partners, including CAP CO2 LLC, Blue Source LLC, Coffeyville Resources, Nitrogen Fertilizers LLC, Ash Grove Cement Company, Kansas Ethanol LLC, Headwaters Clean Carbon Services, Black & Veatch, and Schlumberger Carbon Services, conducted a feasibility study of a large-scale CCS commercialization project that included large-scale CO{sub 2} sources. The overall objective of this project, entitled the 'Integrated Mid-Continent Carbon Capture, Sequestration and Enhanced Oil Recovery Project' was to design an integrated system of US mid-continent industrial CO{sub 2} sources with CO{sub 2} capture, and geologicmore » sequestration in deep saline formations and in oil field reservoirs with concomitant EOR. Findings of this project suggest that deep saline sequestration in the mid-continent region is not feasible without major financial incentives, such as tax credits or otherwise, that do not exist at this time. However, results of the analysis suggest that enhanced oil recovery with carbon sequestration is indeed feasible and practical for specific types of geologic settings in the Midwestern U.S.« less

  13. Linking climate projections to performance: A yield-based decision scaling assessment of a large urban water resources system

    NASA Astrophysics Data System (ADS)

    Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.

    2014-04-01

    Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.

  14. The ECE Culminating Design Experience: Analysis of ABET 2000 Compliance at Leading Academic Institutions

    DTIC Science & Technology

    2006-05-01

    a significant design project that requires development of a large scale software project . A distinct shortcoming of Purdue ECE...18-540: Rapid Prototyping of Computer Systems This is a project -oriented course which will deal with all four aspects of project development ; the...instructors, will develop specifications for a mobile computer to assist in inspection and maintenance. The application will be partitioned

  15. Skate Genome Project: Cyber-Enabled Bioinformatics Collaboration

    PubMed Central

    Vincent, J.

    2011-01-01

    The Skate Genome Project, a pilot project of the North East Cyber infrastructure Consortium, aims to produce a draft genome sequence of Leucoraja erinacea, the Little Skate. The pilot project was designed to also develop expertise in large scale collaborations across the NECC region. An overview of the bioinformatics and infrastructure challenges faced during the first year of the project will be presented. Results to date and lessons learned from the perspective of a bioinformatics core will be highlighted.

  16. Combining points and lines in rectifying satellite images

    NASA Astrophysics Data System (ADS)

    Elaksher, Ahmed F.

    2017-09-01

    The quick advance in remote sensing technologies established the potential to gather accurate and reliable information about the Earth surface using high resolution satellite images. Remote sensing satellite images of less than one-meter pixel size are currently used in large-scale mapping. Rigorous photogrammetric equations are usually used to describe the relationship between the image coordinates and ground coordinates. These equations require the knowledge of the exterior and interior orientation parameters of the image that might not be available. On the other hand, the parallel projection transformation could be used to represent the mathematical relationship between the image-space and objectspace coordinate systems and provides the required accuracy for large-scale mapping using fewer ground control features. This article investigates the differences between point-based and line-based parallel projection transformation models in rectifying satellite images with different resolutions. The point-based parallel projection transformation model and its extended form are presented and the corresponding line-based forms are developed. Results showed that the RMS computed using the point- or line-based transformation models are equivalent and satisfy the requirement for large-scale mapping. The differences between the transformation parameters computed using the point- and line-based transformation models are insignificant. The results showed high correlation between the differences in the ground elevation and the RMS.

  17. Reducing HIV infection among new injecting drug users in the China-Vietnam Cross Border Project.

    PubMed

    Des Jarlais, Don C; Kling, Ryan; Hammett, Theodore M; Ngu, Doan; Liu, Wei; Chen, Yi; Binh, Kieu Thanh; Friedmann, Patricia

    2007-12-01

    To assess an HIV prevention programme for injecting drug users (IDU) in the crossborder area between China and Vietnam. Serial cross-sectional surveys (0, 6, 12, 18, 24 and 36 months) of community-recruited current IDU. The project included peer educator outreach and the large-scale distribution of sterile injection equipment. Serial cross-sectional surveys with HIV testing of community recruited IDU were conducted at baseline (before implementation) and 6, 12, 18, 24 and 36 months post-baseline. HIV prevalence and estimated HIV incidence among new injectors (individuals injecting drugs for < 3 years) in each survey wave were the primary outcome measures. The percentages of new injectors among all subjects declined across each survey waves in both Ning Ming and Lang Son. HIV prevalence and estimated incidence fell by approximately half at the 24-month survey and by approximately three quarters at the 36-month survey in both areas (all P < 0.01). The implementation of large-scale outreach and syringe access programmes was followed by substantial reductions in HIV infection among new injectors, with no evidence of any increase in individuals beginning to inject drugs. This project may serve as a model for large-scale HIV prevention programming for IDU in China, Vietnam, and other developing/transitional countries.

  18. Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  19. Fabrication of the HIAD Large-Scale Demonstration Assembly

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  20. Large scale multiprocessor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gajski, D.; Kuck, D.; Lawrie, D.

    1983-03-01

    The primary goal of the Cedar project is to demonstrate that supercomputers of the future can exhibit general purpose behavior and be easy to use. The Cedar project is based on five key developments which have reached fruition in the past year and taken together offer a comprehensive solution to these problems. The author looks at this project, and how its goals are being met.

  1. Forensic Schedule Analysis of Construction Delay in Military Projects in the Middle East

    DTIC Science & Technology

    This research performs forensic schedule analysis of delay factors that impacted recent large-scale military construction projects in the Middle East...The methodologies for analysis are adapted from the Professional Practice Guide to Forensic Schedule Analysis, particularly Method 3.7 Modeled

  2. CRP: Collaborative Research Project (A Mathematical Research Experience for Undergraduates)

    ERIC Educational Resources Information Center

    Parsley, Jason; Rusinko, Joseph

    2017-01-01

    The "Collaborative Research Project" ("CRP")--a mathematics research experience for undergraduates--offers a large-scale collaborative experience in research for undergraduate students. CRP seeks to widen the audience of students who participate in undergraduate research in mathematics. In 2015, the inaugural CRP had 100…

  3. Testing the DQP: What Was Learned about Learning Outcomes?

    ERIC Educational Resources Information Center

    Ickes, Jessica L.; Flowers, Daniel R.

    2015-01-01

    Through a campuswide project using the Degree Qualifications Profile (DQP) as a comparison tool that engaged students and faculty, the authors share findings and implications about learning outcomes for IR professionals and DQP authors while considering the role of IR in large-scale, campuswide projects.

  4. Book Review: Large-Scale Ecosystem Restoration: Five Case Studies from the United States

    EPA Science Inventory

    Broad-scale ecosystem restoration efforts involve a very complex set of ecological and societal components, and the success of any ecosystem restoration project rests on an integrated approach to implementation. Editors Mary Doyle and Cynthia Drew have successfully synthesized ma...

  5. Preliminary measurement of the noise from the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, J. H.

    1985-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken into the NASA Lewis 8- by 6-Foot Wind Tunnel. The maximum blade passing tone decreases from the peak level when going to higher helical tip Mach numbers. This noise reduction points to the use of higher propeller speeds as a possible method to reduce airplane cabin noise while maintaining high flight speed and efficiency. Comparison of the SR-7A blade passing noise with the noise of the similarly designed SR-3 propeller shows good agreement as expected. The SR-7A propeller is slightly noisier than the SR-3 model in the plane of rotation at the cruise condition. Projections of the tunnel model data are made to the full-scale LAP propeller mounted on the test bed aircraft and compared with design predictions. The prediction method is conservative in the sense that it overpredicts the projected model data.

  6. Cruise noise of the 2/9th scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Stang, David B.

    1987-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.

  7. Cruise noise of the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Stang, David B.

    1987-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.

  8. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys)

    PubMed Central

    Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041

  9. The XChemExplorer graphical workflow tool for routine or large-scale protein-ligand structure determination.

    PubMed

    Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Collins, Patrick; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian; von Delft, Frank

    2017-03-01

    XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein-ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011), Acta Cryst. D67, 235-242] or PHENIX [Adams et al. (2010), Acta Cryst. D66, 213-221] have entrenched the paradigm that a `project' is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects.

  10. Simulating Forest Carbon Dynamics in Response to Large-scale Fuel Reduction Treatments Under Projected Climate-fire Interactions in the Sierra Nevada Mountains, USA

    NASA Astrophysics Data System (ADS)

    Liang, S.; Hurteau, M. D.

    2016-12-01

    The interaction of warmer, drier climate and increasing large wildfires, coupled with increasing fire severity resulting from fire-exclusion are anticipated to undermine forest carbon (C) stock stability and C sink strength in the Sierra Nevada forests. Treatments, including thinning and prescribed burning, to reduce biomass and restore forest structure have proven effective at reducing fire severity and lessening C loss when treated stands are burned by wildfire. However, the current pace and scale of treatment implementation is limited, especially given recent increases in area burned by wildfire. In this study, we used a forest landscape model (LANDIS-II) to evaluate the role of implementation timing of large-scale fuel reduction treatments in influencing forest C stock and fluxes of Sierra Nevada forests with projected climate and larger wildfires. We ran 90-year simulations using climate and wildfire projections from three general circulation models driven by the A2 emission scenario. We simulated two different treatment implementation scenarios: a `distributed' (treatments implemented throughout the simulation) and an `accelerated' (treatments implemented during the first half century) scenario. We found that across the study area, accelerated implementation had 0.6-10.4 Mg ha-1 higher late-century aboveground biomass (AGB) and 1.0-2.2 g C m-2 yr-1 higher mean C sink strength than the distributed scenario, depending on specific climate-wildfire projections. Cumulative wildfire emissions over the simulation period were 0.7-3.9 Mg C ha-1 higher for distributed implementation relative to accelerated implementation. However, simulations with both implementation practices have considerably higher AGB and C sink strength as well as lower wildfire emission than simulations in the absence of fuel reduction treatments. The results demonstrate the potential for implementing large-scale fuel reduction treatments to enhance forest C stock stability and C sink strength under projected climate-wildfire interactions. Given climate and wildfire would become more stressful since the mid-century, a forward management action would grant us more C benefits.

  11. Identification and Functional Prediction of Large Intergenic Noncoding RNAs (lincRNAs) in Rainbow Trout (Oncorhynchus mykiss)

    USDA-ARS?s Scientific Manuscript database

    Long noncoding RNAs (lncRNAs) have been recognized in recent years as key regulators of diverse cellular processes. Genome-wide large-scale projects have uncovered thousands of lncRNAs in many model organisms. Large intergenic noncoding RNAs (lincRNAs) are lncRNAs that are transcribed from intergeni...

  12. A theory of forest dynamics: Spatially explicit models and issues of scale

    NASA Technical Reports Server (NTRS)

    Pacala, S.

    1990-01-01

    Good progress has been made in the first year of DOE grant (number sign) FG02-90ER60933. The purpose of the project is to develop and investigate models of forest dynamics that apply across a range of spatial scales. The grant is one third of a three-part project. The second third was funded by the NSF this year and is intended to provide the empirical data necessary to calibrate and test small-scale (less than or equal to 1000 ha) models. The final third was also funded this year (NASA), and will provide data to calibrate and test the large-scale features of the models.

  13. A family of conjugate gradient methods for large-scale nonlinear equations.

    PubMed

    Feng, Dexiang; Sun, Min; Wang, Xueyong

    2017-01-01

    In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.

  14. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  16. Last of the Monumental Book Catalogs.

    ERIC Educational Resources Information Center

    Welsh, William J.

    1981-01-01

    Reviews the history of the National Union Catalog and the publication of the Pre-1956 Imprints. The roles of the ALA and Mansell Publishing in the completion of what is probably the last large-scale nonautomated bibliographic project, editorial problems, and the role of automation in future projects are discussed. (JL)

  17. DEVELOPMENT OF A SCALABLE, LOW-COST, ULTRANANOCRYSTALLINE DIAMOND ELECTROCHEMICAL PROCESS FOR THE DESTRUCTION OF CONTAMINANTS OF EMERGING CONCERN (CECS) - PHASE II

    EPA Science Inventory

    This Small Business Innovation Research (SBIR) Phase II project will employ the large scale; highly reliable boron-doped ultrananocrystalline diamond (BD-UNCD®) electrodes developed during Phase I project to build and test Electrochemical Anodic Oxidation process (EAOP)...

  18. Workforce Development Analysis | Energy Analysis | NREL

    Science.gov Websites

    with customer service, construction, and electrical projects One-half of surveyed firms reported , training, and experience that will enable continued large-scale deployment of wind and solar technologies engineers; and project managers. Standardized education and training at all levels-primary school through

  19. Factors Affecting Intervention Fidelity of Differentiated Instruction in Kindergarten

    ERIC Educational Resources Information Center

    Dijkstra, Elma M.; Walraven, Amber; Mooij, Ton; Kirschner, Paul A.

    2017-01-01

    This paper reports on the findings in the first phase of a design-based research project as part of a large-scale intervention study in Dutch kindergartens. The project aims at enhancing differentiated instruction and evaluating its effects on children's development, in particular high-ability children. This study investigates relevant…

  20. The Comprehensive Project for Deprived Communitites in Israel.

    ERIC Educational Resources Information Center

    Goldstein, Joseph

    A large-scale educational program, involving 30 settlements and neighborhoods that had been defined as suffering from deprivation, this project included a variety of reinforcement and enrichment programs. Information for a case study of the program was collected through interviews. Findings indicated that the guiding principles of the program…

  1. Strategies for Effective Dissemination of the Outcomes of Teaching and Learning Projects

    ERIC Educational Resources Information Center

    Southwell, Deborah; Gannaway, Deanne; Orrell, Janice; Chalmers, Denise; Abraham, Catherine

    2010-01-01

    This paper describes an empirical study that addresses the question of how higher education institutions can disseminate effectively the outcomes of projects that seek to achieve large-scale change in teaching and learning. Traditionally, dissemination of innovation and good practice is strongly advocated within universities, but little…

  2. Sensitivity of CEAP cropland simulations to the parameterization of the APEX model

    USDA-ARS?s Scientific Manuscript database

    For large scale applications like the U.S. National Scale Conservation Effects Assessment Project (CEAP), soil hydraulic characteristics data are not readily available and therefore need to be estimated. Field soil water properties are commonly approximated using laboratory soil water retention meas...

  3. Scaling up Psycholinguistics

    ERIC Educational Resources Information Center

    Smith, Nathaniel J.

    2011-01-01

    This dissertation contains several projects, each addressing different questions with different techniques. In chapter 1, I argue that they are unified thematically by their goal of "scaling up psycholinguistics"; they are all aimed at analyzing large data-sets using tools that reveal patterns to propose and test mechanism-neutral hypotheses about…

  4. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  5. Biogas utilization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moser, M.A.

    1996-01-01

    Options for successfully using biogas depend on project scale. Almost all biogas from anaerobic digesters must first go through a gas handling system that pressurizes, meters, and filters the biogas. Additional treatment, including hydrogen sulfide-mercaptan scrubbing, gas drying, and carbon dioxide removal may be necessary for specialized uses, but these are complex and expensive processes. Thus, they can be justified only for large-scale projects that require high-quality biogas. Small-scale projects (less than 65 cfm) generally use biogas (as produced) as a boiler fuel or for fueling internal combustion engine-generators to produce electricity. If engines or boilers are selected properly, theremore » should be no need to remove hydrogen sulfide. Small-scale combustion turbines, steam turbines, and fuel cells are not used because of their technical complexity and high capital cost. Biogas cleanup to pipeline or transportation fuel specifications is very costly, and energy economics preclude this level of treatment.« less

  6. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    NASA Astrophysics Data System (ADS)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  7. Large Scale Pedagogical Transformation as Widespread Cultural Change in Mexican Public Schools

    ERIC Educational Resources Information Center

    Rincón-Gallardo, Santiago

    2016-01-01

    This article examines how and under what conditions a new pedagogy can spread at scale using the Learning Community Project (LCP) in Mexico as a case study. Started as a small-scale, grassroots pedagogical change initiative in a handful of public schools, LCP evolved over an 8-year period into a national policy that spread its pedagogy of tutorial…

  8. Public attitudes toward programs of large-scale technological changes: Some reflections and policy prescriptions, appendix E

    NASA Technical Reports Server (NTRS)

    Shostak, A. B.

    1973-01-01

    The question of how ready the public is for the implementation of large-scale programs of technological change is considered. Four vital aspects of the issue are discussed which include: (1) the ways in which the public mis-perceives the change process, (2) the ways in which recent history impacts on public attitudes, (3) the ways in which the public divides among itself, and (4) the fundamentals of public attitudes towards change. It is concluded that nothing is so critical in the 1970's to securing public approval for large-scale planned change projects as is securing the approval by change-agents of the public.

  9. Overcoming Dietary Assessment Challenges in Low-Income Countries: Technological Solutions Proposed by the International Dietary Data Expansion (INDDEX) Project.

    PubMed

    Coates, Jennifer C; Colaiezzi, Brooke A; Bell, Winnie; Charrondiere, U Ruth; Leclercq, Catherine

    2017-03-16

    An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper.

  10. Overcoming Dietary Assessment Challenges in Low-Income Countries: Technological Solutions Proposed by the International Dietary Data Expansion (INDDEX) Project

    PubMed Central

    Coates, Jennifer C.; Colaiezzi, Brooke A.; Bell, Winnie; Charrondiere, U. Ruth; Leclercq, Catherine

    2017-01-01

    An increasing number of low-income countries (LICs) exhibit high rates of malnutrition coincident with rising rates of overweight and obesity. Individual-level dietary data are needed to inform effective responses, yet dietary data from large-scale surveys conducted in LICs remain extremely limited. This discussion paper first seeks to highlight the barriers to collection and use of individual-level dietary data in LICs. Second, it introduces readers to new technological developments and research initiatives to remedy this situation, led by the International Dietary Data Expansion (INDDEX) Project. Constraints to conducting large-scale dietary assessments include significant costs, time burden, technical complexity, and limited investment in dietary research infrastructure, including the necessary tools and databases required to collect individual-level dietary data in large surveys. To address existing bottlenecks, the INDDEX Project is developing a dietary assessment platform for LICs, called INDDEX24, consisting of a mobile application integrated with a web database application, which is expected to facilitate seamless data collection and processing. These tools will be subject to rigorous testing including feasibility, validation, and cost studies. To scale up dietary data collection and use in LICs, the INDDEX Project will also invest in food composition databases, an individual-level dietary data dissemination platform, and capacity development activities. Although the INDDEX Project activities are expected to improve the ability of researchers and policymakers in low-income countries to collect, process, and use dietary data, the global nutrition community is urged to commit further significant investments in order to adequately address the range and scope of challenges described in this paper. PMID:28300759

  11. Field-scale simulation of chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saad, N.

    1989-01-01

    A three-dimensional compositional chemical flooding simulator (UTCHEM) has been improved. The new mathematical formulation, boundary conditions, and a description of the physicochemical models of the simulator are presented. This improved simulator has been used for the study of the low tension pilot project at the Big Muddy field near Casper, Wyoming. Both the tracer injection conducted prior to the injection of the chemical slug, and the chemical flooding stages of the pilot project, have been analyzed. Not only the oil recovery but also the tracers, polymer, alcohol and chloride histories have been successfully matched with field results. Simulation results indicatemore » that, for this fresh water reservoir, the salinity gradient during the preflush and the resulting calcium pickup by the surfactant slug played a major role in the success of the project. In addition, analysis of the effects of the crossflow on the performance of the pilot project indicates that, for the well spacing of the pilot, crossflow does not play as important a role as it might for a large-scale project. To improve the numerical efficiency of the simulator, a third order convective differencing scheme has been applied to the simulator. This method can be used with non-uniform mesh, and therefore is suited for simulation studies of large-scale multiwell heterogeneous reservoirs. Comparison of the results with one and two dimensional analytical solutions shows that this method is effective in eliminating numerical dispersion using relatively large grid blocks. Results of one, two and three-dimensional miscible water/tracer flow, water flooding, polymer flooding, and micellar-polymer flooding test problems, and results of grid orientation studies, are presented.« less

  12. Raising Student Awareness of the Use of English for Specific Business Purposes in the European Context: A Staff-Student Project

    ERIC Educational Resources Information Center

    Nickerson, C.; Gerritsen, M.; Meurs, F.v.

    2005-01-01

    This Research Note reports on a large-scale staff-student project focussing on the use of English for Specific Business Purposes in a number of promotional genres (TV commercials, annual reports, corporate web-sites, print advertising) within several of the EU member states: Belgium, the Netherlands, France, Germany and Spain. The project as a…

  13. Air-quality in the mid-21st century for the city of Paris under two climate scenarios; from regional to local scale

    NASA Astrophysics Data System (ADS)

    Markakis, K.; Valari, M.; Colette, A.; Sanchez, O.; Perrussel, O.; Honore, C.; Vautard, R.; Klimont, Z.; Rao, S.

    2014-01-01

    Ozone and PM2.5 concentrations over the city of Paris are modeled with the CHIMERE air-quality model at 4 km × 4 km horizontal resolution for two future emission scenarios. High-resolution (1 km × 1 km) emission projection until 2020 for the greater Paris region is developed by local experts (AIRPARIF) and is further extended to year 2050 based on regional scale emission projections developed by the Global Energy Assessment. Model evaluation is performed based on a 10 yr control simulation. Ozone is in very good agreement with measurements while PM2.5 is underestimated by 20% over the urban area mainly due to a large wet bias in wintertime precipitation. A significant increase of maximum ozone relative to present time levels over Paris is modeled under the "business as usual" scenario (+7 ppb) while a more optimistic mitigation scenario leads to moderate ozone decrease (-3.5 ppb) in year 2050. These results are substantially different to previous regional scale projections where 2050 ozone is found to decrease under both future scenarios. A sensitivity analysis showed that this difference is due to the fact that ozone formation over Paris at the current, urban scale study, is driven by VOC-limited chemistry, whereas at the regional scale ozone formation occurs under NOx-sensitive conditions. This explains why the sharp NOx reductions implemented in the future scenarios have a different effect on ozone projections at different scales. In rural areas projections at both scales yield similar results showing that the longer time-scale processes of emission transport and ozone formation are less sensitive to model resolution. PM2.5 concentrations decrease by 78% and 89% under "business as usual" and "mitigation" scenarios respectively compared to present time period. The reduction is much more prominent over the urban part of the domain due to the effective reductions of road transport and residential emissions resulting in the smoothing of the large urban increment modelled in the control simulation.

  14. Seemingly unrelated intervention time series models for effectiveness evaluation of large scale environmental remediation.

    PubMed

    Ip, Ryan H L; Li, W K; Leung, Kenneth M Y

    2013-09-15

    Large scale environmental remediation projects applied to sea water always involve large amount of capital investments. Rigorous effectiveness evaluations of such projects are, therefore, necessary and essential for policy review and future planning. This study aims at investigating effectiveness of environmental remediation using three different Seemingly Unrelated Regression (SUR) time series models with intervention effects, including Model (1) assuming no correlation within and across variables, Model (2) assuming no correlation across variable but allowing correlations within variable across different sites, and Model (3) allowing all possible correlations among variables (i.e., an unrestricted model). The results suggested that the unrestricted SUR model is the most reliable one, consistently having smallest variations of the estimated model parameters. We discussed our results with reference to marine water quality management in Hong Kong while bringing managerial issues into consideration. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Coordinated Parameterization Development and Large-Eddy Simulation for Marine and Arctic Cloud-Topped Boundary Layers

    NASA Technical Reports Server (NTRS)

    Bretherton, Christopher S.

    2002-01-01

    The goal of this project was to compare observations of marine and arctic boundary layers with: (1) parameterization systems used in climate and weather forecast models; and (2) two and three dimensional eddy resolving (LES) models for turbulent fluid flow. Based on this comparison, we hoped to better understand, predict, and parameterize the boundary layer structure and cloud amount, type, and thickness as functions of large scale conditions that are predicted by global climate models. The principal achievements of the project were as follows: (1) Development of a novel boundary layer parameterization for large-scale models that better represents the physical processes in marine boundary layer clouds; and (2) Comparison of column output from the ECMWF global forecast model with observations from the SHEBA experiment. Overall the forecast model did predict most of the major precipitation events and synoptic variability observed over the year of observation of the SHEBA ice camp.

  16. Evaluating large-scale health programmes at a district level in resource-limited countries.

    PubMed

    Svoronos, Theodore; Mate, Kedar S

    2011-11-01

    Recent experience in evaluating large-scale global health programmes has highlighted the need to consider contextual differences between sites implementing the same intervention. Traditional randomized controlled trials are ill-suited for this purpose, as they are designed to identify whether an intervention works, not how, when and why it works. In this paper we review several evaluation designs that attempt to account for contextual factors that contribute to intervention effectiveness. Using these designs as a base, we propose a set of principles that may help to capture information on context. Finally, we propose a tool, called a driver diagram, traditionally used in implementation that would allow evaluators to systematically monitor changing dynamics in project implementation and identify contextual variation across sites. We describe an implementation-related example from South Africa to underline the strengths of the tool. If used across multiple sites and multiple projects, the resulting driver diagrams could be pooled together to form a generalized theory for how, when and why a widely-used intervention works. Mechanisms similar to the driver diagram are urgently needed to complement existing evaluations of large-scale implementation efforts.

  17. Prospective CO 2 saline resource estimation methodology: Refinement of existing US-DOE-NETL methods based on data availability

    DOE PAGES

    Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.

    2016-09-28

    Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less

  18. Prospective CO 2 saline resource estimation methodology: Refinement of existing US-DOE-NETL methods based on data availability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.

    Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less

  19. Outcomes and Process in Reading Tutoring

    ERIC Educational Resources Information Center

    Topping, K. J.; Thurston, A.; McGavock, K.; Conlin, N.

    2012-01-01

    Background: Large-scale randomised controlled trials are relatively rare in education. The present study approximates to, but is not exactly, a randomised controlled trial. It was an attempt to scale up previous small peer tutoring projects, while investing only modestly in continuing professional development for teachers. Purpose: A two-year…

  20. Report of the Fermilab ILC Citizens' Task Force

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Fermi National Accelerator Laboratory convened the ILC Citizens' Task Force to provide guidance and advice to the laboratory to ensure that community concerns and ideas are included in all public aspects of planning and design for a proposed future accelerator, the International Linear Collider. In this report, the members of the Task Force describe the process they used to gather and analyze information on all aspects of the proposed accelerator and its potential location at Fermilab in northern Illinois. They present the conclusions and recommendations they reached as a result of the learning process and their subsequent discussions and deliberations.more » While the Task Force was charged to provide guidance on the ILC, it became clear during the process that the high cost of the proposed accelerator made a near-term start for the project at Fermilab unlikely. Nevertheless, based on a year of extensive learning and dialogue, the Task Force developed a series of recommendations for Fermilab to consider as the laboratory develops all successor projects to the Tevatron. The Task Force recognizes that bringing a next-generation particle physics project to Fermilab will require both a large international effort and the support of the local community. While the Task Force developed its recommendations in response to the parameters of a future ILC, the principles they set forth apply directly to any large project that may be conceived at Fermilab, or at other laboratories, in the future. With this report, the Task Force fulfills its task of guiding Fermilab from the perspective of the local community on how to move forward with a large-scale project while building positive relationships with surrounding communities. The report summarizes the benefits, concerns and potential impacts of bringing a large-scale scientific project to northern Illinois.« less

  1. Effects of the Pacific Decadal Oscillation and global warming on drought in the US Southwest

    NASA Astrophysics Data System (ADS)

    Grossmann, I.

    2012-12-01

    Droughts are among the most expensive weather related disasters in the US. In the semi-arid regions of the US Southwest, where average annual rainfall is already very low, multiyear droughts can have large economic, societal and ecological impacts. The US Southwest relies on annual precipitation maxima during winter and the North American Monsoon (NAM), both of which undergo considerable interannual variability associated with large-scale climate patterns, in particular ENSO, the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). The region is also part of the subtropical belt projected to become more arid in a warming climate. These impacts have not been combined and compared with projections of long-term variations due to natural climate patterns. This study addresses this need by deriving future projections of rainfall departures for Arizona and New Mexico with the PDO and AMO and combining these with projected global warming impacts. Depending on the precipitation dataset used, the impacts for the ongoing negative PDO phase are projected to be between 1-1.6 times as large as the multi-model means projection of precipitation minus evaporation during 2020-2040 in the IPCC A1B Scenario. The projected precipitation impacts of a combined negative PDO and positive AMO phase are between 1-2 times as large as the A1B Scenario projection. The study also advances earlier work by addressing problems in detecting the effect of the PDO on precipitation. Given the different mechanisms with which the PDO affects precipitation during winter and the NAM season, precipitation impacts are here investigated on a monthly scale. The impacts of the PDO also vary with other climate patterns. This can be partly addressed by investigating precipitation departures in dependence on other patterns. It is further found that the long-term effect of the PDO can be more clearly separated from short-term variability by considering return periods of multi-year drought measures rather than return periods of simple drought measures.

  2. Use of a Modern Polymerization Pilot-Plant for Undergraduate Control Projects.

    ERIC Educational Resources Information Center

    Mendoza-Bustos, S. A.; And Others

    1991-01-01

    Described is a project where students gain experience in handling large volumes of hazardous materials, process start up and shut down, equipment failures, operational variations, scaling up, equipment cleaning, and run-time scheduling while working in a modern pilot plant. Included are the system design, experimental procedures, and results. (KR)

  3. Municipal Sludge Application in Forests of Northern Michigan: a Case Study.

    Treesearch

    D.G. Brockway; P.V. Nguyen

    1986-01-01

    A large-scale operational demonstration and research project was cooperatively established by the US. Environmental Protection Agency, Michigan Department of Natural Resources, and Michigan State University to evaluate the practice of forest land application as an option for sludge utilization. Project objectives included completing (1) a logistic and economic...

  4. What Have Researchers Learned from Project STAR?

    ERIC Educational Resources Information Center

    Schanzenbach, Diane Whitmore

    2007-01-01

    Project STAR (Student/Teacher Achievement Ratio) was a large-scale randomized trial of reduced class sizes in kindergarten through the third grade. Because of the scope of the experiment, it has been used in many policy discussions. For example, the California statewide class-size-reduction policy was justified, in part, by the successes of…

  5. Strategies for Impact: Enabling E-Learning Project Initiatives

    ERIC Educational Resources Information Center

    Csete, Josephine; Evans, Jennifer

    2013-01-01

    Purpose: The paper aims to focus on institutional initiatives to embed e-learning in a university in Hong Kong, from 2006-12, through large-scale funding of 43 e-learning projects. It outlines the guiding principles behind the university's e-learning development and discusses the significance of various procedures and practices in project…

  6. Mathematics Teachers' Take-Aways from Morning Math Problems in a Long-Term Professional Development Project

    ERIC Educational Resources Information Center

    Sevis, Serife; Cross, Dionne; Hudson, Rick

    2017-01-01

    Considering the role of mathematics-focused professional development programs in improving teachers' content knowledge and quality of teaching, we provided teachers opportunities for dealing with mathematics problems and positioning themselves as students in a large-scale long-term professional development (PD) project. In this proposal, we aimed…

  7. Multiscale socioeconomic assessment across large ecosystems: lessons from practice

    Treesearch

    Rebecca J. McLain; Ellen M. Donoghue; Jonathan Kusel; Lita Buttolph; Susan Charnley

    2008-01-01

    Implementation of ecosystem management projects has created a demand for socioeconomic assessments to predict or evaluate the impacts of ecosystem policies. Social scientists for these assessments face challenges that, although not unique to such projects, are more likely to arise than in smaller scale ones. This article summarizes lessons from our experiences with...

  8. Goal Orientations and Metacognitive Skills of Normal Technical and Normal Academic Students on Project Work

    ERIC Educational Resources Information Center

    Ee, J.; Wang, C.; Koh, C.; Tan, O.; Liu, W.

    2009-01-01

    In 2000, the Singapore Ministry of Education launched Project Work (PW) to encourage the application of knowledge across disciplines, and to develop thinking, communication, collaboration and metacognitive skills. This preliminary findings of a large scale study examines the role of goal orientations (achievement goals and social goals) in…

  9. Network Access to Visual Information: A Study of Costs and Uses.

    ERIC Educational Resources Information Center

    Besser, Howard

    This paper summarizes a subset of the findings of a study of digital image distribution that focused on the Museum Educational Site Licensing (MESL) project--the first large-scale multi-institutional project to explore digital delivery of art images and accompanying text/metadata from disparate sources. This Mellon Foundation-sponsored study…

  10. Gaps between Beliefs, Perceptions, and Practices: The Every Teacher Project on LGBTQ-Inclusive Education in Canadian Schools

    ERIC Educational Resources Information Center

    Taylor, Catherine G.; Meyer, Elizabeth J.; Peter, Tracey; Ristock, Janice; Short, Donn; Campbell, Christopher

    2016-01-01

    The Every Teacher Project involved large-scale survey research conducted to identify the beliefs, perspectives, and practices of Kindergarten to Grade 12 educators in Canadian public schools regarding lesbian, gay, bisexual, transgender, and queer (LGBTQ)-inclusive education. Comparisons are made between LGBTQ and cisgender heterosexual…

  11. Do Sustainability Projects Stimulate Organizational Learning in Universities?

    ERIC Educational Resources Information Center

    Albrecht, Patrick; Burandt, Simon; Schaltegger, Stefan

    2007-01-01

    Purpose: The purpose of this paper is to analyze the preparation of a sustainability report and a large-scale energy-saving campaign with regards to their role for organizational learning (OL). Similar processes indicating OL were observed during the implementation of both projects. Along the lines of a theoretical framework of OL these processes…

  12. Renewable Energy Finance Tracking Initiative (REFTI) Solar Trend Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hubbell, R.; Lowder, T.; Mendelsohn, M.

    This report is a summary of the finance trends for small-scale solar photovoltaic (PV) projects (PV <1 MW), large-scale PV projects (PV greater than or equal to 1 MW), and concentrated solar power projects as reported in the National Renewable Energy Laboratory's Renewable Energy Finance Tracking Initiative (REFTI). The report presents REFTI data during the five quarterly periods from the fourth quarter of 2009 to the first half of 2011. The REFTI project relies exclusively on the voluntary participation of industry stakeholders for its data; therefore, it does not offer a comprehensive view of the technologies it tracks. Despite thismore » limitation, REFTI is the only publicly available resource for renewable energy project financial terms. REFTI analysis offers usable inputs into the project economic evaluations of developers and investors, as well as the policy assessments of public utility commissions and others in the renewable energy industry.« less

  13. The Aeolus project: Science outreach through art.

    PubMed

    Drumm, Ian A; Belantara, Amanda; Dorney, Steve; Waters, Timothy P; Peris, Eulalia

    2015-04-01

    With a general decline in people's choosing to pursue science and engineering degrees there has never been a greater need to raise the awareness of lesser known fields such as acoustics. Given this context, a large-scale public engagement project, the 'Aeolus project', was created to raise awareness of acoustics science through a major collaboration between an acclaimed artist and acoustics researchers. It centred on touring the large singing sculpture Aeolus during 2011/12, though the project also included an extensive outreach programme of talks, exhibitions, community workshops and resources for schools. Described here are the motivations behind the project and the artwork itself, the ways in which scientists and an artist collaborated, and the public engagement activities designed as part of the project. Evaluation results suggest that the project achieved its goal of inspiring interest in the discipline of acoustics through the exploration of an other-worldly work of art. © The Author(s) 2013.

  14. Meta-Analyzing a Complex Correlational Dataset: A Case Study Using Correlations That Measure the Relationship between Parental Involvement and Academic Achievement

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Wilson, Sandra Jo

    2014-01-01

    The purpose of this project is to demonstrate the practical methods developed to utilize a dataset consisting of both multivariate and multilevel effect size data. The context for this project is a large-scale meta-analytic review of the predictors of academic achievement. This project is guided by three primary research questions: (1) How do we…

  15. Projected changes in precipitation intensity and frequency over complex topography: a multi-model perspective

    NASA Astrophysics Data System (ADS)

    Fischer, Andreas; Keller, Denise; Liniger, Mark; Rajczak, Jan; Schär, Christoph; Appenzeller, Christof

    2014-05-01

    Fundamental changes in the hydrological cycle are expected in a future warmer climate. This is of particular relevance for the Alpine region, as a source and reservoir of several major rivers in Europe and being prone to extreme events such as floodings. For this region, climate change assessments based on the ENSEMBLES regional climate models (RCMs) project a significant decrease in summer mean precipitation under the A1B emission scenario by the mid-to-end of this century, while winter mean precipitation is expected to slightly rise. From an impact perspective, projected changes in seasonal means, however, are often insufficient to adequately address the multifaceted challenges of climate change adaptation. In this study, we revisit the full matrix of the ENSEMBLES RCM projections regarding changes in frequency and intensity, precipitation-type (convective versus stratiform) and temporal structure (wet/dry spells and transition probabilities) over Switzerland and surroundings. As proxies for raintype changes, we rely on the model parameterized convective and large-scale precipitation components. Part of the analysis involves a Bayesian multi-model combination algorithm to infer changes from the multi-model ensemble. The analysis suggests a summer drying that evolves altitude-specific: over low-land regions it is associated with wet-day frequency decreases of convective and large-scale precipitation, while over elevated regions it is primarily associated with a decline in large-scale precipitation only. As a consequence, almost all the models project an increase in the convective fraction at elevated Alpine altitudes. The decrease in the number of wet days during summer is accompanied by decreases (increases) in multi-day wet (dry) spells. This shift in multi-day episodes also lowers the likelihood of short dry spell occurrence in all of the models. For spring and autumn the combined multi-model projections indicate higher mean precipitation intensity north of the Alps, while a similar tendency is expected for the winter season over most of Switzerland.

  16. Drought and heatwaves in Europe: historical reconstruction and future projections

    NASA Astrophysics Data System (ADS)

    Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Rakovec, Olda; Wood, Eric; Sheffield, Justin; Pan, Ming; Wanders, Niko; Prudhomme, Christel

    2017-04-01

    Heat waves and droughts are creeping hydro-meteorological events that may bring societies and natural systems to their limits by inducing large famines, increasing health risks to the population, creating drinking and irrigation water shortfalls, inducing natural fires and degradation of soil and water quality, and in many cases causing large socio-economic losses. Europe, in particular, has endured large scale drought-heat-wave events during the recent past (e.g., 2003 European drought), which have induced enormous socio-economic losses as well as casualties. Recent studies showed that the prediction of droughts and heatwaves is subject to large-scale forcing and parametric uncertainties that lead to considerable uncertainties in the projections of extreme characteristics such as drought magnitude/duration and area under drought, among others. Future projections are also heavily influenced by the RCP scenario uncertainty as well as the coarser spatial resolution of the models. The EDgE project funded by the Copernicus programme (C3S) provides an unique opportunity to investigate the evolution of droughts and heatwaves from 1950 until 2099 over the Pan-EU domain at a scale of 5x5 km2. In this project, high-resolution multi-model hydrologic simulations with the mHM (www.ufz.de/mhm), Noah-MP, VIC and PCR-GLOBWB have been completed for the historical period 1955-2015. Climate projections have been carried out with five CMIP-5 GCMs: GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M from 2006 to 2099 under RCP2.6 and RCP8.5. Using these multi-model unprecedented simulations, daily soil moisture index and temperature anomalies since 1955 until 2099 will be estimated. Using the procedure proposed by Samaniego et al. (2013), the probabilities of exceeding the benchmark events in the reference period 1980-2010 will be estimated for each RCP scenario. References http://climate.copernicus.eu/edge-end-end-demonstrator-improved-decision-making-water-sector-europe Samaniego, L., R. Kumar, and M. Zink, 2013: Implications of parameter uncertainty on soil moisture drought analysis in Germany. J. Hydrometeor., 14, 47-68, doi:10.1175/JHM-D-12-075.1. Samaniego, L., et al. 2016: Propagation of forcing and model uncertainties on to hydrological drought characteristics in a multi-model century-long experiment in large river basins. Climatic Change. 1-15.

  17. Transitioning a home telehealth project into a sustainable, large-scale service: a qualitative study.

    PubMed

    Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin

    2016-05-16

    This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.

  18. High-Resolution Climate Data Visualization through GIS- and Web-based Data Portals

    NASA Astrophysics Data System (ADS)

    WANG, X.; Huang, G.

    2017-12-01

    Sound decisions on climate change adaptation rely on an in-depth assessment of potential climate change impacts at regional and local scales, which usually requires finer resolution climate projections at both spatial and temporal scales. However, effective downscaling of global climate projections is practically difficult due to the lack of computational resources and/or long-term reference data. Although a large volume of downscaled climate data has been make available to the public, how to understand and interpret the large-volume climate data and how to make use of the data to drive impact assessment and adaptation studies are still challenging for both impact researchers and decision makers. Such difficulties have become major barriers preventing informed climate change adaptation planning at regional scales. Therefore, this research will explore new GIS- and web-based technologies to help visualize the large-volume regional climate data with high spatiotemporal resolutions. A user-friendly public data portal, named Climate Change Data Portal (CCDP, http://ccdp.network), will be established to allow intuitive and open access to high-resolution regional climate projections at local scales. The CCDP offers functions of visual representation through geospatial maps and data downloading for a variety of climate variables (e.g., temperature, precipitation, relative humidity, solar radiation, and wind) at multiple spatial resolutions (i.e., 25 - 50 km) and temporal resolutions (i.e., annual, seasonal, monthly, daily, and hourly). The vast amount of information the CCDP encompasses can provide a crucial basis for assessing impacts of climate change on local communities and ecosystems and for supporting better decision making under a changing climate.

  19. A unique large-scale undergraduate research experience in molecular systems biology for non-mathematics majors.

    PubMed

    Kappler, Ulrike; Rowland, Susan L; Pedwell, Rhianna K

    2017-05-01

    Systems biology is frequently taught with an emphasis on mathematical modeling approaches. This focus effectively excludes most biology, biochemistry, and molecular biology students, who are not mathematics majors. The mathematical focus can also present a misleading picture of systems biology, which is a multi-disciplinary pursuit requiring collaboration between biochemists, bioinformaticians, and mathematicians. This article describes an authentic large-scale undergraduate research experience (ALURE) in systems biology that incorporates proteomics, bacterial genomics, and bioinformatics in the one exercise. This project is designed to engage students who have a basic grounding in protein chemistry and metabolism and no mathematical modeling skills. The pedagogy around the research experience is designed to help students attack complex datasets and use their emergent metabolic knowledge to make meaning from large amounts of raw data. On completing the ALURE, participants reported a significant increase in their confidence around analyzing large datasets, while the majority of the cohort reported good or great gains in a variety of skills including "analysing data for patterns" and "conducting database or internet searches." An environmental scan shows that this ALURE is the only undergraduate-level system-biology research project offered on a large-scale in Australia; this speaks to the perceived difficulty of implementing such an opportunity for students. We argue however, that based on the student feedback, allowing undergraduate students to complete a systems-biology project is both feasible and desirable, even if the students are not maths and computing majors. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(3):235-248, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.

  20. Air quality in the mid-21st century for the city of Paris under two climate scenarios; from the regional to local scale

    NASA Astrophysics Data System (ADS)

    Markakis, K.; Valari, M.; Colette, A.; Sanchez, O.; Perrussel, O.; Honore, C.; Vautard, R.; Klimont, Z.; Rao, S.

    2014-07-01

    Ozone and PM2.5 concentrations over the city of Paris are modeled with the CHIMERE air-quality model at 4 km × 4 km horizontal resolution for two future emission scenarios. A high-resolution (1 km × 1 km) emission projection until 2020 for the greater Paris region is developed by local experts (AIRPARIF) and is further extended to year 2050 based on regional-scale emission projections developed by the Global Energy Assessment. Model evaluation is performed based on a 10-year control simulation. Ozone is in very good agreement with measurements while PM2.5 is underestimated by 20% over the urban area mainly due to a large wet bias in wintertime precipitation. A significant increase of maximum ozone relative to present-day levels over Paris is modeled under the "business-as-usual" scenario (+7 ppb) while a more optimistic "mitigation" scenario leads to a moderate ozone decrease (-3.5 ppb) in year 2050. These results are substantially different to previous regional-scale projections where 2050 ozone is found to decrease under both future scenarios. A sensitivity analysis showed that this difference is due to the fact that ozone formation over Paris at the current urban-scale study is driven by volatile organic compound (VOC)-limited chemistry, whereas at the regional-scale ozone formation occurs under NOx-sensitive conditions. This explains why the sharp NOx reductions implemented in the future scenarios have a different effect on ozone projections at different scales. In rural areas, projections at both scales yield similar results showing that the longer timescale processes of emission transport and ozone formation are less sensitive to model resolution. PM2.5 concentrations decrease by 78% and 89% under business-as-usual and mitigation scenarios, respectively, compared to the present-day period. The reduction is much more prominent over the urban part of the domain due to the effective reductions of road transport and residential emissions resulting in the smoothing of the large urban increment modeled in the control simulation.

  1. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  2. TARGET Publication Guidelines | Office of Cancer Genomics

    Cancer.gov

    Like other NCI large-scale genomics initiatives, TARGET is a community resource project and data are made available rapidly after validation for use by other researchers. To act in accord with the Fort Lauderdale principles and support the continued prompt public release of large-scale genomic data prior to publication, researchers who plan to prepare manuscripts containing descriptions of TARGET pediatric cancer data that would be of comparable scope to an initial TARGET disease-specific comprehensive, global analysis publication, and journal editors who receive such manuscripts, are

  3. The Navy Needs More Comprehensive Guidance for Evaluating and Supporting CostEffectiveness of LargeScale Renewable Energy Projects (REDACTED)

    DTIC Science & Technology

    2016-08-25

    Improvements’ and ‘ Wind Turbine and Photovoltaic Panels’ at Fort Wainwright, Alaska,” March 7, 2011 Army A-2015-0105-IEE, “Audit of Large-Scale...for renewable energy technologies and will purchase electricity generated from renewable sources—such as solar, wind , geothermal, and biomass3—when...title 10, United States Code states maintenance and repairs of property or facilities are types of IKC. REPO personnel also stated that they have

  4. Interferometric Mapping of Perseus Outflows with MASSES

    NASA Astrophysics Data System (ADS)

    Stephens, Ian; Dunham, Michael; Myers, Philip C.; MASSES Team

    2017-01-01

    The MASSES (Mass Assembly of Stellar Systems and their Evolution with the SMA) survey, a Submillimeter Array (SMA) large-scale program, is mapping molecular lines and continuum emission about the 75 known Class 0/I sources in the Perseus Molecular Cloud. In this talk, I present some of the key results of this project, with a focus on the CO(2-1) maps of the molecular outflows. In particular, I investigate how protostars inherit their rotation axes from large-scale magnetic fields and filamentary structure.

  5. Development of a Spot-Application Tool for Rapid, High-Resolution Simulation of Wave-Driven Nearshore Hydrodynamics

    DTIC Science & Technology

    2013-09-30

    flow models, such as Delft3D, with our developed Boussinesq -type model. The vision of this project is to develop an operational tool for the...situ measurements or large-scale wave models. This information will be used to drive the offshore wave boundary condition. • Execute the Boussinesq ...model to match with the Boussinesq -type theory would be one which can simulate sheared and stratified currents due to large-scale (non-wave) forcings

  6. Scale-Up: Improving Large Enrollment Physics Courses

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  7. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  8. Overview of large scale experiments performed within the LBB project in the Czech Republic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kadecka, P.; Lauerova, D.

    1997-04-01

    During several recent years NRI Rez has been performing the LBB analyses of safety significant primary circuit pipings of NPPs in Czech and Slovak Republics. The analyses covered the NPPs with reactors WWER 440 Type 230 and 213 and WWER 1000 Type 320. Within the relevant LBB projects undertaken with the aim to prove the fulfilling of the requirements of LBB, a series of large scale experiments were performed. The goal of these experiments was to verify the properties of the components selected, and to prove the quality and/or conservatism of assessments used in the LBB-analyses. In this poster, amore » brief overview of experiments performed in Czech Republic under guidance of NRI Rez is presented.« less

  9. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  10. Final Report for Project FG02-05ER25685

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiaosong Ma

    2009-05-07

    In this report, the PI summarizes the results and achievements obtained in the sponsored project. Overall, the project has been very successful and produced both research results in massive data-intensive computing and data management for large scale supercomputers today, and in open-source software products. During the project period, 14 conference/journal publications, as well as two PhD students, have been produced due to exclusive or shared support from this award. In addition, the PI has recently been granted tenure from NC State University.

  11. Development and Field Test of the Trial Battery for Project A. Improving the Selection, Classification and Utilization of Army Enlisted Personnel. Project A: Improving the Selection, Classification and Utilization of Army Enlisted Personnel. ARI Technical Report 739.

    ERIC Educational Resources Information Center

    Peterson, Norman G., Ed.

    As part of the United States Army's Project A, research has been conducted to develop and field test a battery of experimental tests to complement the Armed Services Vocational Aptitude Battery in predicting soldiers' job performance. Project A is the United States Army's large-scale manpower effort to improve selection, classification, and…

  12. Outcomes in a Randomised Controlled Trial of Mathematics Tutoring

    ERIC Educational Resources Information Center

    Topping, K. J.; Miller, D.; Murray, P.; Henderson, S.; Fortuna, C.; Conlin, N.

    2011-01-01

    Background: Large-scale randomised controlled trials (RCT) are relatively rare in education. The present study was an attempt to scale up previous small peer tutoring projects, while investing only modestly in continuing professional development for teachers. Purpose: A two-year RCT of peer tutoring in mathematics was undertaken in one local…

  13. NREL, California Independent System Operator, and First Solar | Energy

    Science.gov Websites

    Solar NREL, California Independent System Operator, and First Solar Demonstrate Essential Reliability Services with Utility-Scale Solar NREL, the California Independent System Operator (CAISO), and First Solar conducted a demonstration project on a large utility-scale photovoltaic (PV) power plant in California to

  14. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  15. Implementation of the Agitated Behavior Scale in the Electronic Health Record.

    PubMed

    Wilson, Helen John; Dasgupta, Kritis; Michael, Kathleen

    The purpose of the study was to implement an Agitated Behavior Scale through an electronic health record and to evaluate the usability of the scale in a brain injury unit at a rehabilitation hospital. A quality improvement project was conducted in the brain injury unit at a large rehabilitation hospital with registered nurses as participants using convenience sampling. The project consisted of three phases and included education, implementation of the scale in the electronic health record, and administration of the survey questionnaire, which utilized the system usability scale. The Agitated Behavior Scale was found to be usable, and there was 92.2% compliance with the use of the electronic Electronic Agitated Behavior Scale. The Agitated Behavior Scale was effectively implemented in the electronic health record and was found to be usable in the assessment of agitation. Utilization of the scale through the electronic health record on a daily basis will allow for an early identification of agitation in patients with traumatic brain injury and enable prompt interventions to manage agitation.

  16. Significantly Increased Extreme Precipitation Expected in Europe and North America from Extratropical Storms

    NASA Astrophysics Data System (ADS)

    Hawcroft, M.; Hodges, K.; Walsh, E.; Zappa, G.

    2017-12-01

    For the Northern Hemisphere extratropics, changes in circulation are key to determining the impacts of climate warming. The mechanisms governing these circulation changes are complex, leading to the well documented uncertainty in projections of the future location of the mid-latitude storm tracks simulated by climate models. These storms are the primary source of precipitation for North America and Europe and generate many of the large-scale precipitation extremes associated with flooding and severe economic loss. Here, we show that in spite of the uncertainty in circulation changes, by analysing the behaviour of the storms themselves, we find entirely consistent and robust projections across an ensemble of climate models. In particular, we find that projections of change in the most intensely precipitating storms (above the present day 99th percentile) in the Northern Hemisphere are substantial and consistent across models, with large increases in the frequency of both summer (June-August, +226±68%) and winter (December-February, +186±34%) extreme storms by the end of the century. Regionally, both North America (summer +202±129%, winter +232±135%) and Europe (summer +390±148%, winter +318±114%) are projected to experience large increases in the frequency of intensely precipitating storms. These changes are thermodynamic and driven by surface warming, rather than by changes in the dynamical behaviour of the storms. Such changes in storm behaviour have the potential to have major impacts on society given intensely precipitating storms are responsible for many large-scale flooding events.

  17. Integrated water and renewable energy management: the Acheloos-Peneios region case study

    NASA Astrophysics Data System (ADS)

    Koukouvinos, Antonios; Nikolopoulos, Dionysis; Efstratiadis, Andreas; Tegos, Aristotelis; Rozos, Evangelos; Papalexiou, Simon-Michael; Dimitriadis, Panayiotis; Markonis, Yiannis; Kossieris, Panayiotis; Tyralis, Christos; Karakatsanis, Georgios; Tzouka, Katerina; Christofides, Antonis; Karavokiros, George; Siskos, Alexandros; Mamassis, Nikos; Koutsoyiannis, Demetris

    2015-04-01

    Within the ongoing research project "Combined Renewable Systems for Sustainable Energy Development" (CRESSENDO), we have developed a novel stochastic simulation framework for optimal planning and management of large-scale hybrid renewable energy systems, in which hydropower plays the dominant role. The methodology and associated computer tools are tested in two major adjacent river basins in Greece (Acheloos, Peneios) extending over 15 500 km2 (12% of Greek territory). River Acheloos is characterized by very high runoff and holds ~40% of the installed hydropower capacity of Greece. On the other hand, the Thessaly plain drained by Peneios - a key agricultural region for the national economy - usually suffers from water scarcity and systematic environmental degradation. The two basins are interconnected through diversion projects, existing and planned, thus formulating a unique large-scale hydrosystem whose future has been the subject of a great controversy. The study area is viewed as a hypothetically closed, energy-autonomous, system, in order to evaluate the perspectives for sustainable development of its water and energy resources. In this context we seek an efficient configuration of the necessary hydraulic and renewable energy projects through integrated modelling of the water and energy balance. We investigate several scenarios of energy demand for domestic, industrial and agricultural use, assuming that part of the demand is fulfilled via wind and solar energy, while the excess or deficit of energy is regulated through large hydroelectric works that are equipped with pumping storage facilities. The overall goal is to examine under which conditions a fully renewable energy system can be technically and economically viable for such large spatial scale.

  18. Analyzing large scale genomic data on the cloud with Sparkhit

    PubMed Central

    Huang, Liren; Krüger, Jan

    2018-01-01

    Abstract Motivation The increasing amount of next-generation sequencing data poses a fundamental challenge on large scale genomic analytics. Existing tools use different distributed computational platforms to scale-out bioinformatics workloads. However, the scalability of these tools is not efficient. Moreover, they have heavy run time overheads when pre-processing large amounts of data. To address these limitations, we have developed Sparkhit: a distributed bioinformatics framework built on top of the Apache Spark platform. Results Sparkhit integrates a variety of analytical methods. It is implemented in the Spark extended MapReduce model. It runs 92–157 times faster than MetaSpark on metagenomic fragment recruitment and 18–32 times faster than Crossbow on data pre-processing. We analyzed 100 terabytes of data across four genomic projects in the cloud in 21 h, which includes the run times of cluster deployment and data downloading. Furthermore, our application on the entire Human Microbiome Project shotgun sequencing data was completed in 2 h, presenting an approach to easily associate large amounts of public datasets with reference data. Availability and implementation Sparkhit is freely available at: https://rhinempi.github.io/sparkhit/. Contact asczyrba@cebitec.uni-bielefeld.de Supplementary information Supplementary data are available at Bioinformatics online. PMID:29253074

  19. Captured metagenomics: large-scale targeting of genes based on ‘sequence capture’ reveals functional diversity in soils

    PubMed Central

    Manoharan, Lokeshwaran; Kushwaha, Sandeep K.; Hedlund, Katarina; Ahrén, Dag

    2015-01-01

    Microbial enzyme diversity is a key to understand many ecosystem processes. Whole metagenome sequencing (WMG) obtains information on functional genes, but it is costly and inefficient due to large amount of sequencing that is required. In this study, we have applied a captured metagenomics technique for functional genes in soil microorganisms, as an alternative to WMG. Large-scale targeting of functional genes, coding for enzymes related to organic matter degradation, was applied to two agricultural soil communities through captured metagenomics. Captured metagenomics uses custom-designed, hybridization-based oligonucleotide probes that enrich functional genes of interest in metagenomic libraries where only probe-bound DNA fragments are sequenced. The captured metagenomes were highly enriched with targeted genes while maintaining their target diversity and their taxonomic distribution correlated well with the traditional ribosomal sequencing. The captured metagenomes were highly enriched with genes related to organic matter degradation; at least five times more than similar, publicly available soil WMG projects. This target enrichment technique also preserves the functional representation of the soils, thereby facilitating comparative metagenomics projects. Here, we present the first study that applies the captured metagenomics approach in large scale, and this novel method allows deep investigations of central ecosystem processes by studying functional gene abundances. PMID:26490729

  20. Assessing and Projecting Greenhouse Gas Release due to Abrupt Permafrost Degradation

    NASA Astrophysics Data System (ADS)

    Saito, K.; Ohno, H.; Yokohata, T.; Iwahana, G.; Machiya, H.

    2017-12-01

    Permafrost is a large reservoir of frozen soil organic carbon (SOC; about half of all the terrestrial storage). Therefore, its degradation (i.e., thawing) under global warming may lead to a substantial amount of additional greenhouse gas (GHG) release. However, understanding of the processes, geographical distribution of such hazards, and implementation of the relevant processes in the advanced climate models are insufficient yet so that variations in permafrost remains one of the large source of uncertainty in climatic and biogeochemical assessment and projections. Thermokarst, induced by melting of ground ice in ice-rich permafrost, leads to dynamic surface subsidence up to 60 m, which further affects local and regional societies and eco-systems in the Arctic. It can also accelerate a large-scale warming process through a positive feedback between released GHGs (especially methane), atmospheric warming and permafrost degradation. This three-year research project (2-1605, Environment Research and Technology Development Fund of the Ministry of the Environment, Japan) aims to assess and project the impacts of GHG release through dynamic permafrost degradation through in-situ and remote (e.g., satellite and airborn) observations, lab analysis of sampled ice and soil cores, and numerical modeling, by demonstrating the vulnerability distribution and relative impacts between large-scale degradation and such dynamic degradation. Our preliminary laboratory analysis of ice and soil cores sampled in 2016 at the Alaskan and Siberian sites largely underlain by ice-rich permafrost, shows that, although gas volumes trapped in unit mass are more or less homogenous among sites both for ice and soil cores, large variations are found in the methane concentration in the trapped gases, ranging from a few ppm (similar to that of the atmosphere) to hundreds of thousands ppm We will also present our numerical approach to evaluate relative impacts of GHGs released through dynamic permafrost degradations, by implementing conceptual modeling to assess and project distribution and affected amount of ground ice and SOC.

  1. Cost of Community Integrated Prevention Campaign for Malaria, HIV, and Diarrhea in Rural Kenya

    PubMed Central

    2011-01-01

    Background Delivery of community-based prevention services for HIV, malaria, and diarrhea is a major priority and challenge in rural Africa. Integrated delivery campaigns may offer a mechanism to achieve high coverage and efficiency. Methods We quantified the resources and costs to implement a large-scale integrated prevention campaign in Lurambi Division, Western Province, Kenya that reached 47,133 individuals (and 83% of eligible adults) in 7 days. The campaign provided HIV testing, condoms, and prevention education materials; a long-lasting insecticide-treated bed net; and a water filter. Data were obtained primarily from logistical and expenditure data maintained by implementing partners. We estimated the projected cost of a Scaled-Up Replication (SUR), assuming reliance on local managers, potential efficiencies of scale, and other adjustments. Results The cost per person served was $41.66 for the initial campaign and was projected at $31.98 for the SUR. The SUR cost included 67% for commodities (mainly water filters and bed nets) and 20% for personnel. The SUR projected unit cost per person served, by disease, was $6.27 for malaria (nets and training), $15.80 for diarrhea (filters and training), and $9.91 for HIV (test kits, counseling, condoms, and CD4 testing at each site). Conclusions A large-scale, rapidly implemented, integrated health campaign provided services to 80% of a rural Kenyan population with relatively low cost. Scaling up this design may provide similar services to larger populations at lower cost per person. PMID:22189090

  2. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  3. Using Technology to Lead a Large-Scale Data Project

    ERIC Educational Resources Information Center

    Shields, Julie

    2018-01-01

    While working in the Social and Emotional Support Services (SESS) for Montgomery County Public Schools in Maryland (MCPS), the author initiated a project to collect data on services to students with emotional needs. During the 2016-2017 school year, MCPS was the 17th largest school district in the country with nearly 160,000 students speaking 150…

  4. Education Policy, Globalization, Commercialization: An Interview with Bob Lingard by David Hursh

    ERIC Educational Resources Information Center

    Hursh, David

    2017-01-01

    In this interview with David Hursh, Bob Lingard comments on his current and/or recently completed research projects in respect to new modes of global governance in schooling and the complementarity between international large scale assessments and national testing. He also looks at a project that, in conjunction with school leaders, teachers,…

  5. Overview of the OGAP Formative Assessment Project and CPRE's Large-Scale Experimental Study of Implementation and Impacts

    ERIC Educational Resources Information Center

    Supovitz, Jonathan

    2016-01-01

    In this presentation discussed in this brief abstracted report, the author presents about an ongoing partnership with the Philadelphia School District (PSD) to implement and research the Ongoing Assessment Project (OGAP). OGAP is a systematic, intentional and iterative formative assessment system grounded in the research on how students learn…

  6. Designing Professional Learning for Effecting Change: Partnerships for Local and System Networks

    ERIC Educational Resources Information Center

    Wyatt-Smith, Claire; Bridges, Susan; Hedemann, Maree; Neville, Mary

    2008-01-01

    This paper presents (i) a purpose-built conceptual model for professional learning and (ii) a leadership framework designed to support a large-scale project involving diverse sites across the state of Queensland, Australia. The project had as its focus teacher-capacity building and ways to improve literacy and numeracy outcomes for students at…

  7. Moving Knowledge Around: Strategies for Fostering Equity within Educational Systems

    ERIC Educational Resources Information Center

    Ainscow, Mel

    2012-01-01

    This paper describes and analyses the work of a large scale improvement project in England in order to find more effective ways of fostering equity within education systems. The project involved an approach based on an analysis of local context, and used processes of networking and collaboration in order to make better use of available expertise.…

  8. The battle against bark beetles in Crater Lake National Park: 1925-34

    Treesearch

    B.E. Wickman

    1987-01-01

    This history records the first large-scale bark beetle control project in a national park in the Pacific Northwest. It describes the relations between Park Service, Forest Service, and USDA Bureau of Entomology personnel; how the project was organized; the ecological implications of the outbreak; and the long-term results of direct control measures.

  9. A Multivariate Analysis of Secondary Students' Experience of Web-Based Language Acquisition

    ERIC Educational Resources Information Center

    Felix, Uschi

    2004-01-01

    This paper reports on a large-scale project designed to replicate an earlier investigation of tertiary students (Felix, 2001) in a secondary school environment. The new project was carried out in five settings, again investigating the potential of the Web as a medium of language instruction. Data was collected by questionnaires and observational…

  10. On the Brink: Activity and Resource Guide to Teaching about Massachusetts Endangered Species.

    ERIC Educational Resources Information Center

    Cervoni, Cleti, Ed.

    Project WILD is the first large-scale curriculum supplement focusing on wildlife concepts and integrated with many areas of the general school curriculum. It features decision-making processes and explores a diversity of attitudes toward wildlife. The goal of Project WILD is to prepare young people to make decisions affecting people and wildlife…

  11. Soil quality and productivity responses to watershed restoration in the Ouachita mountains of Arkansas, USA

    Treesearch

    John A. Stanturf; Daniel A. Marion; Martin Spetich; Kenneth Luckow; James M. Guldin; Hal O. Liechty; Calvin E. Meier

    2000-01-01

    The Ouachita Mountains Ecosystem Management Research Project (OEMP) is a large interdisciplinary research project designed to provide the scientific foundation for landscape management at the scale of watersheds. The OEMP has progressed through three phases: developing natural regeneration alternatives to clearcutting and planting; testing of these alternatives at the...

  12. WikiTextbooks: Designing Your Course around a Collaborative Writing Project

    ERIC Educational Resources Information Center

    Katz, Brian P.; Thoren, Elizabeth

    2014-01-01

    We have used wiki technology to support large-scale, collaborative writing projects in which the students build reference texts (called WikiTextbooks). The goal of this paper is to prepare readers to adapt this idea for their own courses. We give examples of the implementation of WikiTextbooks in a variety of courses, including lecture and…

  13. Professional Development for Culturally Responsive and Relationship-Based Pedagogy. Black Studies and Critical Thinking. Volume 24

    ERIC Educational Resources Information Center

    Sleeter, Christine E., Ed.

    2011-01-01

    The work presented here is a large-scale evaluation of a theory-driven school reform project in New Zealand, which focuses on improving the educational achievement of Maori students in public secondary schools. The project's conceptual underpinnings are based on Kaupapa Maori research, culturally responsive teaching, student voice, and…

  14. Efficient coarse simulation of a growing avascular tumor

    PubMed Central

    Kavousanakis, Michail E.; Liu, Ping; Boudouvis, Andreas G.; Lowengrub, John; Kevrekidis, Ioannis G.

    2013-01-01

    The subject of this work is the development and implementation of algorithms which accelerate the simulation of early stage tumor growth models. Among the different computational approaches used for the simulation of tumor progression, discrete stochastic models (e.g., cellular automata) have been widely used to describe processes occurring at the cell and subcell scales (e.g., cell-cell interactions and signaling processes). To describe macroscopic characteristics (e.g., morphology) of growing tumors, large numbers of interacting cells must be simulated. However, the high computational demands of stochastic models make the simulation of large-scale systems impractical. Alternatively, continuum models, which can describe behavior at the tumor scale, often rely on phenomenological assumptions in place of rigorous upscaling of microscopic models. This limits their predictive power. In this work, we circumvent the derivation of closed macroscopic equations for the growing cancer cell populations; instead, we construct, based on the so-called “equation-free” framework, a computational superstructure, which wraps around the individual-based cell-level simulator and accelerates the computations required for the study of the long-time behavior of systems involving many interacting cells. The microscopic model, e.g., a cellular automaton, which simulates the evolution of cancer cell populations, is executed for relatively short time intervals, at the end of which coarse-scale information is obtained. These coarse variables evolve on slower time scales than each individual cell in the population, enabling the application of forward projection schemes, which extrapolate their values at later times. This technique is referred to as coarse projective integration. Increasing the ratio of projection times to microscopic simulator execution times enhances the computational savings. Crucial accuracy issues arising for growing tumors with radial symmetry are addressed by applying the coarse projective integration scheme in a cotraveling (cogrowing) frame. As a proof of principle, we demonstrate that the application of this scheme yields highly accurate solutions, while preserving the computational savings of coarse projective integration. PMID:22587128

  15. Existing Whole-House Solutions Case Study: Pilot Demonstration of Phased Retrofits in Florida Homes - Central and South Florida Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-08-01

    In this pilot project, the Building America Partnership for Improved Residential Construction and Florida Power and Light are collaborating to retrofit a large number of homes using a phased approach to both simple and deep retrofits. This project will provide the information necessary to significantly reduce energy use through larger community-scale projects in collaboration with utilities, program administrators and other market leader stakeholders.

  16. Reality check in the project management of EU funding

    NASA Astrophysics Data System (ADS)

    Guo, Chenbo

    2015-04-01

    A talk addressing workload, focuses, impacts and outcomes of project management (hereinafter PM) Two FP7 projects serve as objects for investigation. In the Earth Science sector NACLIM is a large scale collaborative project with 18 partners from North and West Europe. NACLIM aims at investigating and quantifying the predictability of the North Atlantic/Arctic sea surface temperature, sea ice variability and change on seasonal to decadal time scales which have a crucial impact on weather and climate in Europe. PRIMO from Political Science is a global PhD program funded by Marie Curie ITN instrument with 11 partners from Europe, Eurasia and BRICS countries focusing on the rise of regional powers and its impact on international politics at large. Although the two projects are granted by different FP7 funding instruments, stem from different cultural backgrounds and have different goals, the inherent processes and the key focus of the PM are quite alike. Only the operational management is at some point distinguished from one another. From the administrative point of view, understanding of both EU requirements and the country-specific regulations is essential; it also helps us identifying the grey area in order to carry out the projects more efficiently. The talk will focus on our observation of the day-to-day PM flows - primarily the project implementation - with few particular cases: transparency issues, e.g. priority settings of non-research stakeholders including the conflict in the human resources field, End-User integration, gender issues rising up during a monitoring visit and ethical aspects in field research. Through a brief comparison of both projects we summarize a range of dos and don'ts, an "acting instead of reacting" line of action, and the conclusion to a systematic overall management instead of exclusively project controlling. In a nutshell , the talk aims at providing the audience a summary of the observation in management methodologies and toolkits applied in both projects, our best practices and lessons learnt in coordinating large international consortia.

  17. Effects of different regional climate model resolution and forcing scales on projected hydrologic changes

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo A.; Mizukami, Naoki; Ikeda, Kyoko; Clark, Martyn P.; Gutmann, Ethan D.; Arnold, Jeffrey R.; Brekke, Levi D.; Rajagopalan, Balaji

    2016-10-01

    We examine the effects of regional climate model (RCM) horizontal resolution and forcing scaling (i.e., spatial aggregation of meteorological datasets) on the portrayal of climate change impacts. Specifically, we assess how the above decisions affect: (i) historical simulation of signature measures of hydrologic behavior, and (ii) projected changes in terms of annual water balance and hydrologic signature measures. To this end, we conduct our study in three catchments located in the headwaters of the Colorado River basin. Meteorological forcings for current and a future climate projection are obtained at three spatial resolutions (4-, 12- and 36-km) from dynamical downscaling with the Weather Research and Forecasting (WRF) regional climate model, and hydrologic changes are computed using four different hydrologic model structures. These projected changes are compared to those obtained from running hydrologic simulations with current and future 4-km WRF climate outputs re-scaled to 12- and 36-km. The results show that the horizontal resolution of WRF simulations heavily affects basin-averaged precipitation amounts, propagating into large differences in simulated signature measures across model structures. The implications of re-scaled forcing datasets on historical performance were primarily observed on simulated runoff seasonality. We also found that the effects of WRF grid resolution on projected changes in mean annual runoff and evapotranspiration may be larger than the effects of hydrologic model choice, which surpasses the effects from re-scaled forcings. Scaling effects on projected variations in hydrologic signature measures were found to be generally smaller than those coming from WRF resolution; however, forcing aggregation in many cases reversed the direction of projected changes in hydrologic behavior.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark; Seel, Joachim

    The utility-scale solar sector—defined here to include any ground-mounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar power (“CSP”) project that is larger than 5 MWAC in capacity—has led the overall U.S. solar market in terms of installed capacity since 2012. It is expected to maintain its market-leading position for at least another five years, driven in part by December 2015’s three-year extension of the 30% federal investment tax credit (“ITC”) through 2019 (coupled with a favorable switch to a “start construction” rather than a “placed in service” eligibility requirement, and a gradual phase down of the credit to 10%more » by 2022). In fact, in 2016 alone, the utility-scale sector is projected to install more than twice as much new capacity as it ever has previously in a single year. This unprecedented boom makes it difficult, yet more important than ever, to stay abreast of the latest utility-scale market developments and trends. This report—the fourth edition in an ongoing annual series—is intended to help meet this need, by providing in-depth, annually updated, data-driven analysis of the utility-scale solar project fleet in the United States. Drawing on empirical project-level data from a wide range of sources, this report analyzes not just installed project costs or prices—i.e., the traditional realm of most solar economic analyses—but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects throughout the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are also presented where appropriate.« less

  19. Embarking on large-scale qualitative research: reaping the benefits of mixed methods in studying youth, clubs and drugs

    PubMed Central

    Hunt, Geoffrey; Moloney, Molly; Fazio, Adam

    2012-01-01

    Qualitative research is often conceptualized as inherently small-scale research, primarily conducted by a lone researcher enmeshed in extensive and long-term fieldwork or involving in-depth interviews with a small sample of 20 to 30 participants. In the study of illicit drugs, traditionally this has often been in the form of ethnographies of drug-using subcultures. Such small-scale projects have produced important interpretive scholarship that focuses on the culture and meaning of drug use in situated, embodied contexts. Larger-scale projects are often assumed to be solely the domain of quantitative researchers, using formalistic survey methods and descriptive or explanatory models. In this paper, however, we will discuss qualitative research done on a comparatively larger scale—with in-depth qualitative interviews with hundreds of young drug users. Although this work incorporates some quantitative elements into the design, data collection, and analysis, the qualitative dimension and approach has nevertheless remained central. Larger-scale qualitative research shares some of the challenges and promises of smaller-scale qualitative work including understanding drug consumption from an emic perspective, locating hard-to-reach populations, developing rapport with respondents, generating thick descriptions and a rich analysis, and examining the wider socio-cultural context as a central feature. However, there are additional challenges specific to the scale of qualitative research, which include data management, data overload and problems of handling large-scale data sets, time constraints in coding and analyzing data, and personnel issues including training, organizing and mentoring large research teams. Yet large samples can prove to be essential for enabling researchers to conduct comparative research, whether that be cross-national research within a wider European perspective undertaken by different teams or cross-cultural research looking at internal divisions and differences within diverse communities and cultures. PMID:22308079

  20. A Sensible Approach to Wireless Networking.

    ERIC Educational Resources Information Center

    Ahmed, S. Faruq

    2002-01-01

    Discusses radio frequency (R.F.) wireless technology, including industry standards, range (coverage) and throughput (data rate), wireless compared to wired networks, and considerations before embarking on a large-scale wireless project. (EV)

  1. Downscaling ocean conditions: Experiments with a quasi-geostrophic model

    NASA Astrophysics Data System (ADS)

    Katavouta, A.; Thompson, K. R.

    2013-12-01

    The predictability of small-scale ocean variability, given the time history of the associated large-scales, is investigated using a quasi-geostrophic model of two wind-driven gyres separated by an unstable, mid-ocean jet. Motivated by the recent theoretical study of Henshaw et al. (2003), we propose a straightforward method for assimilating information on the large-scale in order to recover the small-scale details of the quasi-geostrophic circulation. The similarity of this method to the spectral nudging of limited area atmospheric models is discussed. Results from the spectral nudging of the quasi-geostrophic model, and an independent multivariate regression-based approach, show that important features of the ocean circulation, including the position of the meandering mid-ocean jet and the associated pinch-off eddies, can be recovered from the time history of a small number of large-scale modes. We next propose a hybrid approach for assimilating both the large-scales and additional observed time series from a limited number of locations that alone are too sparse to recover the small scales using traditional assimilation techniques. The hybrid approach improved significantly the recovery of the small-scales. The results highlight the importance of the coupling between length scales in downscaling applications, and the value of assimilating limited point observations after the large-scales have been set correctly. The application of the hybrid and spectral nudging to practical ocean forecasting, and projecting changes in ocean conditions on climate time scales, is discussed briefly.

  2. Advanced Distillation Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddalena Fanelli; Ravi Arora; Annalee Tonkovich

    2010-03-24

    The Advanced Distillation project was concluded on December 31, 2009. This U.S. Department of Energy (DOE) funded project was completed successfully and within budget during a timeline approved by DOE project managers, which included a one year extension to the initial ending date. The subject technology, Microchannel Process Technology (MPT) distillation, was expected to provide both capital and operating cost savings compared to conventional distillation technology. With efforts from Velocys and its project partners, MPT distillation was successfully demonstrated at a laboratory scale and its energy savings potential was calculated. While many objectives established at the beginning of the projectmore » were met, the project was only partially successful. At the conclusion, it appears that MPT distillation is not a good fit for the targeted separation of ethane and ethylene in large-scale ethylene production facilities, as greater advantages were seen for smaller scale distillations. Early in the project, work involved flowsheet analyses to discern the economic viability of ethane-ethylene MPT distillation and develop strategies for maximizing its impact on the economics of the process. This study confirmed that through modification to standard operating processes, MPT can enable net energy savings in excess of 20%. This advantage was used by ABB Lumus to determine the potential impact of MPT distillation on the ethane-ethylene market. The study indicated that a substantial market exists if the energy saving could be realized and if installed capital cost of MPT distillation was on par or less than conventional technology. Unfortunately, it was determined that the large number of MPT distillation units needed to perform ethane-ethylene separation for world-scale ethylene facilities, makes the targeted separation a poor fit for the technology in this application at the current state of manufacturing costs. Over the course of the project, distillation experiments were performed with the targeted mixture, ethane-ethylene, as well as with analogous low relative volatility systems: cyclohexane-hexane and cyclopentane-pentane. Devices and test stands were specifically designed for these efforts. Development progressed from experiments and models considering sections of a full scale device to the design, fabrication, and operation of a single-channel distillation unit with integrated heat transfer. Throughout the project, analytical and numerical models and Computational Fluid Dynamics (CFD) simulations were validated with experiments in the process of developing this platform technology. Experimental trials demonstrated steady and controllable distillation for a variety of process conditions. Values of Height-to-an-Equivalent Theoretical Plate (HETP) ranging from less than 0.5 inch to a few inches were experimentally proven, demonstrating a ten-fold performance enhancement relative to conventional distillation. This improvement, while substantial, is not sufficient for MPT distillation to displace very large scale distillation trains. Fortunately, parallel efforts in the area of business development have yielded other applications for MPT distillation, including smaller scale separations that benefit from the flowsheet flexibility offered by the technology. Talks with multiple potential partners are underway. Their outcome will also help determine the path ahead for MPT distillation.« less

  3. Success in large high-technology projects: What really works?

    NASA Astrophysics Data System (ADS)

    Crosby, P.

    2014-08-01

    Despite a plethora of tools, technologies and management systems, successful execution of big science and engineering projects remains problematic. The sheer scale of globally funded projects such as the Large Hadron Collider and the Square Kilometre Array telescope means that lack of project success can impact both on national budgets, and collaborative reputations. In this paper, I explore data from contemporary literature alongside field research from several current high-technology projects in Europe and Australia, and reveal common `pressure points' that are shown to be key influencers of project control and success. I discuss the how mega-science projects sit between being merely complicated, and chaotic, and explain the importance of understanding multiple dimensions of project complexity. Project manager/leader traits are briefly discussed, including capability to govern and control such enterprises. Project structures are examined, including the challenge of collaborations. I show that early attention to building project resilience, curbing optimism, and risk alertness can help prepare large high-tech projects against threats, and why project managers need to understand aspects of `the silent power of time'. Mission assurance is advanced as a critical success function, alongside the deployment of task forces and new combinations of contingency plans. I argue for increased project control through industrial-style project reviews, and show how post-project reviews are an under-used, yet invaluable avenue of personal and organisational improvement. Lastly, I discuss the avoidance of project amnesia through effective capture of project knowledge, and transfer of lessons-learned to subsequent programs and projects.

  4. Differences in flood hazard projections in Europe – their causes and consequences for decision making

    USGS Publications Warehouse

    Kundzewicz, Z. W.; Krysanova, V.; Dankers, R.; Hirabayashi, Y.; Kanae, S.; Hattermann, F. F.; Huang, S.; Milly, Paul C.D.; Stoffel, M.; Driessen, P.P.J.; Matczak, P.; Quevauviller, P.; Schellnhuber, H.-J.

    2017-01-01

    This paper interprets differences in flood hazard projections over Europe and identifies likely sources of discrepancy. Further, it discusses potential implications of these differences for flood risk reduction and adaptation to climate change. The discrepancy in flood hazard projections raises caution, especially among decision makers in charge of water resources management, flood risk reduction, and climate change adaptation at regional to local scales. Because it is naïve to expect availability of trustworthy quantitative projections of future flood hazard, in order to reduce flood risk one should focus attention on mapping of current and future risks and vulnerability hotspots and improve the situation there. Although an intercomparison of flood hazard projections is done in this paper and differences are identified and interpreted, it does not seems possible to recommend which large-scale studies may be considered most credible in particular areas of Europe.

  5. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; Beckley, Matthew; Abe-Ouchi, Ayako; Aschwanden, Andy; Calov, Reinhard; Gagliardini, Olivier; Gillet-Chaulet, Fabien; Golledge, Nicholas R.; Gregory, Jonathan; Greve, Ralf; Humbert, Angelika; Huybrechts, Philippe; Kennedy, Joseph H.; Larour, Eric; Lipscomb, William H.; Le clec'h, Sébastien; Lee, Victoria; Morlighem, Mathieu; Pattyn, Frank; Payne, Antony J.; Rodehacke, Christian; Rückamp, Martin; Saito, Fuyuki; Schlegel, Nicole; Seroussi, Helene; Shepherd, Andrew; Sun, Sainan; van de Wal, Roderik; Ziemen, Florian A.

    2018-04-01

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. The goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within the Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.

  6. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    DOE PAGES

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; ...

    2018-04-19

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. Here, the goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within themore » Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.« less

  7. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. Here, the goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within themore » Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.« less

  8. Monitoring conservation success in a large oak woodland landscape

    Treesearch

    Rich Reiner; Emma Underwood; John-O Niles

    2002-01-01

    Monitoring is essential in understanding the success or failure of a conservation project and provides the information needed to conduct adaptive management. Although there is a large body of literature on monitoring design, it fails to provide sufficient information to practitioners on how to organize and apply monitoring when implementing landscape-scale conservation...

  9. E-Learning in a Large Organization: A Study of the Critical Role of Information Sharing

    ERIC Educational Resources Information Center

    Netteland, Grete; Wasson, Barbara; Morch, Anders I

    2007-01-01

    Purpose: The purpose of this paper is to provide new insights into the implementation of large-scale learning projects; thereby better understanding the difficulties, frustrations, and obstacles encountered when implementing enterprise-wide e-learning as a tool for training and organization transformation in a complex organization.…

  10. A Navy Shore Activity Manpower Planning System for Civilians. Technical Report No. 24.

    ERIC Educational Resources Information Center

    Niehaus, R. J.; Sholtz, D.

    This report describes the U.S. Navy Shore Activity Manpower Planning System (SAMPS) advanced development research project. This effort is aimed at large-scale feasibility tests of manpower models for large Naval installations. These local planning systems are integrated with Navy-wide information systems on a data-communications network accessible…

  11. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    NASA Astrophysics Data System (ADS)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are being simulated for plausible physical reasons, boosting confidence in future projections of temperature extremes. Conversely, where model skill is identified to be lower, caution should be exercised in interpreting future projections.

  12. DISRUPTION OF LARGE-SCALE NEURAL NETWORKS IN NON-FLUENT/AGRAMMATIC VARIANT PRIMARY PROGRESSIVE APHASIA ASSOCIATED WITH FRONTOTEMPORAL DEGENERATION PATHOLOGY

    PubMed Central

    Grossman, Murray; Powers, John; Ash, Sherry; McMillan, Corey; Burkholder, Lisa; Irwin, David; Trojanowski, John Q.

    2012-01-01

    Non-fluent/agrammatic primary progressive aphasia (naPPA) is a progressive neurodegenerative condition most prominently associated with slowed, effortful speech. A clinical imaging marker of naPPA is disease centered in the left inferior frontal lobe. We used multimodal imaging to assess large-scale neural networks underlying effortful expression in 15 patients with sporadic naPPA due to frontotemporal lobar degeneration (FTLD) spectrum pathology. Effortful speech in these patients is related in part to impaired grammatical processing, and to phonologic speech errors. Gray matter (GM) imaging shows frontal and anterior-superior temporal atrophy, most prominently in the left hemisphere. Diffusion tensor imaging reveals reduced fractional anisotropy in several white matter (WM) tracts mediating projections between left frontal and other GM regions. Regression analyses suggest disruption of three large-scale GM-WM neural networks in naPPA that support fluent, grammatical expression. These findings emphasize the role of large-scale neural networks in language, and demonstrate associated language deficits in naPPA. PMID:23218686

  13. Remote Imaging Applied to Schistosomiasis Control: The Anning River Project

    NASA Technical Reports Server (NTRS)

    Seto, Edmund Y. W.; Maszle, Don R.; Spear, Robert C.; Gong, Peng

    1997-01-01

    The use of satellite imaging to remotely detect areas of high risk for transmission of infectious disease is an appealing prospect for large-scale monitoring of these diseases. The detection of large-scale environmental determinants of disease risk, often called landscape epidemiology, has been motivated by several authors (Pavlovsky 1966; Meade et al. 1988). The basic notion is that large-scale factors such as population density, air temperature, hydrological conditions, soil type, and vegetation can determine in a coarse fashion the local conditions contributing to disease vector abundance and human contact with disease agents. These large-scale factors can often be remotely detected by sensors or cameras mounted on satellite or aircraft platforms and can thus be used in a predictive model to mark high risk areas of transmission and to target control or monitoring efforts. A review of satellite technologies for this purpose was recently presented by Washino and Wood (1994) and Hay (1997) and Hay et al. (1997).

  14. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    NASA Astrophysics Data System (ADS)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  15. Fine resolution probabilistic land cover classification of landscapes in the southeastern United States

    Treesearch

    Joseph St. Peter; John Hogland; Nathaniel Anderson; Jason Drake; Paul Medley

    2018-01-01

    Land cover classification provides valuable information for prioritizing management and conservation operations across large landscapes. Current regional scale land cover geospatial products within the United States have a spatial resolution that is too coarse to provide the necessary information for operations at the local and project scales. This paper describes a...

  16. So What's New? A Survey of the Educational Policies of Orchestras and Opera Companies

    ERIC Educational Resources Information Center

    Winterson, Julia

    2010-01-01

    The creative music workshop involving professional players was intended to give direct support to school teachers and to enhance music in the classroom. However, today's large-scale, high-profile projects mounted by orchestras and opera companies appear to be developing into a full-scale industry on their own, their role in partnership with…

  17. Desalination: Status and Federal Issues

    DTIC Science & Technology

    2009-12-30

    on one side and lets purified water through. Reverse osmosis plants have fewer problems with corrosion and usually have lower energy requirements...Texas) and cities are actively researching and investigating the feasibility of large-scale desalination plants for municipal water supplies...desalination research and development, and in construction and operational costs of desalination demonstration projects and full-scale plants

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo

    Traditional engineering methods do not make provision for the integration of human considerations, while traditional human factors methods do not scale well to the complexity of large-scale nuclear power plant projects. Although the need for up-to-date human factors engineering processes and tools is recognised widely in industry, so far no formal guidance has been developed. This article proposes such a framework.

  19. Development of a 3D printer using scanning projection stereolithography

    PubMed Central

    Lee, Michael P.; Cooper, Geoffrey J. T.; Hinkley, Trevor; Gibson, Graham M.; Padgett, Miles J.; Cronin, Leroy

    2015-01-01

    We have developed a system for the rapid fabrication of low cost 3D devices and systems in the laboratory with micro-scale features yet cm-scale objects. Our system is inspired by maskless lithography, where a digital micromirror device (DMD) is used to project patterns with resolution up to 10 µm onto a layer of photoresist. Large area objects can be fabricated by stitching projected images over a 5cm2 area. The addition of a z-stage allows multiple layers to be stacked to create 3D objects, removing the need for any developing or etching steps but at the same time leading to true 3D devices which are robust, configurable and scalable. We demonstrate the applications of the system by printing a range of micro-scale objects as well as a fully functioning microfluidic droplet device and test its integrity by pumping dye through the channels. PMID:25906401

  20. An extended basis inexact shift-invert Lanczos for the efficient solution of large-scale generalized eigenproblems

    NASA Astrophysics Data System (ADS)

    Rewieński, M.; Lamecki, A.; Mrozowski, M.

    2013-09-01

    This paper proposes a technique, based on the Inexact Shift-Invert Lanczos (ISIL) method with Inexact Jacobi Orthogonal Component Correction (IJOCC) refinement, and a preconditioned conjugate-gradient (PCG) linear solver with multilevel preconditioner, for finding several eigenvalues for generalized symmetric eigenproblems. Several eigenvalues are found by constructing (with the ISIL process) an extended projection basis. Presented results of numerical experiments confirm the technique can be effectively applied to challenging, large-scale problems characterized by very dense spectra, such as resonant cavities with spatial dimensions which are large with respect to wavelengths of the resonating electromagnetic fields. It is also shown that the proposed scheme based on inexact linear solves delivers superior performance, as compared to methods which rely on exact linear solves, indicating tremendous potential of the 'inexact solve' concept. Finally, the scheme which generates an extended projection basis is found to provide a cost-efficient alternative to classical deflation schemes when several eigenvalues are computed.

  1. Data management for community research projects: A JGOFS case study

    NASA Technical Reports Server (NTRS)

    Lowry, Roy K.

    1992-01-01

    Since the mid 1980s, much of the marine science research effort in the United Kingdom has been focused into large scale collaborative projects involving public sector laboratories and university departments, termed Community Research Projects. Two of these, the Biogeochemical Ocean Flux Study (BOFS) and the North Sea Project incorporated large scale data collection to underpin multidisciplinary modeling efforts. The challenge of providing project data sets to support the science was met by a small team within the British Oceanographic Data Centre (BODC) operating as a topical data center. The role of the data center was to both work up the data from the ship's sensors and to combine these data with sample measurements into online databases. The working up of the data was achieved by a unique symbiosis between data center staff and project scientists. The project management, programming and data processing skills of the data center were combined with the oceanographic experience of the project communities to develop a system which has produced quality controlled, calibrated data sets from 49 research cruises in 3.5 years of operation. The data center resources required to achieve this were modest and far outweighed by the time liberated in the scientific community by the removal of the data processing burden. Two online project databases have been assembled containing a very high proportion of the data collected. As these are under the control of BODC their long term availability as part of the UK national data archive is assured. The success of the topical data center model for UK Community Research Project data management has been founded upon the strong working relationships forged between the data center and project scientists. These can only be established by frequent personal contact and hence the relatively small size of the UK has been a critical factor. However, projects covering a larger, even international scale could be successfully supported by a network of topical data centers managing online databases which are interconnected by object oriented distributed data management systems over wide area networks.

  2. Urban Greening Bay Area

    EPA Pesticide Factsheets

    Information about the San Francisco Bay Water Quality Project (SFBWQP) Urban Greening Bay Area, a large-scale effort to re-envision urban landscapes to include green infrastructure (GI) making communities more livable and reducing stormwater runoff.

  3. The XChemExplorer graphical workflow tool for routine or large-scale protein–ligand structure determination

    PubMed Central

    Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian

    2017-01-01

    XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein–ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallo­graphic software packages such as CCP4 [Winn et al. (2011 ▸), Acta Cryst. D67, 235–242] or PHENIX [Adams et al. (2010 ▸), Acta Cryst. D66, 213–221] have entrenched the paradigm that a ‘project’ is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects. PMID:28291762

  4. Why build a virtual brain? Large-scale neural simulations as jump start for cognitive computing

    NASA Astrophysics Data System (ADS)

    Colombo, Matteo

    2017-03-01

    Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates this claim and argues that the main challenge this era is facing is not the lack of biological realism. The challenge lies in identifying general neurocomputational principles for the design of artificial systems, which could display the robust flexibility characteristic of biological intelligence.

  5. Macroweather Predictions and Climate Projections using Scaling and Historical Observations

    NASA Astrophysics Data System (ADS)

    Hébert, R.; Lovejoy, S.; Del Rio Amador, L.

    2017-12-01

    There are two fundamental time scales that are pertinent to decadal forecasts and multidecadal projections. The first is the lifetime of planetary scale structures, about 10 days (equal to the deterministic predictability limit), and the second is - in the anthropocene - the scale at which the forced anthropogenic variability exceeds the internal variability (around 16 - 18 years). These two time scales define three regimes of variability: weather, macroweather and climate that are respectively characterized by increasing, decreasing and then increasing varibility with scale.We discuss how macroweather temperature variability can be skilfully predicted to its theoretical stochastic predictability limits by exploiting its long-range memory with the Stochastic Seasonal and Interannual Prediction System (StocSIPS). At multi-decadal timescales, the temperature response to forcing is approximately linear and this can be exploited to make projections with a Green's function, or Climate Response Function (CRF). To make the problem tractable, we exploit the temporal scaling symmetry and restrict our attention to global mean forcing and temperature response using a scaling CRF characterized by the scaling exponent H and an inner scale of linearity τ. An aerosol linear scaling factor α and a non-linear volcanic damping exponent ν were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference using historical data and these allow us to analytically calculate a median (and likely 66% range) for the transient climate response, and for the equilibrium climate sensitivity: 1.6K ([1.5,1.8]K) and 2.4K ([1.9,3.4]K) respectively. Aerosol forcing typically has large uncertainty and we find a modern (2005) forcing very likely range (90%) of [-1.0, -0.3] Wm-2 with median at -0.7 Wm-2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to Representative Concentration Pathway (RCP) 2.6 for which the probability to remain under 1.5 K is 48%. RCP 4.5 and RCP 8.5-like futures overshoot with very high probability. This underscores that over the next century, the state of the environment will be strongly influenced by past, present and future economical policies.

  6. Pro-sustainability choices and child deaths averted: from project experience to investment strategy.

    PubMed

    Sarriot, Eric G; Swedberg, Eric A; Ricca, James G

    2011-05-01

    The pursuit of the Millennium Development Goals and advancing the 'global health agenda' demand the achievement of health impact at scale through efficient investments. We have previously offered that sustainability-a necessary condition for successful expansion of programmes-can be addressed in practical terms. Based on benchmarks from actual child survival projects, we assess the expected impact of translating pro-sustainability choices into investment strategies. We review the experience of Save the Children US in Guinea in terms of investment, approach to sustainability and impact. It offers three benchmarks for impact: Entry project (21 lives saved of children under age five per US$100 000), Expansion project (37 LS/US$100k), and Continuation project (100 LS/US$100k). Extrapolating this experience, we model the impact of a traditional investment scenario against a pro-sustainability scenario and compare the deaths averted per dollar spent over five project cycles. The impact per dollar spent on a pro-sustainability strategy is 3.4 times that of a traditional one over the long run (range from 2.2 to 5.7 times in a sensitivity analysis). This large efficiency differential between two investment approaches offers a testable hypothesis for large-scale/long-term studies. The 'bang for the buck' of health programmes could be greatly increased by following a pro-sustainability investment strategy.

  7. Promoting Student Progressions in Science Classrooms: A Video Study

    ERIC Educational Resources Information Center

    Jin, Hui; Johnson, Michele E.; Shin, Hyo Jeong; Anderson, Charles W.

    2017-01-01

    This study was conducted in a large-scale environmental literacy project. In the project, we developed a Learning Progression Framework (LPF) for matter and energy in social-ecological systems; the LPF contains four achievement levels. Based on the LPF, we designed a Plant Unit to help Levels 2 and 3 students advance to Level 4 of the LPF. In the…

  8. eIFL (Electronic Information for Libraries): A Global Initiative of the Soros Foundations Network.

    ERIC Educational Resources Information Center

    Feret, Blazej; Kay, Michael

    This paper presents the history, current status, and future development of eIFL (Electronic Information for Libraries Direct)--a large-scale project run by the Soros Foundations Network and the Open Society Institute. The project aims to provide libraries in developing countries with access to a menu of electronic information resources. In 1999,…

  9. Global Economic Integration and Local Community Resilience: Road Paving and Rural Demographic Change in the Southwestern Amazon

    ERIC Educational Resources Information Center

    Perz, Stephen G.; Cabrera, Liliana; Carvalho, Lucas Araujo; Castillo, Jorge; Barnes, Grenville

    2010-01-01

    Recent years have witnessed an expansion in international investment in large-scale infrastructure projects with the goal of achieving global economic integration. We focus on one such project, the Inter-Oceanic Highway in the "MAP" region, a trinational frontier where Bolivia, Brazil, and Peru meet in the southwestern Amazon. We adopt a…

  10. Quality Daily Physical Education for the Primary School Student: A Personal Account of the Trois-Rivieres Regional Project

    ERIC Educational Resources Information Center

    Shephard, Roy J.; Trudeau, Francois

    2013-01-01

    This article offers a brief and personal account of the historical background, implementation and principal findings from the Trois-Rivieres regional project, a large-scale quasi-experimental intervention that tested the impact of providing a daily hour of specialist-taught quality physical education upon the physical and mental development of…

  11. Putting Ourselves in the Big Picture: A Sustainable Approach to Project Management for e-Learning

    ERIC Educational Resources Information Center

    Buchan, Janet

    2010-01-01

    In a case study of a large Australian university the metaphor of panarchy is used as a means of describing and understanding the complex interrelationships of multi-scale institutional projects and the influences of a variety factors on the potential success of e-learning initiatives. The concept of para-analysis is introduced as a management…

  12. High subsonic flow tests of a parallel pipe followed by a large area ratio diffuser

    NASA Technical Reports Server (NTRS)

    Barna, P. S.

    1975-01-01

    Experiments were performed on a pilot model duct system in order to explore its aerodynamic characteristics. The model was scaled from a design projected for the high speed operation mode of the Aircraft Noise Reduction Laboratory. The test results show that the model performed satisfactorily and therefore the projected design will most likely meet the specifications.

  13. A Practical Model of Development for China's National Quality Course Plan

    ERIC Educational Resources Information Center

    Long, Wang; Haklev, Stian

    2012-01-01

    The Chinese National Quality Course Plan is a large-scale project by the Ministry of Education, which has led to the production of more than 12,000 courses from some 700 universities since 2003. This paper describes in detail the purpose of the project and how it is organized at all levels, including how individual courses get selected at…

  14. MouseNet database: digital management of a large-scale mutagenesis project.

    PubMed

    Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M

    2000-07-01

    The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.

  15. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  16. The Challenges Of Investigating And Remediating Port Hope's Small-Scale Urban Properties - 13115

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veen, Walter van; Case, Glenn; Benson, John

    2013-07-01

    An important component of the Port Hope Project, the larger of the two projects comprising the Port Hope Area Initiative (PHAI), is the investigation of all 4,800 properties in the Municipality of Port Hope for low level radioactive waste (LLRW) and the remediation of approximately 10% of these. Although the majority of the individual properties are not expected to involve technically sophisticated remediation programs, the large number of property owners and individually unique properties are expected to present significant logistic challenges that will require a high degree of planning, organization and communication. The protocol and lessons learned described will bemore » of interest to those considering similar programs. Information presented herein is part of a series of papers presented by the PHAI Management Office (PHAI MO) at WM Symposium '13 describing the history of the Port Hope Project and current project status. Other papers prepared for WM Symposium '13 address the large-scale site cleanup and the construction of the long-term waste management facility (LTWMF) where all of the LLRW will be consolidated and managed within an engineered, above-ground mound. (authors)« less

  17. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  18. Developing Quality Improvement capacity and capability across the Children in Fife partnership.

    PubMed

    Morris, Craig; Alexander, Ingrid

    2016-01-01

    A Project Manager from the Fife Early Years Collaborative facilitated a large-scale Quality Improvement (herein QI) project to build organisational capacity and capability across the Children in Fife partnership through three separate, eight month training cohorts. This 18 month QI project enabled 32 practitioners to increase their skills, knowledge, and experiences in a variety of QI tools including the Model for Improvement which then supported the delivery of high quality improvement projects and improved outcomes for children and families. Essentially growing the confidence and capability of practitioners to deliver sustainable QI. 27 respective improvement projects were delivered, some leading to service redesign, reduced waiting times, increased uptake of health entitlements, and improved accessibility to front-line health services. 13 improvement projects spread or scaled beyond the initial site and informal QI mentoring took place with peers in respective agencies. Multiple PDSA cycles were conducted testing the most efficient and effective support mechanisms during and post training, maintaining regular contact, and utilising social media to share progress and achievements.

  19. Large-scale gene discovery in the pea aphid Acyrthosiphon pisum (Hemiptera)

    PubMed Central

    Sabater-Muñoz, Beatriz; Legeai, Fabrice; Rispe, Claude; Bonhomme, Joël; Dearden, Peter; Dossat, Carole; Duclert, Aymeric; Gauthier, Jean-Pierre; Ducray, Danièle Giblot; Hunter, Wayne; Dang, Phat; Kambhampati, Srini; Martinez-Torres, David; Cortes, Teresa; Moya, Andrès; Nakabachi, Atsushi; Philippe, Cathy; Prunier-Leterme, Nathalie; Rahbé, Yvan; Simon, Jean-Christophe; Stern, David L; Wincker, Patrick; Tagu, Denis

    2006-01-01

    Aphids are the leading pests in agricultural crops. A large-scale sequencing of 40,904 ESTs from the pea aphid Acyrthosiphon pisum was carried out to define a catalog of 12,082 unique transcripts. A strong AT bias was found, indicating a compositional shift between Drosophila melanogaster and A. pisum. An in silico profiling analysis characterized 135 transcripts specific to pea-aphid tissues (relating to bacteriocytes and parthenogenetic embryos). This project is the first to address the genetics of the Hemiptera and of a hemimetabolous insect. PMID:16542494

  20. Regional Climate Change across the Continental U.S. Projected from Downscaling IPCC AR5 Simulations

    NASA Astrophysics Data System (ADS)

    Otte, T. L.; Nolte, C. G.; Otte, M. J.; Pinder, R. W.; Faluvegi, G.; Shindell, D. T.

    2011-12-01

    Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. Preliminary results from downscaling NASA/GISS ModelE simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model will be used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 and illustrate potential changes in regional climate for the continental U.S. that are projected by ModelE and WRF under RCP6.0.

  1. Climate change impact on streamflow in large-scale river basins: projections and their uncertainties sourced from GCMs and RCP scenarios

    NASA Astrophysics Data System (ADS)

    Nasonova, Olga N.; Gusev, Yeugeniy M.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    Climate change impact on river runoff was investigated within the framework of the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP2) using a physically-based land surface model Soil Water - Atmosphere - Plants (SWAP) (developed in the Institute of Water Problems of the Russian Academy of Sciences) and meteorological projections (for 2006-2099) simulated by five General Circulation Models (GCMs) (including GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M) for each of four Representative Concentration Pathway (RCP) scenarios (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Eleven large-scale river basins were used in this study. First of all, SWAP was calibrated and validated against monthly values of measured river runoff with making use of forcing data from the WATCH data set and all GCMs' projections were bias-corrected to the WATCH. Then, for each basin, 20 projections of possible changes in river runoff during the 21st century were simulated by SWAP. Analysis of the obtained hydrological projections allowed us to estimate their uncertainties resulted from application of different GCMs and RCP scenarios. On the average, the contribution of different GCMs to the uncertainty of the projected river runoff is nearly twice larger than the contribution of RCP scenarios. At the same time the contribution of GCMs slightly decreases with time.

  2. Large Scale Analyses and Visualization of Adaptive Amino Acid Changes Projects.

    PubMed

    Vázquez, Noé; Vieira, Cristina P; Amorim, Bárbara S R; Torres, André; López-Fernández, Hugo; Fdez-Riverola, Florentino; Sousa, José L R; Reboiro-Jato, Miguel; Vieira, Jorge

    2018-03-01

    When changes at few amino acid sites are the target of selection, adaptive amino acid changes in protein sequences can be identified using maximum-likelihood methods based on models of codon substitution (such as codeml). Although such methods have been employed numerous times using a variety of different organisms, the time needed to collect the data and prepare the input files means that tens or hundreds of coding regions are usually analyzed. Nevertheless, the recent availability of flexible and easy to use computer applications that collect relevant data (such as BDBM) and infer positively selected amino acid sites (such as ADOPS), means that the entire process is easier and quicker than before. However, the lack of a batch option in ADOPS, here reported, still precludes the analysis of hundreds or thousands of sequence files. Given the interest and possibility of running such large-scale projects, we have also developed a database where ADOPS projects can be stored. Therefore, this study also presents the B+ database, which is both a data repository and a convenient interface that looks at the information contained in ADOPS projects without the need to download and unzip the corresponding ADOPS project file. The ADOPS projects available at B+ can also be downloaded, unzipped, and opened using the ADOPS graphical interface. The availability of such a database ensures results repeatability, promotes data reuse with significant savings on the time needed for preparing datasets, and effortlessly allows further exploration of the data contained in ADOPS projects.

  3. Neuroscience thinks big (and collaboratively).

    PubMed

    Kandel, Eric R; Markram, Henry; Matthews, Paul M; Yuste, Rafael; Koch, Christof

    2013-09-01

    Despite cash-strapped times for research, several ambitious collaborative neuroscience projects have attracted large amounts of funding and media attention. In Europe, the Human Brain Project aims to develop a large-scale computer simulation of the brain, whereas in the United States, the Brain Activity Map is working towards establishing a functional connectome of the entire brain, and the Allen Institute for Brain Science has embarked upon a 10-year project to understand the mouse visual cortex (the MindScope project). US President Barack Obama's announcement of the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies Initiative) in April 2013 highlights the political commitment to neuroscience and is expected to further foster interdisciplinary collaborations, accelerate the development of new technologies and thus fuel much needed medical advances. In this Viewpoint article, five prominent neuroscientists explain the aims of the projects and how they are addressing some of the questions (and criticisms) that have arisen.

  4. Geospatial optimization of siting large-scale solar projects

    USGS Publications Warehouse

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet; Gerritsen, Margot; Diffendorfer, James E.; Haines, Seth S.

    2014-01-01

    guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  5. Governance of extended lifecycle in large-scale eHealth initiatives: analyzing variability of enterprise architecture elements.

    PubMed

    Mykkänen, Juha; Virkanen, Hannu; Tuomainen, Mika

    2013-01-01

    The governance of large eHealth initiatives requires traceability of many requirements and design decisions. We provide a model which we use to conceptually analyze variability of several enterprise architecture (EA) elements throughout the extended lifecycle of development goals using interrelated projects related to the national ePrescription in Finland.

  6. Multiresource inventories incorporating GIS, GPS, and database management systems

    Treesearch

    Loukas G. Arvanitis; Balaji Ramachandran; Daniel P. Brackett; Hesham Abd-El Rasol; Xuesong Du

    2000-01-01

    Large-scale natural resource inventories generate enormous data sets. Their effective handling requires a sophisticated database management system. Such a system must be robust enough to efficiently store large amounts of data and flexible enough to allow users to manipulate a wide variety of information. In a pilot project, related to a multiresource inventory of the...

  7. The Renewed Primary School in Belgium: Analysis of the Local Innovation Policy.

    ERIC Educational Resources Information Center

    Vandenberghe, Roland

    The Renewed Primary School project in Belgium is analyzed in this paper in terms of organizational response to a large-scale innovation, which is characterized by its multidimensionality, by the large number of participating schools, and by a complex support structure. Section 2 of the report presents an elaborated description of these…

  8. Learning binary code via PCA of angle projection for image retrieval

    NASA Astrophysics Data System (ADS)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  9. Analysis of Decision Making Skills for Large Scale Disaster Response

    DTIC Science & Technology

    2015-08-21

    Capability to influence and collaborate Compassion Teamwork Communication Leadership Provide vision of outcome / set priorities Confidence, courage to make...project evaluates the viability of expanding the use of serious games to augment classroom training, tabletop and full scale exercise, and actual...training, evaluation, analysis, and technology ex- ploration. Those techniques have found successful niches, but their wider applicability faces

  10. Combined climate and carbon-cycle effects of large-scale deforestation

    PubMed Central

    Bala, G.; Caldeira, K.; Wickett, M.; Phillips, T. J.; Lobell, D. B.; Delire, C.; Mirin, A.

    2007-01-01

    The prevention of deforestation and promotion of afforestation have often been cited as strategies to slow global warming. Deforestation releases CO2 to the atmosphere, which exerts a warming influence on Earth's climate. However, biophysical effects of deforestation, which include changes in land surface albedo, evapotranspiration, and cloud cover also affect climate. Here we present results from several large-scale deforestation experiments performed with a three-dimensional coupled global carbon-cycle and climate model. These simulations were performed by using a fully three-dimensional model representing physical and biogeochemical interactions among land, atmosphere, and ocean. We find that global-scale deforestation has a net cooling influence on Earth's climate, because the warming carbon-cycle effects of deforestation are overwhelmed by the net cooling associated with changes in albedo and evapotranspiration. Latitude-specific deforestation experiments indicate that afforestation projects in the tropics would be clearly beneficial in mitigating global-scale warming, but would be counterproductive if implemented at high latitudes and would offer only marginal benefits in temperate regions. Although these results question the efficacy of mid- and high-latitude afforestation projects for climate mitigation, forests remain environmentally valuable resources for many reasons unrelated to climate. PMID:17420463

  11. Combined climate and carbon-cycle effects of large-scale deforestation.

    PubMed

    Bala, G; Caldeira, K; Wickett, M; Phillips, T J; Lobell, D B; Delire, C; Mirin, A

    2007-04-17

    The prevention of deforestation and promotion of afforestation have often been cited as strategies to slow global warming. Deforestation releases CO(2) to the atmosphere, which exerts a warming influence on Earth's climate. However, biophysical effects of deforestation, which include changes in land surface albedo, evapotranspiration, and cloud cover also affect climate. Here we present results from several large-scale deforestation experiments performed with a three-dimensional coupled global carbon-cycle and climate model. These simulations were performed by using a fully three-dimensional model representing physical and biogeochemical interactions among land, atmosphere, and ocean. We find that global-scale deforestation has a net cooling influence on Earth's climate, because the warming carbon-cycle effects of deforestation are overwhelmed by the net cooling associated with changes in albedo and evapotranspiration. Latitude-specific deforestation experiments indicate that afforestation projects in the tropics would be clearly beneficial in mitigating global-scale warming, but would be counterproductive if implemented at high latitudes and would offer only marginal benefits in temperate regions. Although these results question the efficacy of mid- and high-latitude afforestation projects for climate mitigation, forests remain environmentally valuable resources for many reasons unrelated to climate.

  12. Auroral zone electric fields from DE 1 and 2 at magnetic conjunctions

    NASA Technical Reports Server (NTRS)

    Weimer, D. R.; Goertz, C. K.; Gurnett, D. A.; Maynard, N. C.; Burch, J. L.

    1985-01-01

    Nearly simultaneous measurements of auroral zone electric fields are obtained by the Dynamics Explorer spacecraft at altitudes below 900 km and above 4,500 km during magnetic conjunctions. The measured electric fields are usually perpendicular to the magnetic field lines. The north-south meridional electric fields are projected to a common altitude by a mapping function which accounts for the convergence of the magnetic field lines. When plotted as a function of invariant latitude, graphs of the projected electric fields measured by both DE-1 and DE-2 show that the large-scale electric field is the same at both altitudes, as expected. Superimposed on the large-scale fields, however, are small-scale features with wavelengths less than 100 km which are larger in magnitude at the higher altitude. Fourier transforms of the electric fields show that the magnitudes depend on wavelength. Outside of the auroral zone the electric field spectrums are nearly identical. But within the auroral zone the high and low altitude electric fields have a ratio which increases with the reciprocal of the wavelength. The small-scale electric field variations are associated with field-aligned currents. These currents are measured with both a plasma instrument and magnetometer on DE-1.

  13. Combined Climate and Carbon-Cycle Effects of Large-Scale Deforestation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bala, G; Caldeira, K; Wickett, M

    2006-10-17

    The prevention of deforestation and promotion of afforestation have often been cited as strategies to slow global warming. Deforestation releases CO{sub 2} to the atmosphere, which exerts a warming influence on Earth's climate. However, biophysical effects of deforestation, which include changes in land surface albedo, evapotranspiration, and cloud cover also affect climate. Here we present results from several large-scale deforestation experiments performed with a three-dimensional coupled global carbon-cycle and climate model. These are the first such simulations performed using a fully three-dimensional model representing physical and biogeochemical interactions among land, atmosphere, and ocean. We find that global-scale deforestation has amore » net cooling influence on Earth's climate, since the warming carbon-cycle effects of deforestation are overwhelmed by the net cooling associated with changes in albedo and evapotranspiration. Latitude-specific deforestation experiments indicate that afforestation projects in the tropics would be clearly beneficial in mitigating global-scale warming, but would be counterproductive if implemented at high latitudes and would offer only marginal benefits in temperate regions. While these results question the efficacy of mid- and high-latitude afforestation projects for climate mitigation, forests remain environmentally valuable resources for many reasons unrelated to climate.« less

  14. The role of ethics in data governance of large neuro-ICT projects.

    PubMed

    Stahl, Bernd Carsten; Rainey, Stephen; Harris, Emma; Fothergill, B Tyr

    2018-05-14

    We describe current practices of ethics-related data governance in large neuro-ICT projects, identify gaps in current practice, and put forward recommendations on how to collaborate ethically in complex regulatory and normative contexts. We undertake a survey of published principles of data governance of large neuro-ICT projects. This grounds an approach to a normative analysis of current data governance approaches. Several ethical issues are well covered in the data governance policies of neuro-ICT projects, notably data protection and attribution of work. Projects use a set of similar policies to ensure users behave appropriately. However, many ethical issues are not covered at all. Implementation and enforcement of policies remain vague. The data governance policies we investigated indicate that the neuro-ICT research community is currently close-knit and that shared assumptions are reflected in infrastructural aspects. This explains why many ethical issues are not explicitly included in data governance policies at present. With neuro-ICT research growing in scale, scope, and international involvement, these shared assumptions should be made explicit and reflected in data governance.

  15. Multi-model projections of Indian summer monsoon climate changes under A1B scenario

    NASA Astrophysics Data System (ADS)

    Niu, X.; Wang, S.; Tang, J.

    2016-12-01

    As part of the Regional Climate Model Intercomparison Project for Asia, the projections of Indian summer monsoon climate changes are constructed using three global climate models (GCMs) and seven regional climate models (RCMs) during 2041-2060 based on the Intergovernmental Panel on Climate Change A1B emission scenario. For the control climate of 1981-2000, most nested RCMs show advantage over the driving GCM of European Centre/Hamburg Fifth Generation (ECHAM5) in the temporal-spatial distributions of temperature and precipitation over Indian Peninsula. Following the driving GCM of ECHAM5, most nested RCMs produce advanced monsoon onset in the control climate. For future climate widespread summer warming is projected over Indian Peninsula by all climate models, with the Multi-RCMs ensemble mean (MME) temperature increasing of 1°C to 2.5°C and the maximum warming center located in northern Indian Peninsula. While for the precipitation, a large inter-model spread is projected by RCMs, with wetter condition in MME projections and significant increase over southern India. Driven by the same GCM, most RCMs project advanced monsoon onset while delayed onset is found in two Regional Climate Model (RegCM3) projections, indicating uncertainty can be expected in the Indian Summer Monsoon onset. All climate models except Conformal-Cubic Atmospheric Model with equal resolution (referred as CCAMP) and two RegCM3 models project stronger summer monsoon during 2041-2060. The disagreement in precipitation projections by RCMs indicates that the surface climate change on regional scale is not only dominated by the large-scale forcing which is provided by driving GCM but also sensitive to RCM' internal physics.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Cathro

    The Lake Charles CCS Project is a large-scale industrial carbon capture and sequestration (CCS) project which will demonstrate advanced technologies that capture and sequester carbon dioxide (CO{sub 2}) emissions from industrial sources into underground formations. Specifically the Lake Charles CCS Project will accelerate commercialization of large-scale CO{sub 2} storage from industrial sources by leveraging synergy between a proposed petroleum coke to chemicals plant (the LCC Gasification Project) and the largest integrated anthropogenic CO{sub 2} capture, transport, and monitored sequestration program in the U.S. Gulf Coast Region. The Lake Charles CCS Project will promote the expansion of EOR in Texas andmore » Louisiana and supply greater energy security by expanding domestic energy supplies. The capture, compression, pipeline, injection, and monitoring infrastructure will continue to sequester CO{sub 2} for many years after the completion of the term of the DOE agreement. The objectives of this project are expected to be fulfilled by working through two distinct phases. The overall objective of Phase 1 was to develop a fully definitive project basis for a competitive Renewal Application process to proceed into Phase 2 - Design, Construction and Operations. Phase 1 includes the studies attached hereto that will establish: the engineering design basis for the capture, compression and transportation of CO{sub 2} from the LCC Gasification Project, and the criteria and specifications for a monitoring, verification and accounting (MVA) plan at the Hastings oil field in Texas. The overall objective of Phase 2, provided a successful competitive down-selection, is to execute design, construction and operations of three capital projects: (1) the CO{sub 2} capture and compression equipment, (2) a Connector Pipeline from the LLC Gasification Project to the Green Pipeline owned by Denbury and an affiliate of Denbury, and (3) a comprehensive MVA system at the Hastings oil field.« less

  17. The Joint Improvised Explosive Device Defeat Organization: DOD’s Fight Against IEDs Today and Tomorrow

    DTIC Science & Technology

    2008-11-01

    In 2004, senior military commanders called for a “ Manhattan Project -like” effort against IEDs, and the Department of Defense (DOD) later...reference to the Manhattan Project by U.S. Central Command leaders was meant to convey the need for a large-scale, focused effort, combining the nation’s...of a highway in southern Iraq. USA Photo/Master Sergeant Lek Mateo. 15 JIEDDO TODAY We’ve got to have something like the Manhattan Project . General

  18. Cluster galaxy dynamics and the effects of large-scale environment

    NASA Astrophysics Data System (ADS)

    White, Martin; Cohn, J. D.; Smit, Renske

    2010-11-01

    Advances in observational capabilities have ushered in a new era of multi-wavelength, multi-physics probes of galaxy clusters and ambitious surveys are compiling large samples of cluster candidates selected in different ways. We use a high-resolution N-body simulation to study how the influence of large-scale structure in and around clusters causes correlated signals in different physical probes and discuss some implications this has for multi-physics probes of clusters (e.g. richness, lensing, Compton distortion and velocity dispersion). We pay particular attention to velocity dispersions, matching galaxies to subhaloes which are explicitly tracked in the simulation. We find that not only do haloes persist as subhaloes when they fall into a larger host, but groups of subhaloes retain their identity for long periods within larger host haloes. The highly anisotropic nature of infall into massive clusters, and their triaxiality, translates into an anisotropic velocity ellipsoid: line-of-sight galaxy velocity dispersions for any individual halo show large variance depending on viewing angle. The orientation of the velocity ellipsoid is correlated with the large-scale structure, and thus velocity outliers correlate with outliers caused by projection in other probes. We quantify this orientation uncertainty and give illustrative examples. Such a large variance suggests that velocity dispersion estimators will work better in an ensemble sense than for any individual cluster, which may inform strategies for obtaining redshifts of cluster members. We similarly find that the ability of substructure indicators to find kinematic substructures is highly viewing angle dependent. While groups of subhaloes which merge with a larger host halo can retain their identity for many Gyr, they are only sporadically picked up by substructure indicators. We discuss the effects of correlated scatter on scaling relations estimated through stacking, both analytically and in the simulations, showing that the strong correlation of measures with mass and the large scatter in mass at fixed observable mitigate line-of-sight projections.

  19. The Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial and Its Associated Research Resource

    PubMed Central

    2013-01-01

    The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial is a large-scale research effort conducted by the National Cancer Institute. PLCO offers an example of coordinated research by both the extramural and intramural communities of the National Institutes of Health. The purpose of this article is to describe the PLCO research resource and how it is managed and to assess the productivity and the costs associated with this resource. Such an in-depth analysis of a single large-scale project can shed light on questions such as how large-scale projects should be managed, what metrics should be used to assess productivity, and how costs can be compared with productivity metrics. A comprehensive publication analysis identified 335 primary research publications resulting from research using PLCO data and biospecimens from 2000 to 2012. By the end of 2012, a total of 9679 citations (excluding self-citations) have resulted from this body of research publications, with an average of 29.7 citations per article, and an h index of 45, which is comparable with other large-scale studies, such as the Nurses’ Health Study. In terms of impact on public health, PLCO trial results have been used by the US Preventive Services Task Force in making recommendations concerning prostate and ovarian cancer screening. The overall cost of PLCO was $454 million over 20 years, adjusted to 2011 dollars, with approximately $37 million for the collection, processing, and storage of biospecimens, including blood samples, buccal cells, and pathology tissues. PMID:24115361

  20. Challenging Common Sense: Cases of School Reform for Learning Community under an International Cooperation Project in Bac Giang Province, Vietnam

    ERIC Educational Resources Information Center

    Saito, Eisuke; Tsukui, Atsushi

    2008-01-01

    This paper aims to discuss the challenges in the process of building a learning community in Vietnamese primary schools. Five lessons emerge from the cases. First, changing teachers' beliefs is time-consuming. Second, because of the reluctance of teachers to change, large-scale delivery of the educational project should be critically revisited…

  1. Training in Methods in Computational Neuroscience

    DTIC Science & Technology

    1992-08-29

    in Tritonia. Roger Traub Models with realistic neurons , with an emphasis on large-scale modeling of epileptic phenomena in hippocampus. Rodolpho...Cell Model Plan: 1) Convert some of my simulations from NEURON to GENESIS (and thus learn GENESIS). 2) Develop a realistic inhibtory model . 3) Further...General Hospital, MA Course Project: Membrane Properties of a Neostriatal Neuron and Dopamine Modulation The purpose of my project was to model the

  2. The Challenge of Implementing an ERP System in a Small and Medium Enterprise--A Teaching Case of ERP Project Management

    ERIC Educational Resources Information Center

    Xu, Hongjiang; Rondeau, Patrick J.; Mahenthiran, Sakthi

    2011-01-01

    Enterprise Resource Planning (ERP) system implementation projects are notoriously risky. While large-scale ERP cases continue to be developed, relatively few new ERP cases have been published that further ERP implementation education in small to medium size firms. This case details the implementation of a new ERP system in a medium sized…

  3. Description of a Compensatory College Education Program for the Disadvantaged and Its Associated Research and Evaluation Program.

    ERIC Educational Resources Information Center

    Spuck, Dennis W.; And Others

    This paper reports on a large-scale project of research and evaluation of a program for disadvantaged minority group students conducted by the Center for Educational Opportunity at the Claremont Colleges. The Program of Special Directed Studies for Transition to College (PSDS), a five-year experimental project, is aimed at providing a four-year,…

  4. Preschool Affects Longer Term Literacy and Numeracy: Results from a General Population Longitudinal Study in Northern Ireland

    ERIC Educational Resources Information Center

    Melhuish, Edward; Quinn, Louise; Sylva, Kathy; Sammons, Pam; Siraj-Blatchford, Iram; Taggart, Brenda

    2013-01-01

    The Effective Pre-school Provision in Northern Ireland (EPPNI) project is a longitudinal study of child development from 3 to 11 years. It is one of the first large-scale UK projects to investigate the effects of different kinds of preschool provision, and to relate experience in preschool to child development. In EPPNI, 683 children were randomly…

  5. The Maryland Large-Scale Integrated Neurocognitive Architecture

    DTIC Science & Technology

    2008-03-01

    Visual input enters the network through the lateral geniculate nucleus (LGN) and is passed forward through visual brain regions (V1, V2, and V4...University of Maryland Sponsored by Defense Advanced Research Projects Agency DARPA Order No. V029 APPROVED FOR PUBLIC RELEASE...interpreted as necessarily representing the official policies, either expressed or implied, of the Defense Advanced Research Projects Agency or the U.S

  6. Contruction worker profile. community report--Center, North Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chalmers, J.A.; Glazner, J.

    Center, North Dakota is one of the currently affected communities included in the study to help us learn something of the effects which large-scale construction projects have on small communities. The findings of the Project Survey, which was conducted at the Milton R. Young and Leland Olds Power plants, along with the findings of the Household Survey and the Community Survey, are presented.

  7. The Santa Margarita River Arundo donax control project: development of methods and plant community response

    Treesearch

    Dawn M. Lawson; Jesse A. Giessow; Jason H. Giessow

    2005-01-01

    A large-scale effort to control the aggressively invasive exotic species Arundo donax in the Santa Margarita River watershed in California’s south coast ecoregion was initiated in 1997. The project was prompted by the need for Marine Corps Base Camp Pendleton to address impacts to habitat for federally-listed endangered species and wetlands regulated...

  8. Middle Atmosphere of the Southern Hemisphere (MASH) Global meteor observations system (GLOBMET) Solar Spectral Irradiance Measurements (SSIM) Global Observations and Studies of Stratospheric Aerosols (GOSSA): Progress with the MASH project

    NASA Technical Reports Server (NTRS)

    Oneill, A.

    1989-01-01

    The aim of the MASH project is to study the dynamics of the middle atmosphere in the Southern Hemisphere, emphasizing inter-hemispheric differences. Both observational data and data from simulations with numerical models are being used. It is intended that MASH will be complemented by parallel studies on the transport and photochemistry of trace species in the Southern Hemisphere. Impetus for such studies has come from the unexpected finding of a springtime ozone hole over Antarctica. A summary of recent progress with the MASH project is given. Data from polar orbiting satellites are used to discuss the large scale circulation found in the Southern Hemisphere at extratropical latitudes. Comparisons are made with that of the Northern Hemisphere. Particular attention is paid to the springtime final warming, the most spectacular large scale phenomenon in the statosphere of the Southern Hemisphere. The circulation before and after this event has to be taken into account in theories for the formation and subsequent disappearance of the ozone hole.

  9. Drought in the Horn of Africa: attribution of a damaging and repeating extreme event

    NASA Astrophysics Data System (ADS)

    Marthews, Toby; Otto, Friederike; Mitchell, Daniel; Dadson, Simon; Jones, Richard

    2015-04-01

    We have applied detection and attribution techniques to the severe drought that hit the Horn of Africa in 2014. The short rains failed in late 2013 in Kenya, South Sudan, Somalia and southern Ethiopia, leading to a very dry growing season January to March 2014, and subsequently to the current drought in many agricultural areas of the sub-region. We have made use of the weather@home project, which uses publicly-volunteered distributed computing to provide a large ensemble of simulations sufficient to sample regional climate uncertainty. Based on this, we have estimated the occurrence rates of the kinds of the rare and extreme events implicated in this large-scale drought. From land surface model runs based on these ensemble simulations, we have estimated the impacts of climate anomalies during this period and therefore we can reliably identify some factors of the ongoing drought as attributable to human-induced climate change. The UNFCCC's Adaptation Fund is attempting to support projects that bring about an adaptation to "the adverse effects of climate change", but in order to formulate such projects we need a much clearer way to assess how much climate change is human-induced and how much is a consequence of climate anomalies and large-scale teleconnections, which can only be provided by robust attribution techniques.

  10. Identification and measurement of shrub type vegetation on large scale aerial photography

    NASA Technical Reports Server (NTRS)

    Driscoll, R. S.

    1970-01-01

    Important range-shrub species were identified at acceptable levels of accuracy on large-scale 70 mm color and color infrared aerial photographs. Identification of individual shrubs was significantly higher, however, on color infrared. Photoscales smaller than 1:2400 had limited value except for mature individuals of relatively tall species, and then only if crown margins did not overlap and sharp contrast was evident between the species and background. Larger scale photos were required for low-growing species in dense stands. The crown cover for individual species was estimated from the aerial photos either with a measuring magnifier or a projected-scale micrometer. These crown cover measurements provide techniques for earth-resource analyses when used in conjunction with space and high-altitude remotely procured photos.

  11. A Primer on Infectious Disease Bacterial Genomics

    PubMed Central

    Petkau, Aaron; Knox, Natalie; Graham, Morag; Van Domselaar, Gary

    2016-01-01

    SUMMARY The number of large-scale genomics projects is increasing due to the availability of affordable high-throughput sequencing (HTS) technologies. The use of HTS for bacterial infectious disease research is attractive because one whole-genome sequencing (WGS) run can replace multiple assays for bacterial typing, molecular epidemiology investigations, and more in-depth pathogenomic studies. The computational resources and bioinformatics expertise required to accommodate and analyze the large amounts of data pose new challenges for researchers embarking on genomics projects for the first time. Here, we present a comprehensive overview of a bacterial genomics projects from beginning to end, with a particular focus on the planning and computational requirements for HTS data, and provide a general understanding of the analytical concepts to develop a workflow that will meet the objectives and goals of HTS projects. PMID:28590251

  12. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  13. (New hosts and vectors for genome cloning)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The main goal of our project remains the development of new bacterial hosts and vectors for the stable propagation of human DNA clones in E. coli. During the past six months of our current budget period, we have (1) continued to develop new hosts that permit the stable maintenance of unstable features of human DNA, and (2) developed a series of vectors for (a) cloning large DNA inserts, (b) assessing the frequency of human sequences that are lethal to the growth of E. coli, and (c) assessing the stability of human sequences cloned in M13 for large-scale sequencing projects.

  14. [New hosts and vectors for genome cloning]. Progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The main goal of our project remains the development of new bacterial hosts and vectors for the stable propagation of human DNA clones in E. coli. During the past six months of our current budget period, we have (1) continued to develop new hosts that permit the stable maintenance of unstable features of human DNA, and (2) developed a series of vectors for (a) cloning large DNA inserts, (b) assessing the frequency of human sequences that are lethal to the growth of E. coli, and (c) assessing the stability of human sequences cloned in M13 for large-scale sequencing projects.

  15. Realtime monitoring of bridge scour using remote monitoring technology

    DOT National Transportation Integrated Search

    2011-02-01

    The research performed in this project focuses on the application of instruments including accelerometers : and tiltmeters to monitor bridge scour. First, two large scale laboratory experiments were performed. One : experiment is the simulation of a ...

  16. Trenton Free-Fare Demonstration Project

    DOT National Transportation Integrated Search

    1978-12-01

    The "Trenton Free-Fare Demonstration" is the first large-scale test of free transit in the U.S. The New Jersey Department of Transportation, in cooperation with UMTA, Mercer County, and Mercer County Improvement Authority, is administering an Off-Pea...

  17. Iron-Air Rechargeable Battery: A Robust and Inexpensive Iron-Air Rechargeable Battery for Grid-Scale Energy Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-10-01

    GRIDS Project: USC is developing an iron-air rechargeable battery for large-scale energy storage that could help integrate renewable energy sources into the electric grid. Iron-air batteries have the potential to store large amounts of energy at low cost—iron is inexpensive and abundant, while oxygen is freely obtained from the air we breathe. However, current iron-air battery technologies have suffered from low efficiency and short life spans. USC is working to dramatically increase the efficiency of the battery by placing chemical additives on the battery’s iron-based electrode and restructuring the catalysts at the molecular level on the battery’s air-based electrode. Thismore » can help the battery resist degradation and increase life span. The goal of the project is to develop a prototype iron-air battery at significantly cost lower than today’s best commercial batteries.« less

  18. Listening to the Deep: live monitoring of ocean noise and cetacean acoustic signals.

    PubMed

    André, M; van der Schaar, M; Zaugg, S; Houégnigan, L; Sánchez, A M; Castell, J V

    2011-01-01

    The development and broad use of passive acoustic monitoring techniques have the potential to help assessing the large-scale influence of artificial noise on marine organisms and ecosystems. Deep-sea observatories have the potential to play a key role in understanding these recent acoustic changes. LIDO (Listening to the Deep Ocean Environment) is an international project that is allowing the real-time long-term monitoring of marine ambient noise as well as marine mammal sounds at cabled and standalone observatories. Here, we present the overall development of the project and the use of passive acoustic monitoring (PAM) techniques to provide the scientific community with real-time data at large spatial and temporal scales. Special attention is given to the extraction and identification of high frequency cetacean echolocation signals given the relevance of detecting target species, e.g. beaked whales, in mitigation processes, e.g. during military exercises. Copyright © 2011. Published by Elsevier Ltd.

  19. Freedom and Responsibility in Synthetic Genomics: The Synthetic Yeast Project

    PubMed Central

    Sliva, Anna; Yang, Huanming; Boeke, Jef D.; Mathews, Debra J. H.

    2015-01-01

    First introduced in 2011, the Synthetic Yeast Genome (Sc2.0) Project is a large international synthetic genomics project that will culminate in the first eukaryotic cell (Saccharomyces cerevisiae) with a fully synthetic genome. With collaborators from across the globe and from a range of institutions spanning from do-it-yourself biology (DIYbio) to commercial enterprises, it is important that all scientists working on this project are cognizant of the ethical and policy issues associated with this field of research and operate under a common set of principles. In this commentary, we survey the current ethics and regulatory landscape of synthetic biology and present the Sc2.0 Statement of Ethics and Governance to which all members of the project adhere. This statement focuses on four aspects of the Sc2.0 Project: societal benefit, intellectual property, safety, and self-governance. We propose that such project-level agreements are an important, valuable, and flexible model of self-regulation for similar global, large-scale synthetic biology projects in order to maximize the benefits and minimize potential harms. PMID:26272997

  20. Freedom and Responsibility in Synthetic Genomics: The Synthetic Yeast Project.

    PubMed

    Sliva, Anna; Yang, Huanming; Boeke, Jef D; Mathews, Debra J H

    2015-08-01

    First introduced in 2011, the Synthetic Yeast Genome (Sc2.0) PROJECT is a large international synthetic genomics project that will culminate in the first eukaryotic cell (Saccharomyces cerevisiae) with a fully synthetic genome. With collaborators from across the globe and from a range of institutions spanning from do-it-yourself biology (DIYbio) to commercial enterprises, it is important that all scientists working on this project are cognizant of the ethical and policy issues associated with this field of research and operate under a common set of principles. In this commentary, we survey the current ethics and regulatory landscape of synthetic biology and present the Sc2.0 Statement of Ethics and Governance to which all members of the project adhere. This statement focuses on four aspects of the Sc2.0 PROJECT: societal benefit, intellectual property, safety, and self-governance. We propose that such project-level agreements are an important, valuable, and flexible model of self-regulation for similar global, large-scale synthetic biology projects in order to maximize the benefits and minimize potential harms. Copyright © 2015 by the Genetics Society of America.

  1. Business Architecture Development at Public Administration - Insights from Government EA Method Engineering Project in Finland

    NASA Astrophysics Data System (ADS)

    Valtonen, Katariina; Leppänen, Mauri

    Governments worldwide are concerned for efficient production of services to customers. To improve quality of services and to make service production more efficient, information and communication technology (ICT) is largely exploited in public administration (PA). Succeeding in this exploitation calls for large-scale planning which embraces issues from strategic to technological level. In this planning the notion of enterprise architecture (EA) is commonly applied. One of the sub-architectures of EA is business architecture (BA). BA planning is challenging in PA due to a large number of stakeholders, a wide set of customers, and solid and hierarchical structures of organizations. To support EA planning in Finland, a project to engineer a government EA (GEA) method was launched. In this chapter, we analyze the discussions and outputs of the project workshops and reflect emerged issues on current e-government literature. We bring forth insights into and suggestions for government BA and its development.

  2. Lessons learned from post-wildfire monitoring and implications for land management and regional drinking water treatability in Southern Rockies of Alberta

    NASA Astrophysics Data System (ADS)

    Diiwu, J.; Silins, U.; Kevin, B.; Anderson, A.

    2008-12-01

    Like many areas of the Rocky Mountains, Alberta's forests on the eastern slopes of the Rockies have been shaped by decades of successful fire suppression. These forests are at high risk to fire and large scale insect infestation, and climate change will continue to increase these risks. These headwaters forests provide the vast majority of usable surface water supplies to large region of the province, and large scale natural disasters can have dramatic effects on water quality and water availability. The population in the region has steadily increased and now this area is the main source water for many Alberta municipalities, including the City of Calgary, which has a population of over one million. In 2003 a fire burned 21,000 ha in the southern foothills area. The government land managers were concerned about the downstream implications of the fire and salvage operations, however there was very limited scientific information to guide the decision making. This led to establishment of the Southern Rockies Watershed Project, which is a partnership between Alberta Sustainable Resource Development, the provincial government department responsible for land management and the University of Alberta. After five years of data collection, the project has produced quantitative information that was not previously available about the effects of fire and management interventions such as salvage logging on headwaters and regional water quality. This information can be used to make decisions on forest operations, fire suppression, and post-fire salvage operations. In the past few years this project has captured the interest of large municipalities and water treatment researchers who are keen to investigate the potential implications of large natural disturbances to large and small drinking water treatment facilities. Examples from this project will be used to highlight the challenges and successes encountered while bridging the gap between science and land management policy.

  3. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  4. Regional Climate Sensitivity- and Historical-Based Projections to 2100

    NASA Astrophysics Data System (ADS)

    Hébert, Raphaël.; Lovejoy, Shaun

    2018-05-01

    Reliable climate projections at the regional scale are needed in order to evaluate climate change impacts and inform policy. We develop an alternative method for projections based on the transient climate sensitivity (TCS), which relies on a linear relationship between the forced temperature response and the strongly increasing anthropogenic forcing. The TCS is evaluated at the regional scale (5° by 5°), and projections are made accordingly to 2100 using the high and low Representative Concentration Pathways emission scenarios. We find that there are large spatial discrepancies between the regional TCS from 5 historical data sets and 32 global climate model (GCM) historical runs and furthermore that the global mean GCM TCS is about 15% too high. Given that the GCM Representative Concentration Pathway scenario runs are mostly linear with respect to their (inadequate) TCS, we conclude that historical methods of regional projection are better suited given that they are directly calibrated on the real world (historical) climate.

  5. Assessment of village-wise groundwater draft for irrigation: a field-based study in hard-rock aquifers of central India

    NASA Astrophysics Data System (ADS)

    Ray, R. K.; Syed, T. H.; Saha, Dipankar; Sarkar, B. C.; Patre, A. K.

    2017-12-01

    Extracted groundwater, 90% of which is used for irrigated agriculture, is central to the socio-economic development of India. A lack of regulation or implementation of regulations, alongside unrecorded extraction, often leads to over exploitation of large-scale common-pool resources like groundwater. Inevitably, management of groundwater extraction (draft) for irrigation is critical for sustainability of aquifers and the society at large. However, existing assessments of groundwater draft, which are mostly available at large spatial scales, are inadequate for managing groundwater resources that are primarily exploited by stakeholders at much finer scales. This study presents an estimate, projection and analysis of fine-scale groundwater draft in the Seonath-Kharun interfluve of central India. Using field surveys of instantaneous discharge from irrigation wells and boreholes, annual groundwater draft for irrigation in this area is estimated to be 212 × 106 m3, most of which (89%) is withdrawn during non-monsoon season. However, the density of wells/boreholes, and consequent extraction of groundwater, is controlled by the existing hydrogeological conditions. Based on trends in the number of abstraction structures (1982-2011), groundwater draft for the year 2020 is projected to be approximately 307 × 106 m3; hence, groundwater draft for irrigation in the study area is predicted to increase by ˜44% within a span of 8 years. Central to the work presented here is the approach for estimation and prediction of groundwater draft at finer scales, which can be extended to critical groundwater zones of the country.

  6. swot: Super W Of Theta

    NASA Astrophysics Data System (ADS)

    Coupon, Jean; Leauthaud, Alexie; Kilbinger, Martin; Medezinski, Elinor

    2017-07-01

    SWOT (Super W Of Theta) computes two-point statistics for very large data sets, based on “divide and conquer” algorithms, mainly, but not limited to data storage in binary trees, approximation at large scale, parellelization (open MPI), and bootstrap and jackknife resampling methods “on the fly”. It currently supports projected and 3D galaxy auto and cross correlations, galaxy-galaxy lensing, and weighted histograms.

  7. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  8. The Snowmastodon Project: cutting-edge science on the blade of a bulldozer

    USGS Publications Warehouse

    Pigati, Jeffery S.; Miller, Ian M.; Johnson, Kirk R.

    2015-01-01

    Cutting-edge science happens at a variety of scales, from the individual and intimate to the large-scale and collaborative. The publication of a special issue of Quaternary Research in Nov. 2014 dedicated to the scientific findings of the “Snowmastodon Project” highlights what can be done when natural history museums, governmental agencies, and academic institutions work toward a common goal.

  9. Mapping moderate-scale land-cover over very large geographic areas within a collaborative framework: A case study of the Southwest Regional Gap Analysis Project (SWReGAP)

    USGS Publications Warehouse

    Lowry, J.; Ramsey, R.D.; Thomas, K.; Schrupp, D.; Sajwaj, T.; Kirby, J.; Waller, E.; Schrader, S.; Falzarano, S.; Langs, L.; Manis, G.; Wallace, C.; Schulz, K.; Comer, P.; Pohs, K.; Rieth, W.; Velasquez, C.; Wolk, B.; Kepner, W.; Boykin, K.; O'Brien, L.; Bradford, D.; Thompson, B.; Prior-Magee, J.

    2007-01-01

    Land-cover mapping efforts within the USGS Gap Analysis Program have traditionally been state-centered; each state having the responsibility of implementing a project design for the geographic area within their state boundaries. The Southwest Regional Gap Analysis Project (SWReGAP) was the first formal GAP project designed at a regional, multi-state scale. The project area comprises the southwestern states of Arizona, Colorado, Nevada, New Mexico, and Utah. The land-cover map/dataset was generated using regionally consistent geospatial data (Landsat ETM+ imagery (1999-2001) and DEM derivatives), similar field data collection protocols, a standardized land-cover legend, and a common modeling approach (decision tree classifier). Partitioning of mapping responsibilities amongst the five collaborating states was organized around ecoregion-based "mapping zones". Over the course of 21/2 field seasons approximately 93,000 reference samples were collected directly, or obtained from other contemporary projects, for the land-cover modeling effort. The final map was made public in 2004 and contains 125 land-cover classes. An internal validation of 85 of the classes, representing 91% of the land area was performed. Agreement between withheld samples and the validated dataset was 61% (KHAT = .60, n = 17,030). This paper presents an overview of the methodologies used to create the regional land-cover dataset and highlights issues associated with large-area mapping within a coordinated, multi-institutional management framework. ?? 2006 Elsevier Inc. All rights reserved.

  10. Financial and tax risks at implementation of "Chayanda- Lensk" section of "Sila Sibiri" gas transportation system construction project

    NASA Astrophysics Data System (ADS)

    Sharf, I. V.; Chukhareva, N. V.; Kuznetsova, L. P.

    2014-08-01

    High social and economic importance of large-scale projects on gasification of East Siberian regions of Russia and diversifying gas exports poses the problem of complex risk analysis of the project. This article discusses the various types of risks that could significantly affect the timing of the implementation and effectiveness of the project for the construction of the first line of "Sila Sibiri", the "Chayanda-Lensk" section. Special attention is paid to financial and tax aspects of the project. Graphically presented analysis of the dynamics of financial indicators reflect certain periods of effectiveness in implementing the project. Authors also discuss the possible causes and consequences of risks.

  11. MSE wall void repair effect on corrosion of reinforcement - phase 2 : specialty fill materials.

    DOT National Transportation Integrated Search

    2015-08-01

    This project provided information and recommendations for material selection for best : corrosion control of reinforcement in mechanically stabilized earth (MSE) walls with void repairs. The : investigation consisted of small- and large-scale experim...

  12. EVALUATION PLAN FOR TWO LARGE-SCALE LANDFILL BIOREACTOR TECHNOLOGIES

    EPA Science Inventory

    Abstract - Waste Management, Inc., is operating two long-term bioreactor studies at the Outer Loop Landfill in Louisville, KY, including facultative landfill bioreactor and staged aerobic-anaerobic landfill bioreactor demonstrations. A Quality Assurance Project Plan (QAPP) was p...

  13. Large Scale Evaluation fo Nickel Aluminide Rolls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2005-09-01

    This completed project was a joint effort between Oak Ridge National Laboratory and Bethlehem Steel (now Mittal Steel) to demonstrate the effectiveness of using nickel aluminide intermetallic alloy rolls as part of an updated, energy-efficient, commercial annealing furnace system.

  14. Statistical model of exotic rotational correlations in emergent space-time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan

    2017-06-06

    A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictionsmore » for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.« less

  15. Risk of large-scale fires in boreal forests of Finland under changing climate

    NASA Astrophysics Data System (ADS)

    Lehtonen, I.; Venäläinen, A.; Kämäräinen, M.; Peltola, H.; Gregow, H.

    2016-01-01

    The target of this work was to assess the impact of projected climate change on forest-fire activity in Finland with special emphasis on large-scale fires. In addition, we were particularly interested to examine the inter-model variability of the projected change of fire danger. For this purpose, we utilized fire statistics covering the period 1996-2014 and consisting of almost 20 000 forest fires, as well as daily meteorological data from five global climate models under representative concentration pathway RCP4.5 and RCP8.5 scenarios. The model data were statistically downscaled onto a high-resolution grid using the quantile-mapping method before performing the analysis. In examining the relationship between weather and fire danger, we applied the Canadian fire weather index (FWI) system. Our results suggest that the number of large forest fires may double or even triple during the present century. This would increase the risk that some of the fires could develop into real conflagrations which have become almost extinct in Finland due to active and efficient fire suppression. However, the results reveal substantial inter-model variability in the rate of the projected increase of forest-fire danger, emphasizing the large uncertainty related to the climate change signal in fire activity. We moreover showed that the majority of large fires in Finland occur within a relatively short period in May and June due to human activities and that FWI correlates poorer with the fire activity during this time of year than later in summer when lightning is a more important cause of fires.

  16. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  17. The future of management: The NASA paradigm

    NASA Technical Reports Server (NTRS)

    Harris, Philip R.

    1992-01-01

    Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.

  18. Large-scale functional models of visual cortex for remote sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brumby, Steven P; Kenyon, Garrett; Rasmussen, Craig E

    Neuroscience has revealed many properties of neurons and of the functional organization of visual cortex that are believed to be essential to human vision, but are missing in standard artificial neural networks. Equally important may be the sheer scale of visual cortex requiring {approx}1 petaflop of computation. In a year, the retina delivers {approx}1 petapixel to the brain, leading to massively large opportunities for learning at many levels of the cortical system. We describe work at Los Alamos National Laboratory (LANL) to develop large-scale functional models of visual cortex on LANL's Roadrunner petaflop supercomputer. An initial run of a simplemore » region VI code achieved 1.144 petaflops during trials at the IBM facility in Poughkeepsie, NY (June 2008). Here, we present criteria for assessing when a set of learned local representations is 'complete' along with general criteria for assessing computer vision models based on their projected scaling behavior. Finally, we extend one class of biologically-inspired learning models to problems of remote sensing imagery.« less

  19. Estimating Sediment Delivery to The Rio Maranon, Peru Prior to Large-Scale Hydropower Developments Using High Resolution Imagery from Google Earth and a DJI Phantom 3 Drone

    NASA Astrophysics Data System (ADS)

    Goode, J. R.; Candelaria, T.; Kramer, N. R.; Hill, A. F.

    2016-12-01

    As global energy demands increase, generating hydroelectric power by constructing dams and reservoirs on large river systems is increasingly seen as a renewable alternative to fossil fuels, especially in emerging economies. Many large-scale hydropower projects are located in steep mountainous terrain, where environmental factors have the potential to conspire against the sustainability and success of such projects. As reservoir storage capacity decreases when sediment builds up behind dams, high sediment yields can limit project life expectancy and overall hydropower viability. In addition, episodically delivered sediment from landslides can make quantifying sediment loads difficult. These factors, combined with remote access, limit the critical data needed to effectively evaluate development decisions. In the summer of 2015, we conducted a basic survey to characterize the geomorphology, hydrology and ecology of 620 km of the Rio Maranon, Peru - a major tributary to the Amazon River, which flows north from the semi-arid Peruvian Andes - prior to its dissection by several large hydropower dams. Here we present one component of this larger study: a first order analysis of potential sediment inputs to the Rio Maranon, Peru. To evaluate sediment delivery and storage in this system, we used high resolution Google Earth imagery to delineate landslides, combined with high resolution imagery from a DJI Phantom 3 Drone, flown at alluvial fan inputs to the river in the field. Because hillslope-derived sediment inputs from headwater tributaries are important to overall ecosystem health in large river systems, our study has the potential to contribute to the understanding the impacts of large Andean dams on sediment connectivity to the Amazon basin.

  20. Microbiological, Geochemical and Hydrologic Processes Controlling Uranium Mobility: An Integrated Field-Scale Subsurface Research Challenge Site at Rifle, Colorado, Quality Assurance Project Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, N. J.

    The U.S. Department of Energy (DOE) is cleaning up and/or monitoring large, dilute plumes contaminated by metals, such as uranium and chromium, whose mobility and solubility change with redox status. Field-scale experiments with acetate as the electron donor have stimulated metal-reducing bacteria to effectively remove uranium [U(VI)] from groundwater at the Uranium Mill Tailings Site in Rifle, Colorado. The Pacific Northwest National Laboratory and a multidisciplinary team of national laboratory and academic collaborators has embarked on a research proposed for the Rifle site, the object of which is to gain a comprehensive and mechanistic understanding of the microbial factors andmore » associated geochemistry controlling uranium mobility so that DOE can confidently remediate uranium plumes as well as support stewardship of uranium-contaminated sites. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Rifle Integrated Field-Scale Subsurface Research Challenge Project.« less

  1. Accelerating Software Development through Agile Practices--A Case Study of a Small-Scale, Time-Intensive Web Development Project at a College-Level IT Competition

    ERIC Educational Resources Information Center

    Zhang, Xuesong; Dorn, Bradley

    2012-01-01

    Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…

  2. Integrated Database And Knowledge Base For Genomic Prospective Cohort Study In Tohoku Medical Megabank Toward Personalized Prevention And Medicine.

    PubMed

    Ogishima, Soichi; Takai, Takako; Shimokawa, Kazuro; Nagaie, Satoshi; Tanaka, Hiroshi; Nakaya, Jun

    2015-01-01

    The Tohoku Medical Megabank project is a national project to revitalization of the disaster area in the Tohoku region by the Great East Japan Earthquake, and have conducted large-scale prospective genome-cohort study. Along with prospective genome-cohort study, we have developed integrated database and knowledge base which will be key database for realizing personalized prevention and medicine.

  3. APDA's Contribution to Current Research and Citizen Science

    NASA Astrophysics Data System (ADS)

    Barker, Thurburn; Castelaz, M. W.; Cline, J. D.; Hudec, R.

    2010-01-01

    The Astronomical Photographical Data Archive (APDA) is dedicated to the collection, restoration, preservation, and digitization of astronomical photographic data that eventually can be accessed via the Internet by the global community of scientists, researchers and students. Located on the Pisgah Astronomical Research Institute campus, APDA now includes collections from North America totaling more than 100,000 photographic plates and films. Two new large scale research projects, and one citizen science project have now been developed from the archived data. One unique photographic data collection covering the southern hemisphere contains the signatures of diffuse interstellar bands (DIBs) within the stellar spectra on objective prism plates. We plan to digitize the spectra, identify the DIBs, and map out the large scale spatial extent of DIBS. The goal is to understand the Galactic environment suitable to the DIB molecules. Another collection contains spectra with nearly the same dispersion as the GAIA Satellite low dispersion slitless spectrophotometers, BP and RP. The plates will be used to develop standards for GAIA spectra. To bring the data from APDA to the general public, we have developed the citizen science project called Stellar Classification Online - Public Exploration (SCOPE). SCOPE allows the citizen scientist to classify up to a half million stars on objective prism plates. We will present the status of each of these projects.

  4. Adapting wheat to uncertain future

    NASA Astrophysics Data System (ADS)

    Semenov, Mikhail; Stratonovitch, Pierre

    2015-04-01

    This study describes integration of climate change projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model ensemble with the LARS-WG weather generator, which delivers an attractive option for downscaling of large-scale climate projections from global climate models (GCMs) to local-scale climate scenarios for impact assessments. A subset of 18 GCMs from the CMIP5 ensemble and 2 RCPs, RCP4.5 and RCP8.5, were integrated with LARS-WG. Climate sensitivity indexes for temperature and precipitation were computed for all GCMs and for 21 regions in the world. For computationally demanding impact assessments, where it is not practical to explore all possible combinations of GCM × RCP, climate sensitivity indexes could be used to select a subset of GCMs from CMIP5 with contrasting climate sensitivity. This would allow to quantify uncertainty in impacts resulting from the CMIP5 ensemble by conducting fewer simulation experiments. As an example, an in silico design of wheat ideotype optimised for future climate scenarios in Europe was described. Two contrasting GCMs were selected for the analysis, "hot" HadGEM2-ES and "cool" GISS-E2-R-CC, along with 2 RCPs. Despite large uncertainty in climate projections, several wheat traits were identified as beneficial for the high-yielding wheat ideotypes that could be used as targets for wheat improvement by breeders.

  5. Schistosomiasis control in China: the impact of a 10-year World Bank Loan Project (1992-2001).

    PubMed Central

    Xianyi, Chen; Liying, Wang; Jiming, Cai; Xiaonong, Zhou; Jiang, Zheng; Jiagang, Guo; Xiaohua, Wu; Engels, D.; Minggang, Chen

    2005-01-01

    China has been carrying out large-scale schistosomiasis control since the mid-1950s, but in the early 1990s, schistosomiasis was still endemic in eight provinces. A World Bank Loan Project enabled further significant progress to be made during the period 1992-2001. The control strategy was focused on the large-scale use of chemotherapy -- primarily to reinforce morbidity control -- while at the same time acting on transmission with the ultimate goal of interrupting it. Chemotherapy was complemented by health education, chemical control of snails and environmental modification where appropriate. A final evaluation in 2002 showed that infection rates in humans and livestock had decreased by 55% and 50%, respectively. The number of acute infections and of individuals with advanced disease had also significantly decreased. Although snail infection rates continued to fluctuate at a low level, the densities of infected snails had decreased by more than 75% in all endemic areas. The original objectives of the China World Bank Loan Project for schistosomiasis control had all been met. One province, Zhejiang, had already fulfilled the criteria for elimination of schistosomiasis by 1995. The project was therefore a success and has provided China with a sound basis for further control. PMID:15682248

  6. Improvement of General Electric’s Chilled Ammonia Process with the use of Membrane Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muraskin, Dave; Dube, Sanjay; Baburao, Barath

    General Electric Environmental Control Solutions (formerly Alstom Power Environmental Control Systems) set out to complete the Phase 1 award requirements for a Phase II renewal application for their project selected under DOE-FOA-0001190 “Small and Large Scale Pilots for Reducing the Cost of CO 2 Capture and Compression”. The project focus was to implement several improvement concepts utilizing membrane technology at the recipient’s Chilled Ammonia Process (CAP) CO 2 capture large-scale pilot plant. The goal was to lower the overall cost of technology. During the development of costs for the preliminary techno-economic assessment (TEA), it became clear that the capital andmore » operating costs of this concept were not economically attractive. All work related to a Phase II renewal application at that point was halted as GE made the decision not to submit a Phase II renewal application. Discussions with DOE resulted in a path towards useful information produced from the design and cost work already completed on the project. With the reverse osmosis (RO) unit providing most of the cost issues, GE would provide a sensitivity analysis of the RO unit with respect to project cost. This information would be included with the Techno-Economic Analysis along with the Technology Gap Analysis.« less

  7. Hybrid methods for cybersecurity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less

  8. [Additional psychometric data for the DS1K mood questionnaire. Experience from a large sample study involving parents of young children].

    PubMed

    Danis, Ildiko; Scheuring, Noemi; Papp, Eszter; Czinner, Antal

    2012-06-01

    A new instrument for assessing depressive mood, the first version of Depression Scale Questionnaire (DS1K) was published in 2008 by Halmai et al. This scale was used in our large sample study, in the framework of the For Healthy Offspring project, involving parents of young children. The original questionnaire was developed in small samples, so our aim was to assist further development of the instrument by the psychometric analysis of the data in our large sample (n=1164). The DS1K scale was chosen to measure the parents' mood and mental state in the For Healthy Offspring project. The questionnaire was completed by 1063 mothers and 328 fathers, yielding a heterogenous sample with respect to age and socio-demographic status. Analyses included main descriptive statistics, establishing the scales' inner consistency and some comparisons. Results were checked in our original and multiple imputed datasets as well. According to our results the reliability of our scale was much worse than in the original study (Cronbach alpha: 0.61 versus 0.88). During the detailed item-analysis it became clear that two items contributed to the observed decreased coherence. We assumed a problem related to misreading in case of one of these items. This assumption was checked by cross-analysis by the assumed reading level. According to our results the reliability of the scale was increased in both the lower and higher education level groups if we did not include one or both of these problematic items. However, as the number of items decreased, the relative sensitivity of the scale was also reduced, with fewer persons categorized in the risk group compared to the original scale. We suggest for the authors as an alternative solution to redefine the problematic items and retest the reliability of the measurement in a sample with diverse socio-demographic characteristics.

  9. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  10. Internal Variability-Generated Uncertainty in East Asian Climate Projections Estimated with 40 CCSM3 Ensembles.

    PubMed

    Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang

    2016-01-01

    Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.

  11. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    NASA Astrophysics Data System (ADS)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community governance structures were put in place to ensure a focus on science needs and goals, to provide an informed review of the project's results, and to carefully balance consistency of observations with technical evolution. We will summarize lessons learned from USArray and how these can be applied to future efforts such as SZO.

  12. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    NASA Technical Reports Server (NTRS)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  13. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.

  14. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    NASA Astrophysics Data System (ADS)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  15. A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines

    NASA Astrophysics Data System (ADS)

    Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian

    2016-09-01

    Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.

  16. REAL TIME CONTROL OF SEWERS: US EPA MANUAL

    EPA Science Inventory

    The problem of sewage spills and local flooding has traditionally been addressed by large scale capital improvement programs that focus on construction alternatives such as sewer separation or construction of storage facilities. The cost of such projects is often high, especiall...

  17. Documentation of validity for the AT-SAT computerized test battery. Volume 2

    DOT National Transportation Integrated Search

    2001-03-01

    This document is a comprehensive report on a large-scale research project to develop and validate a : computerized selection battery to hire Air Traffic Control Specialists (ATCSs) for the Federal Aviation : Administration (FAA). The purpose of this ...

  18. Documentation of validity for the AT-SAT computerized test battery. Volume 1

    DOT National Transportation Integrated Search

    2001-03-01

    This document is a comprehensive report on a large-scale research project to develop and validate a : computerized selection battery to hire Air Traffic Control Specialists (ATCSs) for the Federal Aviation : Administration (FAA). The purpose of this ...

  19. Novel Membranes and Systems for Industrial and Municipal Water Purification and Reuse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This factsheet describes a project that developed nano-engineered, high-permeance membrane materials with more than double the permeance of current reverse osmosis membranes as well as manufacturing technologies for large-scale production of the novel materials.

  20. The Telecommunications and Data Acquisition Report

    NASA Technical Reports Server (NTRS)

    Posner, E. C. (Editor)

    1985-01-01

    Developments in programs managed by the Jet Propulsion Laboratory's Office of Telecommunications and Data acquisition are discussed. Space communications, radio antennas, the Deep Space Network, antenna design, Project SETI, seismology, coding, very large scale integration, downlinking, and demodulation are among the topics covered.

  1. Mycotoxins: A fungal genomics perspective

    USDA-ARS?s Scientific Manuscript database

    The chemical and enzymatic diversity in the fungal kingdom is staggering. Large-scale fungal genome sequencing projects are generating a massive catalog of secondary metabolite biosynthetic genes and pathways. Fungal natural products are a boon and bane to man as valuable pharmaceuticals and harmful...

  2. De-mystifying earned value management for ground based astronomy projects, large and small

    NASA Astrophysics Data System (ADS)

    Norton, Timothy; Brennan, Patricia; Mueller, Mark

    2014-08-01

    The scale and complexity of today's ground based astronomy projects have justifiably required Principal Investigator's and their project teams to adopt more disciplined management processes and tools in order to achieve timely and accurate quantification of the progress and relative health of their projects. Earned Value Management (EVM) is one such tool. Developed decades ago and used extensively in the defense and construction industries, and now a requirement of NASA projects greater than $20M; EVM has gained a foothold in ground-based astronomy projects. The intent of this paper is to de-mystify EVM by discussing the fundamentals of project management, explaining how EVM fits with existing principles, and describing key concepts every project can use to implement their own EVM system. This paper also discusses pitfalls to avoid during implementation and obstacles to its success. The authors report on their organization's most recent experience implementing EVM for the GMT-Consortium Large Earth Finder (G-CLEF) project. G-CLEF is a fiber-fed, optical echelle spectrograph that has been selected as a first light instrument for the Giant Magellan Telescope (GMT), planned for construction at the Las Campanas Observatory in Chile's Atacama Desert region.

  3. The project De Caldas International Project: An example of a large-scale radwaste isolation natural analogue study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shea, M.

    1995-09-01

    The proper isolation of radioactive waste is one of today`s most pressing environmental issues. Research is being carried out by many countries around the world in order to answer critical and perplexing questions regarding the safe disposal of radioactive waste. Natural analogue studies are an increasingly important facet of this international research effort. The Pocos de Caldas Project represents a major effort of the international technical and scientific community towards addressing one of modern civilization`s most critical environmental issues - radioactive waste isolation.

  4. System Modeling of a large FPGA project: the SKA Tile Processing Module

    NASA Astrophysics Data System (ADS)

    Belli, C.; Comoretto, G.

    Large projects like the SKA have an intrinsic complexity due to their scale. In this context, the application of a management design system becomes fundamental. For this purpose the SysML language, a UML customization for engineering applications, has been applied. As far as our work is concerned, we focused on the SKA Low Telescope - Tile Processing Module, designing diagrams at different detail levels. We designed a conceptual model of the TPM, primarily focusing on the main interfaces and the major data flows between product items. Functionalities are derived from use cases and allocated to hardware modules in order to guarantee the project's internal consistency and features. This model has been used both as internal documentation and as job specification, to commit part of the design to external entities.

  5. How Large Is the "Public Domain"? A Comparative Analysis of Ringer's 1961 Copyright Renewal Study and HathiTrust CRMS Data

    ERIC Educational Resources Information Center

    Wilkin, John P.

    2017-01-01

    The 1961 Copyright Office study on renewals, authored by Barbara Ringer, has cast an outsized influence on discussions of the U.S. 1923-1963 public domain. As more concrete data emerge from initiatives such as the large-scale determination process in the Copyright Review Management System (CRMS) project, questions are raised about the reliability…

  6. Stereoscopic applications for design visualization

    NASA Astrophysics Data System (ADS)

    Gilson, Kevin J.

    2007-02-01

    Advances in display technology and 3D design visualization applications have made real-time stereoscopic visualization of architectural and engineering projects a reality. Parsons Brinkerhoff (PB) is a transportation consulting firm that has used digital visualization tools from their inception and has helped pioneer the application of those tools to large scale infrastructure projects. PB is one of the first Architecture/Engineering/Construction (AEC) firms to implement a CAVE- an immersive presentation environment that includes stereoscopic rear-projection capability. The firm also employs a portable stereoscopic front-projection system, and shutter-glass systems for smaller groups. PB is using commercial real-time 3D applications in combination with traditional 3D modeling programs to visualize and present large AEC projects to planners, clients and decision makers in stereo. These presentations create more immersive and spatially realistic presentations of the proposed designs. This paper will present the basic display tools and applications, and the 3D modeling techniques PB is using to produce interactive stereoscopic content. The paper will discuss several architectural and engineering design visualizations we have produced.

  7. Large rainfall changes consistently projected over substantial areas of tropical land

    NASA Astrophysics Data System (ADS)

    Chadwick, Robin; Good, Peter; Martin, Gill; Rowell, David P.

    2016-02-01

    Many tropical countries are exceptionally vulnerable to changes in rainfall patterns, with floods or droughts often severely affecting human life and health, food and water supplies, ecosystems and infrastructure. There is widespread disagreement among climate model projections of how and where rainfall will change over tropical land at the regional scales relevant to impacts, with different models predicting the position of current tropical wet and dry regions to shift in different ways. Here we show that despite uncertainty in the location of future rainfall shifts, climate models consistently project that large rainfall changes will occur for a considerable proportion of tropical land over the twenty-first century. The area of semi-arid land affected by large changes under a higher emissions scenario is likely to be greater than during even the most extreme regional wet or dry periods of the twentieth century, such as the Sahel drought of the late 1960s to 1990s. Substantial changes are projected to occur by mid-century--earlier than previously expected--and to intensify in line with global temperature rise. Therefore, current climate projections contain quantitative, decision-relevant information on future regional rainfall changes, particularly with regard to climate change mitigation policy.

  8. Recent developments in user-job management with Ganga

    NASA Astrophysics Data System (ADS)

    Currie, R.; Elmsheuser, J.; Fay, R.; Owen, P. H.; Richards, A.; Slater, M.; Sutcliffe, W.; Williams, M.

    2015-12-01

    The Ganga project was originally developed for use by LHC experiments and has been used extensively throughout Run1 in both LHCb and ATLAS. This document describes some the most recent developments within the Ganga project. There have been improvements in the handling of large scale computational tasks in the form of a new GangaTasks infrastructure. Improvements in file handling through using a new IGangaFile interface makes handling files largely transparent to the end user. In addition to this the performance and usability of Ganga have both been addressed through the development of a new queues system allows for parallel processing of job related tasks.

  9. Consortium biology in immunology: the perspective from the Immunological Genome Project.

    PubMed

    Benoist, Christophe; Lanier, Lewis; Merad, Miriam; Mathis, Diane

    2012-10-01

    Although the field has a long collaborative tradition, immunology has made less use than genetics of 'consortium biology', wherein groups of investigators together tackle large integrated questions or problems. However, immunology is naturally suited to large-scale integrative and systems-level approaches, owing to the multicellular and adaptive nature of the cells it encompasses. Here, we discuss the value and drawbacks of this organization of research, in the context of the long-running 'big science' debate, and consider the opportunities that may exist for the immunology community. We position this analysis in light of our own experience, both positive and negative, as participants of the Immunological Genome Project.

  10. [New hosts and vectors for genome cloning]. Progress report, 1990--1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The main goal of our project remains the development of new bacterial hosts and vectors for the stable propagation of human DNA clones in E. coli. During the past six months of our current budget period, we have (1) continued to develop new hosts that permit the stable maintenance of unstable features of human DNA, and (2) developed a series of vectors for (a) cloning large DNA inserts, (b) assessing the frequency of human sequences that are lethal to the growth of E. coli, and (c) assessing the stability of human sequences cloned in M13 for large-scale sequencing projects.

  11. Spatially distributed potential evapotranspiration modeling and climate projections.

    PubMed

    Gharbia, Salem S; Smullen, Trevor; Gill, Laurence; Johnston, Paul; Pilla, Francesco

    2018-08-15

    Evapotranspiration integrates energy and mass transfer between the Earth's surface and atmosphere and is the most active mechanism linking the atmosphere, hydrosphsophere, lithosphere and biosphere. This study focuses on the fine resolution modeling and projection of spatially distributed potential evapotranspiration on the large catchment scale as response to climate change. Six potential evapotranspiration designed algorithms, systematically selected based on a structured criteria and data availability, have been applied and then validated to long-term mean monthly data for the Shannon River catchment with a 50m 2 cell size. The best validated algorithm was therefore applied to evaluate the possible effect of future climate change on potential evapotranspiration rates. Spatially distributed potential evapotranspiration projections have been modeled based on climate change projections from multi-GCM ensembles for three future time intervals (2020, 2050 and 2080) using a range of different Representative Concentration Pathways producing four scenarios for each time interval. Finally, seasonal results have been compared to baseline results to evaluate the impact of climate change on the potential evapotranspiration and therefor on the catchment dynamical water balance. The results present evidence that the modeled climate change scenarios would have a significant impact on the future potential evapotranspiration rates. All the simulated scenarios predicted an increase in potential evapotranspiration for each modeled future time interval, which would significantly affect the dynamical catchment water balance. This study addresses the gap in the literature of using GIS-based algorithms to model fine-scale spatially distributed potential evapotranspiration on the large catchment systems based on climatological observations and simulations in different climatological zones. Providing fine-scale potential evapotranspiration data is very crucial to assess the dynamical catchment water balance to setup management scenarios for the water abstractions. This study illustrates a transferable systematic method to design GIS-based algorithms to simulate spatially distributed potential evapotranspiration on the large catchment systems. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. The CELSS breadboard project: Plant production

    NASA Technical Reports Server (NTRS)

    Knott, William M.

    1990-01-01

    NASA's Breadboard Project for the Controlled Ecological Life Support System (CELSS) program is described. The simplified schematic of a CELSS is given. A modular approach is taken to building the CELSS Breadboard. Each module is researched in order to develop a data set for each one prior to its integration into the complete system. The data being obtained from the Biomass Production Module or the Biomass Production Chamber is examined. The other primary modules, food processing and resource recovery or waste management, are discussed briefly. The crew habitat module is not discussed. The primary goal of the Breadboard Project is to scale-up research data to an integrated system capable of supporting one person in order to establish feasibility for the development and operation of a CELSS. Breadboard is NASA's first attempt at developing a large scale CELSS.

  13. Anisotropic Galaxy-Galaxy Lensing in the Illustris-1 Simulation

    NASA Astrophysics Data System (ADS)

    Brainerd, Tereasa G.

    2017-06-01

    In Cold Dark Matter universes, the dark matter halos of galaxies are expected to be triaxial, leading to a surface mass density that is not circularly symmetric. In principle, this "flattening" of the dark matter halos of galaxies should be observable as an anisotropy in the weak galaxy-galaxy lensing signal. The degree to which the weak lensing signal is observed to be anisotropic, however, will depend strongly on the degree to which mass (i.e., the dark matter) is aligned with light in the lensing galaxies. That is, the anisotropy will be maximized when the major axis of the projected mass distribution is well aligned with the projected light distribution of the lens galaxies. Observational studies of anisotropic galaxy-galaxy lensing have found an anisotropic weak lensing signal around massive, red galaxies. Detecting the signal around blue, disky galaxies has, however, been more elusive. A possible explanation for this is that mass and light are well aligned within red galaxies and poorly aligned within blue galaxies (an explanation that is supported by studies of the locations of satellites of large, relatively isolated galaxies). Here we compute the weak lensing signal of isolated central galaxies in the Illustris-1 simulation. We compute the anisotropy of the weak lensing signal using two definitions of the geometry: [1] the major axis of the projected dark matter mass distribution and [2] the major axis of the projected stellar mass. On projected scales less than 15% of the virial radius, an anisotropy of order 10% is found for both definitions of the geometry. On larger scales, the anisotropy computed relative to the major axis of the projected light distribution is less than the anisotropy computed relative to the major axis of the projected dark matter. On projected scales of order the virial radius, the anisotropy obtained when using the major axis of the light is an order of magnitude less than the anisotropy obtained when using the major axis of the dark matter. The suppression of the anisotropy when using the major axis of the light to define the geometry is indicative of a significant misalignment of mass and light in the Illustris-1 galaxies at large physical radii.

  14. Extreme weather: Subtropical floods and tropical cyclones

    NASA Astrophysics Data System (ADS)

    Shaevitz, Daniel A.

    Extreme weather events have a large effect on society. As such, it is important to understand these events and to project how they may change in a future, warmer climate. The aim of this thesis is to develop a deeper understanding of two types of extreme weather events: subtropical floods and tropical cyclones (TCs). In the subtropics, the latitude is high enough that quasi-geostrophic dynamics are at least qualitatively relevant, while low enough that moisture may be abundant and convection strong. Extratropical extreme precipitation events are usually associated with large-scale flow disturbances, strong ascent, and large latent heat release. In the first part of this thesis, I examine the possible triggering of convection by the large-scale dynamics and investigate the coupling between the two. Specifically two examples of extreme precipitation events in the subtropics are analyzed, the 2010 and 2014 floods of India and Pakistan and the 2015 flood of Texas and Oklahoma. I invert the quasi-geostrophic omega equation to decompose the large-scale vertical motion profile to components due to synoptic forcing and diabatic heating. Additionally, I present model results from within the Column Quasi-Geostrophic framework. A single column model and cloud-revolving model are forced with the large-scale forcings (other than large-scale vertical motion) computed from the quasi-geostrophic omega equation with input data from a reanalysis data set, and the large-scale vertical motion is diagnosed interactively with the simulated convection. It is found that convection was triggered primarily by mechanically forced orographic ascent over the Himalayas during the India/Pakistan flood and by upper-level Potential Vorticity disturbances during the Texas/Oklahoma flood. Furthermore, a climate attribution analysis was conducted for the Texas/Oklahoma flood and it is found that anthropogenic climate change was responsible for a small amount of rainfall during the event but the intensity of this event may be greatly increased if it occurs in a future climate. In the second part of this thesis, I examine the ability of high-resolution global atmospheric models to simulate TCs. Specifically, I present an intercomparison of several models' ability to simulate the global characteristics of TCs in the current climate. This is a necessary first step before using these models to project future changes in TCs. Overall, the models were able to reproduce the geographic distribution of TCs reasonably well, with some of the models performing remarkably well. The intensity of TCs varied widely between the models, with some of this difference being due to model resolution.

  15. A fast time-difference inverse solver for 3D EIT with application to lung imaging.

    PubMed

    Javaherian, Ashkan; Soleimani, Manuchehr; Moeller, Knut

    2016-08-01

    A class of sparse optimization techniques that require solely matrix-vector products, rather than an explicit access to the forward matrix and its transpose, has been paid much attention in the recent decade for dealing with large-scale inverse problems. This study tailors application of the so-called Gradient Projection for Sparse Reconstruction (GPSR) to large-scale time-difference three-dimensional electrical impedance tomography (3D EIT). 3D EIT typically suffers from the need for a large number of voxels to cover the whole domain, so its application to real-time imaging, for example monitoring of lung function, remains scarce since the large number of degrees of freedom of the problem extremely increases storage space and reconstruction time. This study shows the great potential of the GPSR for large-size time-difference 3D EIT. Further studies are needed to improve its accuracy for imaging small-size anomalies.

  16. User requirements for project-oriented remote sensing

    NASA Technical Reports Server (NTRS)

    Hitchcock, H. C.; Baxter, F. P.; Cox, T. L.

    1975-01-01

    Registration of remotely sensed data to geodetic coordinates provides for overlay analysis of land use data. For aerial photographs of a large area, differences in scales, dates, and film types are reconciled, and multispectral scanner data are machine registered at the time of acquisition.

  17. Estimating ecosystem service changes as a precursor to modeling

    EPA Science Inventory

    EPA's Future Midwestern Landscapes Study will project changes in ecosystem services (ES) for alternative future policy scenarios in the Midwestern U.S. Doing so for detailed landscapes over large spatial scales will require serial application of economic and ecological models. W...

  18. UPDATE ON THE MARINA STUDY ON LAKE TEXOMA

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) has instituted a program for Risk Management Research for Ecosystem Restoration in Watersheds. As part of this program a large scale project was initiated on Lake Texoma and the surrounding watershed to evaluate the assimi...

  19. The World's Largest Photovoltaic Concentrator System.

    ERIC Educational Resources Information Center

    Smith, Harry V.

    1982-01-01

    The Mississippi County Community College large-scale energy experiment, featuring the emerging high technology of solar electricity, is described. The project includes a building designed for solar electricity and a power plant consisting of a total energy photovoltaic system, and features two experimental developments. (MLW)

  20. Modeling Watershed Mercury Response to Atmospheric Loadings: Response Time and Challenges

    EPA Science Inventory

    The relationship between sources of mercury to watersheds and its fate in surface waters is invariably complex. Large scale monitoring studies, such as the METAALICUS project, have advanced current understanding of the links between atmospheric deposition of mercury and accumulat...

  1. Education for Professional Engineering Practice

    ERIC Educational Resources Information Center

    Bramhall, Mike D.; Short, Chris

    2014-01-01

    This paper reports on a funded collaborative large-scale curriculum innovation and enhancement project undertaken as part of a UK National Higher Education Science, Technology Engineering and Mathematics (STEM) programme. Its aim was to develop undergraduate curricula to teach appropriate skills for professional engineering practice more…

  2. Evaluating Green/Gray Infrastructure for CSO/Stormwater Control

    EPA Science Inventory

    The NRMRL is conducting this project to evaluate the water quality and quantity benefits of a large-scale application of green infrastructure (low-impact development/best management practices) retrofits in an entire subcatchment. It will document ORD's effort to demonstrate the e...

  3. Toward server-side, high performance climate change data analytics in the Earth System Grid Federation (ESGF) eco-system

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Williams, Dean; Aloisio, Giovanni

    2016-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.

  4. Implementing large-scale workforce change: learning from 55 pilot sites of allied health workforce redesign in Queensland, Australia

    PubMed Central

    2013-01-01

    Background Increasingly, health workforces are undergoing high-level ‘re-engineering’ to help them better meet the needs of the population, workforce and service delivery. Queensland Health implemented a large scale 5-year workforce redesign program across more than 13 health-care disciplines. This study synthesized the findings from this program to identify and codify mechanisms associated with successful workforce redesign to help inform other large workforce projects. Methods This study used Inductive Logic Reasoning (ILR), a process that uses logic models as the primary functional tool to develop theories of change, which are subsequently validated through proposition testing. Initial theories of change were developed from a systematic review of the literature and synthesized using a logic model. These theories of change were then developed into propositions and subsequently tested empirically against documentary, interview, and survey data from 55 projects in the workforce redesign program. Results Three overarching principles were identified that optimized successful workforce redesign: (1) drivers for change need to be close to practice; (2) contexts need to be supportive both at the local levels and legislatively; and (3) mechanisms should include appropriate engagement, resources to facilitate change management, governance, and support structures. Attendance to these factors was uniformly associated with success of individual projects. Conclusions ILR is a transparent and reproducible method for developing and testing theories of workforce change. Despite the heterogeneity of projects, professions, and approaches used, a consistent set of overarching principles underpinned success of workforce change interventions. These concepts have been operationalized into a workforce change checklist. PMID:24330616

  5. A Canonical Response in Rainfall Characteristics to Global Warming: Projections by IPCC CMIP5 Models

    NASA Technical Reports Server (NTRS)

    Lau, William K. M.; Wu, H. T.; Kim, K. M.

    2012-01-01

    Changes in rainfall characteristics induced by global warming are examined based on probability distribution function (PDF) analysis, from outputs of 14 IPCC (Intergovernmental Panel on Climate Change), CMIP (5th Coupled Model Intercomparison Project) models under various scenarios of increased CO2 emissions. Results show that collectively CMIP5 models project a robust and consistent global and regional rainfall response to CO2 warming. Globally, the models show a 1-3% increase in rainfall per degree rise in temperature, with a canonical response featuring large increase (100-250 %) in frequency of occurrence of very heavy rain, a reduction (5-10%) of moderate rain, and an increase (10-15%) of light rain events. Regionally, even though details vary among models, a majority of the models (>10 out of 14) project a consistent large scale response with more heavy rain events in climatologically wet regions, most pronounced in the Pacific ITCZ and the Asian monsoon. Moderate rain events are found to decrease over extensive regions of the subtropical and extratropical oceans, but increases over the extratropical land regions, and the Southern Oceans. The spatial distribution of light rain resembles that of moderate rain, but mostly with opposite polarity. The majority of the models also show increase in the number of dry events (absence or only trace amount of rain) over subtropical and tropical land regions in both hemispheres. These results suggest that rainfall characteristics are changing and that increased extreme rainfall events and droughts occurrences are connected, as a consequent of a global adjustment of the large scale circulation to global warming.

  6. Projected changes to precipitation extremes over the Canadian Prairies using multi-RCM ensemble

    NASA Astrophysics Data System (ADS)

    Masud, M. B.; Khaliq, M. N.; Wheater, H. S.

    2016-12-01

    Information on projected changes to precipitation extremes is needed for future planning of urban drainage infrastructure and storm water management systems and to sustain socio-economic activities and ecosystems at local, regional and other scales of interest. This study explores the projected changes to seasonal (April-October) precipitation extremes at daily, hourly and sub-hourly scales over the Canadian Prairie Provinces of Alberta, Saskatchewan, and Manitoba, based on the North American Regional Climate Change Assessment Program multi-Regional Climate Model (RCM) ensemble and regional frequency analysis. The performance of each RCM is evaluated regarding boundary and performance errors to study various sources of uncertainties and the impact of large-scale driving fields. In the absence of RCM-simulated short-duration extremes, a framework is developed to derive changes to extremes of these durations. Results from this research reveal that the relative changes in sub-hourly extremes are higher than those in the hourly and daily extremes. Overall, projected changes in precipitation extremes are larger for southeastern parts of this region than southern and northern areas, and smaller for southwestern and western parts of the study area. Keywords: climate change, precipitation extremes, regional frequency analysis, NARCCAP, Canadian Prairie provinces

  7. A controlled field pilot for testing near surface CO2 detection techniques and transport models

    USGS Publications Warehouse

    Spangler, L.H.; Dobeck, L.M.; Repasky, K.; Nehrir, A.; Humphries, S.; Keith, C.; Shaw, J.; Rouse, J.; Cunningham, A.; Benson, S.; Oldenburg, C.M.; Lewicki, J.L.; Wells, A.; Diehl, R.; Strazisar, B.; Fessenden, J.; Rahn, Thomas; Amonette, J.; Barr, J.; Pickles, W.; Jacobson, J.; Silver, E.; Male, E.; Rauch, H.; Gullickson, K.; Trautz, R.; Kharaka, Y.; Birkholzer, J.; Wielopolski, L.

    2009-01-01

    A field facility has been developed to allow controlled studies of near surface CO2 transport and detection technologies. The key component of the facility is a shallow, slotted horizontal well divided into six zones. The scale and fluxes were designed to address large scale CO2 storage projects and desired retention rates for those projects. A wide variety of detection techniques were deployed by collaborators from 6 national labs, 2 universities, EPRI, and the USGS. Additionally, modeling of CO2 transport and concentrations in the saturated soil and in the vadose zone was conducted. An overview of these results will be presented. ?? 2009 Elsevier Ltd. All rights reserved.

  8. The Human Genome Project: big science transforms biology and medicine.

    PubMed

    Hood, Leroy; Rowen, Lee

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.

  9. The Human Genome Project: big science transforms biology and medicine

    PubMed Central

    2013-01-01

    The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834

  10. A decision support system for map projections of small scale data

    USGS Publications Warehouse

    Finn, Michael P.; Usery, E. Lynn; Posch, Stephan T.; Seong, Jeong Chang

    2004-01-01

    The use of commercial geographic information system software to process large raster datasets of terrain elevation, population, land cover, vegetation, soils, temperature, and rainfall requires both projection from spherical coordinates to plane coordinate systems and transformation from one plane system to another. Decision support systems deliver information resulting in knowledge that assists in policies, priorities, or processes. This paper presents an approach to handling the problems of raster dataset projection and transformation through the development of a Web-enabled decision support system to aid users of transformation processes with the selection of appropriate map projections based on data type, areal extent, location, and preservation properties.

  11. Flat-plate solar array project of the US Department of Energy's National Photovoltaics Program: Ten years of progress

    NASA Technical Reports Server (NTRS)

    Christensen, Elmer

    1985-01-01

    The Flat-Plate Solar Array (FSA) Project, a Government-sponsored photovoltaics project, was initiated in January 1975 (previously named the Low-Cost Silicon Solar Array Project) to stimulate the development of PV systems for widespread use. Its goal then was to develop PV modules with 10% efficiency, a 20-year lifetime, and a selling price of $0.50 per peak watt of generating capacity (1975 dollars). It was recognized that cost reduction of PV solar-cell and module manufacturing was the key achievement needed if PV power systems were to be economically competitive for large-scale terrestrial use.

  12. Computational Everyday Life Human Behavior Model as Servicable Knowledge

    NASA Astrophysics Data System (ADS)

    Motomura, Yoichi; Nishida, Yoshifumi

    A project called `Open life matrix' is not only a research activity but also real problem solving as an action research. This concept is realized by large-scale data collection, probabilistic causal structure model construction and information service providing using the model. One concrete outcome of this project is childhood injury prevention activity in new team consist of hospital, government, and many varieties of researchers. The main result from the project is a general methodology to apply probabilistic causal structure models as servicable knowledge for action research. In this paper, the summary of this project and future direction to emphasize action research driven by artificial intelligence technology are discussed.

  13. Health impact assessment of industrial development projects: a spatio-temporal visualization.

    PubMed

    Winkler, Mirko S; Krieger, Gary R; Divall, Mark J; Singer, Burton H; Utzinger, Jürg

    2012-05-01

    Development and implementation of large-scale industrial projects in complex eco-epidemiological settings typically require combined environmental, social and health impact assessments. We present a generic, spatio-temporal health impact assessment (HIA) visualization, which can be readily adapted to specific projects and key stakeholders, including poorly literate communities that might be affected by consequences of a project. We illustrate how the occurrence of a variety of complex events can be utilized for stakeholder communication, awareness creation, interactive learning as well as formulating HIA research and implementation questions. Methodological features are highlighted in the context of an iron ore development in a rural part of Africa.

  14. On initial Brain Activity Mapping of episodic and semantic memory code in the hippocampus.

    PubMed

    Tsien, Joe Z; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Wang, Phillip Lei; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui

    2013-10-01

    It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  16. On Initial Brain Activity Mapping of Associative Memory Code in the Hippocampus

    PubMed Central

    Tsien, Joe Z.; Li, Meng; Osan, Remus; Chen, Guifen; Lin, Longian; Lei Wang, Phillip; Frey, Sabine; Frey, Julietta; Zhu, Dajiang; Liu, Tianming; Zhao, Fang; Kuang, Hui

    2013-01-01

    It has been widely recognized that the understanding of the brain code would require large-scale recording and decoding of brain activity patterns. In 2007 with support from Georgia Research Alliance, we have launched the Brain Decoding Project Initiative with the basic idea which is now similarly advocated by BRAIN project or Brain Activity Map proposal. As the planning of the BRAIN project is currently underway, we share our insights and lessons from our efforts in mapping real-time episodic memory traces in the hippocampus of freely behaving mice. We show that appropriate large-scale statistical methods are essential to decipher and measure real-time memory traces and neural dynamics. We also provide an example of how the carefully designed, sometime thinking-outside-the-box, behavioral paradigms can be highly instrumental to the unraveling of memory-coding cell assembly organizing principle in the hippocampus. Our observations to date have led us to conclude that the specific-to-general categorical and combinatorial feature-coding cell assembly mechanism represents an emergent property for enabling the neural networks to generate and organize not only episodic memory, but also semantic knowledge and imagination. PMID:23838072

  17. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  18. FR-type radio sources in COSMOS: relation of radio structure to size, accretion modes and large-scale environment

    NASA Astrophysics Data System (ADS)

    Vardoulaki, Eleni; Faustino Jimenez Andrade, Eric; Delvecchio, Ivan; Karim, Alexander; Smolčić, Vernesa; Magnelli, Benjamin; Bertoldi, Frank; Schinnener, Eva; Sargent, Mark; Finoguenov, Alexis; VLA COSMOS Team

    2018-01-01

    The radio sources associated with active galactic nuclei (AGN) can exhibit a variety of radio structures, from simple to more complex, giving rise to a variety of classification schemes. The question which still remains open, given deeper surveys revealing new populations of radio sources, is whether this plethora of radio structures can be attributed to the physical properties of the host or to the environment. Here we present an analysis on the radio structure of radio-selected AGN from the VLA-COSMOS Large Project at 3 GHz (JVLA-COSMOS; Smolčić et al.) in relation to: 1) their linear projected size, 2) the Eddington ratio, and 3) the environment their hosts lie within. We classify these as FRI (jet-like) and FRII (lobe-like) based on the FR-type classification scheme, and compare them to a sample of jet-less radio AGN in JVLA-COSMOS. We measure their linear projected sizes using a semi-automatic machine learning technique. Their Eddington ratios are calculated from X-ray data available for COSMOS. As environmental probes we take the X-ray groups (hundreds kpc) and the density fields (~Mpc-scale) in COSMOS. We find that FRII radio sources are on average larger than FRIs, which agrees with literature. But contrary to past studies, we find no dichotomy in FR objects in JVLA-COSMOS given their Eddington ratios, as on average they exhibit similar values. Furthermore our results show that the large-scale environment does not explain the observed dichotomy in lobe- and jet-like FR-type objects as both types are found on similar environments, but it does affect the shape of the radio structure introducing bents for objects closer to the centre of an X-ray group.

  19. The effects of scale on the costs of targeted HIV prevention interventions among female and male sex workers, men who have sex with men and transgenders in India

    PubMed Central

    Guinness, L; Kumaranayake, L; Reddy, Bhaskar; Govindraj, Y; Vickerman, P; Alary, M

    2010-01-01

    Background The India AIDS Initiative (Avahan) project is involved in rapid scale-up of HIV-prevention interventions in high-risk populations. This study examines the cost variation of 107 non-governmental organisations (NGOs) implementing targeted interventions, over the start up (defined as period from project inception until services to the key population commenced) and first 2 years of intervention. Methods The Avahan interventions for female and male sex workers and their clients, in 62 districts of four southern states were costed for the financial years 2004/2005 and 2005/2006 using standard costing techniques. Data sources include financial and economic costs from the lead implementing partners (LPs) and subcontracted local implementing NGOs retrospectively and prospectively collected from a provider perspective. Ingredients and step-down allocation processes were used. Outcomes were measured using routinely collected project data. The average costs were estimated and a regression analysis carried out to explore causes of cost variation. Costs were calculated in US$ 2006. Results The total number of registered people was 134 391 at the end of 2 years, and 124 669 had used STI services during that period. The median average cost of Avahan programme for this period was $76 per person registered with the project. Sixty-one per cent of the cost variation could be explained by scale (positive association), number of NGOs per district (negative), number of LPs in the state (negative) and project maturity (positive) (p<0.0001). Conclusions During rapid scale-up in the initial phase of the Avahan programme, a significant reduction in average costs was observed. As full scale-up had not yet been achieved, the average cost at scale is yet to be realised and the extent of the impact of scale on costs yet to be captured. Scale effects are important to quantify for planning resource requirements of large-scale interventions. The average cost after 2 years is within the range of global scale-up costs estimates and other studies in India. PMID:20167740

  20. The effects of scale on the costs of targeted HIV prevention interventions among female and male sex workers, men who have sex with men and transgenders in India.

    PubMed

    Chandrashekar, S; Guinness, L; Kumaranayake, L; Reddy, Bhaskar; Govindraj, Y; Vickerman, P; Alary, M

    2010-02-01

    The India AIDS Initiative (Avahan) project is involved in rapid scale-up of HIV-prevention interventions in high-risk populations. This study examines the cost variation of 107 non-governmental organisations (NGOs) implementing targeted interventions, over the start up (defined as period from project inception until services to the key population commenced) and first 2 years of intervention. The Avahan interventions for female and male sex workers and their clients, in 62 districts of four southern states were costed for the financial years 2004/2005 and 2005/2006 using standard costing techniques. Data sources include financial and economic costs from the lead implementing partners (LPs) and subcontracted local implementing NGOs retrospectively and prospectively collected from a provider perspective. Ingredients and step-down allocation processes were used. Outcomes were measured using routinely collected project data. The average costs were estimated and a regression analysis carried out to explore causes of cost variation. Costs were calculated in US$ 2006. The total number of registered people was 134,391 at the end of 2 years, and 124,669 had used STI services during that period. The median average cost of Avahan programme for this period was $76 per person registered with the project. Sixty-one per cent of the cost variation could be explained by scale (positive association), number of NGOs per district (negative), number of LPs in the state (negative) and project maturity (positive) (p<0.0001). During rapid scale-up in the initial phase of the Avahan programme, a significant reduction in average costs was observed. As full scale-up had not yet been achieved, the average cost at scale is yet to be realised and the extent of the impact of scale on costs yet to be captured. Scale effects are important to quantify for planning resource requirements of large-scale interventions. The average cost after 2 years is within the range of global scale-up costs estimates and other studies in India.

  1. Exploiting Synoptic-Scale Climate Processes to Develop Nonstationary, Probabilistic Flood Hazard Projections

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Brown, C.; Doss-Gollin, J.

    2016-12-01

    Climate model projections are commonly used for water resources management and planning under nonstationarity, but they do not reliably reproduce intense short-term precipitation and are instead more skilled at broader spatial scales. To provide a credible estimate of flood trend that reflects climate uncertainty, we present a framework that exploits the connections between synoptic-scale oceanic and atmospheric patterns and local-scale flood-producing meteorological events to develop long-term flood hazard projections. We demonstrate the method for the Iowa River, where high flow episodes have been found to correlate with tropical moisture exports that are associated with a pressure dipole across the eastern continental United States We characterize the relationship between flooding on the Iowa River and this pressure dipole through a nonstationary Pareto-Poisson peaks-over-threshold probability distribution estimated based on the historic record. We then combine the results of a trend analysis of dipole index in the historic record with the results of a trend analysis of the dipole index as simulated by General Circulation Models (GCMs) under climate change conditions through a Bayesian framework. The resulting nonstationary posterior distribution of dipole index, combined with the dipole-conditioned peaks-over-threshold flood frequency model, connects local flood hazard to changes in large-scale atmospheric pressure and circulation patterns that are related to flooding in a process-driven framework. The Iowa River example demonstrates that the resulting nonstationary, probabilistic flood hazard projection may be used to inform risk-based flood adaptation decisions.

  2. Environmental impacts of large-scale CSP plants in northwestern China.

    PubMed

    Wu, Zhiyong; Hou, Anping; Chang, Chun; Huang, Xiang; Shi, Duoqi; Wang, Zhifeng

    2014-01-01

    Several concentrated solar power demonstration plants are being constructed, and a few commercial plants have been announced in northwestern China. However, the mutual impacts between the concentrated solar power plants and their surrounding environments have not yet been addressed comprehensively in literature by the parties involved in these projects. In China, these projects are especially important as an increasing amount of low carbon electricity needs to be generated in order to maintain the current economic growth while simultaneously lessening pollution. In this study, the authors assess the potential environmental impacts of large-scale concentrated solar power plants. Specifically, the water use intensity, soil erosion and soil temperature are quantitatively examined. It was found that some of the impacts are favorable, while some impacts are negative in relation to traditional power generation techniques and some need further research before they can be reasonably appraised. In quantitative terms, concentrated solar power plants consume about 4000 L MW(-1) h(-1) of water if wet cooling technology is used, and the collectors lead to the soil temperature changes of between 0.5 and 4 °C; however, it was found that the soil erosion is dramatically alleviated. The results of this study are helpful to decision-makers in concentrated solar power site selection and regional planning. Some conclusions of this study are also valid for large-scale photovoltaic plants.

  3. Response of wheat yield in Spain to large-scale patterns

    NASA Astrophysics Data System (ADS)

    Hernandez-Barrera, Sara; Rodriguez-Puebla, Concepcion

    2016-04-01

    Crops are vulnerable to extreme climate conditions as drought, heat stress and frost risk. In previous study we have quantified the influence of these climate conditions for winter wheat in Spain (Hernandez-Barrera et al. 2015). The climate extremes respond to large-scale atmospheric and oceanic patterns. Therefore, a question emerges in our investigation: How large-scale patterns affect wheat yield? Obtaining and understanding these relationships require different approaches. In this study, we first obtained the leading mode of observed wheat yield variability to characterize the common variability over different provinces in Spain. Then, the wheat variability is related to different modes of mean sea level pressure, jet stream and sea surface temperature by using Partial Least-Squares, which captures the relevant climate drivers accounting for variations in wheat yield from sowing to harvesting. We used the ERA-Interim reanalysis data and the Extended Reconstructed Sea Surface Temperature (SST) (ERSST v3b). The derived model provides insight about the teleconnections between wheat yield and atmospheric and oceanic circulations, which is considered to project the wheat yield trend under global warming using outputs of twelve climate models corresponding to the Coupled Models Intercomparison Project phase 5 (CMIP5). Hernandez-Barrera S., C. Rodríguez-Puebla and A.J. Challinor. Effects of diurnal temperature range and drought on wheat yield in Spain. Theoretical and Applied Climatology (submitted)

  4. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    NASA Astrophysics Data System (ADS)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  5. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  6. Criminological research in contemporary China: challenges and lessons learned from a large-scale criminal victimization survey.

    PubMed

    Zhang, Lening; Messner, Steven F; Lu, Jianhong

    2007-02-01

    This article discusses research experience gained from a large-scale survey of criminal victimization recently conducted in Tianjin, China. The authors review some of the more important challenges that arose in the research, their responses to these challenges, and lessons learned that might be beneficial to other scholars who are interested in conducting criminological research in China. Their experience underscores the importance of understanding the Chinese political, cultural, and academic context, and the utility of collaborating with experienced and knowledgeable colleagues "on site." Although there are some special difficulties and barriers, their project demonstrates the feasibility of original criminological data collection in China.

  7. Assessment of climate change impacts on runoff in China using climate elasticity and multiple CMIP5 GCMs

    NASA Astrophysics Data System (ADS)

    Wu, C.; Hu, B. X.; Wang, P.; Xu, K.

    2017-12-01

    The occurrence of climate warming is unequivocal and is expected to alter the temporal-spatial patterns of regional water resources. Based on the long-term (1960-2012) water budget data and climate projections from 28 Global Climate Models (GCMs) of the Coupled Model Intercomparison Project Phase 5 (CMIP5), this study investigated the responses of runoff (R) to future climate variability in China at both grid and catchment scales using the Budyko-based elasticity method. Results indicate a large spatial variation in precipitation (P) elasticity (from 1.2 to 3.3) and potential evaporation (PET) elasticity (from -2.3 to -0.2) across China. The P elasticity is larger in northeast and western China than in southern China, while the opposite occurs for PET elasticity. Climate projections suggest that there is large uncertainty involved among the GCM simulations, but most project a consistent change in P (or PET) over China at the mean annual scale. During the future period of 2071-2100, the mean annual P will likely increase in most parts of China particularly the western regions, while the mean annual PET will likely increase in the whole China especially the southern regions due to future increases in temperature. Moreover, larger increases are projected for higher emission scenarios. Compared with the baseline 1971-2000, the arid regions and humid regions of China will likely become wetter and drier in the period 2071-2100, respectively.

  8. Intercomparison of methods of coupling between convection and large-scale circulation: 2. Comparison over nonuniform surface conditions

    DOE PAGES

    Daleu, C. L.; Plant, R. S.; Woolnough, S. J.; ...

    2016-03-18

    As part of an international intercomparison project, the weak temperature gradient (WTG) and damped gravity wave (DGW) methods are used to parameterize large-scale dynamics in a set of cloud-resolving models (CRMs) and single column models (SCMs). The WTG or DGW method is implemented using a configuration that couples a model to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. We investigated the sensitivity of each model to changes in SST, given a fixed reference state. We performed a systematic comparison of the WTG and DGW methods in different models, and a systematic comparison ofmore » the behavior of those models using the WTG method and the DGW method. The sensitivity to the SST depends on both the large-scale parameterization method and the choice of the cloud model. In general, SCMs display a wider range of behaviors than CRMs. All CRMs using either the WTG or DGW method show an increase of precipitation with SST, while SCMs show sensitivities which are not always monotonic. CRMs using either the WTG or DGW method show a similar relationship between mean precipitation rate and column-relative humidity, while SCMs exhibit a much wider range of behaviors. DGW simulations produce large-scale velocity profiles which are smoother and less top-heavy compared to those produced by the WTG simulations. Lastly, these large-scale parameterization methods provide a useful tool to identify the impact of parameterization differences on model behavior in the presence of two-way feedback between convection and the large-scale circulation.« less

  9. Managing aquatic ecosystems and water resources under multiple stress--an introduction to the MARS project.

    PubMed

    Hering, Daniel; Carvalho, Laurence; Argillier, Christine; Beklioglu, Meryem; Borja, Angel; Cardoso, Ana Cristina; Duel, Harm; Ferreira, Teresa; Globevnik, Lidija; Hanganu, Jenica; Hellsten, Seppo; Jeppesen, Erik; Kodeš, Vit; Solheim, Anne Lyche; Nõges, Tiina; Ormerod, Steve; Panagopoulos, Yiannis; Schmutz, Stefan; Venohr, Markus; Birk, Sebastian

    2015-01-15

    Water resources globally are affected by a complex mixture of stressors resulting from a range of drivers, including urban and agricultural land use, hydropower generation and climate change. Understanding how stressors interfere and impact upon ecological status and ecosystem services is essential for developing effective River Basin Management Plans and shaping future environmental policy. This paper details the nature of these problems for Europe's water resources and the need to find solutions at a range of spatial scales. In terms of the latter, we describe the aims and approaches of the EU-funded project MARS (Managing Aquatic ecosystems and water Resources under multiple Stress) and the conceptual and analytical framework that it is adopting to provide this knowledge, understanding and tools needed to address multiple stressors. MARS is operating at three scales: At the water body scale, the mechanistic understanding of stressor interactions and their impact upon water resources, ecological status and ecosystem services will be examined through multi-factorial experiments and the analysis of long time-series. At the river basin scale, modelling and empirical approaches will be adopted to characterise relationships between multiple stressors and ecological responses, functions, services and water resources. The effects of future land use and mitigation scenarios in 16 European river basins will be assessed. At the European scale, large-scale spatial analysis will be carried out to identify the relationships amongst stress intensity, ecological status and service provision, with a special focus on large transboundary rivers, lakes and fish. The project will support managers and policy makers in the practical implementation of the Water Framework Directive (WFD), of related legislation and of the Blueprint to Safeguard Europe's Water Resources by advising the 3rd River Basin Management Planning cycle, the revision of the WFD and by developing new tools for diagnosing and predicting multiple stressors. Copyright © 2014. Published by Elsevier B.V.

  10. Changing Permafrost in the Arctic and its Global Effects in the 21st Century (PAGE21): A very large international and integrated project to measure the impact of permafrost degradation on the climate system

    NASA Astrophysics Data System (ADS)

    Lantuit, Hugues; Boike, Julia; Dahms, Melanie; Hubberten, Hans-Wolfgang

    2013-04-01

    The northern permafrost region contains approximately 50% of the estimated global below-ground organic carbon pool and more than twice as much as is contained in the current atmos-pheric carbon pool. The sheer size of this carbon pool, together with the large amplitude of predicted arctic climate change im-plies that there is a high potential for global-scale feedbacks from arctic climate change if these carbon reservoirs are desta-bilized. Nonetheless, significant gaps exist in our current state of knowledge that prevent us from producing accurate assess-ments of the vulnerability of the arctic permafrost to climate change, or of the implications of future climate change for global greenhouse gas (GHG) emissions. Specifically: • Our understanding of the physical and biogeochemical processes at play in permafrost areas is still insuffi-cient in some key aspects • Size estimates for the high latitude continental carbon and nitrogen stocks vary widely between regions and research groups. • The representation of permafrost-related processes in global climate models still tends to be rudimentary, and is one reason for the frequently poor perform-ances of climate models at high latitudes. The key objectives of PAGE21 are: • to improve our understanding of the processes affect-ing the size of the arctic permafrost carbon and nitro-gen pools through detailed field studies and monitor-ing, in order to quantify their size and their vulnerability to climate change, • to produce, assemble and assess high-quality datasets in order to develop and evaluate representations of permafrost and related processes in global models, • to improve these models accordingly, • to use these models to reduce the uncertainties in feed-backs from arctic permafrost to global change, thereby providing the means to assess the feasibility of stabili-zation scenarios, and • to ensure widespread dissemination of our results in order to provide direct input into the ongoing debate on climate-change mitigation. The concept of PAGE21 is to directly address these questions through a close interaction between monitoring activities, proc-ess studies and modeling on the pertinent temporal and spatial scales. Field sites have been selected to cover a wide range of environmental conditions for the validation of large scale mod-els, the development of permafrost monitoring capabilities, the study of permafrost processes, and for overlap with existing monitoring programs. PAGE21 will contribute to upgrading the project sites with the objective of providing a measurement baseline, both for process studies and for modeling programs. PAGE21 is determined to break down the traditional barriers in permafrost sciences between observational and model-supported site studies and large-scale climate modeling. Our concept for the interaction between site-scale studies and large-scale modeling is to establish and maintain a direct link be-tween these two areas for developing and evaluating, on all spatial scales, the land-surface modules of leading European global climate models taking part in the Coupled Model Inter-comparison Project Phase 5 (CMIP5), designed to inform the IPCC process. The timing of this project is such that the main scientific results from PAGE21, and in particular the model-based assessments will build entirely on new outputs and results from the CMIP5 Climate Model Intercomparison Project designed to inform the IPCC Fifth Assessment Report. However, PAGE21 is designed to leave a legacy that will en-dure beyond the lifetime of the projections that it produces. This legacy will comprise • an improved understanding of the key processes and parameters that determine the vulnerability of arctic permafrost to climate change, • the production of a suite of major European coupled climate models including detailed and validated repre-sentations of permafrost-related processes, that will reduce uncertainties in future climate projections pro-duced well beyond the lifetime of PAGE21, and • the training of a new generation of permafrost scien-tists who will bridge the long-standing gap between permafrost field science and global climate modeling, for the long-term benefit of science and society.

  11. Implementation of Fiber Optic Sensing System on Sandwich Composite Cylinder Buckling Test

    NASA Technical Reports Server (NTRS)

    Pena, Francisco; Richards, W. Lance; Parker, Allen R.; Piazza, Anthony; Schultz, Marc R.; Rudd, Michelle T.; Gardner, Nathaniel W.; Hilburger, Mark W.

    2018-01-01

    The National Aeronautics and Space Administration (NASA) Engineering and Safety Center Shell Buckling Knockdown Factor Project is a multicenter project tasked with developing new analysis-based shell buckling design guidelines and design factors (i.e., knockdown factors) through high-fidelity buckling simulations and advanced test technologies. To validate these new buckling knockdown factors for future launch vehicles, the Shell Buckling Knockdown Factor Project is carrying out structural testing on a series of large-scale metallic and composite cylindrical shells at the NASA Marshall Space Flight Center (Marshall Space Flight Center, Alabama). A fiber optic sensor system was used to measure strain on a large-scale sandwich composite cylinder that was tested under multiple axial compressive loads up to more than 850,000 lb, and equivalent bending loads over 22 million in-lb. During the structural testing of the composite cylinder, strain data were collected from optical cables containing distributed fiber Bragg gratings using a custom fiber optic sensor system interrogator developed at the NASA Armstrong Flight Research Center. A total of 16 fiber-optic strands, each containing nearly 1,000 fiber Bragg gratings, measuring strain, were installed on the inner and outer cylinder surfaces to monitor the test article global structural response through high-density real-time and post test strain measurements. The distributed sensing system provided evidence of local epoxy failure at the attachment-ring-to-barrel interface that would not have been detected with conventional instrumentation. Results from the fiber optic sensor system were used to further refine and validate structural models for buckling of the large-scale composite structures. This paper discusses the techniques employed for real-time structural monitoring of the composite cylinder for structural load introduction and distributed bending-strain measurements over a large section of the cylinder by utilizing unique sensing capabilities of fiber optic sensors.

  12. Hazardous thunderstorm intensification over Lake Victoria

    PubMed Central

    Thiery, Wim; Davin, Edouard L.; Seneviratne, Sonia I.; Bedka, Kristopher; Lhermitte, Stef; van Lipzig, Nicole P. M.

    2016-01-01

    Weather extremes have harmful impacts on communities around Lake Victoria, where thousands of fishermen die every year because of intense night-time thunderstorms. Yet how these thunderstorms will evolve in a future warmer climate is still unknown. Here we show that Lake Victoria is projected to be a hotspot of future extreme precipitation intensification by using new satellite-based observations, a high-resolution climate projection for the African Great Lakes and coarser-scale ensemble projections. Land precipitation on the previous day exerts a control on night-time occurrence of extremes on the lake by enhancing atmospheric convergence (74%) and moisture availability (26%). The future increase in extremes over Lake Victoria is about twice as large relative to surrounding land under a high-emission scenario, as only over-lake moisture advection is high enough to sustain Clausius–Clapeyron scaling. Our results highlight a major hazard associated with climate change over East Africa and underline the need for high-resolution projections to assess local climate change. PMID:27658848

  13. The Use of the Nature of Scientific Knowledge Scale as a Entrance Assessment in a Large, Online Citizen Science Project

    NASA Astrophysics Data System (ADS)

    Price, Aaron

    2010-01-01

    Citizen Sky is a new three-year, astronomical citizen science project launched in June, 2009 with funding from the National Science Foundation. This paper reports on early results of an assessment delivered to 1000 participants when they first joined the project. The goal of the assessment, based on the Nature of Scientific Knowledge Scale (NSKS), is to characterize their attitudes towards the nature of scientific knowledge. Our results are that the NSKS components of the assessment achieved high levels of reliability. Both reliability and overall scores fall within the range reported from other NSKS studies in the literature. Correlation analysis with other components of the assessment reveals some factors, such as age and understanding of scientific evidence, may be reflected in scores of subscales of NSKS items. Further work will be done using online discourse analysis and interviews. Overall, we find that the NSKS can be used as an entrance assessment for an online citizen science project.

  14. Space research - At a crossroads

    NASA Technical Reports Server (NTRS)

    Mcdonald, Frank B.

    1987-01-01

    Efforts which must be expended if U.S. space research is to regain vitality in the next few years are discussed. Small-scale programs are the cornerstone for big science projects, giving both researchers and students a chance to practice the development of space missions and hardware and identify promising goals for larger projects. Small projects can be carried aloft by balloons, sounding rockets, the Shuttle and ELVs. It is recommended that NASA continue the development of remote sensing systems, and join with other government agencies to fund space-based materials science, space biology and medical research. Increased international cooperation in space projects is necessary for affording moderate to large scale missions, for political reasons, and to maximize available space resources. Finally, the establishment and funding of long-range goals in space, particularly the development of the infrastructure and technologies for the exploration and colonization of the planets, must be viewed as the normal outgrowth of the capabilities being developed for LEO operations.

  15. The Norwegian national project for ethics support in community health and care services.

    PubMed

    Magelssen, Morten; Gjerberg, Elisabeth; Pedersen, Reidar; Førde, Reidun; Lillemoen, Lillian

    2016-11-08

    Internationally, clinical ethics support has yet to be implemented systematically in community health and care services. A large-scale Norwegian project (2007-2015) attempted to increase ethical competence in community services through facilitating the implementation of ethics support activities in 241 Norwegian municipalities. The article describes the ethics project and the ethics activities that ensued. The article first gives an account of the Norwegian ethics project. Then the results of two online questionnaires are reported, characterizing the scope, activities and organization of the ethics activities in the Norwegian municipalities and the ethical topics addressed. One hundred and thirty-seven municipal contact persons answered the first survey (55 % response rate), whereas 217 ethics facilitators from 48 municipalities responded to the second (33 % response rate). The Norwegian ethics project is vast in scope, yet has focused on some institutions and professions (e.g., nursing homes, home-based care; nurses, nurses' aides, unskilled workers) whilst seldom reaching others (e.g., child and adolescent health care; physicians). Patients and next of kin were very seldom involved. Through the ethics project employees discussed many important ethical challenges, in particular related to patient autonomy, competence to consent, and cooperation with next of kin. The "ethics reflection group" was the most common venue for ethics deliberation. The Norwegian project is the first of its kind and scope, and other countries may learn from the Norwegian experiences. Professionals have discussed central ethical dilemmas, the handling of which arguably makes a difference for patients/users and service quality. The study indicates that large (national) scale implementation of CES structures for the municipal health and care services is complex, yet feasible.

  16. Process for Low Cost Domestic Production of LIB Cathode Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurston, Anthony

    The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less

  17. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  18. Recovery Act: Oxy-Combustion Techology Development for Industrial-Scale Boiler Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levasseur, Armand

    2014-04-30

    Alstom Power Inc. (Alstom), under U.S. DOE/NETL Cooperative Agreement No. DE-NT0005290, is conducting a development program to generate detailed technical information needed for application of oxy-combustion technology. The program is designed to provide the necessary information and understanding for the next step of large-scale commercial demonstration of oxy combustion in tangentially fired boilers and to accelerate the commercialization of this technology. The main project objectives include: • Design and develop an innovative oxyfuel system for existing tangentially-fired boiler units that minimizes overall capital investment and operating costs. • Evaluate performance of oxyfuel tangentially fired boiler systems in pilot scale testsmore » at Alstom’s 15 MWth tangentially fired Boiler Simulation Facility (BSF). • Address technical gaps for the design of oxyfuel commercial utility boilers by focused testing and improvement of engineering and simulation tools. • Develop the design, performance and costs for a demonstration scale oxyfuel boiler and auxiliary systems. • Develop the design and costs for both industrial and utility commercial scale reference oxyfuel boilers and auxiliary systems that are optimized for overall plant performance and cost. • Define key design considerations and develop general guidelines for application of results to utility and different industrial applications. The project was initiated in October 2008 and the scope extended in 2010 under an ARRA award. The project completion date was April 30, 2014. Central to the project is 15 MWth testing in the BSF, which provided in-depth understanding of oxy-combustion under boiler conditions, detailed data for improvement of design tools, and key information for application to commercial scale oxy-fired boiler design. Eight comprehensive 15 MWth oxy-fired test campaigns were performed with different coals, providing detailed data on combustion, emissions, and thermal behavior over a matrix of fuels, oxyprocess variables and boiler design parameters. Significant improvement of CFD modeling tools and validation against 15 MWth experimental data has been completed. Oxy-boiler demonstration and large reference designs have been developed, supported with the information and knowledge gained from the 15 MWth testing. The results from the 15 MWth testing in the BSF and complimentary bench-scale testing are addressed in this volume (Volume II) of the final report. The results of the modeling efforts (Volume III) and the oxy boiler design efforts (Volume IV) are reported in separate volumes.« less

  19. Measuring the payback of research activities: a feasible ex-post evaluation methodology in epidemiology and public health.

    PubMed

    Aymerich, Marta; Carrion, Carme; Gallo, Pedro; Garcia, Maria; López-Bermejo, Abel; Quesada, Miquel; Ramos, Rafel

    2012-08-01

    Most ex-post evaluations of research funding programs are based on bibliometric methods and, although this approach has been widely used, it only examines one facet of the project's impact, that is, scientific productivity. More comprehensive models of payback assessment of research activities are designed for large-scale projects with extensive funding. The purpose of this study was to design and implement a methodology for the ex-post evaluation of small-scale projects that would take into account both the fulfillment of projects' stated objectives as well as other wider benefits to society as payback measures. We used a two-phase ex-post approach to appraise impact for 173 small-scale projects funded in 2007 and 2008 by a Spanish network center for research in epidemiology and public health. In the internal phase we used a questionnaire to query the principal investigator (PI) on the outcomes as well as actual and potential impact of each project; in the external phase we sent a second questionnaire to external reviewers with the aim of assessing (by peer-review) the performance of each individual project. Overall, 43% of the projects were rated as having completed their objectives "totally", and 40% "considerably". The research activities funded were reported by PIs as socially beneficial their greatest impact being on research capacity (50% of payback to society) and on knowledge translation (above 11%). The method proposed showed a good discriminating ability that makes it possible to measure, reliably, the extent to which a project's objectives were met as well as the degree to which the project contributed to enhance the group's scientific performance and of its social payback. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Fabrication and performance analysis of 4-sq cm indium tin oxide/InP photovoltaic solar cells

    NASA Technical Reports Server (NTRS)

    Gessert, T. A.; Li, X.; Phelps, P. W.; Coutts, T. J.; Tzafaras, N.

    1991-01-01

    Large-area photovoltaic solar cells based on direct current magnetron sputter deposition of indium tin oxide (ITO) into single-crystal p-InP substrates demonstrated both the radiation hardness and high performance necessary for extraterrestrial applications. A small-scale production project was initiated in which approximately 50 ITO/InP cells are being produced. The procedures used in this small-scale production of 4-sq cm ITO/InP cells are presented and discussed. The discussion includes analyses of performance range of all available production cells, and device performance data of the best cells thus far produced. Additionally, processing experience gained from the production of these cells is discussed, indicating other issues that may be encountered when large-scale productions are begun.

  1. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    PubMed

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  2. Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations

    NASA Technical Reports Server (NTRS)

    Sorensen, Danny C.

    1996-01-01

    Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.

  3. Dynamic Factorization in Large-Scale Optimization

    DTIC Science & Technology

    1994-01-01

    and variable production charges, distribution via multiple modes, taxes, duties and duty draw- back, and inventory charges. See Harrison, Arntzen and...34 Capital allocation and project selection via decomposition:’ presented at CORS/TIMS/ORSA meeting. Vancouver. Be ( 1989). T.P. Harrison. B.C. Arntzen and

  4. Macro optical projection tomography for large scale 3D imaging of plant structures and gene activity

    PubMed Central

    Lee, Karen J. I.; Calder, Grant M.; Hindle, Christopher R.; Newman, Jacob L.; Robinson, Simon N.; Avondo, Jerome J. H. Y.

    2017-01-01

    Abstract Optical projection tomography (OPT) is a well-established method for visualising gene activity in plants and animals. However, a limitation of conventional OPT is that the specimen upper size limit precludes its application to larger structures. To address this problem we constructed a macro version called Macro OPT (M-OPT). We apply M-OPT to 3D live imaging of gene activity in growing whole plants and to visualise structural morphology in large optically cleared plant and insect specimens up to 60 mm tall and 45 mm deep. We also show how M-OPT can be used to image gene expression domains in 3D within fixed tissue and to visualise gene activity in 3D in clones of growing young whole Arabidopsis plants. A further application of M-OPT is to visualise plant-insect interactions. Thus M-OPT provides an effective 3D imaging platform that allows the study of gene activity, internal plant structures and plant-insect interactions at a macroscopic scale. PMID:28025317

  5. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (LIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper will document the latest improvements of the LIF system design and demonstrations of the redeveloped AHF and IHF LIF systems.

  6. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: Overview and Air-side System Description

    NASA Technical Reports Server (NTRS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; hide

    2016-01-01

    This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  7. Invisible water, visible impact: groundwater use and Indian agriculture under climate change

    NASA Astrophysics Data System (ADS)

    Zaveri, Esha; Grogan, Danielle S.; Fisher-Vanden, Karen; Frolking, Steve; Lammers, Richard B.; Wrenn, Douglas H.; Prusevich, Alexander; Nicholas, Robert E.

    2016-08-01

    India is one of the world’s largest food producers, making the sustainability of its agricultural system of global significance. Groundwater irrigation underpins India’s agriculture, currently boosting crop production by enough to feed 170 million people. Groundwater overexploitation has led to drastic declines in groundwater levels, threatening to push this vital resource out of reach for millions of small-scale farmers who are the backbone of India’s food security. Historically, losing access to groundwater has decreased agricultural production and increased poverty. We take a multidisciplinary approach to assess climate change challenges facing India’s agricultural system, and to assess the effectiveness of large-scale water infrastructure projects designed to meet these challenges. We find that even in areas that experience climate change induced precipitation increases, expansion of irrigated agriculture will require increasing amounts of unsustainable groundwater. The large proposed national river linking project has limited capacity to alleviate groundwater stress. Thus, without intervention, poverty and food insecurity in rural India is likely to worsen.

  8. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  9. Distributed and grid computing projects with research focus in human health.

    PubMed

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  10. The Power of Engaging Citizen Scientists for Scientific Progress

    PubMed Central

    Garbarino, Jeanne; Mason, Christopher E.

    2016-01-01

    Citizen science has become a powerful force for scientific inquiry, providing researchers with access to a vast array of data points while connecting nonscientists to the authentic process of science. This citizen-researcher relationship creates an incredible synergy, allowing for the creation, execution, and analysis of research projects that would otherwise prove impossible in traditional research settings, namely due to the scope of needed human or financial resources (or both). However, citizen-science projects are not without their challenges. For instance, as projects are scaled up, there is concern regarding the rigor and usability of data collected by citizens who are not formally trained in research science. While these concerns are legitimate, we have seen examples of highly successful citizen-science projects from multiple scientific disciplines that have enhanced our collective understanding of science, such as how RNA molecules fold or determining the microbial metagenomic snapshot of an entire public transportation system. These and other emerging citizen-science projects show how improved protocols for reliable, large-scale science can realize both an improvement of scientific understanding for the general public and novel views of the world around us. PMID:27047581

  11. Compactified cosmological simulations of the infinite universe

    NASA Astrophysics Data System (ADS)

    Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László

    2018-06-01

    We present a novel N-body simulation method that compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to follow the evolution of the large-scale structure. Our approach eliminates the need for periodic boundary conditions, a mere numerical convenience which is not supported by observation and which modifies the law of force on large scales in an unrealistic fashion. We demonstrate that our method outclasses standard simulations executed on workstation-scale hardware in dynamic range, it is balanced in following a comparable number of high and low k modes and, its fundamental geometry and topology match observations. Our approach is also capable of simulating an expanding, infinite universe in static coordinates with Newtonian dynamics. The price of these achievements is that most of the simulated volume has smoothly varying mass and spatial resolution, an approximation that carries different systematics than periodic simulations. Our initial implementation of the method is called StePS which stands for Stereographically projected cosmological simulations. It uses stereographic projection for space compactification and naive O(N^2) force calculation which is nevertheless faster to arrive at a correlation function of the same quality than any standard (tree or P3M) algorithm with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence our code can function as a high-speed prediction tool for modern large-scale surveys. To learn about the limits of the respective methods, we compare StePS with GADGET-2 running matching initial conditions.

  12. Regional Climate Change across North America in 2030 Projected from RCP6.0

    NASA Astrophysics Data System (ADS)

    Otte, T.; Nolte, C. G.; Faluvegi, G.; Shindell, D. T.

    2012-12-01

    Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. In this research, downscaling techniques that we developed with historical data are now applied to GCM fields. Results from downscaling NASA/GISS ModelE2 simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model has been used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 over North America and illustrate potential changes in regional climate that are projected by ModelE2 and WRF under RCP6.0. The analysis focuses on regional climate fields that most strongly influence the interactions between climate change and air quality. In particular, an analysis of extreme temperature and precipitation events will be presented.

  13. XLID-Causing Mutations and Associated Genes Challenged in Light of Data From Large-Scale Human Exome Sequencing

    PubMed Central

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-01-01

    Because of the unbalanced sex ratio (1.3–1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. PMID:23871722

  14. DebugIT for patient safety - improving the treatment with antibiotics through multimedia data mining of heterogeneous clinical data.

    PubMed

    Lovis, Christian; Colaert, Dirk; Stroetmann, Veli N

    2008-01-01

    The concepts and architecture underlying a large-scale integrating project funded within the 7th EU Framework Programme (FP7) are discussed. The main objective of the project is to build a tool that will have a significant impact for the monitoring and the control of infectious diseases and antimicrobial resistances in Europe; This will be realized by building a technical and semantic infrastructure able to share heterogeneous clinical data sets from different hospitals in different countries, with different languages and legislations; to analyze large amounts of this clinical data with advanced multimedia data mining and finally apply the obtained knowledge for clinical decisions and outcome monitoring. There are numerous challenges in this project at all levels, technical, semantical, legal and ethical that will have to be addressed.

  15. A rapid and cost-effective method for sequencing pooled cDNA clones by using a combination of transposon insertion and Gateway technology.

    PubMed

    Morozumi, Takeya; Toki, Daisuke; Eguchi-Ogawa, Tomoko; Uenishi, Hirohide

    2011-09-01

    Large-scale cDNA-sequencing projects require an efficient strategy for mass sequencing. Here we describe a method for sequencing pooled cDNA clones using a combination of transposon insertion and Gateway technology. Our method reduces the number of shotgun clones that are unsuitable for reconstruction of cDNA sequences, and has the advantage of reducing the total costs of the sequencing project.

  16. Vegetation changes in recent large-scale ecological restoration projects and subsequent impact on water resources in China's Loess Plateau.

    PubMed

    Li, Shuai; Liang, Wei; Fu, Bojie; Lü, Yihe; Fu, Shuyi; Wang, Shuai; Su, Huimin

    2016-11-01

    Recently, relationship between vegetation activity and temperature variability has received much attention in China. However, vegetation-induced changes in water resources through changing land surface energy balance (e.g. albedo), has not been well documented. This study investigates the underlying causes of vegetation change and subsequent impacts on runoff for the Northern Shaanxi Loess Plateau. Results show that satellite-derived vegetation index has experienced a significantly increasing trend during the past three decades, especially during 2000-2012. Large-scale ecological restorations, i.e., the Natural Forest Conservation project and the Grain for Green project, are found to be the primary driving factors for vegetation increase. The increased vegetation coverage induces decrease in surface albedo and results in an increase in temperature. This positive effect can be counteracted by higher evapotranspiration and the net effect is a decrease in daytime land surface temperature. A higher evapotranspiration rate from restored vegetation is the primary reason for the reduced runoff coefficient. Other factors including less heavy precipitation, increased water consumption from town, industry and agriculture also appear to be the important causes for the reduction of runoff. These two ecological restoration projects produce both positive and negative effects on the overall ecosystem services. Thus, long-term continuous monitoring is needed. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The bungling giant: Atomic Energy Canada Limited and next-generation nuclear technology, 1980--1994

    NASA Astrophysics Data System (ADS)

    Slater, Ian James

    From 1980--1994 Atomic Energy Canada Limited (AECL), the Crown Corporation responsible for the development of nuclear technology in Canada, ventured into the market for small-scale, decentralized power systems with the Slowpoke Energy System (SES), a 10MW nuclear reactor for space heating in urban and remote areas. The SES was designed to be "passively" or "inherently" safe, such that even the most catastrophic failure of the system would not result in a serious accident (e.g. a meltdown or an explosion). This Canadian initiative, a beneficiary of the National Energy Program, was the first and by far the most successful attempt at a passively safe, decentralized nuclear power system anywhere in the world. Part one uses archival documentation and interviews with project leaders to reconstruct the history of the SES. The standard explanations for the failure of the project, cheap oil, public resistance to the technology, and lack of commercial expertise, are rejected. Part two presents an alternative explanation for the failure of AECL to commercialize the SES. In short, technological momentum towards large-scale nuclear designs led to structural restrictions for the SES project. These restrictions manifested themselves internally to the company (e.g., marginalization of the SES) and externally to the company (e.g., licensing). In part three, the historical lessons of the SES are used to refine one of the central tenets of Popper's political philosophy, "piecemeal social engineering." Popper's presentation of the idea is lacking in detail; the analysis of the SES provides some empirical grounding for the concept. I argue that the institutions surrounding traditional nuclear power represent a form utopian social engineering, leading to consequences such as the suspension of civil liberties to guarantee security of the technology. The SES project was an example of a move from the utopian social engineering of large-scale centralized nuclear technology to the piecemeal social engineering of small-scale, safer and simpler decentralized nuclear heating.

  18. Analysis of the ability of large-scale reanalysis data to define Siberian fire danger in preparation for future fire prediction

    NASA Astrophysics Data System (ADS)

    Soja, Amber; Westberg, David; Stackhouse, Paul, Jr.; McRae, Douglas; Jin, Ji-Zhong; Sukhinin, Anatoly

    2010-05-01

    Fire is the dominant disturbance that precipitates ecosystem change in boreal regions, and fire is largely under the control of weather and climate. Fire frequency, fire severity, area burned and fire season length are predicted to increase in boreal regions under current climate change scenarios. Therefore, changes in fire regimes have the potential to compel ecological change, moving ecosystems more quickly towards equilibrium with a new climate. The ultimate goal of this research is to assess the viability of large-scale (1°) data to be used to define fire weather danger and fire regimes, so that large-scale data can be confidently used to predict future fire regimes using large-scale fire weather data, like that available from current Intergovernmental Panel on Climate Change (IPCC) climate change scenarios. In this talk, we intent to: (1) evaluate Fire Weather Indices (FWI) derived using reanalysis and interpolated station data; (2) discuss the advantages and disadvantages of using these distinct data sources; and (3) highlight established relationships between large-scale fire weather data, area burned, active fires and ecosystems burned. Specifically, the Canadian Forestry Service (CFS) Fire Weather Index (FWI) will be derived using: (1) NASA Goddard Earth Observing System version 4 (GEOS-4) large-scale reanalysis and NASA Global Precipitation Climatology Project (GPCP) data; and National Climatic Data Center (NCDC) surface station-interpolated data. Requirements of the FWI are local noon surface-level air temperature, relative humidity, wind speed, and daily (noon-noon) rainfall. GEOS-4 reanalysis and NCDC station-interpolated fire weather indices are generally consistent spatially, temporally and quantitatively. Additionally, increased fire activity coincides with increased FWI ratings in both data products. Relationships have been established between large-scale FWI to area burned, fire frequency, ecosystem types, and these can be use to estimate historic and future fire regimes.

  19. NBS/NIST Gas Thermometry From 0 to 660 °C

    PubMed Central

    Schooley, J. F.

    1990-01-01

    In the NBS/NIST Gas Thermometry program, constant-volume gas thermometers, a unique mercury manometer, and a highly accurate thermal expansion apparatus have been employed to evaluate temperatures on the Kelvin Thermodynamic Temperature Scale (KTTS) that correspond to particular temperatures on the 1968 International Practical Temperature Scale (IPTS-68). In this paper, we present a summary of the NBS/NIST Gas Thermometry project, which originated with planning activities in the late 1920s and was completed by measurements of the differences t(KTTS)-t(IPTS-68) in the range 0 to 660 °C. Early results of this project were the first to demonstrate the surprisingly large inaccuracy of the IPTS-68 with respect to the KTTS above 0 °C. Advances in several different measurement techniques, development of new, specialized instruments, and two distinct sets of gas thermometry observations have resulted from the project. PMID:28179778

  20. Supersonic Retropropulsion Technology Development in NASA's Entry, Descent, and Landing Project

    NASA Technical Reports Server (NTRS)

    Edquist, Karl T.; Berry, Scott A.; Rhode, Matthew N.; Kelb, Bil; Korzun, Ashley; Dyakonov, Artem A.; Zarchi, Kerry A.; Schauerhamer, Daniel G.; Post, Ethan A.

    2012-01-01

    NASA's Entry, Descent, and Landing (EDL) space technology roadmap calls for new technologies to achieve human exploration of Mars in the coming decades [1]. One of those technologies, termed Supersonic Retropropulsion (SRP), involves initiation of propulsive deceleration at supersonic Mach numbers. The potential benefits afforded by SRP to improve payload mass and landing precision make the technology attractive for future EDL missions. NASA's EDL project spent two years advancing the technological maturity of SRP for Mars exploration [2-15]. This paper summarizes the technical accomplishments from the project and highlights challenges and recommendations for future SRP technology development programs. These challenges include: developing sufficiently large SRP engines for use on human-scale entry systems; testing and computationally modelling complex and unsteady SRP fluid dynamics; understanding the effects of SRP on entry vehicle stability and controllability; and demonstrating sub-scale SRP entry systems in Earth's atmosphere.

  1. Risk-based Prioritization of Facility Decommissioning and Environmental Restoration Projects in the National Nuclear Legacy Liabilities Program at the Chalk River Laboratory - 13564

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Jerel G.; Kruzic, Michael; Castillo, Carlos

    2013-07-01

    Chalk River Laboratory (CRL), located in Ontario Canada, has a large number of remediation projects currently in the Nuclear Legacy Liabilities Program (NLLP), including hundreds of facility decommissioning projects and over one hundred environmental remediation projects, all to be executed over the next 70 years. Atomic Energy of Canada Limited (AECL) utilized WorleyParsons to prioritize the NLLP projects at the CRL through a risk-based prioritization and ranking process, using the WorleyParsons Sequencing Unit Prioritization and Estimating Risk Model (SUPERmodel). The prioritization project made use of the SUPERmodel which has been previously used for other large-scale site prioritization and sequencing ofmore » facilities at nuclear laboratories in the United States. The process included development and vetting of risk parameter matrices as well as confirmation/validation of project risks. Detailed sensitivity studies were also conducted to understand the impacts that risk parameter weighting and scoring had on prioritization. The repeatable prioritization process yielded an objective, risk-based and technically defendable process for prioritization that gained concurrence from all stakeholders, including Natural Resources Canada (NRCan) who is responsible for the oversight of the NLLP. (authors)« less

  2. Research on the Application of Rapid Surveying and Mapping for Large Scare Topographic Map by Uav Aerial Photography System

    NASA Astrophysics Data System (ADS)

    Gao, Z.; Song, Y.; Li, C.; Zeng, F.; Wang, F.

    2017-08-01

    Rapid acquisition and processing method of large scale topographic map data, which relies on the Unmanned Aerial Vehicle (UAV) low-altitude aerial photogrammetry system, is studied in this paper, elaborating the main work flow. Key technologies of UAV photograph mapping is also studied, developing a rapid mapping system based on electronic plate mapping system, thus changing the traditional mapping mode and greatly improving the efficiency of the mapping. Production test and achievement precision evaluation of Digital Orth photo Map (DOM), Digital Line Graphic (DLG) and other digital production were carried out combined with the city basic topographic map update project, which provides a new techniques for large scale rapid surveying and has obvious technical advantage and good application prospect.

  3. Modeling the Economic Feasibility of Large-Scale Net-Zero Water Management: A Case Study.

    PubMed

    Guo, Tianjiao; Englehardt, James D; Fallon, Howard J

      While municipal direct potable water reuse (DPR) has been recommended for consideration by the U.S. National Research Council, it is unclear how to size new closed-loop DPR plants, termed "net-zero water (NZW) plants", to minimize cost and energy demand assuming upgradient water distribution. Based on a recent model optimizing the economics of plant scale for generalized conditions, the authors evaluated the feasibility and optimal scale of NZW plants for treatment capacity expansion in Miami-Dade County, Florida. Local data on population distribution and topography were input to compare projected costs for NZW vs the current plan. Total cost was minimized at a scale of 49 NZW plants for the service population of 671,823. Total unit cost for NZW systems, which mineralize chemical oxygen demand to below normal detection limits, is projected at ~$10.83 / 1000 gal, approximately 13% above the current plan and less than rates reported for several significant U.S. cities.

  4. Driving terrestrial ecosystem models from space

    NASA Technical Reports Server (NTRS)

    Waring, R. H.

    1993-01-01

    Regional air pollution, land-use conversion, and projected climate change all affect ecosystem processes at large scales. Changes in vegetation cover and growth dynamics can impact the functioning of ecosystems, carbon fluxes, and climate. As a result, there is a need to assess and monitor vegetation structure and function comprehensively at regional to global scales. To provide a test of our present understanding of how ecosystems operate at large scales we can compare model predictions of CO2, O2, and methane exchange with the atmosphere against regional measurements of interannual variation in the atmospheric concentration of these gases. Recent advances in remote sensing of the Earth's surface are beginning to provide methods for estimating important ecosystem variables at large scales. Ecologists attempting to generalize across landscapes have made extensive use of models and remote sensing technology. The success of such ventures is dependent on merging insights and expertise from two distinct fields. Ecologists must provide the understanding of how well models emulate important biological variables and their interactions; experts in remote sensing must provide the biophysical interpretation of complex optical reflectance and radar backscatter data.

  5. Developing a Framework for Seamless Prediction of Sub-Seasonal to Seasonal Extreme Precipitation Events in the United States.

    NASA Astrophysics Data System (ADS)

    Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.

    2017-12-01

    Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.

  6. Evolving from bioinformatics in-the-small to bioinformatics in-the-large.

    PubMed

    Parker, D Stott; Gorlick, Michael M; Lee, Christopher J

    2003-01-01

    We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.

  7. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  8. OPERATION SUNBEAM, SHOT SMALL BOY. Project Officer’s Report. Project 2.9. Fallout Collection and Gross Sample Analysis

    DTIC Science & Technology

    1985-09-01

    concepts for large scale radiological defense operations. 2. 212 CP 2 PR0c J 2.1 PLAJWU This project participated in the single shotp A Boy, fired on a...inch diameter by 3 inch high NaI scintilla- tion counter taking sample tubes up to 1-1/4 inches in diameter. -The crystal was mounted on an W type...inch high Nal scin- tillation counter. The crystal -source distance was 8-3/4 inches. Shielding consisted of a lead cylinder 2 inches thick, 7 inches

  9. Translation of SNOMED CT - strategies and description of a pilot project.

    PubMed

    Klein, Gunnar O; Chen, Rong

    2009-01-01

    The translation and localization of SNOMED CT (Systematized Nomenclature of Medicine - Clinical Terms) have been initiated in a few countries. In Sweden, we conducted the first evaluation of this terminology in a project called REFTERM in which we also developed a software tool which could handle a large scale translation with a number of translators and reviewers in a web-based environment. The system makes use of existing authorized English-Swedish translations of medical terminologies such as ICD-10. The paper discusses possible strategies for a national project to translate and adapt this terminology.

  10. Inferring cortical function in the mouse visual system through large-scale systems neuroscience.

    PubMed

    Hawrylycz, Michael; Anastassiou, Costas; Arkhipov, Anton; Berg, Jim; Buice, Michael; Cain, Nicholas; Gouwens, Nathan W; Gratiy, Sergey; Iyer, Ramakrishnan; Lee, Jung Hoon; Mihalas, Stefan; Mitelut, Catalin; Olsen, Shawn; Reid, R Clay; Teeter, Corinne; de Vries, Saskia; Waters, Jack; Zeng, Hongkui; Koch, Christof

    2016-07-05

    The scientific mission of the Project MindScope is to understand neocortex, the part of the mammalian brain that gives rise to perception, memory, intelligence, and consciousness. We seek to quantitatively evaluate the hypothesis that neocortex is a relatively homogeneous tissue, with smaller functional modules that perform a common computational function replicated across regions. We here focus on the mouse as a mammalian model organism with genetics, physiology, and behavior that can be readily studied and manipulated in the laboratory. We seek to describe the operation of cortical circuitry at the computational level by comprehensively cataloging and characterizing its cellular building blocks along with their dynamics and their cell type-specific connectivities. The project is also building large-scale experimental platforms (i.e., brain observatories) to record the activity of large populations of cortical neurons in behaving mice subject to visual stimuli. A primary goal is to understand the series of operations from visual input in the retina to behavior by observing and modeling the physical transformations of signals in the corticothalamic system. We here focus on the contribution that computer modeling and theory make to this long-term effort.

  11. Using real options to evaluate the flexibility in the deployment of SMR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Locatelli, G.; Mancini, M.; Ruiz, F.

    2012-07-01

    According to recent estimations the financial gap between Large Reactors (LR) and Small Medium Reactors (SMRs) seems not as huge as the economy of scale would suggest, so the SMRs are going to be important players of the worldwide nuclear renaissance. POLIMIs INCAS model has been developed to compare the investment in SMR with respect to LR. It provides the value of IRR (Internal Rate of Return), NPV (Net Present Value), LUEC (Levelized Unitary Electricity Cost), up-front investment, etc. The aim of this research is to integrate the actual INCAS model, based on discounted cash flows, with the real optionmore » theory to measure flexibility of the investor to expand, defer or abandon a nuclear project, under future uncertainties. The work compares the investment in a large nuclear power plant with a series of smaller, modular nuclear power plants on the same site. As a consequence it compares the benefits of the large power plant, coming from the economy of scale, to the benefit of the modular project (flexibility) concluding that managerial flexibility can be measured and used by an investor to face the investment risks. (authors)« less

  12. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows

    PubMed Central

    O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774

  13. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows.

    PubMed

    O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).

  14. Role of slope stability in cumulative impact assessment of hydropower development: North Cascades, Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, R.R.; Staub, W.P.

    1993-08-01

    Two environmental assessments considered the potential cumulative environmental impacts resulting from the development of eight proposed hydropower projects in the Nooksack River Basin and 11 proposed projects in the Skagit River Basin, North Cascades, Washington, respectively. While not identified as a target resource, slope stability and the alteration of sediment supply to creeks and river mainstems significantly affect other resources. The slope stability assessment emphasized the potential for cumulative impacts under disturbed conditions (e.g., road construction and timber harvesting) and a landslide-induced pipeline rupture scenario. In the case of small-scale slides, the sluicing action of ruptured pipeline water on themore » fresh landslide scarp was found to be capable of eroding significantly more material than the original landslide. For large-scale landslides, sluiced material was found to be a small increment of the original landslide. These results predicted that hypothetical accidental pipeline rupture by small-scale landslides may result in potential cumulative impacts for 12 of the 19 projects with pending license applications in both river basins. 5 refs., 2 tabs.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Qiang

    The rational design of materials, the development of accurate and efficient material simulation algorithms, and the determination of the response of materials to environments and loads occurring in practice all require an understanding of mechanics at disparate spatial and temporal scales. The project addresses mathematical and numerical analyses for material problems for which relevant scales range from those usually treated by molecular dynamics all the way up to those most often treated by classical elasticity. The prevalent approach towards developing a multiscale material model couples two or more well known models, e.g., molecular dynamics and classical elasticity, each of whichmore » is useful at a different scale, creating a multiscale multi-model. However, the challenges behind such a coupling are formidable and largely arise because the atomistic and continuum models employ nonlocal and local models of force, respectively. The project focuses on a multiscale analysis of the peridynamics materials model. Peridynamics can be used as a transition between molecular dynamics and classical elasticity so that the difficulties encountered when directly coupling those two models are mitigated. In addition, in some situations, peridynamics can be used all by itself as a material model that accurately and efficiently captures the behavior of materials over a wide range of spatial and temporal scales. Peridynamics is well suited to these purposes because it employs a nonlocal model of force, analogous to that of molecular dynamics; furthermore, at sufficiently large length scales and assuming smooth deformation, peridynamics can be approximated by classical elasticity. The project will extend the emerging mathematical and numerical analysis of peridynamics. One goal is to develop a peridynamics-enabled multiscale multi-model that potentially provides a new and more extensive mathematical basis for coupling classical elasticity and molecular dynamics, thus enabling next generation atomistic-to-continuum multiscale simulations. In addition, a rigorous studyof nite element discretizations of peridynamics will be considered. Using the fact that peridynamics is spatially derivative free, we will also characterize the space of admissible peridynamic solutions and carry out systematic analyses of the models, in particular rigorously showing how peridynamics encompasses fracture and other failure phenomena. Additional aspects of the project include the mathematical and numerical analysis of peridynamics applied to stochastic peridynamics models. In summary, the project will make feasible mathematically consistent multiscale models for the analysis and design of advanced materials.« less

  16. Predictors of Sustainability of Social Programs

    ERIC Educational Resources Information Center

    Savaya, Riki; Spiro, Shimon E.

    2012-01-01

    This article presents the findings of a large scale study that tested a comprehensive model of predictors of three manifestations of sustainability: continuation, institutionalization, and duration. Based on the literature the predictors were arrayed in four groups: variables pertaining to the project, the auspice organization, the community, and…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lenee-Bluhm, P.; Rhinefrank, Ken

    The overarching project objective is to demonstrate the feasibility of using an innovative PowerTake-Off (PTO) Module in Columbia Power's utility-scale wave energy converter (WEC). The PTO Module uniquely combines a large-diameter, direct-drive, rotary permanent magnet generator; a patent-pending rail-bearing system; and a corrosion-resistant fiber-reinforced-plastic structure

  18. Progress toward a low budget reference grade genome assembly

    USDA-ARS?s Scientific Manuscript database

    Reference quality de novo genome assemblies were once solely the domain of large, well-funded genome projects. While next-generation short read technology removed some of the cost barriers, accurate chromosome-scale assembly remains a real challenge. Here we present efforts to de novo assemble the...

  19. Encouraging Gender Analysis in Research Practice

    ERIC Educational Resources Information Center

    Thien, Deborah

    2009-01-01

    Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…

  20. Large Wind Energy Converter: Growian 3 MW

    NASA Technical Reports Server (NTRS)

    Feustel, J. E.; Helm, S.; Koerber, F.

    1980-01-01

    The final report on the projected application of larger-scale wind turbine on the northern German coast is summarized. The designs of the tower, machinery housing, rotor, and rotor blades are described accompanied various construction materials are examined. Rotor blade adjustment devices auxiliary and accessory equipment are examined.

Top