Sample records for large-scale demonstration projects

  1. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoliang; Stauffer, Philip H.

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  2. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    NASA Astrophysics Data System (ADS)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  3. Project BALLOTS: Bibliographic Automation of Large Library Operations Using a Time-Sharing System. Progress Report (3/27/69 - 6/26/69).

    ERIC Educational Resources Information Center

    Veaner, Allen B.

    Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…

  4. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  5. Talking About The Smokes: a large-scale, community-based participatory research project.

    PubMed

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  6. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    PubMed

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  7. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    PubMed Central

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  8. Composites for Exploration Upper Stage

    NASA Technical Reports Server (NTRS)

    Fikes, J. C.; Jackson, J. R.; Richardson, S. W.; Thomas, A. D.; Mann, T. O.; Miller, S. G.

    2016-01-01

    The Composites for Exploration Upper Stage (CEUS) was a 3-year, level III project within the Technology Demonstration Missions program of the NASA Space Technology Mission Directorate. Studies have shown that composites provide important programmatic enhancements, including reduced weight to increase capability and accelerated expansion of exploration and science mission objectives. The CEUS project was focused on technologies that best advanced innovation, infusion, and broad applications for the inclusion of composites on future large human-rated launch vehicles and spacecraft. The benefits included near- and far-term opportunities for infusion (NASA, industry/commercial, Department of Defense), demonstrated critical technologies and technically implementable evolvable innovations, and sustained Agency experience. The initial scope of the project was to advance technologies for large composite structures applicable to the Space Launch System (SLS) Exploration Upper Stage (EUS) by focusing on the affordability and technical performance of the EUS forward and aft skirts. The project was tasked to develop and demonstrate critical composite technologies with a focus on full-scale materials, design, manufacturing, and test using NASA in-house capabilities. This would have demonstrated a major advancement in confidence and matured the large-scale composite technology to a Technology Readiness Level 6. This project would, therefore, have bridged the gap for providing composite application to SLS upgrades, enabling future exploration missions.

  9. Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  10. Fabrication of the HIAD Large-Scale Demonstration Assembly

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  11. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  12. NREL, California Independent System Operator, and First Solar | Energy

    Science.gov Websites

    Solar NREL, California Independent System Operator, and First Solar Demonstrate Essential Reliability Services with Utility-Scale Solar NREL, the California Independent System Operator (CAISO), and First Solar conducted a demonstration project on a large utility-scale photovoltaic (PV) power plant in California to

  13. The EMCC / DARPA Massively Parallel Electromagnetic Scattering Project

    NASA Technical Reports Server (NTRS)

    Woo, Alex C.; Hill, Kueichien C.

    1996-01-01

    The Electromagnetic Code Consortium (EMCC) was sponsored by the Advanced Research Program Agency (ARPA) to demonstrate the effectiveness of massively parallel computing in large scale radar signature predictions. The EMCC/ARPA project consisted of three parts.

  14. Comparison of WinSLAMM Modeled Results with Monitored Biofiltration Data

    EPA Science Inventory

    The US EPA’s Green Infrastructure Demonstration project in Kansas City incorporates both small scale individual biofiltration device monitoring, along with large scale watershed monitoring. The test watershed (100 acres) is saturated with green infrastructure components (includin...

  15. Trenton Free-Fare Demonstration Project

    DOT National Transportation Integrated Search

    1978-12-01

    The "Trenton Free-Fare Demonstration" is the first large-scale test of free transit in the U.S. The New Jersey Department of Transportation, in cooperation with UMTA, Mercer County, and Mercer County Improvement Authority, is administering an Off-Pea...

  16. Demonstrating a new framework for the comparison of environmental impacts from small- and large-scale hydropower and wind power projects.

    PubMed

    Bakken, Tor Haakon; Aase, Anne Guri; Hagen, Dagmar; Sundt, Håkon; Barton, David N; Lujala, Päivi

    2014-07-01

    Climate change and the needed reductions in the use of fossil fuels call for the development of renewable energy sources. However, renewable energy production, such as hydropower (both small- and large-scale) and wind power have adverse impacts on the local environment by causing reductions in biodiversity and loss of habitats and species. This paper compares the environmental impacts of many small-scale hydropower plants with a few large-scale hydropower projects and one wind power farm, based on the same set of environmental parameters; land occupation, reduction in wilderness areas (INON), visibility and impacts on red-listed species. Our basis for comparison was similar energy volumes produced, without considering the quality of the energy services provided. The results show that small-scale hydropower performs less favourably in all parameters except land occupation. The land occupation of large hydropower and wind power is in the range of 45-50 m(2)/MWh, which is more than two times larger than the small-scale hydropower, where the large land occupation for large hydropower is explained by the extent of the reservoirs. On all the three other parameters small-scale hydropower performs more than two times worse than both large hydropower and wind power. Wind power compares similarly to large-scale hydropower regarding land occupation, much better on the reduction in INON areas, and in the same range regarding red-listed species. Our results demonstrate that the selected four parameters provide a basis for further development of a fair and consistent comparison of impacts between the analysed renewable technologies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  18. Large scale multiprocessor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gajski, D.; Kuck, D.; Lawrie, D.

    1983-03-01

    The primary goal of the Cedar project is to demonstrate that supercomputers of the future can exhibit general purpose behavior and be easy to use. The Cedar project is based on five key developments which have reached fruition in the past year and taken together offer a comprehensive solution to these problems. The author looks at this project, and how its goals are being met.

  19. Municipal Sludge Application in Forests of Northern Michigan: a Case Study.

    Treesearch

    D.G. Brockway; P.V. Nguyen

    1986-01-01

    A large-scale operational demonstration and research project was cooperatively established by the US. Environmental Protection Agency, Michigan Department of Natural Resources, and Michigan State University to evaluate the practice of forest land application as an option for sludge utilization. Project objectives included completing (1) a logistic and economic...

  20. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  1. Meta-Analyzing a Complex Correlational Dataset: A Case Study Using Correlations That Measure the Relationship between Parental Involvement and Academic Achievement

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Wilson, Sandra Jo

    2014-01-01

    The purpose of this project is to demonstrate the practical methods developed to utilize a dataset consisting of both multivariate and multilevel effect size data. The context for this project is a large-scale meta-analytic review of the predictors of academic achievement. This project is guided by three primary research questions: (1) How do we…

  2. Managing Risk and Uncertainty in Large-Scale University Research Projects

    ERIC Educational Resources Information Center

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  3. Large-Scale Campus Computer Technology Implementation: Lessons from the First Year.

    ERIC Educational Resources Information Center

    Nichols, Todd; Frazer, Linda H.

    The purpose of the Elementary Technology Demonstration Schools (ETDS) Project, funded by IBM and Apple, Inc., was to demonstrate the effectiveness of technology in accelerating the learning of low achieving at-risk students and enhancing the education of high achieving students. The paper begins by giving background information on the district,…

  4. Desalination: Status and Federal Issues

    DTIC Science & Technology

    2009-12-30

    on one side and lets purified water through. Reverse osmosis plants have fewer problems with corrosion and usually have lower energy requirements...Texas) and cities are actively researching and investigating the feasibility of large-scale desalination plants for municipal water supplies...desalination research and development, and in construction and operational costs of desalination demonstration projects and full-scale plants

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hite, Roger

    The project site is located in Livingston Parish, Louisiana, approximately 26 miles due east of Baton Rouge. This project proposed to evaluate an early Eocene-aged Wilcox oil reservoir for permanent storage of CO 2. Blackhorse Energy, LLC planned to conduct a parallel CO 2 oil recovery project in the First Wilcox Sand. The primary focus of this project was to examine and prove the suitability of South Louisiana geologic formations for large-scale geologic sequestration of CO 2 in association with enhanced oil recovery applications. This was to be accomplished through the focused demonstration of small-scale, permanent storage of CO 2more » in the First Wilcox Sand. The project was terminated at the request of Blackhorse Energy LLC on October 22, 2014.« less

  6. EVALUATION PLAN FOR TWO LARGE-SCALE LANDFILL BIOREACTOR TECHNOLOGIES

    EPA Science Inventory

    Abstract - Waste Management, Inc., is operating two long-term bioreactor studies at the Outer Loop Landfill in Louisville, KY, including facultative landfill bioreactor and staged aerobic-anaerobic landfill bioreactor demonstrations. A Quality Assurance Project Plan (QAPP) was p...

  7. Large Scale Evaluation fo Nickel Aluminide Rolls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2005-09-01

    This completed project was a joint effort between Oak Ridge National Laboratory and Bethlehem Steel (now Mittal Steel) to demonstrate the effectiveness of using nickel aluminide intermetallic alloy rolls as part of an updated, energy-efficient, commercial annealing furnace system.

  8. Existing Whole-House Solutions Case Study: Pilot Demonstration of Phased Retrofits in Florida Homes - Central and South Florida Homes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2014-08-01

    In this pilot project, the Building America Partnership for Improved Residential Construction and Florida Power and Light are collaborating to retrofit a large number of homes using a phased approach to both simple and deep retrofits. This project will provide the information necessary to significantly reduce energy use through larger community-scale projects in collaboration with utilities, program administrators and other market leader stakeholders.

  9. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenison, LaVesta; Flanigan, Thomas; Hagerty, Gregg

    The primary objectives of the FutureGen 2.0 CO 2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO 2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO 2 capture in steady-state operations. The project was to be fully integratedmore » in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO 2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO 2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO 2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will be helpful to plotting the course of, and successfully executing future large demonstration projects. This Final Scientific and Technical Report describes the technology and engineering basis of the project, inclusive of process systems, performance, effluents and emissions, and controls. Further, the project cost estimate, schedule, and permitting requirements are presented, along with a project risk and opportunity assessment. Lessons-learned related to these elements are summarized in this report. Companion reports Oxy-combustion further document the accomplishments and learnings of the project, including: A.01 Project Management Report which describes what was done to coordinate the various participants, and to track their performance with regard to schedule and budget B.02 Lessons Learned - Technology Integration, Value Improvements, and Program Management, which describes the innovations and conclusions that we arrived upon during the development of the project, and makes recommendations for improvement of future projects of a similar nature . B.03 Project Economics, which details the capital and operation costs and their basis, and also illustrates the cost of power produced by the plant with certain sensitivities. B.04 Power Plant, Pipeline, and Injection Site Interfaces, which details the interfaces between the two FutureGen projects B.05 Contractual Mechanisms for Design, Construction, and Operation, which describes the major EPC, and Operations Contracts required to execute the project.« less

  10. Progress of the Photovoltaic Technology Incubator Project Towards an Enhanced U.S. Manufacturing Base: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullal, H.; Mitchell, R.; Keyes, B.

    In this paper, we report on the major accomplishments of the U.S. Department of Energy's (DOE) Solar Energy Technologies Program (SETP) Photovoltaic (PV) Technology Incubator project. The Incubator project facilitates a company's transition from developing a solar cell or PV module prototype to pilot- and large-scale U.S. manufacturing. The project targets small businesses that have demonstrated proof-of-concept devices or processes in the laboratory. Their success supports U.S. Secretary of Energy Steven Chu's SunShot Initiative, which seeks to achieve PV technologies that are cost-competitive without subsidies at large scale with fossil-based energy sources by the end of this decade. The Incubatormore » Project has enhanced U.S. PV manufacturing capacity and created more than 1200 clean energy jobs, resulting in an increase in American economic competitiveness. The investment raised to date by these PV Incubator companies as a result of DOE's $ 59 million investment totals nearly $ 1.3 billion.« less

  11. Progress of the PV Technology Incubator Project Towards an Enhanced U.S. Manufacturing Base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullal, H.; Mitchell, R.; Keyes, B.

    In this paper, we report on the major accomplishments of the U.S. Department of Energy's (DOE) Solar Energy Technologies Program (SETP) Photovoltaic (PV) Technology Incubator project. The Incubator project facilitates a company's transition from developing a solar cell or PV module prototype to pilot- and large-scale U.S. manufacturing. The project targets small businesses that have demonstrated proof-of-concept devices or processes in the laboratory. Their success supports U.S. Secretary of Energy Steven Chu's SunShot Initiative, which seeks to achieve PV technologies that are cost-competitive without subsidies at large scale with fossil-based energy sources by the end of this decade. The Incubatormore » Project has enhanced U.S. PV manufacturing capacity and created more than 1200 clean energy jobs, resulting in an increase in American economic competitiveness. The investment raised to date by these PV Incubator companies as a result of DOE's $ 59 million investment total nearly $ 1.3 billion.« less

  12. Advanced Grid-Friendly Controls Demonstration Project for Utility-Scale PV Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gevorgian, Vahan; O'Neill, Barbara

    A typical photovoltaic (PV) power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. The availability and dissemination of actual test data showing the viability of advanced utility-scale PV controls among all industry stakeholders can leverage PV's value from being simply an energy resource to providing additional ancillary services that range from variability smoothing and frequency regulation to power quality. Strategically partnering with a selected utility and/or PV power plant operator is a key condition for a successful demonstration project. The U.S. Department of Energy's (DOE's) Solar Energy Technologies Officemore » selected the National Renewable Energy Laboratory (NREL) to be a principal investigator in a two-year project with goals to (1) identify a potential partner(s), (2) develop a detailed scope of work and test plan for a field project to demonstrate the gird-friendly capabilities of utility-scale PV power plants, (3) facilitate conducting actual demonstration tests, and (4) disseminate test results among industry stakeholders via a joint NREL/DOE publication and participation in relevant technical conferences. The project implementation took place in FY 2014 and FY 2015. In FY14, NREL established collaborations with AES and First Solar Electric, LLC, to conduct demonstration testing on their utility-scale PV power plants in Puerto Rico and Texas, respectively, and developed test plans for each partner. Both Puerto Rico Electric Power Authority and the Electric Reliability Council of Texas expressed interest in this project because of the importance of such advanced controls for the reliable operation of their power systems under high penetration levels of variable renewable generation. During FY15, testing was completed on both plants, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to provide various types of new grid-friendly controls.« less

  13. Demonstration-scale evaluation of a novel high-solids anaerobic digestion process for converting organic wastes to fuel gas and compost.

    PubMed

    Rivard, C J; Duff, B W; Dickow, J H; Wiles, C C; Nagle, N J; Gaddy, J L; Clausen, E C

    1998-01-01

    Early evaluations of the bioconversion potential for combined wastes such as tuna sludge and sorted municipal solid waste (MSW) were conducted at laboratory scale and compared conventional low-solids, stirred-tank anaerobic systems with the novel, high-solids anaerobic digester (HSAD) design. Enhanced feedstock conversion rates and yields were determined for the HSAD system. In addition, the HSAD system demonstrated superior resiliency to process failure. Utilizing relatively dry feedstocks, the HSAD system is approximately one-tenth the size of conventional low-solids systems. In addition, the HSAD system is capable of organic loading rates (OLRs) on the order of 20-25 g volatile solids per liter digester volume per d (gVS/L/d), roughly 4-5 times those of conventional systems. Current efforts involve developing a demonstration-scale (pilot-scale) HSAD system. A two-ton/d plant has been constructed in Stanton, CA and is currently in the commissioning/startup phase. The purposes of the project are to verify laboratory- and intermediate-scale process performance; test the performance of large-scale prototype mechanical systems; demonstrate the long-term reliability of the process; and generate the process and economic data required for the design, financing, and construction of full-scale commercial systems. This study presents conformational fermentation data obtained at intermediate-scale and a snapshot of the pilot-scale project.

  14. Carbon nanotube circuit integration up to sub-20 nm channel lengths.

    PubMed

    Shulaker, Max Marcel; Van Rethy, Jelle; Wu, Tony F; Liyanage, Luckshitha Suriyasena; Wei, Hai; Li, Zuanyi; Pop, Eric; Gielen, Georges; Wong, H-S Philip; Mitra, Subhasish

    2014-04-22

    Carbon nanotube (CNT) field-effect transistors (CNFETs) are a promising emerging technology projected to achieve over an order of magnitude improvement in energy-delay product, a metric of performance and energy efficiency, compared to silicon-based circuits. However, due to substantial imperfections inherent with CNTs, the promise of CNFETs has yet to be fully realized. Techniques to overcome these imperfections have yielded promising results, but thus far only at large technology nodes (1 μm device size). Here we demonstrate the first very large scale integration (VLSI)-compatible approach to realizing CNFET digital circuits at highly scaled technology nodes, with devices ranging from 90 nm to sub-20 nm channel lengths. We demonstrate inverters functioning at 1 MHz and a fully integrated CNFET infrared light sensor and interface circuit at 32 nm channel length. This demonstrates the feasibility of realizing more complex CNFET circuits at highly scaled technology nodes.

  15. Post-project geomorphic assessment of a large process-based river restoration project

    USGS Publications Warehouse

    Erwin, Susannah O.; Schmidt, John C.; Allred, Tyler M.

    2016-01-01

    This study describes channel changes following completion of the Provo River Restoration Project (PRRP), the largest stream restoration project in Utah and one of the largest projects in the United States in which a gravel-bed river was fully reconstructed. We summarize project objectives and the design process, and we analyze monitoring data collected during the first 7 years after project completion. Post-project channel adjustment during the study period included two phases: (i) an initial phase of rapid, but small-scale, adjustment during the first years after stream flow was introduced to the newly constructed channel and (ii) a subsequent period of more gradual topographic adjustment and channel migration. Analysis of aerial imagery and ground-survey data demonstrate that the channel has been more dynamic in the downstream 4 km where a local source contributes a significant annual supply of bed material. Here, the channel migrates and exhibits channel adjustments that are more consistent with project objectives. The upstream 12 km of the PRRP are sediment starved, the channel has been laterally stable, and this condition may not be consistent with large-scale project objectives.

  16. Projection Effects of Large-scale Structures on Weak-lensing Peak Abundances

    NASA Astrophysics Data System (ADS)

    Yuan, Shuo; Liu, Xiangkun; Pan, Chuzhong; Wang, Qiao; Fan, Zuhui

    2018-04-01

    High peaks in weak lensing (WL) maps originate dominantly from the lensing effects of single massive halos. Their abundance is therefore closely related to the halo mass function and thus a powerful cosmological probe. However, besides individual massive halos, large-scale structures (LSS) along lines of sight also contribute to the peak signals. In this paper, with ray-tracing simulations, we investigate the LSS projection effects. We show that for current surveys with a large shape noise, the stochastic LSS effects are subdominant. For future WL surveys with source galaxies having a median redshift z med ∼ 1 or higher, however, they are significant. For the cosmological constraints derived from observed WL high-peak counts, severe biases can occur if the LSS effects are not taken into account properly. We extend the model of Fan et al. by incorporating the LSS projection effects into the theoretical considerations. By comparing with simulation results, we demonstrate the good performance of the improved model and its applicability in cosmological studies.

  17. A Scalable Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Aiken, Alexander

    2001-01-01

    The Scalable Analysis Toolkit (SAT) project aimed to demonstrate that it is feasible and useful to statically detect software bugs in very large systems. The technical focus of the project was on a relatively new class of constraint-based techniques for analysis software, where the desired facts about programs (e.g., the presence of a particular bug) are phrased as constraint problems to be solved. At the beginning of this project, the most successful forms of formal software analysis were limited forms of automatic theorem proving (as exemplified by the analyses used in language type systems and optimizing compilers), semi-automatic theorem proving for full verification, and model checking. With a few notable exceptions these approaches had not been demonstrated to scale to software systems of even 50,000 lines of code. Realistic approaches to large-scale software analysis cannot hope to make every conceivable formal method scale. Thus, the SAT approach is to mix different methods in one application by using coarse and fast but still adequate methods at the largest scales, and reserving the use of more precise but also more expensive methods at smaller scales for critical aspects (that is, aspects critical to the analysis problem under consideration) of a software system. The principled method proposed for combining a heterogeneous collection of formal systems with different scalability characteristics is mixed constraints. This idea had been used previously in small-scale applications with encouraging results: using mostly coarse methods and narrowly targeted precise methods, useful information (meaning the discovery of bugs in real programs) was obtained with excellent scalability.

  18. Learning through a portfolio of carbon capture and storage demonstration projects

    NASA Astrophysics Data System (ADS)

    Reiner, David M.

    2016-01-01

    Carbon dioxide capture and storage (CCS) technology is considered by many to be an essential route to meet climate mitigation targets in the power and industrial sectors. Deploying CCS technologies globally will first require a portfolio of large-scale demonstration projects. These first projects should assist learning by diversity, learning by replication, de-risking the technologies and developing viable business models. From 2005 to 2009, optimism about the pace of CCS rollout led to mutually independent efforts in the European Union, North America and Australia to assemble portfolios of projects. Since 2009, only a few of these many project proposals remain viable, but the initial rationales for demonstration have not been revisited in the face of changing circumstances. Here I argue that learning is now both more difficult and more important given the slow pace of deployment. Developing a more coordinated global portfolio will facilitate learning across projects and may determine whether CCS ever emerges from the demonstration phase.

  19. Evaluating Green/Gray Infrastructure for CSO/Stormwater Control

    EPA Science Inventory

    The NRMRL is conducting this project to evaluate the water quality and quantity benefits of a large-scale application of green infrastructure (low-impact development/best management practices) retrofits in an entire subcatchment. It will document ORD's effort to demonstrate the e...

  20. Geomorphic analysis of large alluvial rivers

    NASA Astrophysics Data System (ADS)

    Thorne, Colin R.

    2002-05-01

    Geomorphic analysis of a large river presents particular challenges and requires a systematic and organised approach because of the spatial scale and system complexity involved. This paper presents a framework and blueprint for geomorphic studies of large rivers developed in the course of basic, strategic and project-related investigations of a number of large rivers. The framework demonstrates the need to begin geomorphic studies early in the pre-feasibility stage of a river project and carry them through to implementation and post-project appraisal. The blueprint breaks down the multi-layered and multi-scaled complexity of a comprehensive geomorphic study into a number of well-defined and semi-independent topics, each of which can be performed separately to produce a clearly defined, deliverable product. Geomorphology increasingly plays a central role in multi-disciplinary river research and the importance of effective quality assurance makes it essential that audit trails and quality checks are hard-wired into study design. The structured approach presented here provides output products and production trails that can be rigorously audited, ensuring that the results of a geomorphic study can stand up to the closest scrutiny.

  1. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    NASA Astrophysics Data System (ADS)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  2. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: Overview and Air-side System Description

    NASA Technical Reports Server (NTRS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter, III; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; hide

    2016-01-01

    This work presents an overview of the This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes., a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNCs demonstrated wavefront sensing and control system to refine and quantify the end-to-end system performance for high-contrast starlight suppression. This pathfinder system will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  3. Large Pilot-Scale Carbon Dioxide (CO2) Capture Project Using Aminosilicone Solvent.Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hancu, Dan

    GE Global Research has developed, over the last 8 years, a platform of cost effective CO2 capture technologies based on a non-aqueous aminosilicone solvent (GAP-1m). As demonstrated in previous funded DOE projects (DE-FE0007502 and DEFE0013755), the GAP-1m solvent has increased CO2 working capacity, lower volatility and corrosivity than the benchmark aqueous amine technology. Performance of the GAP-1m solvent was recently demonstrated in a 0.5 MWe pilot at National Carbon Capture Center, AL with real flue gas for over 500 hours of operation using a Steam Stripper Column (SSC). The pilot-scale PSTU engineering data were used to (i) update the techno-economicmore » analysis, and EH&S assessment, (ii) perform technology gap analysis, and (iii) conduct the solvent manufacturability and scale-up study.« less

  4. Stakeholder views on financing carbon capture and storage demonstration projects in China.

    PubMed

    Reiner, David; Liang, Xi

    2012-01-17

    Chinese stakeholders (131) from 68 key institutions in 27 provinces were consulted in spring 2009 in an online survey of their perceptions of the barriers and opportunities in financing large-scale carbon dioxide capture and storage (CCS) demonstration projects in China. The online survey was supplemented by 31 follow-up face-to-face interviews. The National Development and Reform Commission (NDRC) was widely perceived as the most important institution in authorizing the first commercial-scale CCS demonstration project and authorization was viewed as more similar to that for a power project than a chemicals project. There were disagreements, however, on the appropriate size for a demonstration plant, the type of capture, and the type of storage. Most stakeholders believed that the international image of the Chinese Government could benefit from demonstrating commercial CCS and that such a project could also create advantages for Chinese companies investing in CCS technologies. In more detailed interviews with 16 financial officials, we found striking disagreements over the perceived risks of demonstrating CCS. The rate of return seen as appropriate for financing demonstration projects was split between stakeholders from development banks (who supported a rate of 5-8%) and those from commercial banks (12-20%). The divergence on rate alone could result in as much as a 40% difference in the cost of CO(2) abatement and 56% higher levelized cost of electricity based on a hypothetical case study of a typical 600-MW new build ultrasupercritical pulverized coal-fired (USCPC) power plant. To finance the extra operational costs, there were sharp divisions over which institutions should bear the brunt of financing although, overall, more than half of the support was expected to come from foreign and Chinese governments.

  5. Philippine Academy of Rehabilitation Medicine emergency basic relief and medical aid mission project (November 2013-February 2014): the role of physiatrists in Super Typhoon Haiyan.

    PubMed

    Ganchoon, Filipinas; Bugho, Rommel; Calina, Liezel; Dy, Rochelle; Gosney, James

    2017-06-09

    Physiatrists have provided humanitarian assistance in recent large-scale global natural disasters. Super Typhoon Haiyan, the deadliest and most costly typhoon in modern Philippine history, made landfall on 8 November 2013 resulting in significant humanitarian needs. Philippine Academy of Rehabilitation Medicine physiatrists conducted a project of 23 emergency basic relief and medical aid missions in response to Super Typhoon Haiyan from November 2013 to February 2014. The final mission was a medical aid mission to the inland rural community of Burauen, Leyte. Summary data were collected, collated, and tabulated; project and mission evaluation was performed. During the humanitarian assistance project, 31,254 basic relief kits containing a variety of food and non-food items were distributed and medical services including consultation, treatment, and medicines were provided to 7255 patients. Of the 344 conditions evaluated in the medical aid mission to Burauen, Leyte 85 (59%) were physical and rehabilitation medicine conditions comprised of musculoskeletal (62 [73%]), neurological (17 [20%]), and dermatological (6 [7%]) diagnoses. Post-mission and project analysis resulted in recommendations and programmatic changes to strengthen response in future disasters. Physiatrists functioned as medical providers, mission team leaders, community advocates, and in other roles. This physiatrist-led humanitarian assistance project met critical basic relief and medical aid needs of persons impacted by Super Typhoon Haiyan, demonstrating significant roles performed by physiatrists in response to a large-scale natural disaster. Resulting disaster programing changes and recommendations may inform a more effective response by PARM mission teams in the Philippines as well as by other South-Eastern Asia teams comprising rehabilitation professionals to large-scale, regional natural disasters. Implications for rehabilitation Large-scale natural disasters including tropical cyclones can have a catastrophic impact on the affected population. In response to Super Typhoon Haiyan, physiatrists representing the Philippine Academy of Rehabilitation Medicine conducted a project of 23 emergency basic relief and medical aid missions from November 2013 to February 2014. Project analysis indicates that medical mission teams responding in similar settings may expect to evaluate a significant number of physical medicine and rehabilitation conditions. Medical rehabilitation with participation by rehabilitation professionals including rehabilitation doctors is essential to the emergency medical response in large-scale natural disasters.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lenee-Bluhm, P.; Rhinefrank, Ken

    The overarching project objective is to demonstrate the feasibility of using an innovative PowerTake-Off (PTO) Module in Columbia Power's utility-scale wave energy converter (WEC). The PTO Module uniquely combines a large-diameter, direct-drive, rotary permanent magnet generator; a patent-pending rail-bearing system; and a corrosion-resistant fiber-reinforced-plastic structure

  7. Transitioning a home telehealth project into a sustainable, large-scale service: a qualitative study.

    PubMed

    Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin

    2016-05-16

    This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of practice. The implementation of home telehealth services is still in an early stage. Change agents and a community of practice can contribute by marketing telehealth, demonstrating policy alignment and providing potential solutions for difficult health services problems. This should assist health leaders to move from trials to large-scale services.

  8. Advanced Distillation Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddalena Fanelli; Ravi Arora; Annalee Tonkovich

    2010-03-24

    The Advanced Distillation project was concluded on December 31, 2009. This U.S. Department of Energy (DOE) funded project was completed successfully and within budget during a timeline approved by DOE project managers, which included a one year extension to the initial ending date. The subject technology, Microchannel Process Technology (MPT) distillation, was expected to provide both capital and operating cost savings compared to conventional distillation technology. With efforts from Velocys and its project partners, MPT distillation was successfully demonstrated at a laboratory scale and its energy savings potential was calculated. While many objectives established at the beginning of the projectmore » were met, the project was only partially successful. At the conclusion, it appears that MPT distillation is not a good fit for the targeted separation of ethane and ethylene in large-scale ethylene production facilities, as greater advantages were seen for smaller scale distillations. Early in the project, work involved flowsheet analyses to discern the economic viability of ethane-ethylene MPT distillation and develop strategies for maximizing its impact on the economics of the process. This study confirmed that through modification to standard operating processes, MPT can enable net energy savings in excess of 20%. This advantage was used by ABB Lumus to determine the potential impact of MPT distillation on the ethane-ethylene market. The study indicated that a substantial market exists if the energy saving could be realized and if installed capital cost of MPT distillation was on par or less than conventional technology. Unfortunately, it was determined that the large number of MPT distillation units needed to perform ethane-ethylene separation for world-scale ethylene facilities, makes the targeted separation a poor fit for the technology in this application at the current state of manufacturing costs. Over the course of the project, distillation experiments were performed with the targeted mixture, ethane-ethylene, as well as with analogous low relative volatility systems: cyclohexane-hexane and cyclopentane-pentane. Devices and test stands were specifically designed for these efforts. Development progressed from experiments and models considering sections of a full scale device to the design, fabrication, and operation of a single-channel distillation unit with integrated heat transfer. Throughout the project, analytical and numerical models and Computational Fluid Dynamics (CFD) simulations were validated with experiments in the process of developing this platform technology. Experimental trials demonstrated steady and controllable distillation for a variety of process conditions. Values of Height-to-an-Equivalent Theoretical Plate (HETP) ranging from less than 0.5 inch to a few inches were experimentally proven, demonstrating a ten-fold performance enhancement relative to conventional distillation. This improvement, while substantial, is not sufficient for MPT distillation to displace very large scale distillation trains. Fortunately, parallel efforts in the area of business development have yielded other applications for MPT distillation, including smaller scale separations that benefit from the flowsheet flexibility offered by the technology. Talks with multiple potential partners are underway. Their outcome will also help determine the path ahead for MPT distillation.« less

  9. Demonstration of Essential Reliability Services by a 300-MW Solar Photovoltaic Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loutan, Clyde; Klauer, Peter; Chowdhury, Sirajul

    The California Independent System Operator (CAISO), First Solar, and the National Renewable Energy Laboratory (NREL) conducted a demonstration project on a large utility-scale photovoltaic (PV) power plant in California to test its ability to provide essential ancillary services to the electric grid. With increasing shares of solar- and wind-generated energy on the electric grid, traditional generation resources equipped with automatic governor control (AGC) and automatic voltage regulation controls -- specifically, fossil thermal -- are being displaced. The deployment of utility-scale, grid-friendly PV power plants that incorporate advanced capabilities to support grid stability and reliability is essential for the large-scale integrationmore » of PV generation into the electric power grid, among other technical requirements. A typical PV power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. In this way, PV power plants can be used to mitigate the impact of variability on the grid, a role typically reserved for conventional generators. In August 2016, testing was completed on First Solar's 300-MW PV power plant, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to use grid-friendly controls to provide essential reliability services. These data showed how the development of advanced power controls can enable PV to become a provider of a wide range of grid services, including spinning reserves, load following, voltage support, ramping, frequency response, variability smoothing, and frequency regulation to power quality. Specifically, the tests conducted included various forms of active power control such as AGC and frequency regulation; droop response; and reactive power, voltage, and power factor controls. This project demonstrated that advanced power electronics and solar generation can be controlled to contribute to system-wide reliability. It was shown that the First Solar plant can provide essential reliability services related to different forms of active and reactive power controls, including plant participation in AGC, primary frequency control, ramp rate control, and voltage regulation. For AGC participation in particular, by comparing the PV plant testing results to the typical performance of individual conventional technologies, we showed that regulation accuracy by the PV plant is 24-30 points better than fast gas turbine technologies. The plant's ability to provide volt-ampere reactive control during periods of extremely low power generation was demonstrated as well. The project team developed a pioneering demonstration concept and test plan to show how various types of active and reactive power controls can leverage PV generation's value from being a simple variable energy resource to a resource that provides a wide range of ancillary services. With this project's approach to a holistic demonstration on an actual, large, utility-scale, operational PV power plant and dissemination of the obtained results, the team sought to close some gaps in perspectives that exist among various stakeholders in California and nationwide by providing real test data.« less

  10. UXO Detection and Characterization in the Marine Environment

    DTIC Science & Technology

    2009-12-01

    10 Figure 8. A part of Ostrich Bay adjacent to the Naval Ammunition Depot Puget Sound during the period of its...large-scale demonstration focus on a marine geophysical magnetometry survey of Ostrich Bay adjacent to the Former Naval Ammunition Depot – Puget Sound ...The Puget Sound Demonstration was also supported by NAVFAC Northwest, the current manager of the site. The Navy Project Manager is Mr. Mark Murphy

  11. Studies on combined model based on functional objectives of large scale complex engineering

    NASA Astrophysics Data System (ADS)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  12. More robust regional precipitation projection from selected CMIP5 models based on multiple-dimensional metrics

    NASA Astrophysics Data System (ADS)

    Qian, Y.; Wang, L.; Leung, L. R.; Lin, G.; Lu, J.; Gao, Y.; Zhang, Y.

    2017-12-01

    Projecting precipitation changes is challenging because of incomplete understanding of the climate system and biases and uncertainty in climate models. In East Asia where summer precipitation is dominantly influenced by the monsoon circulation and the global models from Coupled Model Intercomparison Project Phase 5 (CMIP5), however, give various projection of precipitation change for 21th century. It is critical for community to know which models' projection are more reliable in response to natural and anthropogenic forcings. In this study we defined multiple-dimensional metrics, measuring the model performance in simulating the present-day of large-scale circulation, regional precipitation and relationship between them. The large-scale circulation features examined in this study include the lower tropospheric southwesterly winds, the western North Pacific subtropical high, the South China Sea Subtropical High, and the East Asian westerly jet in the upper troposphere. Each of these circulation features transport moisture to East Asia, enhancing the moist static energy and strengthening the Meiyu moisture front that is the primary mechanism for precipitation generation in eastern China. Based on these metrics, 30 models in CMIP5 ensemble are classified into three groups. Models in the top performing group projected regional precipitation patterns that are more similar to each other than the bottom or middle performing group and consistently projected statistically significant increasing trends in two of the large-scale circulation indices and precipitation. In contrast, models in the bottom or middle performing group projected small drying or no trends in precipitation. We also find the models that only reasonably reproduce the observed precipitation climatology does not guarantee more reliable projection of future precipitation because good simulation skill could be achieved through compensating errors from multiple sources. Herein the potential for more robust projections of precipitation changes at regional scale is demonstrated through the use of discriminating metric to subsample the multi-model ensemble. The results from this study provides insights for how to select models from CMIP ensemble to project regional climate and hydrological cycle changes.

  13. Recovery Act: Oxy-Combustion Techology Development for Industrial-Scale Boiler Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levasseur, Armand

    2014-04-30

    Alstom Power Inc. (Alstom), under U.S. DOE/NETL Cooperative Agreement No. DE-NT0005290, is conducting a development program to generate detailed technical information needed for application of oxy-combustion technology. The program is designed to provide the necessary information and understanding for the next step of large-scale commercial demonstration of oxy combustion in tangentially fired boilers and to accelerate the commercialization of this technology. The main project objectives include: • Design and develop an innovative oxyfuel system for existing tangentially-fired boiler units that minimizes overall capital investment and operating costs. • Evaluate performance of oxyfuel tangentially fired boiler systems in pilot scale testsmore » at Alstom’s 15 MWth tangentially fired Boiler Simulation Facility (BSF). • Address technical gaps for the design of oxyfuel commercial utility boilers by focused testing and improvement of engineering and simulation tools. • Develop the design, performance and costs for a demonstration scale oxyfuel boiler and auxiliary systems. • Develop the design and costs for both industrial and utility commercial scale reference oxyfuel boilers and auxiliary systems that are optimized for overall plant performance and cost. • Define key design considerations and develop general guidelines for application of results to utility and different industrial applications. The project was initiated in October 2008 and the scope extended in 2010 under an ARRA award. The project completion date was April 30, 2014. Central to the project is 15 MWth testing in the BSF, which provided in-depth understanding of oxy-combustion under boiler conditions, detailed data for improvement of design tools, and key information for application to commercial scale oxy-fired boiler design. Eight comprehensive 15 MWth oxy-fired test campaigns were performed with different coals, providing detailed data on combustion, emissions, and thermal behavior over a matrix of fuels, oxyprocess variables and boiler design parameters. Significant improvement of CFD modeling tools and validation against 15 MWth experimental data has been completed. Oxy-boiler demonstration and large reference designs have been developed, supported with the information and knowledge gained from the 15 MWth testing. The results from the 15 MWth testing in the BSF and complimentary bench-scale testing are addressed in this volume (Volume II) of the final report. The results of the modeling efforts (Volume III) and the oxy boiler design efforts (Volume IV) are reported in separate volumes.« less

  14. Linking climate projections to performance: A yield-based decision scaling assessment of a large urban water resources system

    NASA Astrophysics Data System (ADS)

    Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.

    2014-04-01

    Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.

  15. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities (Book)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. The U.S. Department of Energy's Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessary private capital to complete them. This guide is intended to provide a general resource that will begin to develop the Federal employee's awareness and understanding of the project developer's operating environment and the private sector's awareness and understandingmore » of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this guide has been organized to match Federal processes with typical phases of commercial project development. The main purpose of this guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project.« less

  16. Development of a 3D printer using scanning projection stereolithography

    PubMed Central

    Lee, Michael P.; Cooper, Geoffrey J. T.; Hinkley, Trevor; Gibson, Graham M.; Padgett, Miles J.; Cronin, Leroy

    2015-01-01

    We have developed a system for the rapid fabrication of low cost 3D devices and systems in the laboratory with micro-scale features yet cm-scale objects. Our system is inspired by maskless lithography, where a digital micromirror device (DMD) is used to project patterns with resolution up to 10 µm onto a layer of photoresist. Large area objects can be fabricated by stitching projected images over a 5cm2 area. The addition of a z-stage allows multiple layers to be stacked to create 3D objects, removing the need for any developing or etching steps but at the same time leading to true 3D devices which are robust, configurable and scalable. We demonstrate the applications of the system by printing a range of micro-scale objects as well as a fully functioning microfluidic droplet device and test its integrity by pumping dye through the channels. PMID:25906401

  17. Exploiting Synoptic-Scale Climate Processes to Develop Nonstationary, Probabilistic Flood Hazard Projections

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Brown, C.; Doss-Gollin, J.

    2016-12-01

    Climate model projections are commonly used for water resources management and planning under nonstationarity, but they do not reliably reproduce intense short-term precipitation and are instead more skilled at broader spatial scales. To provide a credible estimate of flood trend that reflects climate uncertainty, we present a framework that exploits the connections between synoptic-scale oceanic and atmospheric patterns and local-scale flood-producing meteorological events to develop long-term flood hazard projections. We demonstrate the method for the Iowa River, where high flow episodes have been found to correlate with tropical moisture exports that are associated with a pressure dipole across the eastern continental United States We characterize the relationship between flooding on the Iowa River and this pressure dipole through a nonstationary Pareto-Poisson peaks-over-threshold probability distribution estimated based on the historic record. We then combine the results of a trend analysis of dipole index in the historic record with the results of a trend analysis of the dipole index as simulated by General Circulation Models (GCMs) under climate change conditions through a Bayesian framework. The resulting nonstationary posterior distribution of dipole index, combined with the dipole-conditioned peaks-over-threshold flood frequency model, connects local flood hazard to changes in large-scale atmospheric pressure and circulation patterns that are related to flooding in a process-driven framework. The Iowa River example demonstrates that the resulting nonstationary, probabilistic flood hazard projection may be used to inform risk-based flood adaptation decisions.

  18. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    PubMed

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs explored in an RCT are inadequate in describing the enablers and barriers of upscaled community program implementation. Methods for data collection, analysis and reporting also require consideration. We present a number of experiential reflections and suggestions for the successful evaluation of future upscaled community programs which are scarcely reported in the literature. PEACH™ QLD was retrospectively registered with the Australian New Zealand Clinical Trials Registry on 28 February 2017 (ACTRN12617000315314).

  19. DISRUPTION OF LARGE-SCALE NEURAL NETWORKS IN NON-FLUENT/AGRAMMATIC VARIANT PRIMARY PROGRESSIVE APHASIA ASSOCIATED WITH FRONTOTEMPORAL DEGENERATION PATHOLOGY

    PubMed Central

    Grossman, Murray; Powers, John; Ash, Sherry; McMillan, Corey; Burkholder, Lisa; Irwin, David; Trojanowski, John Q.

    2012-01-01

    Non-fluent/agrammatic primary progressive aphasia (naPPA) is a progressive neurodegenerative condition most prominently associated with slowed, effortful speech. A clinical imaging marker of naPPA is disease centered in the left inferior frontal lobe. We used multimodal imaging to assess large-scale neural networks underlying effortful expression in 15 patients with sporadic naPPA due to frontotemporal lobar degeneration (FTLD) spectrum pathology. Effortful speech in these patients is related in part to impaired grammatical processing, and to phonologic speech errors. Gray matter (GM) imaging shows frontal and anterior-superior temporal atrophy, most prominently in the left hemisphere. Diffusion tensor imaging reveals reduced fractional anisotropy in several white matter (WM) tracts mediating projections between left frontal and other GM regions. Regression analyses suggest disruption of three large-scale GM-WM neural networks in naPPA that support fluent, grammatical expression. These findings emphasize the role of large-scale neural networks in language, and demonstrate associated language deficits in naPPA. PMID:23218686

  20. Characterization of the Ecosole HCPV tracker and single module inverter

    NASA Astrophysics Data System (ADS)

    Carpanelli, Maurizio; Borelli, Gianni; Verdilio, Daniele; De Nardis, Davide; Migali, Fabrizio; Cancro, Carmine; Graditi, Giorgio

    2015-09-01

    BECAR, the Beghelli group's R&D company, is leading ECOSOLE (Elevated COncentration SOlar Energy), one of the largest European Demonstration projects in solar photovoltaic. ECOSOLE, started in 2012, is focused on the study, design, and realization of new HCPV generator made of high efficiency PV modules equipped with SoG (Silicone on Glass) fresnel lenses and III-V solar cells, and a low cost matched solar tracker with distributed inverters approach. The project also regards the study and demonstration of new high throughput methods for the industrial large scale productions, at very low manufacturing costs. This work reports the description of the characterization of the tracker and single module.

  1. Developing Renewable Energy Projects Larger Than 10 MWs at Federal Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-03-01

    To accomplish Federal goals for renewable energy, sustainability, and energy security, large-scale renewable energy projects must be developed and constructed on Federal sites at a significant scale with significant private investment. For the purposes of this Guide, large-scale Federal renewable energy projects are defined as renewable energy facilities larger than 10 megawatts (MW) that are sited on Federal property and lands and typically financed and owned by third parties.1 The U.S. Department of Energy’s Federal Energy Management Program (FEMP) helps Federal agencies meet these goals and assists agency personnel navigate the complexities of developing such projects and attract the necessarymore » private capital to complete them. This Guide is intended to provide a general resource that will begin to develop the Federal employee’s awareness and understanding of the project developer’s operating environment and the private sector’s awareness and understanding of the Federal environment. Because the vast majority of the investment that is required to meet the goals for large-scale renewable energy projects will come from the private sector, this Guide has been organized to match Federal processes with typical phases of commercial project development. FEMP collaborated with the National Renewable Energy Laboratory (NREL) and professional project developers on this Guide to ensure that Federal projects have key elements recognizable to private sector developers and investors. The main purpose of this Guide is to provide a project development framework to allow the Federal Government, private developers, and investors to work in a coordinated fashion on large-scale renewable energy projects. The framework includes key elements that describe a successful, financially attractive large-scale renewable energy project. This framework begins the translation between the Federal and private sector operating environments. When viewing the overall« less

  2. The XChemExplorer graphical workflow tool for routine or large-scale protein-ligand structure determination.

    PubMed

    Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Collins, Patrick; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian; von Delft, Frank

    2017-03-01

    XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein-ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011), Acta Cryst. D67, 235-242] or PHENIX [Adams et al. (2010), Acta Cryst. D66, 213-221] have entrenched the paradigm that a `project' is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossno, Patricia Joyce; Dunlavy, Daniel M.; Stanton, Eric T.

    This report is a summary of the accomplishments of the 'Scalable Solutions for Processing and Searching Very Large Document Collections' LDRD, which ran from FY08 through FY10. Our goal was to investigate scalable text analysis; specifically, methods for information retrieval and visualization that could scale to extremely large document collections. Towards that end, we designed, implemented, and demonstrated a scalable framework for text analysis - ParaText - as a major project deliverable. Further, we demonstrated the benefits of using visual analysis in text analysis algorithm development, improved performance of heterogeneous ensemble models in data classification problems, and the advantages ofmore » information theoretic methods in user analysis and interpretation in cross language information retrieval. The project involved 5 members of the technical staff and 3 summer interns (including one who worked two summers). It resulted in a total of 14 publications, 3 new software libraries (2 open source and 1 internal to Sandia), several new end-user software applications, and over 20 presentations. Several follow-on projects have already begun or will start in FY11, with additional projects currently in proposal.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.

    CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less

  5. What Will the Neighbors Think? Building Large-Scale Science Projects Around the World

    ScienceCinema

    Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug

    2017-12-22

    Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.

  6. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    NASA Astrophysics Data System (ADS)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  7. Criminological research in contemporary China: challenges and lessons learned from a large-scale criminal victimization survey.

    PubMed

    Zhang, Lening; Messner, Steven F; Lu, Jianhong

    2007-02-01

    This article discusses research experience gained from a large-scale survey of criminal victimization recently conducted in Tianjin, China. The authors review some of the more important challenges that arose in the research, their responses to these challenges, and lessons learned that might be beneficial to other scholars who are interested in conducting criminological research in China. Their experience underscores the importance of understanding the Chinese political, cultural, and academic context, and the utility of collaborating with experienced and knowledgeable colleagues "on site." Although there are some special difficulties and barriers, their project demonstrates the feasibility of original criminological data collection in China.

  8. Jumpstarting commercial-scale CO 2 capture and storage with ethylene production and enhanced oil recovery in the US Gulf

    DOE PAGES

    Middleton, Richard S.; Levine, Jonathan S.; Bielicki, Jeffrey M.; ...

    2015-04-27

    CO 2 capture, utilization, and storage (CCUS) technology has yet to be widely deployed at a commercial scale despite multiple high-profile demonstration projects. We suggest that developing a large-scale, visible, and financially viable CCUS network could potentially overcome many barriers to deployment and jumpstart commercial-scale CCUS. To date, substantial effort has focused on technology development to reduce the costs of CO 2 capture from coal-fired power plants. Here, we propose that near-term investment could focus on implementing CO 2 capture on facilities that produce high-value chemicals/products. These facilities can absorb the expected impact of the marginal increase in the costmore » of production on the price of their product, due to the addition of CO 2 capture, more than coal-fired power plants. A financially viable demonstration of a large-scale CCUS network requires offsetting the costs of CO 2 capture by using the CO 2 as an input to the production of market-viable products. As a result, we demonstrate this alternative development path with the example of an integrated CCUS system where CO 2 is captured from ethylene producers and used for enhanced oil recovery in the U.S. Gulf Coast region.« less

  9. Building and measuring a high performance network architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, William T.C.; Toole, Timothy; Fisher, Chuck

    2001-04-20

    Once a year, the SC conferences present a unique opportunity to create and build one of the most complex and highest performance networks in the world. At SC2000, large-scale and complex local and wide area networking connections were demonstrated, including large-scale distributed applications running on different architectures. This project was designed to use the unique opportunity presented at SC2000 to create a testbed network environment and then use that network to demonstrate and evaluate high performance computational and communication applications. This testbed was designed to incorporate many interoperable systems and services and was designed for measurement from the very beginning.more » The end results were key insights into how to use novel, high performance networking technologies and to accumulate measurements that will give insights into the networks of the future.« less

  10. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  11. 77 FR 52754 - Draft Midwest Wind Energy Multi-Species Habitat Conservation Plan Within Eight-State Planning Area

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... include new and existing small-scale wind energy facilities, such as single-turbine demonstration projects, as well as large, multi-turbine commercial wind facilities. Covered Species The planning partners are...-FF03E00000] Draft Midwest Wind Energy Multi-Species Habitat Conservation Plan Within Eight-State Planning...

  12. Diving in Head First: Finding the Volume of Norris lake

    ERIC Educational Resources Information Center

    Foster, Drew W.

    2008-01-01

    This article allows students to apply their knowledge and experience of area and volume to find the volume of Norris Lake, a large reservoir lake in Tennessee. Students have the opportunity to demonstrate their skills in using maps and scales as well as to incorporate the use of technology in developing the solution. This project satisfied the…

  13. Designing an External Evaluation of a Large-Scale Software Development Project.

    ERIC Educational Resources Information Center

    Collis, Betty; Moonen, Jef

    This paper describes the design and implementation of the evaluation of the POCO Project, a large-scale national software project in the Netherlands which incorporates the perspective of an evaluator throughout the entire span of the project, and uses the experiences gained from it to suggest an evaluation procedure that could be applied to other…

  14. 78 FR 18348 - Submission for OMB Review; Use of Project Labor Agreements for Federal Construction Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... agreement (PLA), as they may decide appropriate, on large-scale construction projects, where the total cost... procurement. A PLA is a pre-hire collective bargaining agreement with one or more labor organizations that... the use of a project labor agreement (PLA), as they may decide appropriate, on large-scale...

  15. Platform Chemicals from an Oilseed Biorefinery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tupy, Mike; Schrodi Yann

    2006-11-06

    The US chemical industry is $460 billion in size where a $150 billion segment of which is non-oxygenated chemicals that is sourced today via petroleum but is addressable by a renewable feedstock if one considers a more chemically reduced feedstock such as vegetable oils. Vegetable oil, due to its chemical functionality, provides a largely untapped opportunity as a renewable chemical source to replace petroleum-derived chemicals and produce platform chemicals unavailable today. This project examined the fertile intersection between the rich building blocks provided by vegetable oils and the enhanced chemical modification capability provided by metathesis chemistry. The technology advanced inmore » this study is the process of ethylene cross-metathesis (referred to as ethenolysis) with vegetable oil and vegetable oil derivatives to manufacture the platform-chemical 9-decenoic acid (or 9DA) and olefin co-products. The project team meet its goals of demonstrating improved catalyst efficiencies of several multiples, deepening the mechanistic understanding of metathesis, synthesis and screening of dozens of new catalysts, designing and modeling commercial processes, and estimating production costs. One demonstrable result of the study was a step change improvement in catalyst turnover number in the ethenolysis of methyl oleate as reported here. We met our key measurable of producing 100 lbs of 9DA at the pilot-scale, which demonstrated ability to scale-up ethenolysis. DOE Project funding had significant positive impact on development of metathetically modified vegetable oils more broadly as the Cargill/Materia partnership, that was able to initiate primarily due to DOE funding, has succeeded in commercializing products, validating metathesis as a platform technology, and expanding a diverse products portfolio in high value and in large volume markets. Opportunities have expanded and business development has gained considerable momentum and enabled further expansion of the Materia/Cargill relationship. This project exceeded expectations and is having immediate impact on DOE success by replacing petroleum products with renewables in a large volume application today.« less

  16. Project Management Life Cycle Models to Improve Management in High-rise Construction

    NASA Astrophysics Data System (ADS)

    Burmistrov, Andrey; Siniavina, Maria; Iliashenko, Oksana

    2018-03-01

    The paper describes a possibility to improve project management in high-rise buildings construction through the use of various Project Management Life Cycle Models (PMLC models) based on traditional and agile project management approaches. Moreover, the paper describes, how the split the whole large-scale project to the "project chain" will create the factor for better manageability of the large-scale buildings project and increase the efficiency of the activities of all participants in such projects.

  17. Practical recipes for the model order reduction, dynamical simulation and compressive sampling of large-scale open quantum systems

    NASA Astrophysics Data System (ADS)

    Sidles, John A.; Garbini, Joseph L.; Harrell, Lee E.; Hero, Alfred O.; Jacky, Jonathan P.; Malcomb, Joseph R.; Norman, Anthony G.; Williamson, Austin M.

    2009-06-01

    Practical recipes are presented for simulating high-temperature and nonequilibrium quantum spin systems that are continuously measured and controlled. The notion of a spin system is broadly conceived, in order to encompass macroscopic test masses as the limiting case of large-j spins. The simulation technique has three stages: first the deliberate introduction of noise into the simulation, then the conversion of that noise into an equivalent continuous measurement and control process, and finally, projection of the trajectory onto state-space manifolds having reduced dimensionality and possessing a Kähler potential of multilinear algebraic form. These state-spaces can be regarded as ruled algebraic varieties upon which a projective quantum model order reduction (MOR) is performed. The Riemannian sectional curvature of ruled Kählerian varieties is analyzed, and proved to be non-positive upon all sections that contain a rule. These manifolds are shown to contain Slater determinants as a special case and their identity with Grassmannian varieties is demonstrated. The resulting simulation formalism is used to construct a positive P-representation for the thermal density matrix. Single-spin detection by magnetic resonance force microscopy (MRFM) is simulated, and the data statistics are shown to be those of a random telegraph signal with additive white noise. Larger-scale spin-dust models are simulated, having no spatial symmetry and no spatial ordering; the high-fidelity projection of numerically computed quantum trajectories onto low dimensionality Kähler state-space manifolds is demonstrated. The reconstruction of quantum trajectories from sparse random projections is demonstrated, the onset of Donoho-Stodden breakdown at the Candès-Tao sparsity limit is observed, a deterministic construction for sampling matrices is given and methods for quantum state optimization by Dantzig selection are given.

  18. Geospatial Optimization of Siting Large-Scale Solar Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Quinby, Ted; Caulfield, Emmet

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent withmore » each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.« less

  19. Office of Electricity Delivery and Energy Reliability (OE) National Energy Technology Laboratory (NETL) American Recovery and Reinvestment Act 2009 United States Department of Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Mohit; Grape, Ulrik

    2014-07-29

    The purpose of this project was for Seeo to deliver the first ever large-scale or grid-scale prototype of a new class of advanced lithium-ion rechargeable batteries. The technology combines unprecedented energy density, lifetime, safety, and cost. The goal was to demonstrate Seeo’s entirely new class of lithium-based batteries based on Seeo’s proprietary nanostructured polymer electrolyte. This technology can enable the widespread deployment in Smart Grid applications and was demonstrated through the development and testing of a 10 kilowatt-hour (kWh) prototype battery system. This development effort, supported by the United States Department of Energy (DOE) enabled Seeo to pursue and validatemore » the transformational performance advantages of its technology for use in grid-tied energy storage applications. The focus of this project and Seeo’s goal as demonstrated through the efforts made under this project is to address the utility market needs for energy storage systems applications, especially for residential and commercial customers tied to solar photovoltaic installations. In addition to grid energy storage opportunities Seeo’s technology has been tested with automotive drive cycles and is seen as equally applicable for battery packs for electric vehicles. The goals of the project were outlined and achieved through a series of specific tasks, which encompassed materials development, scaling up of cells, demonstrating the performance of the cells, designing, building and demonstrating a pack prototype, and providing an economic and environmental assessment. Nearly all of the tasks were achieved over the duration of the program, with only the full demonstration of the battery system and a complete economic and environmental analysis not able to be fully completed. A timeline over the duration of the program is shown in figure 1.« less

  20. Fabrication and performance analysis of 4-sq cm indium tin oxide/InP photovoltaic solar cells

    NASA Technical Reports Server (NTRS)

    Gessert, T. A.; Li, X.; Phelps, P. W.; Coutts, T. J.; Tzafaras, N.

    1991-01-01

    Large-area photovoltaic solar cells based on direct current magnetron sputter deposition of indium tin oxide (ITO) into single-crystal p-InP substrates demonstrated both the radiation hardness and high performance necessary for extraterrestrial applications. A small-scale production project was initiated in which approximately 50 ITO/InP cells are being produced. The procedures used in this small-scale production of 4-sq cm ITO/InP cells are presented and discussed. The discussion includes analyses of performance range of all available production cells, and device performance data of the best cells thus far produced. Additionally, processing experience gained from the production of these cells is discussed, indicating other issues that may be encountered when large-scale productions are begun.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less

  2. Management tools for R&D engineering projects: Coordination perspective for large international consortium (NeXOS)

    NASA Astrophysics Data System (ADS)

    Castro, Ayoze; Memè, Simone; Quevedo, Eduardo; Waldmann, Christoph; Pearlman, Jay; Delory, Eric; Llinás, Octavio

    2017-04-01

    NeXOS is a cross-functional and multidisciplinary project funded under the EU FP7 Program, which involves 21 organizations from six different European countries. They all have different backgrounds, interests, business models and perspectives. To be successful, NeXOS applied an international recognized management methodology tailored to the specific project's environment and conditions, with an explicit structure based on defined roles and responsibilities for the people involved in the project and a means for effective communication between them (Fig.1). The project, divided in four different stages of requirements, design, integration, validation and demonstration, allows a clearer monitor of its progress, a comparison of the level of achievement in accordance with the plan and an earlier detection of problems/issues, leading to implementation of less disruptive, but still effective corrective actions. NeXOS is following an ambitious plan to develop innovative sensor systems with a high degree of modularity and interoperability, starting with requirements definition through validation and demonstration phase. To make this integrative approach possible, a management development strategy has been used incorporating systems engineering methods (Fig.2). Although this is standard practice in software development and large scale systems such as aircraft production, it is still new in the ocean hardware business and therefore NeXOS was a test case for this development concept. The question is one of scale as ocean observation systems are typically built on the scale of a few with co-located teams. With a system of diverse technologies (optical, acoustic, platform interfaces), there are cultural differences that must be bridged. The greatest challenge is in the implementation and the willingness of different teams to work with an engineering process, which may help ultimate system integration, but may place additional burdens on individual participants. This presentation will address approaches for effective operations in this environment.

  3. Status of EPA's (Environmental Protection Agency's) LIMB (Limestone Injection Multistage Burner) demonstration program at Ohio Edison's Edgewater Unit 4. Report for September-December 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendriks, R.V.; Nolan, P.S.

    1987-01-01

    The paper describes and discusses the key design features of the retrofit of EPA's Limestone Injection Multistage Burner (LIMB) system to an operating, wall-fired utility boiler at Ohio Edison's Edgewater Station. It further describes results of the pertinent projects in EPA's LIMB program and shows how these results were used as the basis for the design of the system. The full-scale demonstration is expected to prove the effectiveness and cost of the LIMB concept for use on large-scale utility boilers. The equipment is now being installed at Edgewater, with system start-up scheduled for May 1987.

  4. Large Pilot Scale Testing of Linde/BASF Post-Combustion CO 2 Capture Technology at the Abbott Coal-Fired Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Brien, Kevin C.

    The work summarized in this report is the first step towards a project that will re-train and create jobs for personnel in the coal industry and continue regional economic development to benefit regions impacted by previous downturns. The larger project is aimed at capturing ~300 tons/day (272 metric tonnes/day) CO 2 at a 90% capture rate from existing coal- fired boilers at the Abbott Power Plant on the campus of University of Illinois (UI). It will employ the Linde-BASF novel amine-based advanced CO 2 capture technology, which has already shown the potential to be cost-effective, energy efficient and compact atmore » the 0.5-1.5 MWe pilot scales. The overall objective of the project is to design and install a scaled-up system of nominal 15 MWe size, integrate it with the Abbott Power Plant flue gas, steam and other utility systems, and demonstrate the viability of continuous operation under realistic conditions with high efficiency and capacity. The project will also begin to build a workforce that understands how to operate and maintain the capture plants by including students from regional community colleges and universities in the operation and evaluation of the capture system. This project will also lay the groundwork for follow-on projects that pilot utilization of the captured CO 2 from coal-fired power plants. The net impact will be to demonstrate a replicable means to (1) use a standardized procedure to evaluate power plants for their ability to be retrofitted with a pilot capture unit; (2) design and construct reliable capture systems based on the Linde-BASF technology; (3) operate and maintain these systems; (4) implement training programs with local community colleges and universities to establish a workforce to operate and maintain the systems; and (5) prepare to evaluate at the large pilot scale level various methods to utilize the resulting captured CO 2. Towards the larger project goal, the UI-led team, together with Linde, has completed a preliminary design for the carbon capture pilot plant with basic engineering and cost estimates, established permitting needs, identified approaches to address Environmental, Health, and Safety concerns related to pilot plant installation and operation, developed approaches for long-term use of the captured carbon, and established strategies for workforce development and job creation that will re-train coal operators to operate carbon capture plants. This report describes Phase I accomplishments and demonstrates that the project team is well-prepared for full implementation of Phase 2, to design, build, and operate the carbon capture pilot plant.« less

  5. Simulating Forest Carbon Dynamics in Response to Large-scale Fuel Reduction Treatments Under Projected Climate-fire Interactions in the Sierra Nevada Mountains, USA

    NASA Astrophysics Data System (ADS)

    Liang, S.; Hurteau, M. D.

    2016-12-01

    The interaction of warmer, drier climate and increasing large wildfires, coupled with increasing fire severity resulting from fire-exclusion are anticipated to undermine forest carbon (C) stock stability and C sink strength in the Sierra Nevada forests. Treatments, including thinning and prescribed burning, to reduce biomass and restore forest structure have proven effective at reducing fire severity and lessening C loss when treated stands are burned by wildfire. However, the current pace and scale of treatment implementation is limited, especially given recent increases in area burned by wildfire. In this study, we used a forest landscape model (LANDIS-II) to evaluate the role of implementation timing of large-scale fuel reduction treatments in influencing forest C stock and fluxes of Sierra Nevada forests with projected climate and larger wildfires. We ran 90-year simulations using climate and wildfire projections from three general circulation models driven by the A2 emission scenario. We simulated two different treatment implementation scenarios: a `distributed' (treatments implemented throughout the simulation) and an `accelerated' (treatments implemented during the first half century) scenario. We found that across the study area, accelerated implementation had 0.6-10.4 Mg ha-1 higher late-century aboveground biomass (AGB) and 1.0-2.2 g C m-2 yr-1 higher mean C sink strength than the distributed scenario, depending on specific climate-wildfire projections. Cumulative wildfire emissions over the simulation period were 0.7-3.9 Mg C ha-1 higher for distributed implementation relative to accelerated implementation. However, simulations with both implementation practices have considerably higher AGB and C sink strength as well as lower wildfire emission than simulations in the absence of fuel reduction treatments. The results demonstrate the potential for implementing large-scale fuel reduction treatments to enhance forest C stock stability and C sink strength under projected climate-wildfire interactions. Given climate and wildfire would become more stressful since the mid-century, a forward management action would grant us more C benefits.

  6. The future of management: The NASA paradigm

    NASA Technical Reports Server (NTRS)

    Harris, Philip R.

    1992-01-01

    Prototypes of 21st century management, especially for large scale enterprises, may well be found within the aerospace industry. The space era inaugurated a number of projects of such scope and magnitude that another type of management had to be created to ensure successful achievement. The challenges will be not just in terms of technology and its management, but also human and cultural in dimension. Futurists, students of management, and those concerned with technological administration would do well to review the literature of emerging space management for its wider implications. NASA offers a paradigm, or demonstrated model, of future trends in the field of management at large. More research is needed on issues of leadership for Earth based project in space and space based programs with managers there. It is needed to realize that large scale technical enterprises, such as are undertaken in space, require a new form of management. NASA and other responsible agencies are urged to study excellence in space macromanagement, including the necessary multidisciplinary skills. Two recommended targets are the application of general living systems theory and macromanagement concepts for space stations in the 1990s.

  7. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    ERIC Educational Resources Information Center

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  8. Internal Variability-Generated Uncertainty in East Asian Climate Projections Estimated with 40 CCSM3 Ensembles.

    PubMed

    Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang

    2016-01-01

    Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.

  9. Simulating the impact of the large-scale circulation on the 2-m temperature and precipitation climatology

    NASA Astrophysics Data System (ADS)

    Bowden, Jared H.; Nolte, Christopher G.; Otte, Tanya L.

    2013-04-01

    The impact of the simulated large-scale atmospheric circulation on the regional climate is examined using the Weather Research and Forecasting (WRF) model as a regional climate model. The purpose is to understand the potential need for interior grid nudging for dynamical downscaling of global climate model (GCM) output for air quality applications under a changing climate. In this study we downscale the NCEP-Department of Energy Atmospheric Model Intercomparison Project (AMIP-II) Reanalysis using three continuous 20-year WRF simulations: one simulation without interior grid nudging and two using different interior grid nudging methods. The biases in 2-m temperature and precipitation for the simulation without interior grid nudging are unreasonably large with respect to the North American Regional Reanalysis (NARR) over the eastern half of the contiguous United States (CONUS) during the summer when air quality concerns are most relevant. This study examines how these differences arise from errors in predicting the large-scale atmospheric circulation. It is demonstrated that the Bermuda high, which strongly influences the regional climate for much of the eastern half of the CONUS during the summer, is poorly simulated without interior grid nudging. In particular, two summers when the Bermuda high was west (1993) and east (2003) of its climatological position are chosen to illustrate problems in the large-scale atmospheric circulation anomalies. For both summers, WRF without interior grid nudging fails to simulate the placement of the upper-level anticyclonic (1993) and cyclonic (2003) circulation anomalies. The displacement of the large-scale circulation impacts the lower atmosphere moisture transport and precipitable water, affecting the convective environment and precipitation. Using interior grid nudging improves the large-scale circulation aloft and moisture transport/precipitable water anomalies, thereby improving the simulated 2-m temperature and precipitation. The results demonstrate that constraining the RCM to the large-scale features in the driving fields improves the overall accuracy of the simulated regional climate, and suggest that in the absence of such a constraint, the RCM will likely misrepresent important large-scale shifts in the atmospheric circulation under a future climate.

  10. Extreme Cost Reductions with Multi-Megawatt Centralized Inverter Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwabe, Ulrich; Fishman, Oleg

    2015-03-20

    The objective of this project was to fully develop, demonstrate, and commercialize a new type of utility scale PV system. Based on patented technology, this includes the development of a truly centralized inverter system with capacities up to 100MW, and a high voltage, distributed harvesting approach. This system promises to greatly impact both the energy yield from large scale PV systems by reducing losses and increasing yield from mismatched arrays, as well as reduce overall system costs through very cost effective conversion and BOS cost reductions enabled by higher voltage operation.

  11. Low-cost flywheel demonstration program

    NASA Astrophysics Data System (ADS)

    Rabenhorst, D. W.; Small, T. R.; Wilkinson, W. O.

    1980-04-01

    All primary objectives were successfully achieved as follows: demonstration of a full-size, 1 kWh flywheel having an estimated cost in large-volume production of approximately $50/kWh; development of a ball-bearing system having losses comparable to the losses in a totally magnetic suspension system; successful and repeated demonstration of the low-cost flywheel in a complete flywheel energy-storage system based on the use of ordinary house voltage and frequency; and application of the experience gained in the hardware program to project the system design into a complete, full-scale, 30 kWh home-type flywheel energy-storage system.

  12. The XChemExplorer graphical workflow tool for routine or large-scale protein–ligand structure determination

    PubMed Central

    Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian

    2017-01-01

    XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein–ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallo­graphic software packages such as CCP4 [Winn et al. (2011 ▸), Acta Cryst. D67, 235–242] or PHENIX [Adams et al. (2010 ▸), Acta Cryst. D66, 213–221] have entrenched the paradigm that a ‘project’ is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects. PMID:28291762

  13. NBS/NIST Gas Thermometry From 0 to 660 °C

    PubMed Central

    Schooley, J. F.

    1990-01-01

    In the NBS/NIST Gas Thermometry program, constant-volume gas thermometers, a unique mercury manometer, and a highly accurate thermal expansion apparatus have been employed to evaluate temperatures on the Kelvin Thermodynamic Temperature Scale (KTTS) that correspond to particular temperatures on the 1968 International Practical Temperature Scale (IPTS-68). In this paper, we present a summary of the NBS/NIST Gas Thermometry project, which originated with planning activities in the late 1920s and was completed by measurements of the differences t(KTTS)-t(IPTS-68) in the range 0 to 660 °C. Early results of this project were the first to demonstrate the surprisingly large inaccuracy of the IPTS-68 with respect to the KTTS above 0 °C. Advances in several different measurement techniques, development of new, specialized instruments, and two distinct sets of gas thermometry observations have resulted from the project. PMID:28179778

  14. Supersonic Retropropulsion Technology Development in NASA's Entry, Descent, and Landing Project

    NASA Technical Reports Server (NTRS)

    Edquist, Karl T.; Berry, Scott A.; Rhode, Matthew N.; Kelb, Bil; Korzun, Ashley; Dyakonov, Artem A.; Zarchi, Kerry A.; Schauerhamer, Daniel G.; Post, Ethan A.

    2012-01-01

    NASA's Entry, Descent, and Landing (EDL) space technology roadmap calls for new technologies to achieve human exploration of Mars in the coming decades [1]. One of those technologies, termed Supersonic Retropropulsion (SRP), involves initiation of propulsive deceleration at supersonic Mach numbers. The potential benefits afforded by SRP to improve payload mass and landing precision make the technology attractive for future EDL missions. NASA's EDL project spent two years advancing the technological maturity of SRP for Mars exploration [2-15]. This paper summarizes the technical accomplishments from the project and highlights challenges and recommendations for future SRP technology development programs. These challenges include: developing sufficiently large SRP engines for use on human-scale entry systems; testing and computationally modelling complex and unsteady SRP fluid dynamics; understanding the effects of SRP on entry vehicle stability and controllability; and demonstrating sub-scale SRP entry systems in Earth's atmosphere.

  15. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    NASA Technical Reports Server (NTRS)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  16. Historical evidence of importance to the industrialization of flat-plate silicon photovoltaic systems. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Smith, J. L.

    1978-01-01

    An analysis is given of the Low-Cost Silicon Solar Array Project plans for the industrialization of new production technologies expected to be forthcoming as a result of the project's technology development efforts. In particular, LSSA's mandate to insure an annual production capability of 500 MW peak for the photovoltaic supply industry by 1986 is critically examined. The examination focuses on one of the concerns behind this goal -- timely development of industrial capacity to supply anticipated demand. Some of the conclusions include: (1) construction of small-scale pilot plants should be undertaken only for purposes of technology development; (2) large-scale demonstrations should be undertaken only when the technology is well in hand; (3) commercial-scale production should be left to the private sector; (4) the 500-MW annual output goal should be shifted to Program Headquarters.

  17. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    ERIC Educational Resources Information Center

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-01-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials…

  18. Consolidated Laser-Induced Fluorescence Diagnostic Systems for the NASA Ames Arc Jet Facilities

    NASA Technical Reports Server (NTRS)

    Grinstead, Jay H.; Wilder, Michael C.; Porter, Barry J.; Brown, Jeffrey D.; Yeung, Dickson; Battazzo, Stephen J.; Brubaker, Timothy R.

    2016-01-01

    The spectroscopic diagnostic technique of two photon absorption laser-induced fluorescence (LIF) of atomic species for non-intrusive arc jet flow property measurement was first implemented at NASA Ames in the mid-1990s. In 2013-2014, NASA combined the agency's large-scale arc jet test capabilities at NASA Ames. Concurrent with that effort, the agency also sponsored a project to establish two comprehensive LIF diagnostic systems for the Aerodynamic Heating Facility (AHF) and Interaction Heating Facility (IHF) arc jets. The scope of the project enabled further engineering development of the existing IHF LIF system as well as the complete reconstruction of the AHF LIF system. The updated LIF systems are identical in design and capability. They represent the culmination of over 20 years of development experience in transitioning a specialized laboratory research tool into a measurement system for large-scale, high-demand test facilities. This paper will document the latest improvements of the LIF system design and demonstrations of the redeveloped AHF and IHF LIF systems.

  19. Proposed roadmap for overcoming legal and financial obstacles to carbon capture and sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, Wendy; Chohen, Leah; Kostakidis-Lianos, Leah

    Many existing proposals either lack sufficient concreteness to make carbon capture and geological sequestration (CCGS) operational or fail to focus on a comprehensive, long term framework for its regulation, thus failing to account adequately for the urgency of the issue, the need to develop immediate experience with large scale demonstration projects, or the financial and other incentives required to launch early demonstration projects. We aim to help fill this void by proposing a roadmap to commercial deployment of CCGS in the United States.This roadmap focuses on the legal and financial incentives necessary for rapid demonstration of geological sequestration in themore » absence of national restrictions on CO2 emissions. It weaves together existing federal programs and financing opportunities into a set of recommendations for achieving commercial viability of geological sequestration.« less

  20. Assessing and Projecting Greenhouse Gas Release due to Abrupt Permafrost Degradation

    NASA Astrophysics Data System (ADS)

    Saito, K.; Ohno, H.; Yokohata, T.; Iwahana, G.; Machiya, H.

    2017-12-01

    Permafrost is a large reservoir of frozen soil organic carbon (SOC; about half of all the terrestrial storage). Therefore, its degradation (i.e., thawing) under global warming may lead to a substantial amount of additional greenhouse gas (GHG) release. However, understanding of the processes, geographical distribution of such hazards, and implementation of the relevant processes in the advanced climate models are insufficient yet so that variations in permafrost remains one of the large source of uncertainty in climatic and biogeochemical assessment and projections. Thermokarst, induced by melting of ground ice in ice-rich permafrost, leads to dynamic surface subsidence up to 60 m, which further affects local and regional societies and eco-systems in the Arctic. It can also accelerate a large-scale warming process through a positive feedback between released GHGs (especially methane), atmospheric warming and permafrost degradation. This three-year research project (2-1605, Environment Research and Technology Development Fund of the Ministry of the Environment, Japan) aims to assess and project the impacts of GHG release through dynamic permafrost degradation through in-situ and remote (e.g., satellite and airborn) observations, lab analysis of sampled ice and soil cores, and numerical modeling, by demonstrating the vulnerability distribution and relative impacts between large-scale degradation and such dynamic degradation. Our preliminary laboratory analysis of ice and soil cores sampled in 2016 at the Alaskan and Siberian sites largely underlain by ice-rich permafrost, shows that, although gas volumes trapped in unit mass are more or less homogenous among sites both for ice and soil cores, large variations are found in the methane concentration in the trapped gases, ranging from a few ppm (similar to that of the atmosphere) to hundreds of thousands ppm We will also present our numerical approach to evaluate relative impacts of GHGs released through dynamic permafrost degradations, by implementing conceptual modeling to assess and project distribution and affected amount of ground ice and SOC.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodford, William

    This document is the final technical report from 24M Technologies on the project titled: Low Cost, Structurally Advanced Novel Electrode and Cell Manufacturing. All of the program milestones and deliverables were completed during the performance of the award. Specific accomplishments are 1) 24M demonstrated the processability and electrochemical performance of semi-solid electrodes with active volume contents increased by 10% relative to the program baseline; 2) electrode-level metrics, quality, and yield were demonstrated at an 80 cm 2 electrode footprint; 3) these electrodes were integrated into cells with consistent capacities and impedances, including cells delivered to Argonne National Laboratory for independentmore » testing; 4) those processes were scaled to a large-format (> 260 cm 2) electrode footprint and quality and yield were demonstrated; 5) a high-volume manufacturing approach for large-format electrode fabrication was demonstrated; and 6) large-format cells (> 100 Ah capacity) were prototyped with consistent capacity and impedance, including cells which were delivered to Argonne National Laboratory for independent testing.« less

  2. Efficient coarse simulation of a growing avascular tumor

    PubMed Central

    Kavousanakis, Michail E.; Liu, Ping; Boudouvis, Andreas G.; Lowengrub, John; Kevrekidis, Ioannis G.

    2013-01-01

    The subject of this work is the development and implementation of algorithms which accelerate the simulation of early stage tumor growth models. Among the different computational approaches used for the simulation of tumor progression, discrete stochastic models (e.g., cellular automata) have been widely used to describe processes occurring at the cell and subcell scales (e.g., cell-cell interactions and signaling processes). To describe macroscopic characteristics (e.g., morphology) of growing tumors, large numbers of interacting cells must be simulated. However, the high computational demands of stochastic models make the simulation of large-scale systems impractical. Alternatively, continuum models, which can describe behavior at the tumor scale, often rely on phenomenological assumptions in place of rigorous upscaling of microscopic models. This limits their predictive power. In this work, we circumvent the derivation of closed macroscopic equations for the growing cancer cell populations; instead, we construct, based on the so-called “equation-free” framework, a computational superstructure, which wraps around the individual-based cell-level simulator and accelerates the computations required for the study of the long-time behavior of systems involving many interacting cells. The microscopic model, e.g., a cellular automaton, which simulates the evolution of cancer cell populations, is executed for relatively short time intervals, at the end of which coarse-scale information is obtained. These coarse variables evolve on slower time scales than each individual cell in the population, enabling the application of forward projection schemes, which extrapolate their values at later times. This technique is referred to as coarse projective integration. Increasing the ratio of projection times to microscopic simulator execution times enhances the computational savings. Crucial accuracy issues arising for growing tumors with radial symmetry are addressed by applying the coarse projective integration scheme in a cotraveling (cogrowing) frame. As a proof of principle, we demonstrate that the application of this scheme yields highly accurate solutions, while preserving the computational savings of coarse projective integration. PMID:22587128

  3. Geothermal projects funded under the NER 300 programme - current state of development and knowledge gained

    NASA Astrophysics Data System (ADS)

    Shortall, Ruth; Uihlein, Andreas

    2017-04-01

    Introduction The NER 300 programme, managed by the European Commission is one of the largest funding programmes for innovative low-carbon energy demonstration projects. NER 300 is so called because it is funded from the sale of 300 million emission allowances from the new entrants' reserve (NER) set up for the third phase of the EU emissions trading system (ETS). The programme aims to successfully demonstrate environmentally safe carbon capture and storage (CCS) and innovative renewable energy (RES) technologies on a commercial scale with a view to scaling up production of low-carbon technologies in the EU. Consequently, it supports a wide range of CCS and RES technologies (bioenergy, concentrated solar power, photovoltaics, geothermal, wind, ocean, hydropower, and smart grids). Funded projects and the role of geothermal projects for the programme In total, about EUR 2.1 billion have been awarded through the programme's 2 calls for proposals (the first awarded in December 2012, the second in July 2014). The programme has awarded around EUR 70 million funding to 3 geothermal projects in Hungary, Croatia and France. The Croatian geothermal project will enter into operation during 2017 the Hungarian in 2018, and the French in 2020. Knowledge Sharing Knowledge sharing requirements are built into the legal basis of the programme as a critical tool to lower risks in bridging the transition to large-scale production of innovative renewable energy and CCS deployment. Projects have to submit annually to the European Commission relevant knowledge gained during that year in the implementation of their project. The relevant knowledge is aggregated and disseminated by the European Commission to industry, research, government, NGO and other interest groups and associations in order to provide a better understanding of the practical challenges that arise in the important step of scaling up technologies and operating them at commercial scale. The knowledge sharing of the NER 300 programme should lead to better planning and faster introduction of low carbon technologies in the future. Content of the presentation The presentation will introduce the geothermal projects that have been awarded funding (see Annex), including their state-of-play. Insights and knowledge gained from the projects that have entered into operation will be shown and discussed. Furthermore, the presentation will provide an overview of the NER 300 programme.

  4. Program Correctness, Verification and Testing for Exascale (Corvette)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Koushik; Iancu, Costin; Demmel, James W

    The goal of this project is to provide tools to assess the correctness of parallel programs written using hybrid parallelism. There is a dire lack of both theoretical and engineering know-how in the area of finding bugs in hybrid or large scale parallel programs, which our research aims to change. In the project we have demonstrated novel approaches in several areas: 1. Low overhead automated and precise detection of concurrency bugs at scale. 2. Using low overhead bug detection tools to guide speculative program transformations for performance. 3. Techniques to reduce the concurrency required to reproduce a bug using partialmore » program restart/replay. 4. Techniques to provide reproducible execution of floating point programs. 5. Techniques for tuning the floating point precision used in codes.« less

  5. Intercomparison Project on Parameterizations of Large-Scale Dynamics for Simulations of Tropical Convection

    NASA Astrophysics Data System (ADS)

    Sobel, A. H.; Wang, S.; Bellon, G.; Sessions, S. L.; Woolnough, S.

    2013-12-01

    Parameterizations of large-scale dynamics have been developed in the past decade for studying the interaction between tropical convection and large-scale dynamics, based on our physical understanding of the tropical atmosphere. A principal advantage of these methods is that they offer a pathway to attack the key question of what controls large-scale variations of tropical deep convection. These methods have been used with both single column models (SCMs) and cloud-resolving models (CRMs) to study the interaction of deep convection with several kinds of environmental forcings. While much has been learned from these efforts, different groups' efforts are somewhat hard to compare. Different models, different versions of the large-scale parameterization methods, and experimental designs that differ in other ways are used. It is not obvious which choices are consequential to the scientific conclusions drawn and which are not. The methods have matured to the point that there is value in an intercomparison project. In this context, the Global Atmospheric Systems Study - Weak Temperature Gradient (GASS-WTG) project was proposed at the Pan-GASS meeting in September 2012. The weak temperature gradient approximation is one method to parameterize large-scale dynamics, and is used in the project name for historical reasons and simplicity, but another method, the damped gravity wave (DGW) method, will also be used in the project. The goal of the GASS-WTG project is to develop community understanding of the parameterization methods currently in use. Their strengths, weaknesses, and functionality in models with different physics and numerics will be explored in detail, and their utility to improve our understanding of tropical weather and climate phenomena will be further evaluated. This presentation will introduce the intercomparison project, including background, goals, and overview of the proposed experimental design. Interested groups will be invited to join (it will not be too late), and preliminary results will be presented.

  6. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  7. Final Technical Progress Report: Development of Low-Cost Suspension Heliostat; December 7, 2011 - December 6, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, W.

    2013-01-01

    Final technical progress report of SunShot Incubator Solaflect Energy. The project succeeded in demonstrating that the Solaflect Suspension Heliostat design is viable for large-scale CSP installations. Canting accuracy is acceptable and is continually improving as Solaflect improves its understanding of this design. Cost reduction initiatives were successful, and there are still many opportunities for further development and further cost reduction.

  8. Process for Low Cost Domestic Production of LIB Cathode Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurston, Anthony

    The objective of the research was to determine the best low cost method for the large scale production of the Nickel-Cobalt-Manganese (NCM) layered cathode materials. The research and development focused on scaling up the licensed technology from Argonne National Laboratory in BASF’s battery material pilot plant in Beachwood Ohio. Since BASF did not have experience with the large scale production of the NCM cathode materials there was a significant amount of development that was needed to support BASF’s already existing research program. During the three year period BASF was able to develop and validate production processes for the NCM 111,more » 523 and 424 materials as well as begin development of the High Energy NCM. BASF also used this time period to provide free cathode material samples to numerous manufactures, OEM’s and research companies in order to validate the ma-terials. The success of the project can be demonstrated by the construction of the production plant in Elyria Ohio and the successful operation of that facility. The benefit of the project to the public will begin to be apparent as soon as material from the production plant is being used in electric vehicles.« less

  9. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doug Cathro

    The Lake Charles CCS Project is a large-scale industrial carbon capture and sequestration (CCS) project which will demonstrate advanced technologies that capture and sequester carbon dioxide (CO{sub 2}) emissions from industrial sources into underground formations. Specifically the Lake Charles CCS Project will accelerate commercialization of large-scale CO{sub 2} storage from industrial sources by leveraging synergy between a proposed petroleum coke to chemicals plant (the LCC Gasification Project) and the largest integrated anthropogenic CO{sub 2} capture, transport, and monitored sequestration program in the U.S. Gulf Coast Region. The Lake Charles CCS Project will promote the expansion of EOR in Texas andmore » Louisiana and supply greater energy security by expanding domestic energy supplies. The capture, compression, pipeline, injection, and monitoring infrastructure will continue to sequester CO{sub 2} for many years after the completion of the term of the DOE agreement. The objectives of this project are expected to be fulfilled by working through two distinct phases. The overall objective of Phase 1 was to develop a fully definitive project basis for a competitive Renewal Application process to proceed into Phase 2 - Design, Construction and Operations. Phase 1 includes the studies attached hereto that will establish: the engineering design basis for the capture, compression and transportation of CO{sub 2} from the LCC Gasification Project, and the criteria and specifications for a monitoring, verification and accounting (MVA) plan at the Hastings oil field in Texas. The overall objective of Phase 2, provided a successful competitive down-selection, is to execute design, construction and operations of three capital projects: (1) the CO{sub 2} capture and compression equipment, (2) a Connector Pipeline from the LLC Gasification Project to the Green Pipeline owned by Denbury and an affiliate of Denbury, and (3) a comprehensive MVA system at the Hastings oil field.« less

  11. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    PubMed

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  12. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    PubMed

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  13. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    PubMed Central

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  14. Drought and heatwaves in Europe: historical reconstruction and future projections

    NASA Astrophysics Data System (ADS)

    Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Rakovec, Olda; Wood, Eric; Sheffield, Justin; Pan, Ming; Wanders, Niko; Prudhomme, Christel

    2017-04-01

    Heat waves and droughts are creeping hydro-meteorological events that may bring societies and natural systems to their limits by inducing large famines, increasing health risks to the population, creating drinking and irrigation water shortfalls, inducing natural fires and degradation of soil and water quality, and in many cases causing large socio-economic losses. Europe, in particular, has endured large scale drought-heat-wave events during the recent past (e.g., 2003 European drought), which have induced enormous socio-economic losses as well as casualties. Recent studies showed that the prediction of droughts and heatwaves is subject to large-scale forcing and parametric uncertainties that lead to considerable uncertainties in the projections of extreme characteristics such as drought magnitude/duration and area under drought, among others. Future projections are also heavily influenced by the RCP scenario uncertainty as well as the coarser spatial resolution of the models. The EDgE project funded by the Copernicus programme (C3S) provides an unique opportunity to investigate the evolution of droughts and heatwaves from 1950 until 2099 over the Pan-EU domain at a scale of 5x5 km2. In this project, high-resolution multi-model hydrologic simulations with the mHM (www.ufz.de/mhm), Noah-MP, VIC and PCR-GLOBWB have been completed for the historical period 1955-2015. Climate projections have been carried out with five CMIP-5 GCMs: GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M from 2006 to 2099 under RCP2.6 and RCP8.5. Using these multi-model unprecedented simulations, daily soil moisture index and temperature anomalies since 1955 until 2099 will be estimated. Using the procedure proposed by Samaniego et al. (2013), the probabilities of exceeding the benchmark events in the reference period 1980-2010 will be estimated for each RCP scenario. References http://climate.copernicus.eu/edge-end-end-demonstrator-improved-decision-making-water-sector-europe Samaniego, L., R. Kumar, and M. Zink, 2013: Implications of parameter uncertainty on soil moisture drought analysis in Germany. J. Hydrometeor., 14, 47-68, doi:10.1175/JHM-D-12-075.1. Samaniego, L., et al. 2016: Propagation of forcing and model uncertainties on to hydrological drought characteristics in a multi-model century-long experiment in large river basins. Climatic Change. 1-15.

  15. Development of a database system for mapping insertional mutations onto the mouse genome with large-scale experimental data

    PubMed Central

    2009-01-01

    Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505

  16. The value of demonstration projects for new interventions: The case of human papillomavirus vaccine introduction in low- and middle-income countries.

    PubMed

    Howard, N; Mounier-Jack, S; Gallagher, K E; Kabakama, S; Griffiths, U K; Feletto, M; LaMontagne, D S; Burchett, H E D; Watson-Jones, D

    2016-09-01

    Demonstration projects or pilots of new public health interventions aim to build learning and capacity to inform country-wide implementation. Authors examined the value of HPV vaccination demonstration projects and initial national programmes in low-income and lower-middle-income countries, including potential drawbacks and how value for national scale-up might be increased. Data from a systematic review and key informant interviews, analyzed thematically, included 55 demonstration projects and 8 national programmes implemented between 2007-2015 (89 years' experience). Initial demonstration projects quickly provided consistent lessons. Value would increase if projects were designed to inform sustainable national scale-up. Well-designed projects can test multiple delivery strategies, implementation for challenging areas and populations, and integration with national systems. Introduction of vaccines or other health interventions, particularly those involving new target groups or delivery strategies, needs flexible funding approaches to address specific questions of scalability and sustainability, including learning lessons through phased national expansion.

  17. Guide to Documenting and Managing Cost and Performance Information for Remediation Projects - Revised Version

    EPA Pesticide Factsheets

    This Guide to Documenting and Managing Cost and Performance Information for Remediation Projects provides the recommended procedures for documenting the results of completed and on-going full-scale and demonstration-scale remediation projects.

  18. Optical mapping and its potential for large-scale sequencing projects.

    PubMed

    Aston, C; Mishra, B; Schwartz, D C

    1999-07-01

    Physical mapping has been rediscovered as an important component of large-scale sequencing projects. Restriction maps provide landmark sequences at defined intervals, and high-resolution restriction maps can be assembled from ensembles of single molecules by optical means. Such optical maps can be constructed from both large-insert clones and genomic DNA, and are used as a scaffold for accurately aligning sequence contigs generated by shotgun sequencing.

  19. A numerical projection technique for large-scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang

    2011-10-01

    We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.

  20. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation

    PubMed Central

    Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho

    2014-01-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299

  1. Handling Big Data in Medical Imaging: Iterative Reconstruction with Large-Scale Automated Parallel Computation.

    PubMed

    Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho

    2014-11-01

    The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.

  2. Environmental impacts of large-scale CSP plants in northwestern China.

    PubMed

    Wu, Zhiyong; Hou, Anping; Chang, Chun; Huang, Xiang; Shi, Duoqi; Wang, Zhifeng

    2014-01-01

    Several concentrated solar power demonstration plants are being constructed, and a few commercial plants have been announced in northwestern China. However, the mutual impacts between the concentrated solar power plants and their surrounding environments have not yet been addressed comprehensively in literature by the parties involved in these projects. In China, these projects are especially important as an increasing amount of low carbon electricity needs to be generated in order to maintain the current economic growth while simultaneously lessening pollution. In this study, the authors assess the potential environmental impacts of large-scale concentrated solar power plants. Specifically, the water use intensity, soil erosion and soil temperature are quantitatively examined. It was found that some of the impacts are favorable, while some impacts are negative in relation to traditional power generation techniques and some need further research before they can be reasonably appraised. In quantitative terms, concentrated solar power plants consume about 4000 L MW(-1) h(-1) of water if wet cooling technology is used, and the collectors lead to the soil temperature changes of between 0.5 and 4 °C; however, it was found that the soil erosion is dramatically alleviated. The results of this study are helpful to decision-makers in concentrated solar power site selection and regional planning. Some conclusions of this study are also valid for large-scale photovoltaic plants.

  3. Utility-Scale Solar 2014. An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolinger, Mark; Seel, Joachim

    2015-09-01

    Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MW AC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the nextmore » few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.« less

  4. Fluorescence guided lymph node biopsy in large animals using direct image projection device

    NASA Astrophysics Data System (ADS)

    Ringhausen, Elizabeth; Wang, Tylon; Pitts, Jonathan; Akers, Walter J.

    2016-03-01

    The use of fluorescence imaging for aiding oncologic surgery is a fast growing field in biomedical imaging, revolutionizing open and minimally invasive surgery practices. We have designed, constructed, and tested a system for fluorescence image acquisition and direct display on the surgical field for fluorescence guided surgery. The system uses a near-infrared sensitive CMOS camera for image acquisition, a near-infra LED light source for excitation, and DLP digital projector for projection of fluorescence image data onto the operating field in real time. Instrument control was implemented in Matlab for image capture, processing of acquired data and alignment of image parameters with the projected pattern. Accuracy of alignment was evaluated statistically to demonstrate sensitivity to small objects and alignment throughout the imaging field. After verification of accurate alignment, feasibility for clinical application was demonstrated in large animal models of sentinel lymph node biopsy. Indocyanine green was injected subcutaneously in Yorkshire pigs at various locations to model sentinel lymph node biopsy in gynecologic cancers, head and neck cancer, and melanoma. Fluorescence was detected by the camera system during operations and projected onto the imaging field, accurately identifying tissues containing the fluorescent tracer at up to 15 frames per second. Fluorescence information was projected as binary green regions after thresholding and denoising raw intensity data. Promising results with this initial clinical scale prototype provided encouraging results for the feasibility of optical projection of acquired luminescence during open oncologic surgeries.

  5. Reducing Risk in CO2 Sequestration: A Framework for Integrated Monitoring of Basin Scale Injection

    NASA Astrophysics Data System (ADS)

    Seto, C. J.; Haidari, A. S.; McRae, G. J.

    2009-12-01

    Geological sequestration of CO2 is an option for stabilization of atmospheric CO2 concentrations. Technical ability to safely store CO2 in the subsurface has been demonstrated through pilot projects and a long history of enhanced oil recovery and acid gas disposal operations. To address climate change, current injection operations must be scaled up by a factor of 100, raising issues of safety and security. Monitoring and verification is an essential component in ensuring safe operations and managing risk. Monitoring provides assurance that CO2 is securely stored in the subsurface, and the mechanisms governing transport and storage are well understood. It also provides an early warning mechanism for identification of anomalies in performance, and a means for intervention and remediation through the ability to locate the CO2. Through theoretical studies, bench scale experiments and pilot tests, a number of technologies have demonstrated their ability to monitor CO2 in the surface and subsurface. Because the focus of these studies has been to demonstrate feasibility, individual techniques have not been integrated to provide a more robust method for monitoring. Considering the large volumes required for injection, size of the potential footprint, length of time a project must be monitored and uncertainty, operational considerations of cost and risk must balance safety and security. Integration of multiple monitoring techniques will reduce uncertainty in monitoring injected CO2, thereby reducing risk. We present a framework for risk management of large scale injection through model based monitoring network design. This framework is applied to monitoring CO2 in a synthetic reservoir where there is uncertainty in the underlying permeability field controlling fluid migration. Deformation and seismic data are used to track plume migration. A modified Ensemble Kalman filter approach is used to estimate flow properties by jointly assimilating flow and geomechanical observations. Issues of risk, cost and uncertainty are considered.

  6. Geothermal projects funded under the NER 300 programme - current state of development and knowledge gained

    NASA Astrophysics Data System (ADS)

    Uihlein, Andreas; Salto Saura, Lourdes; Sigfusson, Bergur; Lichtenvort, Kerstin; Gagliardi, Filippo

    2015-04-01

    Introduction The NER 300 programme, managed by the European Commission is one of the largest funding programmes for innovative low-carbon energy demonstration projects. NER 300 is so called because it is funded from the sale of 300 million emission allowances from the new entrants' reserve (NER) set up for the third phase of the EU emissions trading system (ETS). The programme aims to successfully demonstrate environmentally safe carbon capture and storage (CCS) and innovative renewable energy (RES) technologies on a commercial scale with a view to scaling up production of low-carbon technologies in the EU. Consequently, it supports a wide range of CCS and RES technologies (bioenergy, concentrated solar power, photovoltaics, geothermal, wind, ocean, hydropower, and smart grids). Funded projects and the role of geothermal projects for the programme In total, about EUR 2.1 billion have been awarded to 39 projects through the programme's 2 calls for proposals (the first awarded in December 2012, the second in July 2014). The programme has awarded around 70 mEUR funding to 3 geothermal projects in Hungary, Croatia and France (see Annex). The Hungarian geothermal project awarded funding under the first call will enter into operation at the end of 2015 and the rest are expected to start in 2016 (HR) and in 2018 (FR), respectively. Knowledge Sharing Knowledge sharing requirements are built into the legal basis of the programme as a critical tool to lower risks in bridging the transition to large-scale production of innovative renewable energy and CCS deployment. Projects have to submit annually to the European Commission relevant knowledge gained during that year in the implementation of their project. The relevant knowledge is aggregated and disseminated by the European Commission to industry, research, government, NGO and other interest groups and associations in order to provide a better understanding of the practical challenges that arise in the important step of scaling up technologies and operating them at commercial scale. The knowledge sharing of the NER 300 programme should lead to better planning and faster introduction of low carbon technologies in the future. Content of the presentation The presentation will introduce the geothermal projects that have been awarded funding, including their state-of-play. Insights and knowledge gained from the projects that have entered into operation will be shown and discussed. Furthermore, the presentation will provide an overview of the NER 300 programme.

  7. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    ERIC Educational Resources Information Center

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  8. Model projections of atmospheric steering of Sandy-like superstorms

    PubMed Central

    Barnes, Elizabeth A.; Polvani, Lorenzo M.; Sobel, Adam H.

    2013-01-01

    Superstorm Sandy ravaged the eastern seaboard of the United States, costing a great number of lives and billions of dollars in damage. Whether events like Sandy will become more frequent as anthropogenic greenhouse gases continue to increase remains an open and complex question. Here we consider whether the persistent large-scale atmospheric patterns that steered Sandy onto the coast will become more frequent in the coming decades. Using the Coupled Model Intercomparison Project, phase 5 multimodel ensemble, we demonstrate that climate models consistently project a decrease in the frequency and persistence of the westward flow that led to Sandy’s unprecedented track, implying that future atmospheric conditions are less likely than at present to propel storms westward into the coast. PMID:24003129

  9. Manufacturing Demonstration Facility: Roll-to-Roll Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datskos, Panos G; Joshi, Pooran C; List III, Frederick Alyious

    This Manufacturing Demonstration Facility (MDF)e roll-to-roll processing effort described in this report provided an excellent opportunity to investigate a number of advanced manufacturing approaches to achieve a path for low cost devices and sensors. Critical to this effort is the ability to deposit thin films at low temperatures using nanomaterials derived from nanofermentation. The overarching goal of this project was to develop roll-to-roll manufacturing processes of thin film deposition on low-cost flexible substrates for electronics and sensor applications. This project utilized ORNL s unique Pulse Thermal Processing (PTP) technologies coupled with non-vacuum low temperature deposition techniques, ORNL s clean roommore » facility, slot dye coating, drop casting, spin coating, screen printing and several other equipment including a Dimatix ink jet printer and a large-scale Kyocera ink jet printer. The roll-to-roll processing project had three main tasks: 1) develop and demonstrate zinc-Zn based opto-electronic sensors using low cost nanoparticulate structures manufactured in a related MDF Project using nanofermentation techniques, 2) evaluate the use of silver based conductive inks developed by project partner NovaCentrix for electronic device fabrication, and 3) demonstrate a suite of low cost printed sensors developed using non-vacuum deposition techniques which involved the integration of metal and semiconductor layers to establish a diverse sensor platform technology.« less

  10. Models projecting the fate of fish populations under climate change need to be based on valid physiological mechanisms.

    PubMed

    Lefevre, Sjannie; McKenzie, David J; Nilsson, Göran E

    2017-09-01

    Some recent modelling papers projecting smaller fish sizes and catches in a warmer future are based on erroneous assumptions regarding (i) the scaling of gills with body mass and (ii) the energetic cost of 'maintenance'. Assumption (i) posits that insurmountable geometric constraints prevent respiratory surface areas from growing as fast as body volume. It is argued that these constraints explain allometric scaling of energy metabolism, whereby larger fishes have relatively lower mass-specific metabolic rates. Assumption (ii) concludes that when fishes reach a certain size, basal oxygen demands will not be met, because of assumption (i). We here demonstrate unequivocally, by applying accepted physiological principles with reference to the existing literature, that these assumptions are not valid. Gills are folded surfaces, where the scaling of surface area to volume is not constrained by spherical geometry. The gill surface area can, in fact, increase linearly in proportion to gill volume and body mass. We cite the large body of evidence demonstrating that respiratory surface areas in fishes reflect metabolic needs, not vice versa, which explains the large interspecific variation in scaling of gill surface areas. Finally, we point out that future studies basing their predictions on models should incorporate factors for scaling of metabolic rate and for temperature effects on metabolism, which agree with measured values, and should account for interspecific variation in scaling and temperature effects. It is possible that some fishes will become smaller in the future, but to make reliable predictions the underlying mechanisms need to be identified and sought elsewhere than in geometric constraints on gill surface area. Furthermore, to ensure that useful information is conveyed to the public and policymakers about the possible effects of climate change, it is necessary to improve communication and congruity between fish physiologists and fisheries scientists. © 2017 John Wiley & Sons Ltd.

  11. Subsurface Characterization and Seismic Monitoring for the Southwest Partnerships Phase III Demonstration Project at Farnsworth Field, TX

    NASA Astrophysics Data System (ADS)

    Will, R. A.; Balch, R. S.

    2015-12-01

    The Southwest Partnership on Carbon Sequestration is performing seismic based characterization and monitoring activities at an active CO2 EOR project at Farnsworth Field, Texas. CO2 is anthropogenically sourced from a fertilizer and an ethanol plant. The field has 13 CO2 injectors and has sequestered 302,982 metric tonnes of CO2 since October 2013. The field site provides an excellent laboratory for testing a range of monitoring technologies in an operating CO2 flood since planned development is sequential and allows for multiple opportunities to record zero CO2 baseline data, mid-flood data, and fully flooded data. The project is comparing and contrasting several scales of seismic technologies in order to determine best practices for large scale commercial sequestration projects. Characterization efforts include an 85 km2 3D surface seismic survey, baseline and repeat 3D VSP surveys centered on injection wells, cross-well tomography baseline and repeat surveys between injector/producer pairs, and a borehole passive seismic array to monitor induced seismicity. All surveys have contributed to detailed geologic models which were then used for fluid flow and risk assessment simulations. 3D VSP and cross-well data with repeat surveys have allowed for direct comparisons of the reservoir prior to CO2 injection and at eight months into injection, with a goal of imaging the CO2 plume as it moves away from injection wells. Additional repeat surveys at regular intervals will continue to refine the plume. The goal of this work is to demonstrate seismic based technologies to monitor CO2 sequestration projects, and to contribute to best practices manuals for commercial scale CO2 sequestration projects. In this talk the seismic plan will be outlined, progress towards goals enumerated, and preliminary results from baseline and repeat seismic data will be discussed. Funding for this project is provided by the U.S. Department of Energy under Award No. DE-FC26-05NT42591.

  12. The impact of large-scale, long-term optical surveys on pulsating star research

    NASA Astrophysics Data System (ADS)

    Soszyński, Igor

    2017-09-01

    The era of large-scale photometric variability surveys began a quarter of a century ago, when three microlensing projects - EROS, MACHO, and OGLE - started their operation. These surveys initiated a revolution in the field of variable stars and in the next years they inspired many new observational projects. Large-scale optical surveys multiplied the number of variable stars known in the Universe. The huge, homogeneous and complete catalogs of pulsating stars, such as Cepheids, RR Lyrae stars, or long-period variables, offer an unprecedented opportunity to calibrate and test the accuracy of various distance indicators, to trace the three-dimensional structure of the Milky Way and other galaxies, to discover exotic types of intrinsically variable stars, or to study previously unknown features and behaviors of pulsators. We present historical and recent findings on various types of pulsating stars obtained from the optical large-scale surveys, with particular emphasis on the OGLE project which currently offers the largest photometric database among surveys for stellar variability.

  13. SUBTASK 2.19 – OPERATIONAL FLEXIBILITY OF CO2 TRANSPORT AND STORAGE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jensen, Melanie; Schlasner, Steven; Sorensen, James

    2014-12-31

    Carbon dioxide (CO2) is produced in large quantities during electricity generation and by industrial processes. These CO2 streams vary in terms of both composition and mass flow rate, sometimes substantially. The impact of a varying CO2 stream on pipeline and storage operation is not fully understood in terms of either operability or infrastructure robustness. This study was performed to summarize basic background from the literature on the topic of operational flexibility of CO2 transport and storage, but the primary focus was on compiling real-world lessons learned about flexible operation of CO2 pipelines and storage from both large-scale field demonstrations andmore » commercial operating experience. Modeling and pilot-scale results of research in this area were included to illustrate some of the questions that exist relative to operation of carbon capture and storage (CCS) projects with variable CO2 streams. It is hoped that this report’s real-world findings provide readers with useful information on the topic of transport and storage of variable CO2 streams. The real-world results were obtained from two sources. The first source consisted of five full-scale, commercial transport–storage projects: Sleipner, Snøhvit, In Salah, Weyburn, and Illinois Basin–Decatur. These scenarios were reviewed to determine the information that is available about CO2 stream variability/intermittency on these demonstration-scale projects. The five projects all experienced mass flow variability or an interruption in flow. In each case, pipeline and/or injection engineers were able to accommodate any issues that arose. Significant variability in composition has not been an issue at these five sites. The second source of real- world results was telephone interviews conducted with experts in CO2 pipeline transport, injection, and storage during which commercial anecdotal information was acquired to augment that found during the literature search of the five full-scale projects. The experts represented a range of disciplines and hailed from North America and Europe. Major findings of the study are that compression and transport of CO2 for enhanced oil recovery (EOR) purposes in the United States has shown that impurities are not likely to cause transport problems if CO2 stream composition standards are maintained and pressures are kept at 10.3 MPa or higher. Cyclic, or otherwise intermittent, CO2 supplies historically have not impacted in-field distribution pipeline networks, wellbore integrity, or reservoir conditions. The U.S. EOR industry has demonstrated that it is possible to adapt to variability and intermittency in CO2 supply through flexible operation of the pipeline and geologic storage facility. This CO2 transport and injection experience represents knowledge that can be applied in future CCS projects. A number of gaps in knowledge were identified that may benefit from future research and development, further enhancing the possibility for widespread application of CCS. This project was funded through the Energy & Environmental Research Center–U.S. Department of Energy Joint Program on Research and Development for Fossil Energy-Related Resources Cooperative Agreement No. DE-FC26-08NT43291. Nonfederal funding was provided by the IEA Greenhouse Gas R&D Programme.« less

  14. Using Markov chains of nucleotide sequences as a possible precursor to predict functional roles of human genome: a case study on inactive chromatin regions.

    PubMed

    Lee, K-E; Lee, E-J; Park, H-S

    2016-08-30

    Recent advances in computational epigenetics have provided new opportunities to evaluate n-gram probabilistic language models. In this paper, we describe a systematic genome-wide approach for predicting functional roles in inactive chromatin regions by using a sequence-based Markovian chromatin map of the human genome. We demonstrate that Markov chains of sequences can be used as a precursor to predict functional roles in heterochromatin regions and provide an example comparing two publicly available chromatin annotations of large-scale epigenomics projects: ENCODE project consortium and Roadmap Epigenomics consortium.

  15. Neighborhood scale quantification of ecosystem goods and ...

    EPA Pesticide Factsheets

    Ecosystem goods and services are those ecological structures and functions that humans can directly relate to their state of well-being. Ecosystem goods and services include, but are not limited to, a sufficient fresh water supply, fertile lands to produce agricultural products, shading, air and water of sufficient quality for designated uses, flood water retention, and places to recreate. The US Environmental Protection Agency (USEPA) Office of Research and Development’s Tampa Bay Ecosystem Services Demonstration Project (TBESDP) modeling efforts organized existing literature values for biophysical attributes and processes related to EGS. The goal was to develop a database for informing mapped-based EGS assessments for current and future land cover/use scenarios at multiple scales. This report serves as a demonstration of applying an EGS assessment approach at the large neighborhood scale (~1,000 acres of residential parcels plus common areas). Here, we present mapped inventories of ecosystem goods and services production at a neighborhood scale within the Tampa Bay, FL region. Comparisons of the inventory between two alternative neighborhood designs are presented as an example of how one might apply EGS concepts at this scale.

  16. The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications

    NASA Astrophysics Data System (ADS)

    Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.

    2010-01-01

    The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.

  17. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE PAGES

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.; ...

    2017-11-26

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  18. The United States Department Of Agriculture Northeast Area-wide Tick Control Project: history and protocol.

    PubMed

    Pound, Joe Mathews; Miller, John Allen; George, John E; Fish, Durland

    2009-08-01

    The Northeast Area-wide Tick Control Project (NEATCP) was funded by the United States Department of Agriculture (USDA) as a large-scale cooperative demonstration project of the USDA-Agricultural Research Service (ARS)-patented 4-Poster tick control technology (Pound et al. 1994) involving the USDA-ARS and a consortium of universities, state agencies, and a consulting firm at research locations in the five states of Connecticut (CT), Maryland (MD), New Jersey (NJ), New York (NY), and Rhode Island (RI). The stated objective of the project was "A community-based field trial of ARS-patented tick control technology designed to reduce the risk of Lyme disease in northeastern states." Here we relate the rationale and history of the technology, a chronological listing of events leading to implementation of the project, the original protocol for selecting treatment, and control sites, and protocols for deployment of treatments, sampling, assays, data analyses, and estimates of efficacy.

  19. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  20. Commercial-scale biotherapeutics manufacturing facility for plant-made pharmaceuticals.

    PubMed

    Holtz, Barry R; Berquist, Brian R; Bennett, Lindsay D; Kommineni, Vally J M; Munigunti, Ranjith K; White, Earl L; Wilkerson, Don C; Wong, Kah-Yat I; Ly, Lan H; Marcel, Sylvain

    2015-10-01

    Rapid, large-scale manufacture of medical countermeasures can be uniquely met by the plant-made-pharmaceutical platform technology. As a participant in the Defense Advanced Research Projects Agency (DARPA) Blue Angel project, the Caliber Biotherapeutics facility was designed, constructed, commissioned and released a therapeutic target (H1N1 influenza subunit vaccine) in <18 months from groundbreaking. As of 2015, this facility was one of the world's largest plant-based manufacturing facilities, with the capacity to process over 3500 kg of plant biomass per week in an automated multilevel growing environment using proprietary LED lighting. The facility can commission additional plant grow rooms that are already built to double this capacity. In addition to the commercial-scale manufacturing facility, a pilot production facility was designed based on the large-scale manufacturing specifications as a way to integrate product development and technology transfer. The primary research, development and manufacturing system employs vacuum-infiltrated Nicotiana benthamiana plants grown in a fully contained, hydroponic system for transient expression of recombinant proteins. This expression platform has been linked to a downstream process system, analytical characterization, and assessment of biological activity. This integrated approach has demonstrated rapid, high-quality production of therapeutic monoclonal antibody targets, including a panel of rituximab biosimilar/biobetter molecules and antiviral antibodies against influenza and dengue fever. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.

  1. Plans for Embedding ICTs into Teaching and Learning through a Large-Scale Secondary Education Reform in the Country of Georgia

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; Sales, Gregory; Sentocnik, Sonja

    2015-01-01

    Integrating ICTs into international development projects is common. However, focusing on how ICTs support leading, teaching, and learning is often overlooked. This article describes a team's approach to technology integration into the design of a large-scale, five year, teacher and leader professional development project in the country of Georgia.…

  2. Using microstructure observations to quantify fracture properties and improve reservoir simulations. Final report, September 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laubach, S.E.; Marrett, R.; Rossen, W.

    The research for this project provides new technology to understand and successfully characterize, predict, and simulate reservoir-scale fractures. Such fractures have worldwide importance because of their influence on successful extraction of resources. The scope of this project includes creation and testing of new methods to measure, interpret, and simulate reservoir fractures that overcome the challenge of inadequate sampling. The key to these methods is the use of microstructures as guides to the attributes of the large fractures that control reservoir behavior. One accomplishment of the project research is a demonstration that these microstructures can be reliably and inexpensively sampled. Specificmore » goals of this project were to: create and test new methods of measuring attributes of reservoir-scale fractures, particularly as fluid conduits, and test the methods on samples from reservoirs; extrapolate structural attributes to the reservoir scale through rigorous mathematical techniques and help build accurate and useful 3-D models of the interwell region; and design new ways to incorporate geological and geophysical information into reservoir simulation and verify the accuracy by comparison with production data. New analytical methods developed in the project are leading to a more realistic characterization of fractured reservoir rocks. Testing diagnostic and predictive approaches was an integral part of the research, and several tests were successfully completed.« less

  3. Analysis of central enterprise architecture elements in models of six eHealth projects.

    PubMed

    Virkanen, Hannu; Mykkänen, Juha

    2014-01-01

    Large-scale initiatives for eHealth services have been established in many countries on regional or national level. The use of Enterprise Architecture has been suggested as a methodology to govern and support the initiation, specification and implementation of large-scale initiatives including the governance of business changes as well as information technology. This study reports an analysis of six health IT projects in relation to Enterprise Architecture elements, focusing on central EA elements and viewpoints in different projects.

  4. Fishermen's Energy Atlantic City Wind Farm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wissemann, Chris

    Fishermen's Energy Atlantic City Wind Farm final report under US DOE Advanced Technology Demonstration project documents achievements developing a demonstration scale offshore wind project off the coast of New Jersey.

  5. CONSIDERATIONS FOR A REGULATORY FRAMEWORK FOR LARGE-SCALE GEOLOGIC SEQUESTRATION OF CARBON DIOXIDE: A NORTH AMERICAN PERSPECTIVE

    EPA Science Inventory

    Large scale geologic sequestration (GS) of carbon dioxide poses a novel set of challenges for regulators. This paper focuses on the unique needs of large scale GS projects in light of the existing regulatory regimes in the United States and Canada and identifies several differen...

  6. Energy Efficient Engine acoustic supporting technology report

    NASA Technical Reports Server (NTRS)

    Lavin, S. P.; Ho, P. Y.

    1985-01-01

    The acoustic development of the Energy Efficient Engine combined testing and analysis using scale model rigs and an integrated Core/Low Spool demonstration engine. The scale model tests show that a cut-on blade/vane ratio fan with a large spacing (S/C = 2.3) is as quiet as a cut-off blade/vane ratio with a tighter spacing (S/C = 1.27). Scale model mixer tests show that separate flow nozzles are the noisiest, conic nozzles the quietest, with forced mixers in between. Based on projections of ICLS data the Energy Efficient Engine (E3) has FAR 36 margins of 3.7 EPNdB at approach, 4.5 EPNdB at full power takeoff, and 7.2 EPNdB at sideline conditions.

  7. StePS: Stereographically Projected Cosmological Simulations

    NASA Astrophysics Data System (ADS)

    Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László

    2018-05-01

    StePS (Stereographically Projected Cosmological Simulations) compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to simulate the evolution of the large-scale structure. This eliminates the need for periodic boundary conditions, which are a numerical convenience unsupported by observation and which modifies the law of force on large scales in an unrealistic fashion. StePS uses stereographic projection for space compactification and naive O(N2) force calculation; this arrives at a correlation function of the same quality more quickly than standard (tree or P3M) algorithms with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence StePS can function as a high-speed prediction tool for modern large-scale surveys.

  8. Compactified cosmological simulations of the infinite universe

    NASA Astrophysics Data System (ADS)

    Rácz, Gábor; Szapudi, István; Csabai, István; Dobos, László

    2018-06-01

    We present a novel N-body simulation method that compactifies the infinite spatial extent of the Universe into a finite sphere with isotropic boundary conditions to follow the evolution of the large-scale structure. Our approach eliminates the need for periodic boundary conditions, a mere numerical convenience which is not supported by observation and which modifies the law of force on large scales in an unrealistic fashion. We demonstrate that our method outclasses standard simulations executed on workstation-scale hardware in dynamic range, it is balanced in following a comparable number of high and low k modes and, its fundamental geometry and topology match observations. Our approach is also capable of simulating an expanding, infinite universe in static coordinates with Newtonian dynamics. The price of these achievements is that most of the simulated volume has smoothly varying mass and spatial resolution, an approximation that carries different systematics than periodic simulations. Our initial implementation of the method is called StePS which stands for Stereographically projected cosmological simulations. It uses stereographic projection for space compactification and naive O(N^2) force calculation which is nevertheless faster to arrive at a correlation function of the same quality than any standard (tree or P3M) algorithm with similar spatial and mass resolution. The N2 force calculation is easy to adapt to modern graphics cards, hence our code can function as a high-speed prediction tool for modern large-scale surveys. To learn about the limits of the respective methods, we compare StePS with GADGET-2 running matching initial conditions.

  9. Columbia University's Informatics for Diabetes Education and Telemedicine (IDEATel) Project

    PubMed Central

    Starren, Justin; Hripcsak, George; Sengupta, Soumitra; Abbruscato, C.R.; Knudson, Paul E.; Weinstock, Ruth S.; Shea, Steven

    2002-01-01

    The Columbia University Informatics for Diabetes Education and Telemedicine IDEATel) project is a four-year demonstration project funded by the Centers for Medicare and Medicaid Services with the overall goal of evaluating the feasibility, acceptability, effectiveness, and cost-effectiveness of telemedicine. The focal point of the intervention is the home telemedicine unit (HTU), which provides four functions: synchronous videoconferencing over standard telephone lines, electronic transmission for fingerstick glucose and blood pressure readings, secure Web-based messaging and clinical data review, and access to Web-based educational materials. The HTU must be usable by elderly patients with no prior computer experience. Providing these functions through the HTU requires tight integration of six components: the HTU itself, case management software, a clinical information system, Web-based educational material, data security, and networking and telecommunications. These six components were integrated through a variety of interfaces, providing a system that works well for patients and providers. With more than 400 HTUs installed, IDEATel has demonstrated the feasibility of large-scale home telemedicine. PMID:11751801

  10. Effects of national ecological restoration projects on carbon sequestration in China from 2001 to 2010.

    PubMed

    Lu, Fei; Hu, Huifeng; Sun, Wenjuan; Zhu, Jiaojun; Liu, Guobin; Zhou, Wangming; Zhang, Quanfa; Shi, Peili; Liu, Xiuping; Wu, Xing; Zhang, Lu; Wei, Xiaohua; Dai, Limin; Zhang, Kerong; Sun, Yirong; Xue, Sha; Zhang, Wanjun; Xiong, Dingpeng; Deng, Lei; Liu, Bojie; Zhou, Li; Zhang, Chao; Zheng, Xiao; Cao, Jiansheng; Huang, Yao; He, Nianpeng; Zhou, Guoyi; Bai, Yongfei; Xie, Zongqiang; Tang, Zhiyao; Wu, Bingfang; Fang, Jingyun; Liu, Guohua; Yu, Guirui

    2018-04-17

    The long-term stressful utilization of forests and grasslands has led to ecosystem degradation and C loss. Since the late 1970s China has launched six key national ecological restoration projects to protect its environment and restore degraded ecosystems. Here, we conducted a large-scale field investigation and a literature survey of biomass and soil C in China's forest, shrubland, and grassland ecosystems across the regions where the six projects were implemented (∼16% of the country's land area). We investigated the changes in the C stocks of these ecosystems to evaluate the contributions of the projects to the country's C sink between 2001 and 2010. Over this decade, we estimated that the total annual C sink in the project region was 132 Tg C per y (1 Tg = 10 12 g), over half of which (74 Tg C per y, 56%) was attributed to the implementation of the projects. Our results demonstrate that these restoration projects have substantially contributed to CO 2 mitigation in China.

  11. Scale-up of mild gasification to be a process development unit mildgas 24 ton/day PDU design report. Final report, November 1991--July 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    From November 1991 to April 1996, Kerr McGee Coal Corporation (K-M Coal) led a project to develop the Institute of Gas Technology (IGT) Mild Gasification (MILDGAS) process for near-term commercialization. The specific objectives of the program were to: design, construct, and operate a 24-tons/day adiabatic process development unit (PDU) to obtain process performance data suitable for further design scale-up; obtain large batches of coal-derived co-products for industrial evaluation; prepare a detailed design of a demonstration unit; and develop technical and economic plans for commercialization of the MILDGAS process. The project team for the PDU development program consisted of: K-M Coal,more » IGT, Bechtel Corporation, Southern Illinois University at Carbondale (SIUC), General Motors (GM), Pellet Technology Corporation (PTC), LTV Steel, Armco Steel, Reilly Industries, and Auto Research.« less

  12. Development and Demonstration of an Aerial Imagery Assessment Method to Monitor Changes in Restored Stream Condition

    NASA Astrophysics Data System (ADS)

    Fong, L. S.; Ambrose, R. F.

    2017-12-01

    Remote sensing is an excellent way to assess the changing condition of streams and wetlands. Several studies have measured large-scale changes in riparian condition indicators, but few have remotely applied multi-metric assessments on a finer scale to measure changes, such as those caused by restoration, in the condition of small riparian areas. We developed an aerial imagery assessment method (AIAM) that combines landscape, hydrology, and vegetation observations into one index describing overall ecological condition of non-confined streams. Verification of AIAM demonstrated that sites in good condition (as assessed on-site by the California Rapid Assessment Method) received high AIAM scores. (AIAM was not verified with poor condition sites.) Spearman rank correlation tests comparing AIAM and the field-based California Rapid Assessment Method (CRAM) results revealed that some components of the two methods were highly correlated. The application of AIAM is illustrated with time-series restoration trajectories of three southern California stream restoration projects aged 15 to 21 years. The trajectories indicate that the projects improved in condition in years following their restoration, with vegetation showing the most dynamic change over time. AIAM restoration trajectories also overlapped to different degrees with CRAM chronosequence restoration performance curves that demonstrate the hypothetical development of high-performing projects. AIAM has high potential as a remote ecological assessment method and effective tool to determine restoration trajectories. Ultimately, this tool could be used to further improve stream and wetland restoration management.

  13. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    ERIC Educational Resources Information Center

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  14. How Robust Is Your Project? From Local Failures to Global Catastrophes: A Complex Networks Approach to Project Systemic Risk.

    PubMed

    Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders

    2015-01-01

    Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.

  15. Efficient data management in a large-scale epidemiology research project.

    PubMed

    Meyer, Jens; Ostrzinski, Stefan; Fredrich, Daniel; Havemann, Christoph; Krafczyk, Janina; Hoffmann, Wolfgang

    2012-09-01

    This article describes the concept of a "Central Data Management" (CDM) and its implementation within the large-scale population-based medical research project "Personalized Medicine". The CDM can be summarized as a conjunction of data capturing, data integration, data storage, data refinement, and data transfer. A wide spectrum of reliable "Extract Transform Load" (ETL) software for automatic integration of data as well as "electronic Case Report Forms" (eCRFs) was developed, in order to integrate decentralized and heterogeneously captured data. Due to the high sensitivity of the captured data, high system resource availability, data privacy, data security and quality assurance are of utmost importance. A complex data model was developed and implemented using an Oracle database in high availability cluster mode in order to integrate different types of participant-related data. Intelligent data capturing and storage mechanisms are improving the quality of data. Data privacy is ensured by a multi-layered role/right system for access control and de-identification of identifying data. A well defined backup process prevents data loss. Over the period of one and a half year, the CDM has captured a wide variety of data in the magnitude of approximately 5terabytes without experiencing any critical incidents of system breakdown or loss of data. The aim of this article is to demonstrate one possible way of establishing a Central Data Management in large-scale medical and epidemiological studies. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schauder, C.

    This subcontract report was completed under the auspices of the NREL/SCE High-Penetration Photovoltaic (PV) Integration Project, which is co-funded by the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) and the California Solar Initiative (CSI) Research, Development, Demonstration, and Deployment (RD&D) program funded by the California Public Utility Commission (CPUC) and managed by Itron. This project is focused on modeling, quantifying, and mitigating the impacts of large utility-scale PV systems (generally 1-5 MW in size) that are interconnected to the distribution system. This report discusses the concerns utilities have when interconnecting large PV systems thatmore » interconnect using PV inverters (a specific application of frequency converters). Additionally, a number of capabilities of PV inverters are described that could be implemented to mitigate the distribution system-level impacts of high-penetration PV integration. Finally, the main issues that need to be addressed to ease the interconnection of large PV systems to the distribution system are presented.« less

  17. Grid-Scale Energy Storage Demonstration of Ancillary Services Using the UltraBattery Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seasholtz, Jeff

    2015-08-20

    The collaboration described in this document is being done as part of a cooperative research agreement under the Department of Energy’s Smart Grid Demonstration Program. This document represents the Final Technical Performance Report, from July 2012 through April 2015, for the East Penn Manufacturing Smart Grid Program demonstration project. This Smart Grid Demonstration project demonstrates Distributed Energy Storage for Grid Support, in particular the economic and technical viability of a grid-scale, advanced energy storage system using UltraBattery ® technology for frequency regulation ancillary services and demand management services. This project entailed the construction of a dedicated facility on the Eastmore » Penn campus in Lyon Station, PA that is being used as a working demonstration to provide regulation ancillary services to PJM and demand management services to Metropolitan Edison (Met-Ed).« less

  18. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    NASA Astrophysics Data System (ADS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  19. Real World Cognitive Multi-Tasking and Problem Solving: A Large Scale Cognitive Architecture Simulation Through High Performance Computing-Project Casie

    DTIC Science & Technology

    2008-03-01

    computational version of the CASIE architecture serves to demonstrate the functionality of our primary theories. However, implementation of several other...following facts. First, based on Theorem 3 and Theorem 5, the objective function is non -increasing under updating rule (6); second, by the criteria for...reassignment in updating rule (7), it is trivial to show that the objective function is non -increasing under updating rule (7). A Unified View to Graph

  20. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    NASA Astrophysics Data System (ADS)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning and management, with proper consideration to potential technological developments and new options. Finally, it highlights the urgent need for a broader framework that integrates the physical science-related aspects ("hard sciences") and the human science-related aspects ("soft sciences").

  1. NASA: Assessments of Selected Large-Scale Projects

    DTIC Science & Technology

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  2. Carbon dioxide (CO2) sequestration in deep saline aquifers and formations: Chapter 3

    USGS Publications Warehouse

    Rosenbauer, Robert J.; Thomas, Burt

    2010-01-01

    Carbon dioxide (CO2) capture and sequestration in geologic media is one among many emerging strategies to reduce atmospheric emissions of anthropogenic CO2. This chapter looks at the potential of deep saline aquifers – based on their capacity and close proximity to large point sources of CO2 – as repositories for the geologic sequestration of CO2. The petrochemical characteristics which impact on the suitability of saline aquifers for CO2 sequestration and the role of coupled geochemical transport models and numerical tools in evaluating site feasibility are also examined. The full-scale commercial CO2 sequestration project at Sleipner is described together with ongoing pilot and demonstration projects.

  3. Status of HiLASE project: High average power pulsed DPSSL systems for research and industry

    NASA Astrophysics Data System (ADS)

    Mocek, T.; Divoky, M.; Smrz, M.; Sawicka, M.; Chyla, M.; Sikocinski, P.; Vohnikova, H.; Severova, P.; Lucianetti, A.; Novak, J.; Rus, B.

    2013-11-01

    We introduce the Czech national R&D project HiLASE which focuses on strategic development of advanced high-repetition rate, diode pumped solid state laser (DPSSL) systems that may find use in research, high-tech industry and in the future European large-scale facilities such as HiPER and ELI. Within HiLASE we explore two major concepts: thin-disk and cryogenically cooled multislab amplifiers capable of delivering average output powers above 1 kW level in picosecond-to-nanosecond pulsed regime. In particular, we have started a programme of technology development to demonstrate the scalability of multislab concept up to the kJ level at repetition rate of 1-10 Hz.

  4. The HALNA project: Diode-pumped solid-state laser for inertial fusion energy

    NASA Astrophysics Data System (ADS)

    Kawashima, T.; Ikegawa, T.; Kawanaka, J.; Miyanaga, N.; Nakatsuka, M.; Izawa, Y.; Matsumoto, O.; Yasuhara, R.; Kurita, T.; Sekine, T.; Miyamoto, M.; Kan, H.; Furukawa, H.; Motokoshi, S.; Kanabe, T.

    2006-06-01

    High-enery, rep.-rated, diode-pumped solid-state laser (DPSSL) is one of leading candidates for inertial fusion energy driver (IFE) and related laser-driven high-field applications. The project for the development of IFE laser driver in Japan, HALNA (High Average-power Laser for Nuclear Fusion Application) at ILE, Osaka University, aims to demonstrate 100-J pulse energy at 10 Hz rep. rate with 5 times diffraction limited beam quality. In this article, the advanced solid-state laser technologies for one half scale of HALNA (50 J, 10 Hz) are presented including thermally managed slab amplifier of Nd:phosphate glass and zig-zag optical geometry, and uniform, large-area diode-pumping.

  5. Anthropogenic aerosols and the distribution of past large-scale precipitation change

    DOE PAGES

    Wang, Chien

    2015-12-28

    In this paper, the climate response of precipitation to the effects of anthropogenic aerosols is a critical while not yet fully understood aspect in climate science. Results of selected models that participated the Coupled Model Intercomparison Project Phase 5 and the data from the Twentieth Century Reanalysis Project suggest that, throughout the tropics and also in the extratropical Northern Hemisphere, aerosols have largely dominated the distribution of precipitation changes in reference to the preindustrial era in the second half of the last century. Aerosol-induced cooling has offset some of the warming caused by the greenhouse gases from the tropics tomore » the Arctic and thus formed the gradients of surface temperature anomaly that enable the revealed precipitation change patterns to occur. Improved representation of aerosol-cloud interaction has been demonstrated as the key factor for models to reproduce consistent distributions of past precipitation change with the reanalysis data.« less

  6. Portable parallel stochastic optimization for the design of aeropropulsion components

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Rhodes, G. S.

    1994-01-01

    This report presents the results of Phase 1 research to develop a methodology for performing large-scale Multi-disciplinary Stochastic Optimization (MSO) for the design of aerospace systems ranging from aeropropulsion components to complete aircraft configurations. The current research recognizes that such design optimization problems are computationally expensive, and require the use of either massively parallel or multiple-processor computers. The methodology also recognizes that many operational and performance parameters are uncertain, and that uncertainty must be considered explicitly to achieve optimum performance and cost. The objective of this Phase 1 research was to initialize the development of an MSO methodology that is portable to a wide variety of hardware platforms, while achieving efficient, large-scale parallelism when multiple processors are available. The first effort in the project was a literature review of available computer hardware, as well as review of portable, parallel programming environments. The first effort was to implement the MSO methodology for a problem using the portable parallel programming language, Parallel Virtual Machine (PVM). The third and final effort was to demonstrate the example on a variety of computers, including a distributed-memory multiprocessor, a distributed-memory network of workstations, and a single-processor workstation. Results indicate the MSO methodology can be well-applied towards large-scale aerospace design problems. Nearly perfect linear speedup was demonstrated for computation of optimization sensitivity coefficients on both a 128-node distributed-memory multiprocessor (the Intel iPSC/860) and a network of workstations (speedups of almost 19 times achieved for 20 workstations). Very high parallel efficiencies (75 percent for 31 processors and 60 percent for 50 processors) were also achieved for computation of aerodynamic influence coefficients on the Intel. Finally, the multi-level parallelization strategy that will be needed for large-scale MSO problems was demonstrated to be highly efficient. The same parallel code instructions were used on both platforms, demonstrating portability. There are many applications for which MSO can be applied, including NASA's High-Speed-Civil Transport, and advanced propulsion systems. The use of MSO will reduce design and development time and testing costs dramatically.

  7. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques.more » Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two years of the project, we have successfully extended STAT to determine the relative progress of different MPI processes. We have shown that the STAT, which is now included in the debugging tools distributed by Cray with their large-scale systems, substantially reduces the scale at which traditional debugging techniques are applied. We have extended CBI to large-scale systems and developed new compiler based analyses that reduce its instrumentation overhead. Our results demonstrate that CBI can identify the source of errors in large-scale applications. Finally, we have developed MPIecho, a new technique that will reduce the time required to perform key correctness analyses, such as the detection of writes to unallocated memory. Overall, our research results are the foundations for new debugging paradigms that will improve application scientist productivity by reducing the time to determine which package or module contains the root cause of a problem that arises at all scales of our high end systems. While we have made substantial progress in the first two years of CoPS research, significant work remains. While STAT provides scalable debugging assistance for incorrect application runs, we could apply its techniques to assertions in order to observe deviations from expected behavior. Further, we must continue to refine STAT's techniques to represent behavioral equivalence classes efficiently as we expect systems with millions of threads in the next year. We are exploring new CBI techniques that can assess the likelihood that execution deviations from past behavior are the source of erroneous execution. Finally, we must develop usable correctness analyses that apply the MPIecho parallelization strategy in order to locate coding errors. We expect to make substantial progress on these directions in the next year but anticipate that significant work will remain to provide usable, scalable debugging paradigms.« less

  8. Overview of Opportunities for Co-Location of Solar Energy Technologies and Vegetation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macknick, Jordan; Beatty, Brenda; Hill, Graham

    2013-12-01

    Large-scale solar facilities have the potential to contribute significantly to national electricity production. Many solar installations are large-scale or utility-scale, with a capacity over 1 MW and connected directly to the electric grid. Large-scale solar facilities offer an opportunity to achieve economies of scale in solar deployment, yet there have been concerns about the amount of land required for solar projects and the impact of solar projects on local habitat. During the site preparation phase for utility-scale solar facilities, developers often grade land and remove all vegetation to minimize installation and operational costs, prevent plants from shading panels, and minimizemore » potential fire or wildlife risks. However, the common site preparation practice of removing vegetation can be avoided in certain circumstances, and there have been successful examples where solar facilities have been co-located with agricultural operations or have native vegetation growing beneath the panels. In this study we outline some of the impacts that large-scale solar facilities can have on the local environment, provide examples of installations where impacts have been minimized through co-location with vegetation, characterize the types of co-location, and give an overview of the potential benefits from co-location of solar energy projects and vegetation. The varieties of co-location can be replicated or modified for site-specific use at other solar energy installations around the world. We conclude with opportunities to improve upon our understanding of ways to reduce the environmental impacts of large-scale solar installations.« less

  9. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    NASA Astrophysics Data System (ADS)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of climate change on crop productivity in a watershed. The first was carried out by the large-scale crop model alone. The second was carried out by the integrated model of the large-scale crop model and the H08 model. The former projected that changes in temperature and precipitation due to future climate change would give rise to increasing the water stress in crops. Nevertheless, the latter projected that the increasing amount of agricultural water resources in the watershed would supply sufficient amount of water for irrigation, consequently reduce the water stress. The integrated model demonstrated the importance of taking into account the water circulation in watershed when predicting the regional crop production.

  10. Collaborative Working for Large Digitisation Projects

    ERIC Educational Resources Information Center

    Yeates, Robin; Guy, Damon

    2006-01-01

    Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…

  11. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2009-09-30

    Modeling of Burning Emissions ( FLAMBE ) project, and other related parameters. Our plans to embed NAAPS inside NOGAPS may need to be put on hold...AOD, FLAMBE and FAROP at FNMOC are supported by 6.4 funding from PMW-120 for “Large-scale Atmospheric Models”, “Small-scale Atmospheric Models

  12. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  13. Surface Contamination Monitor and Survey Information Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-02-01

    Shonka Research Associates, Inc.`s (SRA) Surface Contamination Monitor and Survey Information management System (SCM/SIMS) is designed to perform alpha and beta radiation surveys of floors and surfaces and document the measured data. The SRA-SCM/SIMS technology can be applied to routine operational surveys, characterization surveys, and free release and site closure surveys. Any large nuclear site can make use of this technology. This report describes a demonstration of the SRA-SCM/SIMS technology. This demonstration is part of the chicago Pile-5 (CP-5) Large-Scale Demonstration Project (LSDP) sponsored by the US Department of Energy (DOE), Office of Science and Technology (ST), Deactivation and Decommissioningmore » Focus Area (DDFA). The objective of the LSDP is to select and demonstrate potentially beneficial technologies at the Argonne National Laboratory-East`s (ANL) CP-5 Research Reactor Facility. The purpose of the LSDP is to demonstrate that by using innovative and improved deactivation and decommissioning (D and D) technologies from various sources, significant benefits can be achieved when compared to baseline D and D technologies.« less

  14. Development of a large-scale transportation optimization course.

    DOT National Transportation Integrated Search

    2011-11-01

    "In this project, a course was developed to introduce transportation and logistics applications of large-scale optimization to graduate students. This report details what : similar courses exist in other universities, and the methodology used to gath...

  15. Seismic data restoration with a fast L1 norm trust region method

    NASA Astrophysics Data System (ADS)

    Cao, Jingjie; Wang, Yanfei

    2014-08-01

    Seismic data restoration is a major strategy to provide reliable wavefield when field data dissatisfy the Shannon sampling theorem. Recovery by sparsity-promoting inversion often get sparse solutions of seismic data in a transformed domains, however, most methods for sparsity-promoting inversion are line-searching methods which are efficient but are inclined to obtain local solutions. Using trust region method which can provide globally convergent solutions is a good choice to overcome this shortcoming. A trust region method for sparse inversion has been proposed, however, the efficiency should be improved to suitable for large-scale computation. In this paper, a new L1 norm trust region model is proposed for seismic data restoration and a robust gradient projection method for solving the sub-problem is utilized. Numerical results of synthetic and field data demonstrate that the proposed trust region method can get excellent computation speed and is a viable alternative for large-scale computation.

  16. Decadal opportunities for space architects

    NASA Astrophysics Data System (ADS)

    Sherwood, Brent

    2012-12-01

    A significant challenge for the new field of space architecture is the dearth of project opportunities. Yet every year more young professionals express interest to enter the field. This paper derives projections that bound the number, type, and range of global development opportunities that may be reasonably expected over the next few decades for human space flight (HSF) systems so those interested in the field can benchmark their goals. Four categories of HSF activity are described: human Exploration of solar system bodies; human Servicing of space-based assets; large-scale development of space Resources; and Breakout of self-sustaining human societies into the solar system. A progressive sequence of capabilities for each category starts with its earliest feasible missions and leads toward its full expression. The four sequences are compared in scale, distance from Earth, and readiness. Scenarios hybridize the most synergistic features from the four sequences for comparison to status quo, government-funded HSF program plans. Finally qualitative, decadal, order-of-magnitude estimates are derived for system development needs, and hence opportunities for space architects. Government investment towards human planetary exploration is the weakest generator of space architecture work. Conversely, the strongest generator is a combination of three market drivers: (1) commercial passenger travel in low Earth orbit; (2) in parallel, government extension of HSF capability to GEO; both followed by (3) scale-up demonstration of end-to-end solar power satellites in GEO. The rich end of this scale affords space architecture opportunities which are more diverse, complex, large-scale, and sociologically challenging than traditional exploration vehicle cabins and habitats.

  17. Lessons from a Large-Scale Assessment: Results from Conceptual Inventories

    ERIC Educational Resources Information Center

    Thacker, Beth; Dulli, Hani; Pattillo, Dave; West, Keith

    2014-01-01

    We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER) (physics education research-informed materials) into a department where most instruction has previously been traditional and a significant…

  18. Estimating unbiased economies of scale of HIV prevention projects: a case study of Avahan.

    PubMed

    Lépine, Aurélia; Vassall, Anna; Chandrashekar, Sudha; Blanc, Elodie; Le Nestour, Alexis

    2015-04-01

    Governments and donors are investing considerable resources on HIV prevention in order to scale up these services rapidly. Given the current economic climate, providers of HIV prevention services increasingly need to demonstrate that these investments offer good 'value for money'. One of the primary routes to achieve efficiency is to take advantage of economies of scale (a reduction in the average cost of a health service as provision scales-up), yet empirical evidence on economies of scale is scarce. Methodologically, the estimation of economies of scale is hampered by several statistical issues preventing causal inference and thus making the estimation of economies of scale complex. In order to estimate unbiased economies of scale when scaling up HIV prevention services, we apply our analysis to one of the few HIV prevention programmes globally delivered at a large scale: the Indian Avahan initiative. We costed the project by collecting data from the 138 Avahan NGOs and the supporting partners in the first four years of its scale-up, between 2004 and 2007. We develop a parsimonious empirical model and apply a system Generalized Method of Moments (GMM) and fixed-effects Instrumental Variable (IV) estimators to estimate unbiased economies of scale. At the programme level, we find that, after controlling for the endogeneity of scale, the scale-up of Avahan has generated high economies of scale. Our findings suggest that average cost reductions per person reached are achievable when scaling-up HIV prevention in low and middle income countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Physical habitat monitoring strategy (PHAMS) for reach-scale restoration effectiveness monitoring

    USGS Publications Warehouse

    Jones, Krista L.; O'Daniel, Scott J.; Beechie, Tim J.; Zakrajsek, John; Webster, John G.

    2015-04-14

    Habitat restoration efforts by the Confederated Tribes of the Umatilla Indian Reservation (CTUIR) have shifted from the site scale (1-10 meters) to the reach scale (100-1,000 meters). This shift was in response to the growing scientific emphasis on process-based restoration and to support from the 2007 Accords Agreement with the Bonneville Power Administration. With the increased size of restoration projects, the CTUIR and other agencies are in need of applicable monitoring methods for assessing large-scale changes in river and floodplain habitats following restoration. The goal of the Physical Habitat Monitoring Strategy is to outline methods that are useful for capturing reach-scale changes in surface and groundwater hydrology, geomorphology, hydrologic connectivity, and riparian vegetation at restoration projects. The Physical Habitat Monitoring Strategy aims to avoid duplication with existing regional effectiveness monitoring protocols by identifying complimentary reach-scale metrics and methods that may improve the ability of CTUIR and others to detect instream and riparian changes at large restoration projects.

  20. Innovative Remediation Technologies: Field-Scale Demonstration Projects in North America, 2nd Edition

    EPA Pesticide Factsheets

    This report consolidates key reference information in a matrix that allows project mangers to quickly identify new technologies that may answer their cleanup needs and contacts for obtaining technology demonstration results and other information.

  1. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  2. The epistemic culture in an online citizen science project: Programs, antiprograms and epistemic subjects.

    PubMed

    Kasperowski, Dick; Hillman, Thomas

    2018-05-01

    In the past decade, some areas of science have begun turning to masses of online volunteers through open calls for generating and classifying very large sets of data. The purpose of this study is to investigate the epistemic culture of a large-scale online citizen science project, the Galaxy Zoo, that turns to volunteers for the classification of images of galaxies. For this task, we chose to apply the concepts of programs and antiprograms to examine the 'essential tensions' that arise in relation to the mobilizing values of a citizen science project and the epistemic subjects and cultures that are enacted by its volunteers. Our premise is that these tensions reveal central features of the epistemic subjects and distributed cognition of epistemic cultures in these large-scale citizen science projects.

  3. Evaluating Introductory Physics Classes in Light of the ABET Criteria: An Example from the SCALE-UP Project.

    ERIC Educational Resources Information Center

    Saul, Jeffery M.; Deardorff, Duane L.; Abbott, David S.; Allain, Rhett J.; Beichner, Robert J.

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project at North Carolina State University (NCSU) is developing a curriculum to promote learning through in-class group activities in introductory physics classes up to 100 students. The authors are currently in Phase II of the project using a specially designed…

  4. Renewable Enhanced Feedstocks for Advanced Biofuels and Bioproducts (REFABB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peoples, Oliver; Snell, Kristi

    The basic concept of the REFABB project was that by genetically engineering the biomass crop switchgrass to produce a natural polymer PHB, which is readily broken down by heating (thermolysis) into the chemical building block crotonic acid, sufficient additional economic value would be added for the grower and processor to make it an attractive business at small scale. Processes for using thermolysis to upgrade biomass to densified pellets (char) or bio-oil are well known and require low capital investment similar to a corn ethanol facility. Several smaller thermolysis plants would then supply the densified biomass, which is easier to handlemore » and transport to a centralized biorefinery where it would be used as the feedstock. Crotonic acid is not by itself a large volume commodity chemical, however, the project demonstrated that it can be used as a feedstock to produce a number of large volume chemicals including butanol which itself is a biofuel target. In effect the project would try to address three key technology barriers, feedstock logistics, feedstock supply and cost effective biomass conversion. This project adds to our understanding of the potential for future biomass biorefineries in two main areas. The first addressed in Task A was the importance and potential of developing an advanced value added biomass feedstock crop. In this Task several novel genetic engineering technologies were demonstrated for the first time. One important outcome was the identification of three novel genes which when re-introduced into the switchgrass plants had a remarkable impact on increasing the biomass yield based on dramatically increasing photosynthesis. These genes also turned out to be critical to increasing the levels of PHB in switchgrass by enabling the plants to fix carbon fast enough to support both plant growth and higher levels of the polymer. Challenges in the critical objective of Task B, demonstrating conversion of the PHB in biomass to crotonic acid at over 90% yield, demonstrated the need to consider up-front the limitations of trying to adopt existing equipment to a task for which subsequent basic research studies indicated it was not suitable. New information was developed in the most complex of the chemical conversions studied, advanced catalysis to make acrylic acid, a chemical used widely to make paints, and this was published in a scientific journal. In regard to the technical effectiveness, the crop science aspects were for the most part remarkably effective in addressing the underlying objectives indicating the soundness of the technical approach. With time, it should be possible to fully develop the advanced biomass biorefinery feedstock. Challenges within the thermolysis step to recover crotonic acid meant that by the end of the project we were not able to demonstrate an economic case based on data from scaled up equipment. Solving this will take further research and development work. As a general statement, the broadest public good is in demonstrating the value of funding a very unique approach to the complex problem of enabling large-scale biomass biorefineries which resulted in significant progress towards the ultimate goal and a clearer understanding of the technical hurdles remaining. Perhaps not surprisingly, some of the broader benefits to the public come from the use of the REFABB project innovations in areas unrelated to the initial objective. It is worth highlighting the breakthrough developments in identifying three single global regulator genes which can be engineered into plants to dramatically increase photosynthesis and carbon capturing ability. These genes have tremendous potential for use in major food crops, in particular corn to enhance grain yield and based on recent findings, increase the root density, a critical key to increasing carbon sequestration in agriculture and improving the sustainability of global food and biofuel production.« less

  5. Incorporating variability in simulations of seasonally forced phenology using integral projection models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.

    Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography.Our derivation, which is based on the rate-summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills mature pine trees.more » This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less

  6. Effective Integration of Earth Observation Data and Flood Modeling for Rapid Disaster Response: The Texas 2015 Case

    NASA Astrophysics Data System (ADS)

    Schumann, G.

    2016-12-01

    Routinely obtaining real-time 2-D inundation patterns of a flood event at a meaningful spatial resolution and over large scales is at the moment only feasible with either operational aircraft flights or satellite imagery. Of course having model simulations of floodplain inundation available to complement the remote sensing data is highly desirable, for both event re-analysis and forecasting event inundation. Using the Texas 2015 flood disaster, we demonstrate the value of multi-scale EO data for large scale 2-D floodplain inundation modeling and forecasting. A dynamic re-analysis of the Texas 2015 flood disaster was run using a 2-D flood model developed for accurate large scale simulations. We simulated the major rivers entering the Gulf of Mexico and used flood maps produced from both optical and SAR satellite imagery to examine regional model sensitivities and assess associated performance. It was demonstrated that satellite flood maps can complement model simulations and add value, although this is largely dependent on a number of important factors, such as image availability, regional landscape topology, and model uncertainty. In the preferred case where model uncertainty is high, landscape topology is complex (i.e. urbanized coastal area) and satellite flood maps are available (in case of SAR for instance), satellite data can significantly reduce model uncertainty by identifying the "best possible" model parameter set. However, most often the situation is occurring where model uncertainty is low and spatially contiguous flooding can be mapped from satellites easily enough, such as in rural large inland river floodplains. Consequently, not much value from satellites can be added. Nevertheless, where a large number of flood maps are available, model credibility can be increased substantially. In the case presented here this was true for at least 60% of the many thousands of kilometers of river flow length simulated, where satellite flood maps existed. The next steps of this project is to employ a technique termed "targeted observation" approach, which is an assimilation based procedure that allows quantifying the impact observations have on model predictions at the local scale and also along the entire river system, when assimilated with the model at specific "overpass" locations.

  7. The “Wireless Sensor Networks for City-Wide Ambient Intelligence (WISE-WAI)” Project

    PubMed Central

    Casari, Paolo; Castellani, Angelo P.; Cenedese, Angelo; Lora, Claudio; Rossi, Michele; Schenato, Luca; Zorzi, Michele

    2009-01-01

    This paper gives a detailed technical overview of some of the activities carried out in the context of the “Wireless Sensor networks for city-Wide Ambient Intelligence (WISE-WAI)” project, funded by the Cassa di Risparmio di Padova e Rovigo Foundation, Italy. The main aim of the project is to demonstrate the feasibility of large-scale wireless sensor network deployments, whereby tiny objects integrating one or more environmental sensors (humidity, temperature, light intensity), a microcontroller and a wireless transceiver are deployed over a large area, which in this case involves the buildings of the Department of Information Engineering at the University of Padova. We will describe how the network is organized to provide full-scale automated functions, and which services and applications it is configured to provide. These applications include long-term environmental monitoring, alarm event detection and propagation, single-sensor interrogation, localization and tracking of objects, assisted navigation, as well as fast data dissemination services to be used, e.g., to rapidly re-program all sensors over-the-air. The organization of such a large testbed requires notable efforts in terms of communication protocols and strategies, whose design must pursue scalability, energy efficiency (while sensors are connected through USB cables for logging and debugging purposes, most of them will be battery-operated), as well as the capability to support applications with diverse requirements. These efforts, the description of a subset of the results obtained so far, and of the final objectives to be met are the scope of the present paper. PMID:22408513

  8. Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies

    NASA Astrophysics Data System (ADS)

    Kauffmann, Guinevere

    2018-03-01

    The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 < z < 0.8 selected from Sloan Digital Sky Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.

  9. Space and time scales in human-landscape systems.

    PubMed

    Kondolf, G Mathias; Podolak, Kristen

    2014-01-01

    Exploring spatial and temporal scales provides a way to understand human alteration of landscape processes and human responses to these processes. We address three topics relevant to human-landscape systems: (1) scales of human impacts on geomorphic processes, (2) spatial and temporal scales in river restoration, and (3) time scales of natural disasters and behavioral and institutional responses. Studies showing dramatic recent change in sediment yields from uplands to the ocean via rivers illustrate the increasingly vast spatial extent and quick rate of human landscape change in the last two millennia, but especially in the second half of the twentieth century. Recent river restoration efforts are typically small in spatial and temporal scale compared to the historical human changes to ecosystem processes, but the cumulative effectiveness of multiple small restoration projects in achieving large ecosystem goals has yet to be demonstrated. The mismatch between infrequent natural disasters and individual risk perception, media coverage, and institutional response to natural disasters results in un-preparedness and unsustainable land use and building practices.

  10. Five Kilowatt Solid Oxide Fuel Cell/Diesel Reformer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis Witmer; Thomas Johnson

    2008-12-31

    Reducing fossil fuel consumption both for energy security and for reduction in global greenhouse emissions has been a major goal of energy research in the US for many years. Fuel cells have been proposed as a technology that can address both these issues--as devices that convert the energy of a fuel directly into electrical energy, they offer low emissions and high efficiencies. These advantages are of particular interest to remote power users, where grid connected power is unavailable, and most electrical power comes from diesel electric generators. Diesel fuel is the fuel of choice because it can be easily transportedmore » and stored in quantities large enough to supply energy for small communities for extended periods of time. This projected aimed to demonstrate the operation of a solid oxide fuel cell on diesel fuel, and to measure the resulting efficiency. Results from this project have been somewhat encouraging, with a laboratory breadboard integration of a small scale diesel reformer and a Solid Oxide Fuel Cell demonstrated in the first 18 months of the project. This initial demonstration was conducted at INEEL in the spring of 2005 using a small scale diesel reformer provided by SOFCo and a fuel cell provided by Acumentrics. However, attempts to integrate and automate the available technology have not proved successful as yet. This is due both to the lack of movement on the fuel processing side as well as the rather poor stack lifetimes exhibited by the fuel cells. Commercial product is still unavailable, and precommercial devices are both extremely expensive and require extensive field support.« less

  11. Alternative projections of the impacts of private investment on southern forests: a comparison of two large-scale forest sector models of the United States.

    Treesearch

    Ralph Alig; Darius Adams; John Mills; Richard Haynes; Peter Ince; Robert Moulton

    2001-01-01

    The TAMM/NAPAP/ATLAS/AREACHANGE(TNAA) system and the Forest and Agriculture Sector Optimization Model (FASOM) are two large-scale forestry sector modeling systems that have been employed to analyze the U.S. forest resource situation. The TNAA system of static, spatial equilibrium models has been applied to make SO-year projections of the U.S. forest sector for more...

  12. On a Game of Large-Scale Projects Competition

    NASA Astrophysics Data System (ADS)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  13. Concentrating Solar Power Central Receiver Panel Component Fabrication and Testing FINAL REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDowell, Michael W; Miner, Kris

    The objective of this project is to complete a design of an advanced concentrated solar panel and demonstrate the manufacturability of key components. Then confirm the operation of the key components under prototypic solar flux conditions. This work is an important step in reducing the levelized cost of energy (LCOE) from a central receiver solar power plant. The key technical risk to building larger power towers is building the larger receiver systems. Therefore, this proposed technology project includes the design of an advanced molten salt prototypic sub-scale receiver panel that can be utilized into a large receiver system. Then completemore » the fabrication and testing of key components of the receive design that will be used to validate the design. This project shall have a significant impact on solar thermal power plant design. Receiver panels of suitable size for utility scale plants are a key element to a solar power tower plant. Many subtle and complex manufacturing processes are involved in producing a reliable, robust receiver panel. Given the substantial size difference between receiver panels manufactured in the past and those needed for large plant designs, the manufacture and demonstration on prototype receiver panel components with representative features of a full-sized panel will be important to improving the build process for commercial success. Given the thermal flux limitations of the test facility, the panel components cannot be rendered full size. Significance changes occurred in the projects technical strategies from project initiation to the accomplishments described herein. The initial strategy was to define cost improvements for the receiver, design and build a scale prototype receiver and test, on sun, with a molten salt heat transport system. DOE had committed to constructing a molten salt heat transport loop to support receiver testing at the top of the NSTTF tower. Because of funding constraints this did not happen. A subsequent plan to test scale prototype receiver, off sun but at temperature, at a molten salt loop at ground level adjacent to the tower also had to be abandoned. Thus, no test facility existed for a molten salt receiver test. As a result, PWR completed the prototype receiver design and then fabricated key components for testing instead of fabricating the complete prototype receiver. A number of innovative design ideas have been developed. Key features of the receiver panel have been identified. This evaluation includes input from Solar 2, personal experience of people working on these programs and meetings with Sandia. Key components of the receiver design and key processes used to fabricate a receiver have been selected for further evaluation. The Test Plan, Concentrated Solar Power Receiver In Cooperation with the Department of Energy and Sandia National Laboratory was written to define the scope of the testing to be completed as well as to provide details related to the hardware, instrumentation, and data acquisition. The document contains a list of test objectives, a test matrix, and an associated test box showing the operating points to be tested. Test Objectives: 1. Demonstrate low-cost manufacturability 2. Demonstrate robustness of two different tube base materials 3. Collect temperature data during on sun operation 4. Demonstrate long term repeated daily operation of heat shields 5. Complete pinhole tube weld repairs 6. Anchor thermal models This report discusses the tests performed, the results, and implications for design improvements and LCOE reduction.« less

  14. Status of Technology Development to enable Large Stable UVOIR Space Telescopes

    NASA Astrophysics Data System (ADS)

    Stahl, H. Philip; MSFC AMTD Team

    2017-01-01

    NASA MSFC has two funded Strategic Astrophysics Technology projects to develop technology for potential future large missions: AMTD and PTC. The Advanced Mirror Technology Development (AMTD) project is developing technology to make mechanically stable mirrors for a 4-meter or larger UVOIR space telescope. AMTD is demonstrating this technology by making a 1.5 meter diameter x 200 mm thick ULE(C) mirror that is 1/3rd scale of a full size 4-m mirror. AMTD is characterizing the mechanical and thermal performance of this mirror and of a 1.2-meter Zerodur(R) mirror to validate integrate modeling tools. Additionally, AMTD has developed integrated modeling tools which are being used to evaluate primary mirror systems for a potential Habitable Exoplanet Mission and analyzed the interaction between optical telescope wavefront stability and coronagraph contrast leakage. Predictive Thermal Control (PTC) project is developing technology to enable high stability thermal wavefront performance by using integrated modeling tools to predict and actively control the thermal environment of a 4-m or larger UVOIR space telescope.

  15. Effects of national ecological restoration projects on carbon sequestration in China from 2001 to 2010

    PubMed Central

    Lu, Fei; Hu, Huifeng; Sun, Wenjuan; Zhu, Jiaojun; Liu, Guobin; Zhou, Wangming; Zhang, Quanfa; Shi, Peili; Liu, Xiuping; Wu, Xing; Zhang, Lu; Wei, Xiaohua; Dai, Limin; Zhang, Kerong; Sun, Yirong; Xue, Sha; Zhang, Wanjun; Xiong, Dingpeng; Deng, Lei; Liu, Bojie; Zhou, Li; Zhang, Chao; Cao, Jiansheng; Huang, Yao; Zhou, Guoyi; Bai, Yongfei; Xie, Zongqiang; Wu, Bingfang; Fang, Jingyun; Liu, Guohua; Yu, Guirui

    2018-01-01

    The long-term stressful utilization of forests and grasslands has led to ecosystem degradation and C loss. Since the late 1970s China has launched six key national ecological restoration projects to protect its environment and restore degraded ecosystems. Here, we conducted a large-scale field investigation and a literature survey of biomass and soil C in China’s forest, shrubland, and grassland ecosystems across the regions where the six projects were implemented (∼16% of the country’s land area). We investigated the changes in the C stocks of these ecosystems to evaluate the contributions of the projects to the country’s C sink between 2001 and 2010. Over this decade, we estimated that the total annual C sink in the project region was 132 Tg C per y (1 Tg = 1012 g), over half of which (74 Tg C per y, 56%) was attributed to the implementation of the projects. Our results demonstrate that these restoration projects have substantially contributed to CO2 mitigation in China. PMID:29666317

  16. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.

    1990-01-01

    The proposed Ada 9X constructs for distribution was studied. The goal was to select suitable test cases to help in the evaluation of the proposed constructs. The examples were to be considered according to the following requirements: real time operation; fault tolerance at several different levels; demonstration of both distributed and massively parallel operation; reflection of realistic NASA programs; illustration of the issues of configuration, compilation, linking, and loading; indications of the consequences of using the proposed revisions for large scale programs; and coverage of the spectrum of communication patterns such as predictable, bursty, small and large messages. The first month was spent identifying possible examples and judging their suitability for the project.

  17. Discovering Beaten Paths in Collaborative Ontology-Engineering Projects using Markov Chains

    PubMed Central

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A.; Noy, Natalya F.

    2014-01-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50, 000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. PMID:24953242

  18. Discovering beaten paths in collaborative ontology-engineering projects using Markov chains.

    PubMed

    Walk, Simon; Singer, Philipp; Strohmaier, Markus; Tudorache, Tania; Musen, Mark A; Noy, Natalya F

    2014-10-01

    Biomedical taxonomies, thesauri and ontologies in the form of the International Classification of Diseases as a taxonomy or the National Cancer Institute Thesaurus as an OWL-based ontology, play a critical role in acquiring, representing and processing information about human health. With increasing adoption and relevance, biomedical ontologies have also significantly increased in size. For example, the 11th revision of the International Classification of Diseases, which is currently under active development by the World Health Organization contains nearly 50,000 classes representing a vast variety of different diseases and causes of death. This evolution in terms of size was accompanied by an evolution in the way ontologies are engineered. Because no single individual has the expertise to develop such large-scale ontologies, ontology-engineering projects have evolved from small-scale efforts involving just a few domain experts to large-scale projects that require effective collaboration between dozens or even hundreds of experts, practitioners and other stakeholders. Understanding the way these different stakeholders collaborate will enable us to improve editing environments that support such collaborations. In this paper, we uncover how large ontology-engineering projects, such as the International Classification of Diseases in its 11th revision, unfold by analyzing usage logs of five different biomedical ontology-engineering projects of varying sizes and scopes using Markov chains. We discover intriguing interaction patterns (e.g., which properties users frequently change after specific given ones) that suggest that large collaborative ontology-engineering projects are governed by a few general principles that determine and drive development. From our analysis, we identify commonalities and differences between different projects that have implications for project managers, ontology editors, developers and contributors working on collaborative ontology-engineering projects and tools in the biomedical domain. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Composite Technology for Exploration

    NASA Technical Reports Server (NTRS)

    Fikes, John

    2017-01-01

    The CTE (Composite Technology for Exploration) Project will develop and demonstrate critical composites technologies with a focus on joints that utilize NASA expertise and capabilities. The project will advance composite technologies providing lightweight structures to support future NASA exploration missions. The CTE project will demonstrate weight-saving, performance-enhancing bonded joint technology for Space Launch System (SLS)-scale composite hardware.

  20. Overview of ERA Integrated Technology Demonstration (ITD) 51A Ultra-High Bypass (UHB) Integration for Hybrid Wing Body (HWB)

    NASA Technical Reports Server (NTRS)

    Flamm, Jeffrey D.; James, Kevin D.; Bonet, John T.

    2016-01-01

    The NASA Environmentally Responsible Aircraft Project (ERA) was a ve year project broken into two phases. In phase II, high N+2 Technical Readiness Level demonstrations were grouped into Integrated Technology Demonstrations (ITD). This paper describes the work done on ITD-51A: the Vehicle Systems Integration, Engine Airframe Integration Demonstration. Refinement of a Hybrid Wing Body (HWB) aircraft from the possible candidates developed in ERA Phase I was continued. Scaled powered, and unpowered wind- tunnel testing, with and without acoustics, in the NASA LARC 14- by 22-foot Subsonic Tunnel, the NASA ARC Unitary Plan Wind Tunnel, and the 40- by 80-foot test section of the National Full-Scale Aerodynamics Complex (NFAC) in conjunction with very closely coupled Computational Fluid Dynamics was used to demonstrate the fuel burn and acoustic milestone targets of the ERA Project.

  1. Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER.

    PubMed

    Chatzigeorgiou, Giorgos; Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos

    2016-01-01

    Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues.

  2. Large-Scale Spacecraft Fire Safety Experiments in ISS Resupply Vehicles

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.; Urban, David

    2013-01-01

    Our understanding of the fire safety risk in manned spacecraft has been limited by the small scale of the testing we have been able to conduct in low-gravity. Fire growth and spread cannot be expected to scale linearly with sample size so we cannot make accurate predictions of the behavior of realistic scale fires in spacecraft based on the limited low-g testing to date. As a result, spacecraft fire safety protocols are necessarily very conservative and costly. Future crewed missions are expected to be longer in duration than previous exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this concern, a spacecraft fire safety research project is underway to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. This project is supported by the NASA Advanced Exploration Systems Program Office in the Human Exploration and Operations Mission Directorate. The activity of this project is supported by an international topical team of fire experts from other space agencies to maximize the utility of the data and to ensure the widest possible scrutiny of the concept. The large-scale space flight experiment will be conducted on three missions; each in an Orbital Sciences Corporation Cygnus vehicle after it has deberthed from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew allows the fire products to be released into the cabin. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the atmosphere. The international topical team is collaborating with the NASA team in the definition of the experiment requirements and performing supporting analysis, experimentation and technology development.

  3. SDN-NGenIA, a software defined next generation integrated architecture for HEP and data intensive science

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.

  4. Opera: reconstructing optimal genomic scaffolds with high-throughput paired-end sequences.

    PubMed

    Gao, Song; Sung, Wing-Kin; Nagarajan, Niranjan

    2011-11-01

    Scaffolding, the problem of ordering and orienting contigs, typically using paired-end reads, is a crucial step in the assembly of high-quality draft genomes. Even as sequencing technologies and mate-pair protocols have improved significantly, scaffolding programs still rely on heuristics, with no guarantees on the quality of the solution. In this work, we explored the feasibility of an exact solution for scaffolding and present a first tractable solution for this problem (Opera). We also describe a graph contraction procedure that allows the solution to scale to large scaffolding problems and demonstrate this by scaffolding several large real and synthetic datasets. In comparisons with existing scaffolders, Opera simultaneously produced longer and more accurate scaffolds demonstrating the utility of an exact approach. Opera also incorporates an exact quadratic programming formulation to precisely compute gap sizes (Availability: http://sourceforge.net/projects/operasf/ ).

  5. Opera: Reconstructing Optimal Genomic Scaffolds with High-Throughput Paired-End Sequences

    PubMed Central

    Gao, Song; Sung, Wing-Kin

    2011-01-01

    Abstract Scaffolding, the problem of ordering and orienting contigs, typically using paired-end reads, is a crucial step in the assembly of high-quality draft genomes. Even as sequencing technologies and mate-pair protocols have improved significantly, scaffolding programs still rely on heuristics, with no guarantees on the quality of the solution. In this work, we explored the feasibility of an exact solution for scaffolding and present a first tractable solution for this problem (Opera). We also describe a graph contraction procedure that allows the solution to scale to large scaffolding problems and demonstrate this by scaffolding several large real and synthetic datasets. In comparisons with existing scaffolders, Opera simultaneously produced longer and more accurate scaffolds demonstrating the utility of an exact approach. Opera also incorporates an exact quadratic programming formulation to precisely compute gap sizes (Availability: http://sourceforge.net/projects/operasf/). PMID:21929371

  6. How uncertain are climate model projections of water availability indicators across the Middle East?

    PubMed

    Hemming, Debbie; Buontempo, Carlo; Burke, Eleanor; Collins, Mat; Kaye, Neil

    2010-11-28

    The projection of robust regional climate changes over the next 50 years presents a considerable challenge for the current generation of climate models. Water cycle changes are particularly difficult to model in this area because major uncertainties exist in the representation of processes such as large-scale and convective rainfall and their feedback with surface conditions. We present climate model projections and uncertainties in water availability indicators (precipitation, run-off and drought index) for the 1961-1990 and 2021-2050 periods. Ensembles from two global climate models (GCMs) and one regional climate model (RCM) are used to examine different elements of uncertainty. Although all three ensembles capture the general distribution of observed annual precipitation across the Middle East, the RCM is consistently wetter than observations, especially over the mountainous areas. All future projections show decreasing precipitation (ensemble median between -5 and -25%) in coastal Turkey and parts of Lebanon, Syria and Israel and consistent run-off and drought index changes. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) GCM ensemble exhibits drying across the north of the region, whereas the Met Office Hadley Centre work Quantifying Uncertainties in Model ProjectionsAtmospheric (QUMP-A) GCM and RCM ensembles show slight drying in the north and significant wetting in the south. RCM projections also show greater sensitivity (both wetter and drier) and a wider uncertainty range than QUMP-A. The nature of these uncertainties suggests that both large-scale circulation patterns, which influence region-wide drying/wetting patterns, and regional-scale processes, which affect localized water availability, are important sources of uncertainty in these projections. To reduce large uncertainties in water availability projections, it is suggested that efforts would be well placed to focus on the understanding and modelling of both large-scale processes and their teleconnections with Middle East climate and localized processes involved in orographic precipitation.

  7. Image processing in biodosimetry: A proposal of a generic free software platform.

    PubMed

    Dumpelmann, Matthias; Cadena da Matta, Mariel; Pereira de Lemos Pinto, Marcela Maria; de Salazar E Fernandes, Thiago; Borges da Silva, Edvane; Amaral, Ademir

    2015-08-01

    The scoring of chromosome aberrations is the most reliable biological method for evaluating individual exposure to ionizing radiation. However, microscopic analyses of chromosome human metaphases, generally employed to identify aberrations mainly dicentrics (chromosome with two centromeres), is a laborious task. This method is time consuming and its application in biological dosimetry would be almost impossible in case of a large scale radiation incidents. In this project, a generic software was enhanced for automatic chromosome image processing from a framework originally developed for the Framework V project Simbio, of the European Union for applications in the area of source localization from electroencephalographic signals. The platforms capability is demonstrated by a study comparing automatic segmentation strategies of chromosomes from microscopic images.

  8. Research and Development of Large Capacity CFB Boilers in TPRI

    NASA Astrophysics Data System (ADS)

    Xianbin, Sun; Minhua, Jiang

    This paper presents an overview of advancements of circulating fluidized bed (CFB) technology in Thermal Power Research Institute (TPRI),including technologies and configuration and progress of scaling up. For devoloping large CFB boiler, the CFB combustion test facilities have been established, the key technologies of large capacity CFB boiler have been research systematically, the 100MW ˜330MW CFB boiler have been developed and manufactured. The first domestically designed 100MW and 210MW CFB boiler have been put into commericial operation and have good operating performance. Domestic 330MW CFB boiler demonstration project also has been put into commericial operation,which is H type CFB boiler with Compact heat exchanger. This boiler is China's largest CFB boiler. The technical plan of domestic 600MW supercritical CFB boiler are also briefly introduced.

  9. Large-scale standardized phenotyping of strawberry in RosBREED

    USDA-ARS?s Scientific Manuscript database

    A large, multi-institutional, international, research project with the goal of bringing genomicists and plant breeders together was funded by USDA-NIFA Specialty Crop Research Initiative. Apple, cherry, peach, and strawberry are the Rosaceous crops included in the project. Many (900+) strawberry g...

  10. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.

  11. SCALING-UP INFORMATION IN LAND-COVER DATA FOR LARGE-SCALE ENVIRONMENTAL ASSESSMENTS

    EPA Science Inventory

    The NLCD project provides national-scope land-cover data for the conterminous United States. The first land-cover data set was completed in 2000, and the continuing need for recent land-cover information has motivated continuation of the project to provide current and change info...

  12. Vulnerability-based evaluation of water supply design under climate change

    NASA Astrophysics Data System (ADS)

    Umit Taner, Mehmet; Ray, Patrick; Brown, Casey

    2015-04-01

    Long-lived water supply infrastructures are strategic investments in the developing world, serving the purpose of balancing water deficits compounded by both population growth and socio-economic development. Robust infrastructure design under climate change is compelling, and often addressed by focusing on the outcomes of climate model projections ('scenario-led' planning), or by identifying design options that are less vulnerable to a wide range of plausible futures ('vulnerability-based' planning). Decision-Scaling framework combines these two approaches by first applying a climate stress test on the system to explore vulnerabilities across many traces of the future, and then employing climate projections to inform the decision-making process. In this work, we develop decision scaling's nascent risk management concepts further, directing actions on vulnerabilities identified during the climate stress test. In the process, we present a new way to inform climate vulnerability space using climate projections, and demonstrate the use of multiple decision criteria to guide to a final design recommendation. The concepts are demonstrated for a water supply project in the Mombasa Province of Kenya, planned to provide domestic and irrigation supply. Six storage design capacities (from 40 to 140 million cubic meters) are explored through a stress test, under a large number climate traces representing both natural climate variability and plausible climate changes. Design outcomes are simulated over a 40-year planning period with a coupled hydrologic-water resources systems model and using standard reservoir operation rules. Resulting performance is expressed in terms of water supply reliability and economic efficiency. Ensemble climate projections are used for assigning conditional likelihoods to the climate traces using a statistical distance measure. The final design recommendations are presented and discussed for the decision criteria of expected regret, satisficing, and conditional value-at-risk (CVaR).

  13. Sustainable urban water systems in rich and poor cities--steps towards a new approach.

    PubMed

    Newman, P

    2001-01-01

    The 'big pipes in, big pipes out' approach to urban water management was developed in the 19th century for a particular linear urban form. Large, sprawling car-dependent cities are pushing this approach to new limits in rich cities and it has never worked in poor cities. An alternative which uses new small-scale technology and is more community-based, is suggested for both rich and poor countries. The Sydney Olympics and a demonstration project in Java show that the approach can work.

  14. Successful contracting of prevention services: fighting malnutrition in Senegal and Madagascar.

    PubMed

    Marek, T; Diallo, I; Ndiaye, B; Rakotosalama, J

    1999-12-01

    There are very few documented large-scale successes in nutrition in Africa, and virtually no consideration of contracting for preventive services. This paper describes two successful large-scale community nutrition projects in Africa as examples of what can be done in prevention using the contracting approach in rural as well as urban areas. The two case-studies are the Secaline project in Madagascar, and the Community Nutrition Project in Senegal. The article explains what is meant by 'success' in the context of these two projects, how these results were achieved, and how certain bottlenecks were avoided. Both projects are very similar in the type of service they provide, and in combining private administration with public finance. The article illustrates that contracting out is a feasible option to be seriously considered for organizing certain prevention programmes on a large scale. There are strong indications from these projects of success in terms of reducing malnutrition, replicability and scale, and community involvement. When choosing that option, a government can tap available private local human resources through contracting out, rather than delivering those services by the public sector. However, as was done in both projects studied, consideration needs to be given to using a contract management unit for execution and monitoring, which costs 13-17% of the total project's budget. Rigorous assessments of the cost-effectiveness of contracted services are not available, but improved health outcomes, targeting of the poor, and basic cost data suggest that the programmes may well be relatively cost-effective. Although the contracting approach is not presented as the panacea to solve the malnutrition problem faced by Africa, it can certainly provide an alternative in many countries to increase coverage and quality of services.

  15. Changing vessel routes could significantly reduce the cost of future offshore wind projects.

    PubMed

    Samoteskul, Kateryna; Firestone, Jeremy; Corbett, James; Callahan, John

    2014-08-01

    With the recent emphasis on offshore wind energy Coastal and Marine Spatial Planning (CMSP) has become one of the main frameworks used to plan and manage the increasingly complex web of ocean and coastal uses. As wind development becomes more prevalent, existing users of the ocean space, such as commercial shippers, will be compelled to share their historically open-access waters with these projects. Here, we demonstrate the utility of using cost-effectiveness analysis (CEA) to support siting decisions within a CMSP framework. In this study, we assume that large-scale offshore wind development will take place in the US Mid-Atlantic within the next decades. We then evaluate whether building projects nearshore or far from shore would be more cost-effective. Building projects nearshore is assumed to require rerouting of the commercial vessel traffic traveling between the US Mid-Atlantic ports by an average of 18.5 km per trip. We focus on less than 1500 transits by large deep-draft vessels. We estimate that over 29 years of the study, commercial shippers would incur an additional $0.2 billion (in 2012$) in direct and indirect costs. Building wind projects closer to shore where vessels used to transit would generate approximately $13.4 billion (in 2012$) in savings. Considering the large cost savings, modifying areas where vessels transit needs to be included in the portfolio of policies used to support the growth of the offshore wind industry in the US. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hung, Cheng-Hung

    The main objective of this project was to develop a low-cost integrated substrate for rigid OLED solid-state lighting produced at a manufacturing scale. The integrated substrates could include combinations of soda lime glass substrate, light extraction layer, and an anode layer (i.e., Transparent Conductive Oxide, TCO). Over the 3 + year course of the project, the scope of work was revised to focus on the development of a glass substrates with an internal light extraction (IEL) layer. A manufacturing-scale float glass on-line particle embedding process capable of producing an IEL glass substrate having a thickness of less than 1.7mm andmore » an area larger than 500mm x 400mm was demonstrated. Substrates measuring 470mm x 370mm were used in the OLED manufacturing process for fabricating OLED lighting panels in single pixel devices as large as 120.5mm x 120.5mm. The measured light extraction efficiency (calculated as external quantum efficiency, EQE) for on-line produced IEL samples (>50%) met the project’s initial goal.« less

  17. Integrating human and machine intelligence in galaxy morphology classification tasks

    NASA Astrophysics Data System (ADS)

    Beck, Melanie R.; Scarlata, Claudia; Fortson, Lucy F.; Lintott, Chris J.; Simmons, B. D.; Galloway, Melanie A.; Willett, Kyle W.; Dickinson, Hugh; Masters, Karen L.; Marshall, Philip J.; Wright, Darryl

    2018-06-01

    Quantifying galaxy morphology is a challenging yet scientifically rewarding task. As the scale of data continues to increase with upcoming surveys, traditional classification methods will struggle to handle the load. We present a solution through an integration of visual and automated classifications, preserving the best features of both human and machine. We demonstrate the effectiveness of such a system through a re-analysis of visual galaxy morphology classifications collected during the Galaxy Zoo 2 (GZ2) project. We reprocess the top-level question of the GZ2 decision tree with a Bayesian classification aggregation algorithm dubbed SWAP, originally developed for the Space Warps gravitational lens project. Through a simple binary classification scheme, we increase the classification rate nearly 5-fold classifying 226 124 galaxies in 92 d of GZ2 project time while reproducing labels derived from GZ2 classification data with 95.7 per cent accuracy. We next combine this with a Random Forest machine learning algorithm that learns on a suite of non-parametric morphology indicators widely used for automated morphologies. We develop a decision engine that delegates tasks between human and machine and demonstrate that the combined system provides at least a factor of 8 increase in the classification rate, classifying 210 803 galaxies in just 32 d of GZ2 project time with 93.1 per cent accuracy. As the Random Forest algorithm requires a minimal amount of computational cost, this result has important implications for galaxy morphology identification tasks in the era of Euclid and other large-scale surveys.

  18. Basin-Scale Hydrologic Impacts of CO2 Storage: Regulatory and Capacity Implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, J.T.; Zhou, Q.

    Industrial-scale injection of CO{sub 2} into saline sedimentary basins will cause large-scale fluid pressurization and migration of native brines, which may affect valuable groundwater resources overlying the deep sequestration reservoirs. In this paper, we discuss how such basin-scale hydrologic impacts can (1) affect regulation of CO{sub 2} storage projects and (2) may reduce current storage capacity estimates. Our assessment arises from a hypothetical future carbon sequestration scenario in the Illinois Basin, which involves twenty individual CO{sub 2} storage projects in a core injection area suitable for long-term storage. Each project is assumed to inject five million tonnes of CO{sub 2}more » per year for 50 years. A regional-scale three-dimensional simulation model was developed for the Illinois Basin that captures both the local-scale CO{sub 2}-brine flow processes and the large-scale groundwater flow patterns in response to CO{sub 2} storage. The far-field pressure buildup predicted for this selected sequestration scenario suggests that (1) the area that needs to be characterized in a permitting process may comprise a very large region within the basin if reservoir pressurization is considered, and (2) permits cannot be granted on a single-site basis alone because the near- and far-field hydrologic response may be affected by interference between individual sites. Our results also support recent studies in that environmental concerns related to near-field and far-field pressure buildup may be a limiting factor on CO{sub 2} storage capacity. In other words, estimates of storage capacity, if solely based on the effective pore volume available for safe trapping of CO{sub 2}, may have to be revised based on assessments of pressure perturbations and their potential impact on caprock integrity and groundwater resources, respectively. We finally discuss some of the challenges in making reliable predictions of large-scale hydrologic impacts related to CO{sub 2} sequestration projects.« less

  19. Final Technical Report, City of Brockton Solar Brightfield: Deploying a Solar Array on a Brockton Brownfield

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ribeiro, Lori

    The City of Brockton, Massachusetts sought to install New England’s largest solar array at a remediated brownfield site on Grove Street. The 425-kilowatt solar photovoltaic array – or “Brightfield” – was installed in an urban park setting along with interpretive displays to maximize the educational opportunities. The “Brightfield” project included 1,395 310-Watt solar panels connected in “strings” that span the otherwise unusable 3.7-acre site. The project demonstrated that it is both technically and economically feasible to install utility scale solar photovoltaics on a capped landfill site. The US Department of Energy conceived the Brightfields program in 2000, and Brockton’s Brightfieldmore » is the largest such installation nationwide. Brockton’s project demonstrated that while it was both technically and economically feasible to perform such a project, the implementation was extremely challenging due to the state policy barriers, difficulty obtaining grant funding, and level of sophistication required to perform the financing and secure required state approvals. This demonstration project can be used as a model for other communities that wish to implement “Brownfields to Brightfields” projects; 2) implementing utility scale solar creates economies of scale that can help to decrease costs of photovoltaics; 3) the project is an aesthetic, environmental, educational and economic asset for the City of Brockton.« less

  20. Telecommunications technology and rural education in the United States

    NASA Technical Reports Server (NTRS)

    Perrine, J. R.

    1975-01-01

    The rural sector of the US is examined from the point of view of whether telecommunications technology can augment the development of rural education. Migratory farm workers and American Indians were the target groups which were examined as examples of groups with special needs in rural areas. The general rural population and the target groups were examined to identify problems and to ascertain specific educational needs. Educational projects utilizing telecommunications technology in target group settings were discussed. Large scale regional ATS-6 satellite-based experimental educational telecommunications projects were described. Costs and organizational factors were also examined for large scale rural telecommunications projects.

  1. Building continental-scale 3D subsurface layers in the Digital Crust project: constrained interpolation and uncertainty estimation.

    NASA Astrophysics Data System (ADS)

    Yulaeva, E.; Fan, Y.; Moosdorf, N.; Richard, S. M.; Bristol, S.; Peters, S. E.; Zaslavsky, I.; Ingebritsen, S.

    2015-12-01

    The Digital Crust EarthCube building block creates a framework for integrating disparate 3D/4D information from multiple sources into a comprehensive model of the structure and composition of the Earth's upper crust, and to demonstrate the utility of this model in several research scenarios. One of such scenarios is estimation of various crustal properties related to fluid dynamics (e.g. permeability and porosity) at each node of any arbitrary unstructured 3D grid to support continental-scale numerical models of fluid flow and transport. Starting from Macrostrat, an existing 4D database of 33,903 chronostratigraphic units, and employing GeoDeepDive, a software system for extracting structured information from unstructured documents, we construct 3D gridded fields of sediment/rock porosity, permeability and geochemistry for large sedimentary basins of North America, which will be used to improve our understanding of large-scale fluid flow, chemical weathering rates, and geochemical fluxes into the ocean. In this talk, we discuss the methods, data gaps (particularly in geologically complex terrain), and various physical and geological constraints on interpolation and uncertainty estimation.

  2. Emily Evans | NREL

    Science.gov Websites

    Evans Emily Evans Project Controller Emily.Evans@nrel.gov | 303-275-3125 Emily joined NREL in 2010 . As a Project Administrator in the Integrated Applications Center, Emily works with project managers and teams to develop and maintain project management excellence on large-scale, multi-year projects

  3. Research to Real Life, 2006: Innovations in Deaf-Blindness

    ERIC Educational Resources Information Center

    Leslie, Gail, Ed.

    2006-01-01

    This publication presents several projects that support children who are deaf-blind. These projects are: (1) Learning To Learn; (2) Project SALUTE; (3) Project SPARKLE; (4) Bringing It All Back Home; (5) Project PRIIDE; and (6) Including Students With Deafblindness In Large Scale Assessment Systems. Each project lists components, key practices,…

  4. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-12-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.

  5. Evaluating synoptic systems in the CMIP5 climate models over the Australian region

    NASA Astrophysics Data System (ADS)

    Gibson, Peter B.; Uotila, Petteri; Perkins-Kirkpatrick, Sarah E.; Alexander, Lisa V.; Pitman, Andrew J.

    2016-10-01

    Climate models are our principal tool for generating the projections used to inform climate change policy. Our confidence in projections depends, in part, on how realistically they simulate present day climate and associated variability over a range of time scales. Traditionally, climate models are less commonly assessed at time scales relevant to daily weather systems. Here we explore the utility of a self-organizing maps (SOMs) procedure for evaluating the frequency, persistence and transitions of daily synoptic systems in the Australian region simulated by state-of-the-art global climate models. In terms of skill in simulating the climatological frequency of synoptic systems, large spread was observed between models. A positive association between all metrics was found, implying that relative skill in simulating the persistence and transitions of systems is related to skill in simulating the climatological frequency. Considering all models and metrics collectively, model performance was found to be related to model horizontal resolution but unrelated to vertical resolution or representation of the stratosphere. In terms of the SOM procedure, the timespan over which evaluation was performed had some influence on model performance skill measures, as did the number of circulation types examined. These findings have implications for selecting models most useful for future projections over the Australian region, particularly for projections related to synoptic scale processes and phenomena. More broadly, this study has demonstrated the utility of the SOMs procedure in providing a process-based evaluation of climate models.

  6. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    NASA Astrophysics Data System (ADS)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  7. Adjoint-Based Methodology for Time-Dependent Optimal Control (AMTOC)

    NASA Technical Reports Server (NTRS)

    Yamaleev, Nail; Diskin, boris; Nishikawa, Hiroaki

    2012-01-01

    During the five years of this project, the AMTOC team developed an adjoint-based methodology for design and optimization of complex time-dependent flows, implemented AMTOC in a testbed environment, directly assisted in implementation of this methodology in the state-of-the-art NASA's unstructured CFD code FUN3D, and successfully demonstrated applications of this methodology to large-scale optimization of several supersonic and other aerodynamic systems, such as fighter jet, subsonic aircraft, rotorcraft, high-lift, wind-turbine, and flapping-wing configurations. In the course of this project, the AMTOC team has published 13 refereed journal articles, 21 refereed conference papers, and 2 NIA reports. The AMTOC team presented the results of this research at 36 international and national conferences, meeting and seminars, including International Conference on CFD, and numerous AIAA conferences and meetings. Selected publications that include the major results of the AMTOC project are enclosed in this report.

  8. Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs

    ERIC Educational Resources Information Center

    Carr, Nathan T.

    2008-01-01

    Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…

  9. Development of Affordable, Low-Carbon Hydrogen Supplies at an Industrial Scale

    ERIC Educational Resources Information Center

    Roddy, Dermot J.

    2008-01-01

    An existing industrial hydrogen generation and distribution infrastructure is described, and a number of large-scale investment projects are outlined. All of these projects have the potential to generate significant volumes of low-cost, low-carbon hydrogen. The technologies concerned range from gasification of coal with carbon capture and storage…

  10. Real-time adaptive ramp metering : phase I, MILOS proof of concept (multi-objective, integrated, large-scale, optimized system).

    DOT National Transportation Integrated Search

    2006-12-01

    Over the last several years, researchers at the University of Arizonas ATLAS Center have developed an adaptive ramp : metering system referred to as MILOS (Multi-Objective, Integrated, Large-Scale, Optimized System). The goal of this project : is ...

  11. Scaling up Education Reform

    ERIC Educational Resources Information Center

    Gaffney, Jon D. H.; Richards, Evan; Kustusch, Mary Bridget; Ding, Lin; Beichner, Robert J.

    2008-01-01

    The SCALE-UP (Student-Centered Activities for Large Enrollment for Undergraduate Programs) project was developed to implement reforms designed for small classes into large physics classes. Over 50 schools across the country, ranging from Wake Technical Community College to Massachusetts Institute of Technology (MIT), have adopted it for classes of…

  12. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    NASA Astrophysics Data System (ADS)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  13. The Large Scale Distribution of Water Ice in the Polar Regions of the Moon

    NASA Astrophysics Data System (ADS)

    Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.

    2017-12-01

    For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.

  14. Mems: Platform for Large-Scale Integrated Vacuum Electronic Circuits

    DTIC Science & Technology

    2017-03-20

    SECURITY CLASSIFICATION OF: The objective of the LIVEC advanced study project was to develop a platform for large-scale integrated vacuum electronic ...Distribution Unlimited UU UU UU UU 20-03-2017 1-Jul-2014 30-Jun-2015 Final Report: MEMS Platform for Large-Scale Integrated Vacuum Electronic ... Electronic Circuits (LIVEC) Contract No: W911NF-14-C-0093 COR Dr. James Harvey U.S. ARO RTP, NC 27709-2211 Phone: 702-696-2533 e-mail

  15. High-efficiency Thin-film Fe 2SiS 4 and Fe 2GeS 4-based Solar Cells Prepared from Low-Cost Solution Precursors. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radu, Daniela Rodica; Liu, Mimi; Hwang, Po-yu

    The project aimed to provide solar energy education to students from underrepresented groups and to develop a novel, nano-scale approach, in utilizing Fe 2SiS 4 and Fe 2GeS 4 materials as precursors to the absorber layer in photovoltaic thin-film devices. The objectives of the project were as follows: 1. Develop and implement one solar-related course at Delaware State University and train two graduate students in solar research. 2. Fabricate and characterize high-efficiency (larger than 7%) Fe 2SiS 4 and Fe 2GeS 4-based solar devices. The project has been successful in both the educational components, implementing the solar course at DSUmore » as well as in developing multiple routes to prepare the Fe 2GeS 4 with high purity and in large quantities. The project did not meet the efficiency objective, however, a functional solar device was demonstrated.« less

  16. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    EPA Pesticide Factsheets

    Data from a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating using predictive computational models on high-throughput screening data to screen thousands of chemicals against the estrogen receptor.This dataset is associated with the following publication:Mansouri , K., A. Abdelaziz, A. Rybacka, A. Roncaglioni, A. Tropsha, A. Varnek, A. Zakharov, A. Worth, A. Richard , C. Grulke , D. Trisciuzzi, D. Fourches, D. Horvath, E. Benfenati , E. Muratov, E.B. Wedebye, F. Grisoni, G.F. Mangiatordi, G.M. Incisivo, H. Hong, H.W. Ng, I.V. Tetko, I. Balabin, J. Kancherla , J. Shen, J. Burton, M. Nicklaus, M. Cassotti, N.G. Nikolov, O. Nicolotti, P.L. Andersson, Q. Zang, R. Politi, R.D. Beger , R. Todeschini, R. Huang, S. Farag, S.A. Rosenberg, S. Slavov, X. Hu, and R. Judson. (Environmental Health Perspectives) CERAPP: Collaborative Estrogen Receptor Activity Prediction Project. ENVIRONMENTAL HEALTH PERSPECTIVES. National Institute of Environmental Health Sciences (NIEHS), Research Triangle Park, NC, USA, 1-49, (2016).

  17. A Fourier-based compressed sensing technique for accelerated CT image reconstruction using first-order methods.

    PubMed

    Choi, Kihwan; Li, Ruijiang; Nam, Haewon; Xing, Lei

    2014-06-21

    As a solution to iterative CT image reconstruction, first-order methods are prominent for the large-scale capability and the fast convergence rate [Formula: see text]. In practice, the CT system matrix with a large condition number may lead to slow convergence speed despite the theoretically promising upper bound. The aim of this study is to develop a Fourier-based scaling technique to enhance the convergence speed of first-order methods applied to CT image reconstruction. Instead of working in the projection domain, we transform the projection data and construct a data fidelity model in Fourier space. Inspired by the filtered backprojection formalism, the data are appropriately weighted in Fourier space. We formulate an optimization problem based on weighted least-squares in the Fourier space and total-variation (TV) regularization in image space for parallel-beam, fan-beam and cone-beam CT geometry. To achieve the maximum computational speed, the optimization problem is solved using a fast iterative shrinkage-thresholding algorithm with backtracking line search and GPU implementation of projection/backprojection. The performance of the proposed algorithm is demonstrated through a series of digital simulation and experimental phantom studies. The results are compared with the existing TV regularized techniques based on statistics-based weighted least-squares as well as basic algebraic reconstruction technique. The proposed Fourier-based compressed sensing (CS) method significantly improves both the image quality and the convergence rate compared to the existing CS techniques.

  18. A Scale Model of Cation Exchange for Classroom Demonstration.

    ERIC Educational Resources Information Center

    Guertal, E. A.; Hattey, J. A.

    1996-01-01

    Describes a project that developed a scale model of cation exchange that can be used for a classroom demonstration. The model uses kaolinite clay, nails, plywood, and foam balls to enable students to gain a better understanding of the exchange complex of soil clays. (DDR)

  19. Considerations for Managing Large-Scale Clinical Trials.

    ERIC Educational Resources Information Center

    Tuttle, Waneta C.; And Others

    1989-01-01

    Research management strategies used effectively in a large-scale clinical trial to determine the health effects of exposure to Agent Orange in Vietnam are discussed, including pre-project planning, organization according to strategy, attention to scheduling, a team approach, emphasis on guest relations, cross-training of personnel, and preparing…

  20. Raising Concerns about Sharing and Reusing Large-Scale Mathematics Classroom Observation Video Data

    ERIC Educational Resources Information Center

    Ing, Marsha; Samkian, Artineh

    2018-01-01

    There are great opportunities and challenges to sharing large-scale mathematics classroom observation data. This Research Commentary describes the methodological opportunities and challenges and provides a specific example from a mathematics education research project to illustrate how the research questions and framework drove observational…

  1. Wavelet-based time series bootstrap model for multidecadal streamflow simulation using climate indicators

    NASA Astrophysics Data System (ADS)

    Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth

    2016-05-01

    A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.

  2. Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldenson, N.; Mauger, G.; Leung, L. R.

    Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less

  3. Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel supercomputers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.

  4. Development and Applications of a Modular Parallel Process for Large Scale Fluid/Structures Problems

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Byun, Chansup; Kwak, Dochan (Technical Monitor)

    2001-01-01

    A modular process that can efficiently solve large scale multidisciplinary problems using massively parallel super computers is presented. The process integrates disciplines with diverse physical characteristics by retaining the efficiency of individual disciplines. Computational domain independence of individual disciplines is maintained using a meta programming approach. The process integrates disciplines without affecting the combined performance. Results are demonstrated for large scale aerospace problems on several supercomputers. The super scalability and portability of the approach is demonstrated on several parallel computers.

  5. Higher climatological temperature sensitivity of soil carbon in cold than warm climates

    NASA Astrophysics Data System (ADS)

    Koven, Charles D.; Hugelius, Gustaf; Lawrence, David M.; Wieder, William R.

    2017-11-01

    The projected loss of soil carbon to the atmosphere resulting from climate change is a potentially large but highly uncertain feedback to warming. The magnitude of this feedback is poorly constrained by observations and theory, and is disparately represented in Earth system models (ESMs). To assess the climatological temperature sensitivity of soil carbon, we calculate apparent soil carbon turnover times that reflect long-term and broad-scale rates of decomposition. Here, we show that the climatological temperature control on carbon turnover in the top metre of global soils is more sensitive in cold climates than in warm climates and argue that it is critical to capture this emergent ecosystem property in global-scale models. We present a simplified model that explains the observed high cold-climate sensitivity using only the physical scaling of soil freeze-thaw state across climate gradients. Current ESMs fail to capture this pattern, except in an ESM that explicitly resolves vertical gradients in soil climate and carbon turnover. An observed weak tropical temperature sensitivity emerges in a different model that explicitly resolves mineralogical control on decomposition. These results support projections of strong carbon-climate feedbacks from northern soils and demonstrate a method for ESMs to capture this emergent behaviour.

  6. Contribution of the infrasound technology to characterize large scale atmospheric disturbances and impact on infrasound monitoring

    NASA Astrophysics Data System (ADS)

    Blanc, Elisabeth; Le Pichon, Alexis; Ceranna, Lars; Pilger, Christoph; Charlton Perez, Andrew; Smets, Pieter

    2016-04-01

    The International Monitoring System (IMS) developed for the verification of the Comprehensive nuclear-Test-Ban Treaty (CTBT) provides a unique global description of atmospheric disturbances generating infrasound such as extreme events (e.g. meteors, volcanoes, earthquakes, and severe weather) or human activity (e.g. explosions and supersonic airplanes). The analysis of the detected signals, recorded at global scales and over near 15 years at some stations, demonstrates that large-scale atmospheric disturbances strongly affect infrasound propagation. Their time scales vary from several tens of minutes to hours and days. Their effects are in average well resolved by the current model predictions; however, accurate spatial and temporal description is lacking in both weather and climate models. This study reviews recent results using the infrasound technology to characterize these large scale disturbances, including (i) wind fluctuations induced by gravity waves generating infrasound partial reflections and modifications of the infrasound waveguide, (ii) convection from thunderstorms and mountain waves generating gravity waves, (iii) stratospheric warming events which yield wind inversions in the stratosphere, (iv)planetary waves which control the global atmospheric circulation. Improved knowledge of these disturbances and assimilation in future models is an important objective of the ARISE (Atmospheric dynamics Research InfraStructure in Europe) project. This is essential in the context of the future verification of the CTBT as enhanced atmospheric models are necessary to assess the IMS network performance in higher resolution, reduce source location errors, and improve characterization methods.

  7. Low-cost solar array project task 1: Silicon material. Gaseous melt replenishment system

    NASA Technical Reports Server (NTRS)

    Jewett, D. N.; Bates, H. E.; Hill, D. M.

    1980-01-01

    The operation of a silicon production technique was demonstrated. The essentials of the method comprise chemical vapor deposition of silicon, by hydrogen reduction of chlorosilanes, on the inside of a quartz reaction vessel having large internal surface area. The system was designed to allow successive deposition-melting cycles, with silicon removal being accomplished by discharging the molten silicon. The liquid product would be suitable for transfer to a crystal growth process, casting into solid form, or production of shots. A scaled-down prototype reactor demonstrated single pass conversion efficiency of 20 percent and deposition rates and energy consumption better than conventional Siemens reactors, via deposition rates of 365 microns/hr. and electrical consumption of 35 Kwhr/kg of silicon produced.

  8. Visualizing the universe, part 2

    NASA Technical Reports Server (NTRS)

    Falco, Emilio E.; Kurtz, Michael J.; Bajuk, Mark

    1992-01-01

    It is now possible to create animated views of the universe that are realistic, physically relevant, and breathtaking. To demonstrate the point, we describe our efforts to navigate the CfA redshift survey. For our project, we selected several CCD images of spiral and elliptical galaxies, and placed them at their observed positions in redshift space. We demonstrate how, by choreographing aesthetically pleasing trajectories, we are able to develop our own and the viewer's intuition about the large-scale structures found in the CfA redshift survey. We show for instance that three-dimensional motion enhances significantly our perception of voids and sheets in the distribution of galaxies. Such sophistication happily has become possible with the 'coming of age' of observational cosmology, as data have grown to drive the field.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinzman, Larry D.; Bolton, William Robert; Young-Robertson, Jessica

    This project improves meso-scale hydrologic modeling in the boreal forest by: (1) demonstrating the importance of capturing the heterogeneity of the landscape using small scale datasets for parameterization for both small and large basins; (2) demonstrating that in drier parts of the landscape and as the boreal forest dries with climate change, modeling approaches must consider the sensitivity of simulations to soil hydraulic parameters - such as residual water content - that are usually held constant. Thus, variability / flexibility in residual water content must be considered for accurate simulation of hydrologic processes in the boreal forest; (3) demonstrating thatmore » assessing climate change impacts on boreal forest hydrology through multiple model integration must account for direct effects of climate change (temperature and precipitation), and indirect effects from climate impacts on landscape characteristics (permafrost and vegetation distribution). Simulations demonstrated that climate change will increase runoff, but will increase ET to a greater extent and result in a drying of the landscape; and (4) vegetation plays a significant role in boreal hydrologic processes in permafrost free areas that have deciduous trees. This landscape type results in a decoupling of ET and precipitation, a tight coupling of ET and temperature, low runoff, and overall soil drying.« less

  10. Testing the robustness of Citizen Science projects: Evaluating the results of pilot project COMBER

    PubMed Central

    Faulwetter, Sarah; Dailianis, Thanos; Smith, Vincent Stuart; Koulouri, Panagiota; Dounas, Costas; Arvanitidis, Christos

    2016-01-01

    Abstract Background Citizen Science (CS) as a term implies a great deal of approaches and scopes involving many different fields of science. The number of the relevant projects globally has been increased significantly in the recent years. Large scale ecological questions can be answered only through extended observation networks and CS projects can support this effort. Although the need of such projects is apparent, an important part of scientific community cast doubt on the reliability of CS data sets. New information The pilot CS project COMBER has been created in order to provide evidence to answer the aforementioned question in the coastal marine biodiversity monitoring. The results of the current analysis show that a carefully designed CS project with clear hypotheses, wide participation and data sets validation, can be a valuable tool for the large scale and long term changes in marine biodiversity pattern change and therefore for relevant management and conservation issues. PMID:28174507

  11. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    NASA Technical Reports Server (NTRS)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  12. 75 FR 13765 - Submission for OMB Review; Use of Project Labor Agreements for Federal Construction Projects

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-23

    ... a project labor agreement (PLA), as they may decide appropriate, on large-scale construction... efficiency in Federal procurement. A PLA is a pre-hire collective bargaining agreement with one or more labor...

  13. KSC-2009-6449

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – NASA's first large-scale solar power generation facility is unveiled at NASA's Kennedy Space Center in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  14. KSC-2009-6457

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – NASA's first large-scale solar power generation facility is ready for operation at NASA's Kennedy Space Center in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  15. KSC-2009-6450

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – NASA's first large-scale solar power generation facility opens at NASA's Kennedy Space Center in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  16. NASA/Drexel program. [research effort in large-scale technical programs management for application to urban problems

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The results are reported of the NASA/Drexel research effort which was conducted in two separate phases. The initial phase stressed exploration of the problem from the point of view of three primary research areas and the building of a multidisciplinary team. The final phase consisted of a clinical demonstration program in which the research associates consulted with the County Executive of New Castle County, Delaware, to aid in solving actual problems confronting the County Government. The three primary research areas of the initial phase are identified as technology, management science, and behavioral science. Five specific projects which made up the research effort are treated separately. A final section contains the conclusions drawn from total research effort as well as from the specific projects.

  17. A Hybrid Approach to Protect Palmprint Templates

    PubMed Central

    Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach. PMID:24982977

  18. A hybrid approach to protect palmprint templates.

    PubMed

    Liu, Hailun; Sun, Dongmei; Xiong, Ke; Qiu, Zhengding

    2014-01-01

    Biometric template protection is indispensable to protect personal privacy in large-scale deployment of biometric systems. Accuracy, changeability, and security are three critical requirements for template protection algorithms. However, existing template protection algorithms cannot satisfy all these requirements well. In this paper, we propose a hybrid approach that combines random projection and fuzzy vault to improve the performances at these three points. Heterogeneous space is designed for combining random projection and fuzzy vault properly in the hybrid scheme. New chaff point generation method is also proposed to enhance the security of the heterogeneous vault. Theoretical analyses of proposed hybrid approach in terms of accuracy, changeability, and security are given in this paper. Palmprint database based experimental results well support the theoretical analyses and demonstrate the effectiveness of proposed hybrid approach.

  19. Ultrasonic humidification for telecommunications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Longo, F.

    1994-03-01

    This article examines two installations which demonstrate that ultrasonic humidification is an excellent option for large-scale commercial installations. Many existing telephone switching centers constructed 20 to 30 years ago were equipped with electro-mechanical switching equipment that was not sensitive to humidity. Today's sophisticated solid-state telecommunications equipment requires specific levels of relative humidity to operate properly. Over the last several years, Einhorn Yaffee Prescott (formerly Rose Beaton + Rose) designed two of the largest ultrasonic humidification systems at telecommunications buildings located in Cheshire, Conn., and White Plains, N.Y. The Cheshire project was a retrofit to the existing system in a 1960smore » building; the White Plains project involved an upgrade to a totally new air handling system, including an ultrasonic humidification component, in a 1950s building.« less

  20. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  1. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghattas, Omar

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less

  2. Effects of Large-Scale Solar Installations on Dust Mobilization and Air Quality

    NASA Astrophysics Data System (ADS)

    Pratt, J. T.; Singh, D.; Diffenbaugh, N. S.

    2012-12-01

    Large-scale solar projects are increasingly being developed worldwide and many of these installations are located in arid, desert regions. To examine the effects of these projects on regional dust mobilization and air quality, we analyze aerosol product data from NASA's Multi-angle Imaging Spectroradiometer (MISR) at annual and seasonal time intervals near fifteen photovoltaic and solar thermal stations ranging from 5-200 MW (12-4,942 acres) in size. The stations are distributed over eight different countries and were chosen based on size, location and installation date; most of the installations are large-scale, took place in desert climates and were installed between 2006 and 2010. We also consider air quality measurements of particulate matter between 2.5 and 10 micrometers (PM10) from the Environmental Protection Agency (EPA) monitoring sites near and downwind from the project installations in the U.S. We use monthly wind data from the NOAA's National Center for Atmospheric Prediction (NCEP) Global Reanalysis to select the stations downwind from the installations, and then perform statistical analysis on the data to identify any significant changes in these quantities. We find that fourteen of the fifteen regions have lower aerosol product after the start of the installations as well as all six PM10 monitoring stations showing lower particulate matter measurements after construction commenced. Results fail to show any statistically significant differences in aerosol optical index or PM10 measurements before and after the large-scale solar installations. However, many of the large installations are very recent, and there is insufficient data to fully understand the long-term effects on air quality. More data and higher resolution analysis is necessary to better understand the relationship between large-scale solar, dust and air quality.

  3. US EPA - ToxCast and the Tox21 program: perspectives

    EPA Science Inventory

    ToxCast is a large-scale project being conducted by the U.S. EPA to screen ~2000 chemicals against a large battery of in vitro high-throughput screening (HTS) assays. ToxCast is complemented by the Tox21 project being jointly carried out by the U.S. NIH Chemical Genomics Center (...

  4. Demonstration of three gorges archaeological relics based on 3D-visualization technology

    NASA Astrophysics Data System (ADS)

    Xu, Wenli

    2015-12-01

    This paper mainly focuses on the digital demonstration of three gorges archeological relics to exhibit the achievements of the protective measures. A novel and effective method based on 3D-visualization technology, which includes large-scaled landscape reconstruction, virtual studio, and virtual panoramic roaming, etc, is proposed to create a digitized interactive demonstration system. The method contains three stages: pre-processing, 3D modeling and integration. Firstly, abundant archaeological information is classified according to its history and geographical information. Secondly, build up a 3D-model library with the technology of digital images processing and 3D modeling. Thirdly, use virtual reality technology to display the archaeological scenes and cultural relics vividly and realistically. The present work promotes the application of virtual reality to digital projects and enriches the content of digital archaeology.

  5. ISMIP6 - initMIP: Greenland ice sheet model initialisation experiments

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Payne, Tony; Larour, Eric; Abe Ouchi, Ayako; Gregory, Jonathan; Lipscomb, William; Seroussi, Helene; Shepherd, Andrew; Edwards, Tamsin

    2016-04-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. This intercomparison exercise (initMIP) aims at comparing, evaluating and improving the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experiments are conceived for the large-scale Greenland ice sheet and are designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The latter experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss first results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  6. Colorado State Capitol Geothermal project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shepherd, Lance

    Colorado State Capitol Geothermal Project - Final report is redacted due to space constraints. This project was an innovative large-scale ground-source heat pump (GSHP) project at the Colorado State Capitol in Denver, Colorado. The project employed two large wells on the property. One for pulling water from the aquifer, and another for returning the water to the aquifer, after performing the heat exchange. The two wells can work in either direction. Heat extracted/added to the water via a heat exchanger is used to perform space conditioning in the building.

  7. Teaching English Critically to Mexican Children

    ERIC Educational Resources Information Center

    López-Gopar, Mario E.

    2014-01-01

    The purpose of this article is to present one significant part of a large-scale critical-ethnographic-action-research project (CEAR Project) carried out in Oaxaca, Mexico. The overall CEAR Project has been conducted since 2007 in different Oaxacan elementary schools serving indigenous and mestizo (mixed-race) children. In the CEAR Project, teacher…

  8. New High Performance Water Vapor Membranes to Improve Fuel Cell Balance of Plant Efficiency and Lower Costs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagener, Earl; Topping, Chris; Morgan, Brad

    Hydrogen fuel cells are currently one of the more promising long term alternative energy options and out of the range of fuel cell technologies under development, proton exchange membranes [PEMs] have the advantage of being able to deliver high power density at relatively low operating temperatures. This is essential for systems such as fuel cell vehicles (FCV) and many stationary applications that undergoing frequent on/off cycling. One of the biggest challenges for PEM systems is the need to maintain a high level of hydration in the cell to enable efficient conduction of protons from the anode to the cathode. Inmore » addition to significant power loss, low humidity conditions lead to increased stress on the membranes which can result in both physical and chemical degradation. Therefore, an effective fuel cell humidifier can be critical for the efficient operation and durability of the system under high load and low humidity conditions. The most common types of water vapor transport (WVT) devices are based on water permeable membrane based separators. Successful membranes must effectively permeate water vapor while restricting crossover of air, and be robust to the temperature and humidity fluctuations experienced in fuel cell systems. DOE sponsored independent evaluations indicate that balance of plant components, including humidification devices, make up more than half of the cost of current automotive fuel cell systems. Despite its relatively widespread us in other applications, the current industry standard perfluorosulfonic acid based Nafion® remains expensive compared with non-perfluorinated polymer membranes. During Phase II of this project, we demonstrated the improved performance of our semi-fluorinated perfluorocyclobutyl polymer based membranes compared with the current industry standard perfluorosulfonic acid based Nafion®, at ~ 50% lower cost. Building on this work, highlights of our Phase IIB developments, in close collaboration with leading global automotive component supplier Dana Holding Corporation include: • Development of a lower cost series of ionomers, with reduced synthetic steps and purification requirements and improved scale-ability, while maintaining performance advantages over Nafion® demonstrated during Phase II. • Demonstration of efficient, continuous production of down-selected WVT membrane configurations at commercial continuous roll coating facilities. We see no major issues producing Tetramer supported WVT membranes on a large commercial scale. • Following the production and testing of three prototype humidifier stacks, a full size humidifier unit was manufactured and successfully tested by an automotive customer for performance and durability. • Assuming the availability of a reasonably priced support, our cost projections for mid to large scale production of Tetramer WVT membranes are within the acceptable range of the leading automotive manufacturers and at a large scale, our calculations based on bulk sourcing of raw materials indicate we can achieve the project goal of $25/m2.« less

  9. ProteinInferencer: Confident protein identification and multiple experiment comparison for large scale proteomics projects.

    PubMed

    Zhang, Yaoyang; Xu, Tao; Shan, Bing; Hart, Jonathan; Aslanian, Aaron; Han, Xuemei; Zong, Nobel; Li, Haomin; Choi, Howard; Wang, Dong; Acharya, Lipi; Du, Lisa; Vogt, Peter K; Ping, Peipei; Yates, John R

    2015-11-03

    Shotgun proteomics generates valuable information from large-scale and target protein characterizations, including protein expression, protein quantification, protein post-translational modifications (PTMs), protein localization, and protein-protein interactions. Typically, peptides derived from proteolytic digestion, rather than intact proteins, are analyzed by mass spectrometers because peptides are more readily separated, ionized and fragmented. The amino acid sequences of peptides can be interpreted by matching the observed tandem mass spectra to theoretical spectra derived from a protein sequence database. Identified peptides serve as surrogates for their proteins and are often used to establish what proteins were present in the original mixture and to quantify protein abundance. Two major issues exist for assigning peptides to their originating protein. The first issue is maintaining a desired false discovery rate (FDR) when comparing or combining multiple large datasets generated by shotgun analysis and the second issue is properly assigning peptides to proteins when homologous proteins are present in the database. Herein we demonstrate a new computational tool, ProteinInferencer, which can be used for protein inference with both small- or large-scale data sets to produce a well-controlled protein FDR. In addition, ProteinInferencer introduces confidence scoring for individual proteins, which makes protein identifications evaluable. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015. Published by Elsevier B.V.

  10. Redox Flow Batteries, Hydrogen and Distributed Storage.

    PubMed

    Dennison, C R; Vrubel, Heron; Amstutz, Véronique; Peljo, Pekka; Toghill, Kathryn E; Girault, Hubert H

    2015-01-01

    Social, economic, and political pressures are causing a shift in the global energy mix, with a preference toward renewable energy sources. In order to realize widespread implementation of these resources, large-scale storage of renewable energy is needed. Among the proposed energy storage technologies, redox flow batteries offer many unique advantages. The primary limitation of these systems, however, is their limited energy density which necessitates very large installations. In order to enhance the energy storage capacity of these systems, we have developed a unique dual-circuit architecture which enables two levels of energy storage; first in the conventional electrolyte, and then through the formation of hydrogen. Moreover, we have begun a pilot-scale demonstration project to investigate the scalability and technical readiness of this approach. This combination of conventional energy storage and hydrogen production is well aligned with the current trajectory of modern energy and mobility infrastructure. The combination of these two means of energy storage enables the possibility of an energy economy dominated by renewable resources.

  11. Distributed 3D Information Visualization - Towards Integration of the Dynamic 3D Graphics and Web Services

    NASA Astrophysics Data System (ADS)

    Vucinic, Dean; Deen, Danny; Oanta, Emil; Batarilo, Zvonimir; Lacor, Chris

    This paper focuses on visualization and manipulation of graphical content in distributed network environments. The developed graphical middleware and 3D desktop prototypes were specialized for situational awareness. This research was done in the LArge Scale COllaborative decision support Technology (LASCOT) project, which explored and combined software technologies to support human-centred decision support system for crisis management (earthquake, tsunami, flooding, airplane or oil-tanker incidents, chemical, radio-active or other pollutants spreading, etc.). The performed state-of-the-art review did not identify any publicly available large scale distributed application of this kind. Existing proprietary solutions rely on the conventional technologies and 2D representations. Our challenge was to apply the "latest" available technologies, such Java3D, X3D and SOAP, compatible with average computer graphics hardware. The selected technologies are integrated and we demonstrate: the flow of data, which originates from heterogeneous data sources; interoperability across different operating systems and 3D visual representations to enhance the end-users interactions.

  12. Homogeneity of the coefficient of linear thermal expansion of ZERODUR: a review of a decade of evaluations

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Westerhoff, Thomas

    2017-09-01

    The coefficient of thermal expansion (CTE) and its spatial homogeneity from small to large formats is the most important property of ZERODUR. Since more than a decade SCHOTT has documented the excellent CTE homogeneity. It started with reviews of past astronomical telescope projects like the VLT, Keck and GTC mirror blanks and continued with dedicated evaluations of the production. In recent years, extensive CTE measurements on samples cut from randomly selected single ZERODUR parts in meter size and formats of arbitrary shape, large production boules and even 4 m sized blanks have demonstrated the excellent CTE homogeneity in production. The published homogeneity data shows single ppb/K peak to valley CTE variations on medium spatial scale of several cm down to small spatial scale of only a few mm mostly at the limit of the measurement reproducibility. This review paper summarizes the results also in respect to the increased CTE measurement accuracy over the last decade of ZERODUR production.

  13. System identification of the Large-Angle Magnetic Suspension Test Facility (LAMSTF)

    NASA Technical Reports Server (NTRS)

    Huang, Jen-Kuang

    1993-01-01

    The Large-Angle Magnetic Suspension Test Facility (LAMSTF), a laboratory-scale research project to demonstrate the magnetic suspension of objects over wide ranges of attitudes, has been developed. This system represents a scaled model of a planned Large-Gap Magnetic Suspension System (LGMSS). The LAMSTF system consists of a planar array of five copper electromagnets which actively suspend a small cylindrical permanent magnet. The cylinder is a rigid body and can be controlled to move in five independent degrees of freedom. Five position variables are sensed indirectly by using infra-red light-emitting diodes and light-receiving phototransistors. The motion of the suspended cylinder is in general nonlinear and hence only the linear, time-invariant perturbed motion about an equilibrium state is considered. One of the main challenges in this project is the control of the suspended element over a wide range of orientations. An accurate dynamic model plans an essential role in controller design. The analytical model of the LAMSTF system includes highly unstable real poles (about 10 Hz) and low-frequency flexible modes (about 0.16 Hz). Projection filters are proposed to identify the state space model from closed-loop test data in time domain. A canonical transformation matrix is also derived to transform the identified state space model into the physical coordinate. The LAMSTF system is stabilized by using a linear quadratic regulator (LQR) feedback controller. The rate information is obtained by calculating the back difference of the sensed position signals. The reference inputs contain five uncorrelated random signals. This control input and the system reponse are recorded as input/output data to identify the system directly from the projection filters. The sampling time is 4 ms and the model is fairly accurate in predicting the step responses for different controllers while the analytical model has a deficiency in the pitch axis.

  14. Final Report on DOE Project entitled Dynamic Optimized Advanced Scheduling of Bandwidth Demands for Large-Scale Science Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramamurthy, Byravamurthy

    2014-05-05

    In this project, developed scheduling frameworks for dynamic bandwidth demands for large-scale science applications. In particular, we developed scheduling algorithms for dynamic bandwidth demands in this project. Apart from theoretical approaches such as Integer Linear Programming, Tabu Search and Genetic Algorithm heuristics, we have utilized practical data from ESnet OSCARS project (from our DOE lab partners) to conduct realistic simulations of our approaches. We have disseminated our work through conference paper presentations and journal papers and a book chapter. In this project we addressed the problem of scheduling of lightpaths over optical wavelength division multiplexed (WDM) networks. We published severalmore » conference papers and journal papers on this topic. We also addressed the problems of joint allocation of computing, storage and networking resources in Grid/Cloud networks and proposed energy-efficient mechanisms for operatin optical WDM networks.« less

  15. Longitudinal impact of the project PATHS on adolescent risk behavior: what happened after five years?

    PubMed

    Shek, Daniel T L; Yu, Lu

    2012-01-01

    The present study investigated the longitudinal impact of the Project PATHS, a large-scale curriculum-based positive youth development program in Hong Kong, on the development of adolescents' risk behavior over a period of five years. Using a longitudinal randomized controlled design, eight waves of data were collected from 19 experimental schools in which students participated in the Project PATHS (N = 2,850 at Wave 8) and 24 control schools without joining the Project PATHS (N = 3,640 at Wave 8). At each wave, students responded to measures assessing their current risk behaviors, including delinquency, use of different types of drug, and their intentions of participating in risk behaviors in the future. Results demonstrated that adolescents receiving the program exhibited significantly slower increases in delinquent behaviors and substance use as compared to the control participants. During two years after the completion of the program, differences in youth risk behaviors in the two groups still existed. These results suggest that the Project PATHS has long-term effect in preventing adolescent problem behavior through promoting positive youth development.

  16. A stable and high-order accurate discontinuous Galerkin based splitting method for the incompressible Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Piatkowski, Marian; Müthing, Steffen; Bastian, Peter

    2018-03-01

    In this paper we consider discontinuous Galerkin (DG) methods for the incompressible Navier-Stokes equations in the framework of projection methods. In particular we employ symmetric interior penalty DG methods within the second-order rotational incremental pressure correction scheme. The major focus of the paper is threefold: i) We propose a modified upwind scheme based on the Vijayasundaram numerical flux that has favourable properties in the context of DG. ii) We present a novel postprocessing technique in the Helmholtz projection step based on H (div) reconstruction of the pressure correction that is computed locally, is a projection in the discrete setting and ensures that the projected velocity satisfies the discrete continuity equation exactly. As a consequence it also provides local mass conservation of the projected velocity. iii) Numerical results demonstrate the properties of the scheme for different polynomial degrees applied to two-dimensional problems with known solution as well as large-scale three-dimensional problems. In particular we address second-order convergence in time of the splitting scheme as well as its long-time stability.

  17. CanvasDB: a local database infrastructure for analysis of targeted- and whole genome re-sequencing projects

    PubMed Central

    Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf

    2014-01-01

    CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB PMID:25281234

  18. CanvasDB: a local database infrastructure for analysis of targeted- and whole genome re-sequencing projects.

    PubMed

    Ameur, Adam; Bunikis, Ignas; Enroth, Stefan; Gyllensten, Ulf

    2014-01-01

    CanvasDB is an infrastructure for management and analysis of genetic variants from massively parallel sequencing (MPS) projects. The system stores SNP and indel calls in a local database, designed to handle very large datasets, to allow for rapid analysis using simple commands in R. Functional annotations are included in the system, making it suitable for direct identification of disease-causing mutations in human exome- (WES) or whole-genome sequencing (WGS) projects. The system has a built-in filtering function implemented to simultaneously take into account variant calls from all individual samples. This enables advanced comparative analysis of variant distribution between groups of samples, including detection of candidate causative mutations within family structures and genome-wide association by sequencing. In most cases, these analyses are executed within just a matter of seconds, even when there are several hundreds of samples and millions of variants in the database. We demonstrate the scalability of canvasDB by importing the individual variant calls from all 1092 individuals present in the 1000 Genomes Project into the system, over 4.4 billion SNPs and indels in total. Our results show that canvasDB makes it possible to perform advanced analyses of large-scale WGS projects on a local server. Database URL: https://github.com/UppsalaGenomeCenter/CanvasDB. © The Author(s) 2014. Published by Oxford University Press.

  19. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  20. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  1. High Quality, Low Cost Bulk Gallium Nitride Substrates Grown by the Electrochemical Solution Growth Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seacrist, Michael

    The objective of this project was to develop the Electrochemical Solution Growth (ESG) method conceived / patented at Sandia National Laboratory into a commercially viable bulk gallium nitride (GaN) growth process that can be scaled to low cost, high quality, and large area GaN wafer substrate manufacturing. The goal was to advance the ESG growth technology by demonstrating rotating seed growth at the lab scale and then transitioning process to prototype commercial system, while validating the GaN material and electronic / optical device quality. The desired outcome of the project is a prototype commercial process for US-based manufacturing of highmore » quality, large area, and lower cost GaN substrates that can drive widespread deployment of energy efficient GaN-based power electronic and optical devices. In year 1 of the project (Sept 2012 – Dec 2013) the overall objective was to demonstrate crystalline GaN growth > 100um on a GaN seed crystal. The development plan included tasks to demonstrate and implement a method for purifying reagent grade salts, develop the reactor 1 process for rotating seed Electrochemical Solution Growth (ESG) of GaN, grow and characterize ESG GaN films, develop a fluid flow and reaction chemistry model for GaN film growth, and design / build an improved growth reactor capable of scaling to 50mm seed diameter. The first year’s project objectives were met in some task areas including salt purification, film characterization, modeling, and reactor 2 design / fabrication. However, the key project objective of the growth of a crystalline GaN film on the seed template was not achieved. Amorphous film growth on the order of a few tenths of a micron has been detected with a film composition including Ga and N, plus several other impurities originating from the process solution and hardware. The presence of these impurities, particularly the oxygen, has inhibited the demonstration of crystalline GaN film growth on the seed template. However, the presence of both Ga and N at the growth surface indicates that the reactor hardware physics is all functioning properly; achieving film growth is a matter of controlling the chemistry at the interface. The impurities originating from the hardware are expected to be straightforward to eliminate. Activities were defined for an extension of budget period 1 to eliminate the undesired impurities originating from the reactor hardware and interfering with crystalline GaN film growth. The budget period 1 extension was negotiated during the 1st half of 2014. The budget period 1 extension spanned approximately from August 2014 to August 2015. The project objective for this extension period was to demonstrate at least 0.5um crystalline GaN film on a GaN seed in the lab scale reactor. The focus of the budget 1 extension period from August 2014 to August 2015 was to eliminate oxygen contamination interference with GaN film growth. The team procured the highest purity lowest oxygen salt for testing. Low oxygen crucible materials such as silicon carbide were installed and evaluated in the laboratory reactor. Growth experiments were performed with high purity salt, high purity hardware, and optimized oxide removal from the seed surface. Experiments were characterized with methods including UV inspection, profilometry, x-ray diffraction (XRD) to determine crystalline structure, optical and scanning electron microscopy, photoluminescence, x-ray photon spectroscopy (XPS), transmission electron microscopy (TEM), and secondary ion mass spectroscopy (SIMS). Despite successfully integrating the low oxygen materials in the laboratory reactor, the goal of depositing 0.5um of crystalline GaN on the MOCVD GaN seed was not met. Very thin (ca. 10nm) cubic phase GaN deposition was observed on the hexagonal MOCVD GaN seeds. But there was a competing etching reaction which was also observed and thought to be related to the presence of metallic lithium, a byproduct of the LiCl-KCl salt used as the process medium. The etching reaction could potentially be addressed by alternate salts not containing lithium, but would necessitate starting all over on the reactor and process design. Further, controlling the reaction of Ga and N in the bulk salt to favor deposition on the seed has proved to be very difficult and unlikely to be solved within the scope of this project in a manner consistent with the original objective for wafer or crystal scale thickness for GaN deposition on a GaN seed. Upon completion of the budget 1 extension period in August 2015 the project partners and DOE agreed to stop work on the project.« less

  2. The Billboard Project

    ERIC Educational Resources Information Center

    Weaver, Victoria

    2005-01-01

    Since 1997, the author coordinated a large-scale billboard project. Coordinated to coincide with the National Art Education Association's celebration of Youth Art Month, strong commitments from faculty, students, administrators, public-relations liaisons, local press, radio, TV, and community businesses have made this project a success. The first…

  3. CSP ELEMENTS: High-Temperature Thermochemical Storage with Redox-Stable Perovskites for Concentrating Solar Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, Gregory S; Braun, Robert J; Ma, Zhiwen

    This project was motivated by the potential of reducible perovskite oxides for high-temperature, thermochemical energy storage (TCES) to provide dispatchable renewable heat for concentrating solar power (CSP) plants. This project sought to identify and characterize perovskites from earth-abundant cations with high reducibility below 1000 °C for coupling TCES of solar energy to super-critical CO2 (s-CO2) plants that operate above temperature limits (< 600 °C) of current molten-salt storage. Specific TCES > 750 kJ/kg for storage cycles between 500 and 900 °C was targeted with a system cost goal of $15/kWhth. To realize feasibility of TCES systems based on reducible perovskites,more » our team focused on designing and testing a lab-scale concentrating solar receiver, wherein perovskite particles capture solar energy by fast O2 release and sensible heating at a thermal efficiency of 90% and wall temperatures below 1100 °C. System-level models of the receiver and reoxidation reactor coupled to validated thermochemical materials models can assess approaches to scale-up a full TCES system based on reduction/oxidation cycles of perovskite oxides at large scales. After characterizing many Ca-based perovskites for TCES, our team identified strontium-doped calcium manganite Ca1-xSrxMnO3-δ (with x ≤ 0.1) as a composition with adequate stability and specific TCES capacity (> 750 kJ/kg for Ca0.95Sr0.05MnO3-δ) for cycling between air at 500 °C and low-PO2 (10-4 bar) N2 at 900 °C. Substantial kinetic tests demonstrated that resident times of several minutes in low-PO2 gas were needed for these materials to reach the specific TCES goals with particles of reasonable size for large-scale transport (diameter dp > 200 μm). On the other hand, fast reoxidation kinetics in air enables subsequent rapid heat release in a fluidized reoxidation reactor/ heat recovery unit for driving s-CO2 power plants. Validated material thermochemistry coupled to radiation and convective particle-gas transport models facilitated full TCES system analysis for CSP and results showed that receiver efficiencies approaching 85% were feasible with wall-to-particle heat transfer coefficients observed in laboratory experiments. Coupling these reactive particle-gas transport models to external SolTrace and CFD models drove design of a reactive-particle receiver with indirect heating through flux spreading. A lab-scale receiver using Ca0.9Sr0.1MnO3-δ was demonstrated at NREL’s High Flux Solar Furnace with particle temperatures reaching 900 °C while wall temperatures remained below 1100 °C and approximately 200 kJ/kg of chemical energy storage. These first demonstrations of on-sun perovskite reduction and the robust modeling tools from this program provide a basis for going forward with improved receiver designs to increase heat fluxes and solar-energy capture efficiencies. Measurements and modeling tools from this project provide the foundations for advancing TCES for CSP and other applications using reducible perovskite oxides from low-cost, earth-abundant elements. A perovskite composition has been identified that has the thermodynamic potential to meet the targeted TCES capacity of 750 kJ/kg over a range of temperatures amenable for integration with s-CO2 cycles. Further research needs to explore ways of accelerating effective particle kinetics through variations in composition and/or reactor/receiver design. Initial demonstrations of on-sun particle reduction for TCES show a need for testing at larger scales with reduced heat losses and improved particle-wall heat transfer. The gained insight into particle-gas transport and reactor design can launch future development of cost-effective, large-scale particle-based TCES as a technology for enabling increased renewable energy penetration.« less

  4. Architecture and Programming Models for High Performance Intensive Computation

    DTIC Science & Technology

    2016-06-29

    Applications Systems and Large-Scale-Big-Data & Large-Scale-Big-Computing (DDDAS- LS ). ICCS 2015, June 2015. Reykjavk, Ice- land. 2. Bo YT, Wang P, Guo ZL...The Mahali project,” Communications Magazine , vol. 52, pp. 111–133, Aug 2014. 14 DISTRIBUTION A: Distribution approved for public release. Response ID

  5. The Large-Scale Biosphere-Atmosphere Experiment in Amazonia: Analyzing Regional Land Use Change Effects.

    Treesearch

    Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae

    2004-01-01

    The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...

  6. Large-scale hybrid poplar production economics: 1995 Alexandria, Minnesota establishment cost and management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downing, M.; Langseth, D.; Stoffel, R.

    1996-12-31

    The purpose of this project was to track and monitor costs of planting, maintaining, and monitoring large scale commercial plantings of hybrid poplar in Minnesota. These costs assists potential growers and purchasers of this resource to determine the ways in which supply and demand may be secured through developing markets.

  7. The global climate of December 1992-February 1993. Part 2: Large-scale variability across the tropical western Pacific during TOGA COARE

    NASA Technical Reports Server (NTRS)

    Gutzler, D. S.; Kiladis, G. N.; Meehl, G. A.; Weickmann, K. M.; Wheeler, M.

    1994-01-01

    Recently, scientists from more than a dozen countries carried out the field phase of a project called the Coupled-Atmosphere Response Experiment (COARE), devoted to describing the ocean-atmosphere system of the western Pacific near-equatorial warm pool. The project was conceived, organized, and funded under the auspices of the International Tropical Ocean Global Atmosphere (TOGA) Program. Although COARE consisted of several field phases, including a year-long atmospheric enhanced monitoring period (1 July 1992 -- 30 June 1993), the heart of COARE was its four-month Intensive Observation Period (IOP) extending from 1 Nov. 1992 through 28 Feb. 1993. An overview of large-scale variability during COARE is presented. The weather and climate observed in the IOP is placed into context with regard to large-scale, low-frequency fluctuations of the ocean-atmosphere system. Aspects of tropical variability beginning in Aug. 1992 and extending through Mar. 1993, with some sounding data for Apr. 1993 are considered. Variability over the large-scale sounding array (LSA) and the intensive flux array (IFA) is emphasized.

  8. Wave Power Demonstration Project at Reedsport, Oregon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mekhiche, Mike; Downie, Bruce

    2013-10-21

    Ocean wave power can be a significant source of large‐scale, renewable energy for the US electrical grid. The Electrical Power Research Institute (EPRI) conservatively estimated that 20% of all US electricity could be generated by wave energy. Ocean Power Technologies, Inc. (OPT), with funding from private sources and the US Navy, developed the PowerBuoy to generate renewable energy from the readily available power in ocean waves. OPT's PowerBuoy converts the energy in ocean waves to electricity using the rise and fall of waves to move the buoy up and down (mechanical stroking) which drives an electric generator. This electricity ismore » then conditioned and transmitted ashore as high‐voltage power via underwater cable. OPT's wave power generation system includes sophisticated techniques to automatically tune the system for efficient conversion of random wave energy into low cost green electricity, for disconnecting the system in large waves for hardware safety and protection, and for automatically restoring operation when wave conditions normalize. As the first utility scale wave power project in the US, the Wave Power Demonstration Project at Reedsport, OR, will consist of 10 PowerBuoys located 2.5 miles off the coast. This U.S. Department of Energy Grant funding along with funding from PNGC Power, an Oregon‐based electric power cooperative, was utilized for the design completion, fabrication, assembly and factory testing of the first PowerBuoy for the Reedsport project. At this time, the design and fabrication of this first PowerBuoy and factory testing of the power take‐off subsystem are complete; additionally the power take‐off subsystem has been successfully integrated into the spar.« less

  9. An evolving effective stress approach to anisotropic distortional hardening

    DOE PAGES

    Lester, B. T.; Scherzinger, W. M.

    2018-03-11

    A new yield surface with an evolving effective stress definition is proposed for consistently and efficiently describing anisotropic distortional hardening. Specifically, a new internal state variable is introduced to capture the thermodynamic evolution between different effective stress definitions. The corresponding yield surface and evolution equations of the internal variables are derived from thermodynamic considerations enabling satisfaction of the second law. A closest point projection return mapping algorithm for the proposed model is formulated and implemented for use in finite element analyses. Finally, select constitutive and larger scale boundary value problems are solved to explore the capabilities of the model andmore » examine the impact of distortional hardening on constitutive and structural responses. Importantly, these simulations demonstrate the tractability of the proposed formulation in investigating large-scale problems of interest.« less

  10. An evolving effective stress approach to anisotropic distortional hardening

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lester, B. T.; Scherzinger, W. M.

    A new yield surface with an evolving effective stress definition is proposed for consistently and efficiently describing anisotropic distortional hardening. Specifically, a new internal state variable is introduced to capture the thermodynamic evolution between different effective stress definitions. The corresponding yield surface and evolution equations of the internal variables are derived from thermodynamic considerations enabling satisfaction of the second law. A closest point projection return mapping algorithm for the proposed model is formulated and implemented for use in finite element analyses. Finally, select constitutive and larger scale boundary value problems are solved to explore the capabilities of the model andmore » examine the impact of distortional hardening on constitutive and structural responses. Importantly, these simulations demonstrate the tractability of the proposed formulation in investigating large-scale problems of interest.« less

  11. Problematic Social Media Use: Results from a Large-Scale Nationally Representative Adolescent Sample

    PubMed Central

    Bányai, Fanni; Zsila, Ágnes; Király, Orsolya; Maraz, Aniko; Elekes, Zsuzsanna; Griffiths, Mark D.; Andreassen, Cecilie Schou

    2017-01-01

    Despite social media use being one of the most popular activities among adolescents, prevalence estimates among teenage samples of social media (problematic) use are lacking in the field. The present study surveyed a nationally representative Hungarian sample comprising 5,961 adolescents as part of the European School Survey Project on Alcohol and Other Drugs (ESPAD). Using the Bergen Social Media Addiction Scale (BSMAS) and based on latent profile analysis, 4.5% of the adolescents belonged to the at-risk group, and reported low self-esteem, high level of depression symptoms, and elevated social media use. Results also demonstrated that BSMAS has appropriate psychometric properties. It is concluded that adolescents at-risk of problematic social media use should be targeted by school-based prevention and intervention programs. PMID:28068404

  12. Problematic Social Media Use: Results from a Large-Scale Nationally Representative Adolescent Sample.

    PubMed

    Bányai, Fanni; Zsila, Ágnes; Király, Orsolya; Maraz, Aniko; Elekes, Zsuzsanna; Griffiths, Mark D; Andreassen, Cecilie Schou; Demetrovics, Zsolt

    2017-01-01

    Despite social media use being one of the most popular activities among adolescents, prevalence estimates among teenage samples of social media (problematic) use are lacking in the field. The present study surveyed a nationally representative Hungarian sample comprising 5,961 adolescents as part of the European School Survey Project on Alcohol and Other Drugs (ESPAD). Using the Bergen Social Media Addiction Scale (BSMAS) and based on latent profile analysis, 4.5% of the adolescents belonged to the at-risk group, and reported low self-esteem, high level of depression symptoms, and elevated social media use. Results also demonstrated that BSMAS has appropriate psychometric properties. It is concluded that adolescents at-risk of problematic social media use should be targeted by school-based prevention and intervention programs.

  13. Data integration in the era of omics: current and future challenges

    PubMed Central

    2014-01-01

    To integrate heterogeneous and large omics data constitutes not only a conceptual challenge but a practical hurdle in the daily analysis of omics data. With the rise of novel omics technologies and through large-scale consortia projects, biological systems are being further investigated at an unprecedented scale generating heterogeneous and often large data sets. These data-sets encourage researchers to develop novel data integration methodologies. In this introduction we review the definition and characterize current efforts on data integration in the life sciences. We have used a web-survey to assess current research projects on data-integration to tap into the views, needs and challenges as currently perceived by parts of the research community. PMID:25032990

  14. Hanford's Supplemental Treatment Project: Full-Scale Integrated Testing of In-Container-Vitrification and a 10,000-Liter Dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witwer, K.S.; Dysland, E.J.; Garfield, J.S.

    2008-07-01

    The GeoMelt{sup R} In-Container Vitrification{sup TM} (ICV{sup TM}) process was selected by the U.S. Department of Energy (DOE) in 2004 for further evaluation as the supplemental treatment technology for Hanford's low-activity waste (LAW). Also referred to as 'bulk vitrification', this process combines glass forming minerals, LAW, and chemical amendments; dries the mixture; and then vitrifies the material in a refractory-lined steel container. AMEC Nuclear Ltd. (AMEC) is adapting its GeoMelt ICV{sup TM} technology for this application with technical and analytical support from Pacific Northwest National Laboratory (PNNL). The DVBS project is funded by the DOE Office of River Protection andmore » administered by CH2M HILL Hanford Group, Inc. The Demonstration Bulk Vitrification Project (DBVS) was initiated to engineer, construct, and operate a full-scale bulk vitrification pilot-plant to treat up to 750,000 liters of LAW from Waste Tank 241-S-109 at the DOE Hanford Site. Since the beginning of the DBVS project in 2004, testing has used laboratory, crucible-scale, and engineering-scale equipment to help establish process limitations of selected glass formulations and identify operational issues. Full-scale testing has provided critical design verification of the ICV{sup TM} process before operating the Hanford pilot-plant. In 2007, the project's fifth full-scale test, called FS-38D, (also known as the Integrated Dryer Melter Test, or IDMT,) was performed. This test had three primary objectives: 1) Demonstrate the simultaneous and integrated operation of the ICV{sup TM} melter with a 10,000- liter dryer, 2) Demonstrate the effectiveness of a new feed reformulation and change in process methodology towards reducing the production and migration of molten ionic salts (MIS), and, 3) Demonstrate that an acceptable glass product is produced under these conditions. Testing was performed from August 8 to 17, 2007. Process and analytical results demonstrated that the primary test objectives, along with a dozen supporting objectives, were successfully met. Glass performance exceeded all disposal performance criteria. A previous issue with MIS containment was successfully resolved in FS-38D, and the ICV{sup TM} melter was integrated with a full-scale, 10,000-liter dryer. This paper describes the rationale for performing the test, the purpose and outcome of scale-up tests preceding it, and the performance and outcome of FS-38D. (authors)« less

  15. Ecological research at the Goosenest Adaptive Management Area in northeastern California

    Treesearch

    Martin W. Ritchie

    2005-01-01

    This paper describes the establishment of an interdisciplinary, large-scale ecological research project on the Goosenest Adaptive Management Area of the Klamath National Forest in northeastern California. This project is a companion to the Blacks Mountain Ecological Research Project described by Oliver (2000). The genesis for this project was the Northwest...

  16. Success in health information exchange projects: solving the implementation puzzle.

    PubMed

    Sicotte, Claude; Paré, Guy

    2010-04-01

    Interest in health information exchange (HIE), defined as the use of information technology to support the electronic transfer of clinical information across health care organizations, continues to grow among those pursuing greater patient safety and health care accessibility and efficiency. In this paper, we present the results of a longitudinal multiple-case study of two large-scale HIE implementation projects carried out in real time over 3-year and 2-year periods in Québec, Canada. Data were primarily collected through semi-structured interviews (n=52) with key informants, namely implementation team members and targeted users. These were supplemented with non-participants observation of team meetings and by the analysis of organizational documents. The cross-case comparison was particularly relevant given that project circumstances led to contrasting outcomes: while one project failed, the other was a success. A risk management analysis was performed taking a process view in order to capture the complexity of project implementations as evolving phenomena that are affected by interdependent pre-existing and emergent risks that tend to change over time. The longitudinal case analysis clearly demonstrates that the risk factors were closely intertwined. Systematic ripple effects from one risk factor to another were observed. This risk interdependence evolved dynamically over time, with a snowball effect that rendered a change of path progressively more difficult as time passed. The results of the cross-case analysis demonstrate a direct relationship between the quality of an implementation strategy and project outcomes. Copyright 2010 Elsevier Ltd. All rights reserved.

  17. HTS-DB: an online resource to publish and query data from functional genomics high-throughput siRNA screening projects.

    PubMed

    Saunders, Rebecca E; Instrell, Rachael; Rispoli, Rossella; Jiang, Ming; Howell, Michael

    2013-01-01

    High-throughput screening (HTS) uses technologies such as RNA interference to generate loss-of-function phenotypes on a genomic scale. As these technologies become more popular, many research institutes have established core facilities of expertise to deal with the challenges of large-scale HTS experiments. As the efforts of core facility screening projects come to fruition, focus has shifted towards managing the results of these experiments and making them available in a useful format that can be further mined for phenotypic discovery. The HTS-DB database provides a public view of data from screening projects undertaken by the HTS core facility at the CRUK London Research Institute. All projects and screens are described with comprehensive assay protocols, and datasets are provided with complete descriptions of analysis techniques. This format allows users to browse and search data from large-scale studies in an informative and intuitive way. It also provides a repository for additional measurements obtained from screens that were not the focus of the project, such as cell viability, and groups these data so that it can provide a gene-centric summary across several different cell lines and conditions. All datasets from our screens that can be made available can be viewed interactively and mined for further hit lists. We believe that in this format, the database provides researchers with rapid access to results of large-scale experiments that might facilitate their understanding of genes/compounds identified in their own research. DATABASE URL: http://hts.cancerresearchuk.org/db/public.

  18. Intelligent monitoring system for real-time geologic CO2 storage, optimization and reservoir managemen

    NASA Astrophysics Data System (ADS)

    Dou, S.; Commer, M.; Ajo Franklin, J. B.; Freifeld, B. M.; Robertson, M.; Wood, T.; McDonald, S.

    2017-12-01

    Archer Daniels Midland Company's (ADM) world-scale agricultural processing and biofuels production complex located in Decatur, Illinois, is host to two industrial-scale carbon capture and storage projects. The first operation within the Illinois Basin-Decatur Project (IBDP) is a large-scale pilot that injected 1,000,000 metric tons of CO2 over a three year period (2011-2014) in order to validate the Illinois Basin's capacity to permanently store CO2. Injection for the second operation, the Illinois Industrial Carbon Capture and Storage Project (ICCS), started in April 2017, with the purpose of demonstrating the integration of carbon capture and storage (CCS) technology at an ethanol plant. The capacity to store over 1,000,000 metric tons of CO2 per year is anticipated. The latter project is accompanied by the development of an intelligent monitoring system (IMS) that will, among other tasks, perform hydrogeophysical joint analysis of pressure, temperature and seismic reflection data. Using a preliminary radial model assumption, we carry out synthetic joint inversion studies of these data combinations. We validate the history-matching process to be applied to field data once CO2-breakthrough at observation wells occurs. This process will aid the estimation of permeability and porosity for a reservoir model that best matches monitoring observations. The reservoir model will further be used for forecasting studies in order to evaluate different leakage scenarios and develop appropriate early-warning mechanisms. Both the inversion and forecasting studies aim at building an IMS that will use the seismic and pressure-temperature data feeds for providing continuous model calibration and reservoir status updates.

  19. Search for subgrid scale parameterization by projection pursuit regression

    NASA Technical Reports Server (NTRS)

    Meneveau, C.; Lund, T. S.; Moin, Parviz

    1992-01-01

    The dependence of subgrid-scale stresses on variables of the resolved field is studied using direct numerical simulations of isotropic turbulence, homogeneous shear flow, and channel flow. The projection pursuit algorithm, a promising new regression tool for high-dimensional data, is used to systematically search through a large collection of resolved variables, such as components of the strain rate, vorticity, velocity gradients at neighboring grid points, etc. For the case of isotropic turbulence, the search algorithm recovers the linear dependence on the rate of strain (which is necessary to transfer energy to subgrid scales) but is unable to determine any other more complex relationship. For shear flows, however, new systematic relations beyond eddy viscosity are found. For the homogeneous shear flow, the results suggest that products of the mean rotation rate tensor with both the fluctuating strain rate and fluctuating rotation rate tensors are important quantities in parameterizing the subgrid-scale stresses. A model incorporating these terms is proposed. When evaluated with direct numerical simulation data, this model significantly increases the correlation between the modeled and exact stresses, as compared with the Smagorinsky model. In the case of channel flow, the stresses are found to correlate with products of the fluctuating strain and rotation rate tensors. The mean rates of rotation or strain do not appear to be important in this case, and the model determined for homogeneous shear flow does not perform well when tested with channel flow data. Many questions remain about the physical mechanisms underlying these findings, about possible Reynolds number dependence, and, given the low level of correlations, about their impact on modeling. Nevertheless, demonstration of the existence of causal relations between sgs stresses and large-scale characteristics of turbulent shear flows, in addition to those necessary for energy transfer, provides important insight into the relation between scales in turbulent flows.

  20. An assessment of people's satisfaction with the public hearing on the Yadana Natural Gas Pipeline project.

    PubMed

    Ogunlana, S; Yotsinsak, T; Yisa, S

    2001-11-01

    Many public and large-scale construction projects in Thailand have been faced with environmental and social conflict problems. The major cause is that project sponsors do not address concerns of the public in a proper manner during EIA study. The Yadana Natural Gas Pipeline (YNGP) project is an example of a project which suffered the effects of public demonstration. A public hearing, one technique of public participation, is a good mechanism to solve conflict problems in a non-violent way which the Thai Government usually adopts to settle conflict in construction projects. In the case of the YNGP, even after the conflict was 'resolved' hostility towards the project was not eliminated, as the opponents were not satisfied with the decision. Therefore, this article examines the hearing on the YNGP project. The study found that it was too late to make any significant changes to the project after the hearing was held, most respondents were not satisfied with the project. In other words, this hearing did not improve their perception of environmental soundness of the project. The study showed that the project's impact on the environment was not properly addressed. The project sponsors did not provide sufficient publicity for the meeting and the stage at which the hearing was conducted. Suggestions are made for improving participation in future hearings.

  1. Visualizing phylogenetic tree landscapes.

    PubMed

    Wilgenbusch, James C; Huang, Wen; Gallivan, Kyle A

    2017-02-02

    Genomic-scale sequence alignments are increasingly used to infer phylogenies in order to better understand the processes and patterns of evolution. Different partitions within these new alignments (e.g., genes, codon positions, and structural features) often favor hundreds if not thousands of competing phylogenies. Summarizing and comparing phylogenies obtained from multi-source data sets using current consensus tree methods discards valuable information and can disguise potential methodological problems. Discovery of efficient and accurate dimensionality reduction methods used to display at once in 2- or 3- dimensions the relationship among these competing phylogenies will help practitioners diagnose the limits of current evolutionary models and potential problems with phylogenetic reconstruction methods when analyzing large multi-source data sets. We introduce several dimensionality reduction methods to visualize in 2- and 3-dimensions the relationship among competing phylogenies obtained from gene partitions found in three mid- to large-size mitochondrial genome alignments. We test the performance of these dimensionality reduction methods by applying several goodness-of-fit measures. The intrinsic dimensionality of each data set is also estimated to determine whether projections in 2- and 3-dimensions can be expected to reveal meaningful relationships among trees from different data partitions. Several new approaches to aid in the comparison of different phylogenetic landscapes are presented. Curvilinear Components Analysis (CCA) and a stochastic gradient decent (SGD) optimization method give the best representation of the original tree-to-tree distance matrix for each of the three- mitochondrial genome alignments and greatly outperformed the method currently used to visualize tree landscapes. The CCA + SGD method converged at least as fast as previously applied methods for visualizing tree landscapes. We demonstrate for all three mtDNA alignments that 3D projections significantly increase the fit between the tree-to-tree distances and can facilitate the interpretation of the relationship among phylogenetic trees. We demonstrate that the choice of dimensionality reduction method can significantly influence the spatial relationship among a large set of competing phylogenetic trees. We highlight the importance of selecting a dimensionality reduction method to visualize large multi-locus phylogenetic landscapes and demonstrate that 3D projections of mitochondrial tree landscapes better capture the relationship among the trees being compared.

  2. UAS in the NAS Project: Large-Scale Communication Architecture Simulations with NASA GRC Gen5 Radio Model

    NASA Technical Reports Server (NTRS)

    Kubat, Gregory

    2016-01-01

    This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.

  3. Climate Drivers of Alaska Summer Stream Temperature

    NASA Astrophysics Data System (ADS)

    Bieniek, P.; Bhatt, U. S.; Plumb, E. W.; Thoman, R.; Trammell, E. J.

    2016-12-01

    The temperature of the water in lakes, rivers and streams has wide ranging impacts from local water quality and fish habitats to global climate change. Salmon fisheries in Alaska, a critical source of food in many subsistence communities, are sensitive to large-scale climate variability and river and stream temperatures have also been linked with salmon production in Alaska. Given current and projected climate change, understanding the mechanisms that link the large-scale climate and river and stream temperatures is essential to better understand the changes that may occur with aquatic life in Alaska's waterways on which subsistence users depend. An analysis of Alaska stream temperatures in the context of reanalysis, downscaled, station and other climate data is undertaken in this study to fill that need. Preliminary analysis identified eight stream observation sites with sufficiently long (>15 years) data available for climate-scale analysis in Alaska with one station, Terror Creek in Kodiak, having a 30-year record. Cross-correlation of summer (June-August) water temperatures between the stations are generally high even though they are spread over a large geographic region. Correlation analysis of the Terror Creek summer observations with seasonal sea surface temperatures (SSTs) in the North Pacific broadly resembles the SST anomaly fields typically associated with the Pacific Decadal Oscillation (PDO). A similar result was found for the remaining stations and in both cases PDO-like correlation patterns also occurred in the preceding spring. These preliminary results demonstrate that there is potential to diagnose the mechanisms that link the large-scale climate system and Alaska stream temperatures.

  4. The 80 megawatt wind power project at Kahuku Point, Hawaii

    NASA Technical Reports Server (NTRS)

    Laessig, R. R.

    1982-01-01

    Windfarms Ltd. is developing the two largest wind energy projects in the world. Designed to produce 80 megawatts at Kahuku Point, Hawaii and 350 megawatts in Solano County, California, these projects will be the prototypes for future large-scale wind energy installations throughout the world.

  5. Exposing and deposing hyper-economized school science

    NASA Astrophysics Data System (ADS)

    Bencze, John Lawrence

    2010-06-01

    Despite indications of the problematic nature of laissez faire capitalism, such as the convictions of corporate leaders and the global financial crisis that appeared to largely stem from a de-regulated financial services industry, it seems clear that societies and environments continue to be strongly influenced by hyper-economized worldviews and practices. Given the importance of societal acceptance of a potentially dominant ideological perspective, it is logical to assume that it would be critical for students to be prepared to function in niches prioritizing unrestricted for-profit commodity exchanges. Indeed, in their article in this issue, Lyn Carter and Ranjith Dediwalage appear to support this claim in their analyses of the large-scale and expensive Australian curriculum and instruction project, Sustainability by the Bay. More specifically, they effectively demonstrate that this project manifests several characteristics that would suggest neoliberal and neoconservative influences—ideological perspectives that they argue are largely fundamental to the functioning of the global economic system. In this forum article, possible adverse effects of neoliberalism and neoconservatism on school science are discussed—with further justification for Carter and Dediwalage's concerns. Additionally, however, this article raises the possibility of subverting neoliberalism and neoconservatism in science education through application of communitarian ideals.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shull, H.E.

    The objective of the project was to investigate the economic feasibility of converting potato waste to fuel alcohol. The source of potato starch was Troyer Farms Potato Chips. Experimental work was carried out at both the laboratory scale and the larger pilot scale batch operation at a decommissioned waste water treatment building on campus. The laboratory scale work was considerably more extensive than originally planned, resulting in a much improved scientific work. The pilot scale facility has been completed and operated successfully. In contrast, the analysis of the economic feasibility of commercial production has not yet been completed. The projectmore » was brought to a close with the successful demonstration of the fermentation and distillation using the large scale facilities described previously. Two batches of mash were cooked using the procedures established in support of the laboratory scale work. One of the batches was fermented using the optimum values of the seven controlled factors as predicted by the laboratory scale application of the Box-Wilson design. The other batch was fermented under conditions derived out of Mr. Rouse's interpretation of his long sequence of laboratory results. He was gratified to find that his commitment to the Box-Wilson experiments was justified. The productivity of the Box-Wilson design was greater. The difference between the performance of the two fermentors (one stirred, one not) has not been established yet. Both batches were then distilled together, demonstrating the satisfactory performance of the column still. 4 references.« less

  7. Developing a monitoring and verification plan with reference to the Australian Otway CO2 pilot project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dodds, K.; Daley, T.; Freifeld, B.

    2009-05-01

    The Australian Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC) is currently injecting 100,000 tons of CO{sub 2} in a large-scale test of storage technology in a pilot project in southeastern Australia called the CO2CRC Otway Project. The Otway Basin, with its natural CO{sub 2} accumulations and many depleted gas fields, offers an appropriate site for such a pilot project. An 80% CO{sub 2} stream is produced from a well (Buttress) near the depleted gas reservoir (Naylor) used for storage (Figure 1). The goal of this project is to demonstrate that CO{sub 2} can be safely transported, stored underground, andmore » its behavior tracked and monitored. The monitoring and verification framework has been developed to monitor for the presence and behavior of CO{sub 2} in the subsurface reservoir, near surface, and atmosphere. This monitoring framework addresses areas, identified by a rigorous risk assessment, to verify conformance to clearly identifiable performance criteria. These criteria have been agreed with the regulatory authorities to manage the project through all phases addressing responsibilities, liabilities, and to assure the public of safe storage.« less

  8. The Mesaba Energy Project: Clean Coal Power Initiative, Round 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, Richard; Gray, Gordon; Evans, Robert

    2014-07-31

    The Mesaba Energy Project is a nominal 600 MW integrated gasification combine cycle power project located in Northeastern Minnesota. It was selected to receive financial assistance pursuant to code of federal regulations (?CFR?) 10 CFR 600 through a competitive solicitation under Round 2 of the Department of Energy?s Clean Coal Power Initiative, which had two stated goals: (1) to demonstrate advanced coal-based technologies that can be commercialized at electric utility scale, and (2) to accelerate the likelihood of deploying demonstrated technologies for widespread commercial use in the electric power sector. The Project was selected in 2004 to receive a totalmore » of $36 million. The DOE portion that was equally cost shared in Budget Period 1 amounted to about $22.5 million. Budget Period 1 activities focused on the Project Definition Phase and included: project development, preliminary engineering, environmental permitting, regulatory approvals and financing to reach financial close and start of construction. The Project is based on ConocoPhillips? E-Gas? Technology and is designed to be fuel flexible with the ability to process sub-bituminous coal, a blend of sub-bituminous coal and petroleum coke and Illinois # 6 bituminous coal. Major objectives include the establishment of a reference plant design for Integrated Gasification Combined Cycle (?IGCC?) technology featuring advanced full slurry quench, multiple train gasification, integration of the air separation unit, and the demonstration of 90% operational availability and improved thermal efficiency relative to previous demonstration projects. In addition, the Project would demonstrate substantial environmental benefits, as compared with conventional technology, through dramatically lower emissions of sulfur dioxide, nitrogen oxides, volatile organic compounds, carbon monoxide, particulate matter and mercury. Major milestones achieved in support of fulfilling the above goals include obtaining Site, High Voltage Transmission Line Route, and Natural Gas Pipeline Route Permits for a Large Electric Power Generating Plant to be located in Taconite, Minnesota. In addition, major pre-construction permit applications have been filed requesting authorization for the Project to i) appropriate water sufficient to accommodate its worst case needs, ii) operate a major stationary source in compliance with regulations established to protect public health and welfare, and iii) physically alter the geographical setting to accommodate its construction. As of the current date, the Water Appropriation Permits have been obtained.« less

  9. Imaging spectroscopy links aspen genotype with below-ground processes at landscape scales

    PubMed Central

    Madritch, Michael D.; Kingdon, Clayton C.; Singh, Aditya; Mock, Karen E.; Lindroth, Richard L.; Townsend, Philip A.

    2014-01-01

    Fine-scale biodiversity is increasingly recognized as important to ecosystem-level processes. Remote sensing technologies have great potential to estimate both biodiversity and ecosystem function over large spatial scales. Here, we demonstrate the capacity of imaging spectroscopy to discriminate among genotypes of Populus tremuloides (trembling aspen), one of the most genetically diverse and widespread forest species in North America. We combine imaging spectroscopy (AVIRIS) data with genetic, phytochemical, microbial and biogeochemical data to determine how intraspecific plant genetic variation influences below-ground processes at landscape scales. We demonstrate that both canopy chemistry and below-ground processes vary over large spatial scales (continental) according to aspen genotype. Imaging spectrometer data distinguish aspen genotypes through variation in canopy spectral signature. In addition, foliar spectral variation correlates well with variation in canopy chemistry, especially condensed tannins. Variation in aspen canopy chemistry, in turn, is correlated with variation in below-ground processes. Variation in spectra also correlates well with variation in soil traits. These findings indicate that forest tree species can create spatial mosaics of ecosystem functioning across large spatial scales and that these patterns can be quantified via remote sensing techniques. Moreover, they demonstrate the utility of using optical properties as proxies for fine-scale measurements of biodiversity over large spatial scales. PMID:24733949

  10. Hanford’s Supplemental Treatment Project: Full-Scale Integrated Testing of In-Container-Vitrification and a 10,000-Liter Dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witwer, Keith S.; Dysland, Eric J.; Garfield, J. S.

    2008-02-22

    The GeoMelt® In-Container Vitrification™ (ICV™) process was selected by the U.S. Department of Energy (DOE) in 2004 for further evaluation as the supplemental treatment technology for Hanford’s low-activity waste (LAW). Also referred to as “bulk vitrification,” this process combines glass forming minerals, LAW, and chemical amendments; dries the mixture; and then vitrifies the material in a refractory-lined steel container. AMEC Nuclear Ltd. (AMEC) is adapting its GeoMelt ICV™ technology for this application with technical and analytical support from Pacific Northwest National Laboratory (PNNL). The DVBS project is funded by the DOE Office of River Protection and administered by CH2M HILLmore » Hanford Group, Inc. The Demonstration Bulk Vitrification Project (DBVS) was initiated to engineer, construct, and operate a full-scale bulk vitrification pilot-plant to treat up to 750,000 liters of LAW from Waste Tank 241-S-109 at the DOE Hanford Site. Since the beginning of the DBVS project in 2004, testing has used laboratory, crucible-scale, and engineering-scale equipment to help establish process limitations of selected glass formulations and identify operational issues. Full-scale testing has provided critical design verification of the ICV™ process before operating the Hanford pilot-plant. In 2007, the project’s fifth full-scale test, called FS-38D, (also known as the Integrated Dryer Melter Test, or IDMT,) was performed. This test had three primary objectives: 1) Demonstrate the simultaneous and integrated operation of the ICV™ melter with a 10,000-liter dryer, 2) Demonstrate the effectiveness of a new feed reformulation and change in process methodology towards reducing the production and migration of molten ionic salts (MIS), and, 3) Demonstrate that an acceptable glass product is produced under these conditions. Testing was performed from August 8 to 17, 2007. Process and analytical results demonstrated that the primary test objectives, along with a dozen supporting objectives, were successfully met. Glass performance exceeded all disposal performance criteria. A previous issue with MIS containment was successfully resolved in FS-38D, and the ICV™ melter was integrated with a full-scale, 10,000-liter dryer. This paper describes the rationale for performing the test, the purpose and outcome of scale-up tests preceding it, and the performance and outcome of FS-38D.« less

  11. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  12. Development and Large-Scale Validation of an Instrument to Assess Arabic-Speaking Students' Attitudes toward Science

    ERIC Educational Resources Information Center

    Abd-El-Khalick, Fouad; Summers, Ryan; Said, Ziad; Wang, Shuai; Culbertson, Michael

    2015-01-01

    This study is part of a large-scale project focused on "Qatari students' Interest in, and Attitudes toward, Science" (QIAS). QIAS aimed to gauge Qatari student attitudes toward science in grades 3-12, examine factors that impact these attitudes, and assess the relationship between student attitudes and prevailing modes of science…

  13. Status of JUPITER Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inoue, T.; Shirakata, K.; Kinjo, K.

    To obtain the data necessary for evaluating the nuclear design method of a large-scale fast breeder reactor, criticality tests with a large- scale homogeneous reactor were conducted as part of a joint research program by Japan and the U.S. Analyses of the tests are underway in both countries. The purpose of this paper is to describe the status of this project.

  14. AN EXAMINATION OF CITIZEN PARTICIPATION AND PROCEDURAL FAIRNESS IN LARGE-SCALE URBAN TREE PLANTING INITIATIVES IN THE UNITED STATES

    EPA Science Inventory

    This project will result in a typology of the degrees and forms of citizen participation in large-scale urban tree planting initiatives. It also will identify specific aspects of urban tree planting processes that residents perceive as fair and unfair, which will provide ad...

  15. Technology demonstration for reusable launchers

    NASA Astrophysics Data System (ADS)

    Baiocco, P.; Bonnal, Ch.

    2016-03-01

    Reusable launchers have been studied under CNES contracts for more than 30 years, with early concepts such as STS-2000 or Oriflamme, more recently with very significant efforts devoted to Liquid Fly Back Boosters as with the Bargouzin project led with Tsniimash, TSTO with the Everest concept studied by Airbus-DS as prime contractor or the RFS Reusable First Stage concept of a large first stage associated to a cryotechnic second stage. These investigations, summarized in the first part of the paper, enabled CNES to identify clearly the technology requirements associated to reusability, as well as cost efficiency through detailed non-recurring costs and mission costs analysis. In parallel, CNES set in place development logic for sub-systems and equipment based on demonstrators, hardware test benches enabling maturation of technologies up to a TRL such that an actual development can be decided with limited risk. This philosophy has been applied so far to a large number of cases, such as TPTech and TPX for Hydrogen turbo pump, GGPX as demonstrator of innovative gas generator, HX demonstrator of modern cryotechnic upper stage with a dozen of different objectives (Thermal Protection, 20K Helium storage, measurements …). This virtuous approach, "learn as you test", is currently applied in the phased approach towards scaled down reusable booster stage, whose possibility to be used as first stage of a microlaunch vehicle is under investigation. The selected technologies allow paving the way towards reusable booster stages for Ariane 6 evolutions or main reusable stage for a further generation of heavy launchers. The paper describes the logic behind this project, together with the demonstration objectives set for the various sub-systems as well as operations.

  16. Development of superconductor magnetic suspension and balance prototype facility for studying the feasibility of applying this technique to large scale aerodynamic testing

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    The unique design and operational characteristics of a prototype magnetic suspension and balance facility which utilizes superconductor technology are described and discussed from the point of view of scalability to large sizes. The successful experimental demonstration of the feasibility of this new magnetic suspension concept of the University of Virginia, together with the success of the cryogenic wind-tunnel concept developed at Langley Research Center, appear to have finally opened the way to clean-tunnel, high-Re aerodynamic testing. Results of calculations corresponding to a two-step design extrapolation from the observed performance of the prototype magnetic suspension system to a system compatible with the projected cryogenic transonic research tunnel are presented to give an order-of-magnitude estimate of expected performance characteristics. Research areas where progress should lead to improved design and performance of large facilities are discussed.

  17. Advanced Flue Gas Desulfurization (AFGD) Demonstration Project, A DOE Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    National Energy Technology Laboratory

    2001-08-31

    The AFGD process as demonstrated by Pure Air at the Bailly Station offers a reliable and cost-effective means of achieving a high degree of SO{sub 2} emissions reduction when burning high-sulfur coals. Many innovative features have been successfully incorporated in this process, and it is ready for widespread commercial use. The system uses a single-loop cocurrent scrubbing process with in-situ oxidation to produce wallboard-grade gypsum instead of wet sludge. A novel wastewater evaporation system minimizes effluents. The advanced scrubbing process uses a common absorber to serve multiple boilers, thereby saving on capital through economies of scale. Major results of themore » project are: (1) SO{sub 2} removal of over 94 percent was achieved over the three-year demonstration period, with a system availability exceeding 99.5 percent; (2) a large, single absorber handled the combined flue gas of boilers generating 528 MWe of power, and no spares were required; (3) direct injection of pulverized limestone into the absorber was successful; (4) Wastewater evaporation eliminated the need for liquid waste disposal; and (5) the gypsum by-product was used directly for wallboard manufacture, eliminating the need to dispose of waste sludge.« less

  18. Parlin Creek large woody debris placement project

    Treesearch

    Barry W. Collins

    1999-01-01

    In August 1996 the Jackson Demonstration State Forest (JSDF) completed a fish habitat rehabilitation project in a 2.5 mile reach of Parlin Creek, a tributary to the Noyo River in Mendocino County, California. The purse of the project was to introduce large woody material to the stream channel to determine if higher quality habitat could be produced for anadromous...

  19. Regional variability of the frequency distribution of daily precipitation and the synoptic characteristics of heavy precipitation events in present and future climate simulations

    NASA Astrophysics Data System (ADS)

    DeAngelis, Anthony M.

    Changes in the characteristics of daily precipitation in response to global warming may have serious impacts on human life and property. An analysis of precipitation in climate models is performed to evaluate how well the models simulate the present climate and how precipitation may change in the future. Models participating in phase 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) have substantial biases in their simulation of heavy precipitation intensity over parts of North America during the 20th century. Despite these biases, the large-scale atmospheric circulation accompanying heavy precipitation is either simulated realistically or the strength of the circulation is overestimated. The biases are not related to the large-scale flow in a simple way, pointing toward the importance of other model deficiencies, such as coarse horizontal resolution and convective parameterizations, for the accurate simulation of intense precipitation. Although the models may not sufficiently simulate the intensity of precipitation, their realistic portrayal of the large-scale circulation suggests that projections of future precipitation may be reliable. In the CMIP5 ensemble, the distribution of daily precipitation is projected to undergo substantial changes in response to future atmospheric warming. The regional distribution of these changes was investigated, revealing that dry days and days with heavy-extreme precipitation are projected to increase at the expense of light-moderate precipitation over much of the middle and low latitudes. Such projections have serious implications for future impacts from flood and drought events. In other places, changes in the daily precipitation distribution are characterized by a shift toward either wetter or drier conditions in the future, with heavy-extreme precipitation projected to increase in all but the driest subtropical subsidence regions. Further analysis shows that increases in heavy precipitation in midlatitudes are largely explained by thermodynamics, including increases in atmospheric water vapor. However, in low latitudes and northern high latitudes, changes in vertical velocity accompanying heavy precipitation are also important. The strength of the large-scale atmospheric circulation is projected to change in accordance with vertical velocity in many places, though the circulation patterns, and therefore physical mechanisms that generate heavy precipitation, may remain the same.

  20. Deep Space Habitat Concept Demonstrator

    NASA Technical Reports Server (NTRS)

    Bookout, Paul S.; Smitherman, David

    2015-01-01

    This project will develop, integrate, test, and evaluate Habitation Systems that will be utilized as technology testbeds and will advance NASA's understanding of alternative deep space mission architectures, requirements, and operations concepts. Rapid prototyping and existing hardware will be utilized to develop full-scale habitat demonstrators. FY 2014 focused on the development of a large volume Space Launch System (SLS) class habitat (Skylab Gen 2) based on the SLS hydrogen tank components. Similar to the original Skylab, a tank section of the SLS rocket can be outfitted with a deep space habitat configuration and launched as a payload on an SLS rocket. This concept can be used to support extended stay at the Lunar Distant Retrograde Orbit to support the Asteroid Retrieval Mission and provide a habitat suitable for human missions to Mars.

  1. KSC-2009-6452

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – NASA Kennedy Space Center Director Bob Cabana addresses the audience on hand for the unveiling of NASA's first large-scale solar power generation facility at Kennedy in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  2. Projections of Flood Risk using Credible Climate Signals in the Ohio River Basin

    NASA Astrophysics Data System (ADS)

    Schlef, K.; Robertson, A. W.; Brown, C.

    2017-12-01

    Estimating future hydrologic flood risk under non-stationary climate is a key challenge to the design of long-term water resources infrastructure and flood management strategies. In this work, we demonstrate how projections of large-scale climate patterns can be credibly used to create projections of long-term flood risk. Our study area is the northwest region of the Ohio River Basin in the United States Midwest. In the region, three major teleconnections have been previously demonstrated to affect synoptic patterns that influence extreme precipitation and streamflow: the El Nino Southern Oscillation, the Pacific North American pattern, and the Pacific Decadal Oscillation. These teleconnections are strongest during the winter season (January-March), which also experiences the greatest number of peak flow events. For this reason, flood events are defined as the maximum daily streamflow to occur in the winter season. For each gage in the region, the location parameter of a log Pearson type 3 distribution is conditioned on the first principal component of the three teleconnections to create a statistical model of flood events. Future projections of flood risk are created by forcing the statistical model with projections of the teleconnections from general circulation models selected for skill. We compare the results of our method to the results of two other methods: the traditional model chain (i.e., general circulation model projections to downscaling method to hydrologic model to flood frequency analysis) and that of using the historic trend. We also discuss the potential for developing credible projections of flood events for the continental United States.

  3. Utilization of Integrated Assessment Modeling for determining geologic CO2 storage security

    NASA Astrophysics Data System (ADS)

    Pawar, R.

    2017-12-01

    Geologic storage of carbon dioxide (CO2) has been extensively studied as a potential technology to mitigate atmospheric concentration of CO2. Multiple international research & development efforts, large-scale demonstration and commercial projects are helping advance the technology. One of the critical areas of active investigation is prediction of long-term CO2 storage security and risks. A quantitative methodology for predicting a storage site's long-term performance is critical for making key decisions necessary for successful deployment of commercial scale projects where projects will require quantitative assessments of potential long-term liabilities. These predictions are challenging given that they require simulating CO2 and in-situ fluid movements as well as interactions through the primary storage reservoir, potential leakage pathways (such as wellbores, faults, etc.) and shallow resources such as groundwater aquifers. They need to take into account the inherent variability and uncertainties at geologic sites. This talk will provide an overview of an approach based on integrated assessment modeling (IAM) to predict long-term performance of a geologic storage site including, storage reservoir, potential leakage pathways and shallow groundwater aquifers. The approach utilizes reduced order models (ROMs) to capture the complex physical/chemical interactions resulting due to CO2 movement and interactions but are computationally extremely efficient. Applicability of the approach will be demonstrated through examples that are focused on key storage security questions such as what is the probability of leakage of CO2 from a storage reservoir? how does storage security vary for different geologic environments and operational conditions? how site parameter variability and uncertainties affect storage security, etc.

  4. Chapter 13 - Perspectives on LANDFIRE Prototype Project Accuracy Assessment

    Treesearch

    James Vogelmann; Zhiliang Zhu; Jay Kost; Brian Tolk; Donald Ohlen

    2006-01-01

    The purpose of this chapter is to provide a general overview of the many aspects of accuracy assessment pertinent to the Landscape Fire and Resource Management Planning Tools Prototype Project (LANDFIRE Prototype Project). The LANDFIRE Prototype formed a large and complex research and development project with many broad-scale data sets and products developed throughout...

  5. Results of the Greenland Ice Sheet Model Initialisation Experiments ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, H.; Nowicki, S.; Edwards, T.; Beckley, M.; Abe-Ouchi, A.; Aschwanden, A.; Calov, R.; Gagliardini, O.; Gillet-chaulet, F.; Golledge, N. R.; Gregory, J. M.; Greve, R.; Humbert, A.; Huybrechts, P.; Larour, E. Y.; Lipscomb, W. H.; Le ´h, S.; Lee, V.; Kennedy, J. H.; Pattyn, F.; Payne, A. J.; Rodehacke, C. B.; Rückamp, M.; Saito, F.; Schlegel, N.; Seroussi, H. L.; Shepherd, A.; Sun, S.; Vandewal, R.; Ziemen, F. A.

    2016-12-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. The goal of this intercomparison exercise (initMIP-Greenland) is to compare, evaluate and improve the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss final results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  6. An efficient and scalable analysis framework for variant extraction and refinement from population-scale DNA sequence data.

    PubMed

    Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min

    2015-06-01

    The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.

  7. Pretreatment Engineering Platform Phase 1 Final Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurath, Dean E.; Hanson, Brady D.; Minette, Michael J.

    2009-12-23

    Pacific Northwest National Laboratory (PNNL) was tasked by Bechtel National Inc. (BNI) on the River Protection Project, Hanford Tank Waste Treatment and Immobilization Plant (RPP-WTP) project to conduct testing to demonstrate the performance of the WTP Pretreatment Facility (PTF) leaching and ultrafiltration processes at an engineering-scale. In addition to the demonstration, the testing was to address specific technical issues identified in Issue Response Plan for Implementation of External Flowsheet Review Team (EFRT) Recommendations - M12, Undemonstrated Leaching Processes.( ) Testing was conducted in a 1/4.5-scale mock-up of the PTF ultrafiltration system, the Pretreatment Engineering Platform (PEP). Parallel laboratory testing wasmore » conducted in various PNNL laboratories to allow direct comparison of process performance at an engineering-scale and a laboratory-scale. This report presents and discusses the results of those tests.« less

  8. 3 CFR 13502 - Executive Order 13502 of February 6, 2009. Use of Project Labor Agreements for Federal...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... developing by providing structure and stability to large-scale construction projects, thereby promoting the... procurement, producing labor-management stability, and ensuring compliance with laws and regulations governing... construction projects receiving Federal financial assistance, would help to promote the economical, efficient...

  9. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  10. Large scale rigidity-based flexibility analysis of biomolecules

    PubMed Central

    Streinu, Ileana

    2016-01-01

    KINematics And RIgidity (KINARI) is an on-going project for in silico flexibility analysis of proteins. The new version of the software, Kinari-2, extends the functionality of our free web server KinariWeb, incorporates advanced web technologies, emphasizes the reproducibility of its experiments, and makes substantially improved tools available to the user. It is designed specifically for large scale experiments, in particular, for (a) very large molecules, including bioassemblies with high degree of symmetry such as viruses and crystals, (b) large collections of related biomolecules, such as those obtained through simulated dilutions, mutations, or conformational changes from various types of dynamics simulations, and (c) is intended to work as seemlessly as possible on the large, idiosyncratic, publicly available repository of biomolecules, the Protein Data Bank. We describe the system design, along with the main data processing, computational, mathematical, and validation challenges underlying this phase of the KINARI project. PMID:26958583

  11. How big is too big or how many partners are needed to build a large project which still can be managed successfully?

    NASA Astrophysics Data System (ADS)

    Henkel, Daniela; Eisenhauer, Anton

    2017-04-01

    During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.

  12. 1366 Project Silicon: Reclaiming US Silicon PV Leadership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenz, Adam

    1366 Technologies’ Project Silicon addresses two of the major goals of the DOE’s PV Manufacturing Initiative Part 2 program: 1) How to reclaim a strong silicon PV manufacturing presence and; 2) How to lower the levelized cost of electricity (“LCOE”) for solar to $0.05-$0.07/kWh, enabling wide-scale U.S. market adoption. To achieve these two goals, US companies must commercialize disruptive, high-value technologies that are capable of rapid scaling, defensible from foreign competition, and suited for US manufacturing. These are the aims of 1366 Technologies Direct Wafer ™ process. The research conducted during Project Silicon led to the first industrial scaling ofmore » 1366’s Direct Wafer™ process – an innovative, US-friendly (efficient, low-labor content) manufacturing process that destroys the main cost barrier limiting silicon PV cost-reductions: the 35-year-old grand challenge of making quality wafers (40% of the cost of modules) without the cost and waste of sawing. The SunPath program made it possible for 1366 Technologies to build its demonstration factory, a key and critical step in the Company’s evolution. The demonstration factory allowed 1366 to build every step of the process flow at production size, eliminating potential risk and ensuring the success of the Company’s subsequent scaling for a 1 GW factory to be constructed in Western New York in 2016 and 2017. Moreover, the commercial viability of the Direct Wafer process and its resulting wafers were established as 1366 formed key strategic partnerships, gained entry into the $8B/year multi-Si wafer market, and installed modules featuring Direct Wafer products – the veritable proving grounds for the technology. The program also contributed to the development of three Generation 3 Direct Wafer furnaces. These furnaces are the platform for copying intelligently and preparing our supply chain – large-scale expansion will not require a bigger machine but more machines. SunPath filled the crucial development step between the original research effort in Lexington and the GW factory scheduled to be online before the end of the decade. At the conclusion of the project, it is clear that the Direct Wafer™ technology will have a dramatic impact on the entire silicon photovoltaic supply chain by effectively doubling existing silicon capacity (by reducing silicon usage by 50%) and reducing supply chain capital costs by 35%. The technology, when fully-scaled in the US, will also lead to significant job growth, with the eventual creation of 1,000 jobs in Western New York.« less

  13. Early Implementation of Large Scale Carbon Dioxide Removal Projects through the Cement Industry

    NASA Astrophysics Data System (ADS)

    Zeman, F. S.

    2014-12-01

    The development of large-scale carbon dioxide reduction projects requires high purity CO2and a reactive cation source. A project seeking to provide both of these requirements will likely face cost barriers with current carbon prices. The cement industry is a suitable early implementation site for such projects by virtue of the properties of its exhaust gases and those of waste concrete. Cement plants are the second largest source of industrial CO2 emissions, globally. It is also the second largest commodity after water, has no ready substitute and is literally the foundation of society. Finally, half of the CO2 emissions originate from process reactions rather than fossil fuel combustion resulting in higher flue gas CO2concentrations. These properties, with the co-benefits of oxygen combustion, create a favorable environment for spatially suitable projects. Oxygen combustion involves substituting produced oxygen for air in a combustion reaction. The absence of gaseous N2 necessitates the recirculation of exhaust gases to maintain kiln temperatures, which increase the CO2 concentrations from 28% to 80% or more. Gas exit temperatures are also elevated (>300oC) and can reach higher temperatures if the multi stage pre-heater towers, that recover heat, are re-designed in light of FGR. A ready source of cations can be found in waste concrete, a by-product of construction and demolition activities. These wastes can be processed to remove cations and then reacted with atmospheric CO2 to produce carbonate minerals. While not carbon negative, they represent a demonstration opportunity for binding atmospheric CO2while producing a saleable product (precipitated calcium carbonate). This paper will present experimental results on PCC production from waste concrete along with modeling results for oxygen combustion at cement facilities. The results will be presented with a view to mineral sequestration process design and implementation.

  14. Comprehensive evaluation of transportation projects : a toolkit for sketch planning.

    DOT National Transportation Integrated Search

    2010-10-01

    A quick-response project-planning tool can be extremely valuable in anticipating the congestion, safety, : emissions, and other impacts of large-scale network improvements and policy implementations. This report : identifies the advantages and limita...

  15. Leveraging Resources to Address Transportation Needs: Transportation Pooled Fund Program

    DOT National Transportation Integrated Search

    2004-05-28

    This brochure describes the Transportation Pooled Fund (TPF) Program. The objectives of the TPF Program are to leverage resources, avoid duplication of effort, undertake large-scale projects, obtain greater input on project definition, achieve broade...

  16. Living the lesson: can the Lifestyle Project be used to achieve deep learning in environmental earth science?

    NASA Astrophysics Data System (ADS)

    Padden, M.; Whalen, K.

    2013-12-01

    Students in a large, second-year environmental earth science class made significant changes to their daily lives over a three-week period to learn how small-scale actions interact with global-scaled issues such as water and energy supplies, waste management and agriculture. The Lifestyle Project (Kirk and Thomas, 2003) was slightly adapted to fit a large-class setting (350 students). Students made changes to their lifestyle in self-selected categories (water, home heating, transportation, waste, food) and created journals over a three-week period as the changes increased in difficulty. The goal of this study is to gain an understanding of which aspects of the project played a pivotal role in impacting long-term learning. Content analysis of the journal entries and follow-up interviews are used to investigate if the Lifestyle Project is having a lasting impact on the students 18 months after the initial assignment.

  17. The global gridded crop model intercomparison: Data and modeling protocols for Phase 1 (v1.0)

    DOE PAGES

    Elliott, J.; Müller, C.; Deryng, D.; ...

    2015-02-11

    We present protocols and input data for Phase 1 of the Global Gridded Crop Model Intercomparison, a project of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The project consist of global simulations of yields, phenologies, and many land-surface fluxes using 12–15 modeling groups for many crops, climate forcing data sets, and scenarios over the historical period from 1948 to 2012. The primary outcomes of the project include (1) a detailed comparison of the major differences and similarities among global models commonly used for large-scale climate impact assessment, (2) an evaluation of model and ensemble hindcasting skill, (3) quantification ofmore » key uncertainties from climate input data, model choice, and other sources, and (4) a multi-model analysis of the agricultural impacts of large-scale climate extremes from the historical record.« less

  18. The Agricultural Model Intercomparison and Improvement Project (AgMIP): Protocols and Pilot Studies

    NASA Technical Reports Server (NTRS)

    Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P.; Antle, J. M.; Nelson, G. C.; Porter, C.; Janssen, S.; hide

    2012-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) is a major international effort linking the climate, crop, and economic modeling communities with cutting-edge information technology to produce improved crop and economic models and the next generation of climate impact projections for the agricultural sector. The goals of AgMIP are to improve substantially the characterization of world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. Analyses of the agricultural impacts of climate variability and change require a transdisciplinary effort to consistently link state-of-the-art climate scenarios to crop and economic models. Crop model outputs are aggregated as inputs to regional and global economic models to determine regional vulnerabilities, changes in comparative advantage, price effects, and potential adaptation strategies in the agricultural sector. Climate, Crop Modeling, Economics, and Information Technology Team Protocols are presented to guide coordinated climate, crop modeling, economics, and information technology research activities around the world, along with AgMIP Cross-Cutting Themes that address uncertainty, aggregation and scaling, and the development of Representative Agricultural Pathways (RAPs) to enable testing of climate change adaptations in the context of other regional and global trends. The organization of research activities by geographic region and specific crops is described, along with project milestones. Pilot results demonstrate AgMIP's role in assessing climate impacts with explicit representation of uncertainties in climate scenarios and simulations using crop and economic models. An intercomparison of wheat model simulations near Obregón, Mexico reveals inter-model differences in yield sensitivity to [CO2] with model uncertainty holding approximately steady as concentrations rise, while uncertainty related to choice of crop model increases with rising temperatures. Wheat model simulations with midcentury climate scenarios project a slight decline in absolute yields that is more sensitive to selection of crop model than to global climate model, emissions scenario, or climate scenario downscaling method. A comparison of regional and national-scale economic simulations finds a large sensitivity of projected yield changes to the simulations' resolved scales. Finally, a global economic model intercomparison example demonstrates that improvements in the understanding of agriculture futures arise from integration of the range of uncertainty in crop, climate, and economic modeling results in multi-model assessments.

  19. Introducing Vi polysaccharide typhoid fever vaccine to primary school children in North Jakarta, Indonesia, via an existent school-based vaccination platform.

    PubMed

    Agtini, M D; Ochiai, R L; Soeharno, R; Lee, H J; Sundoro, J; Hadinegoro, S R; Han, O P; Tana, L; Halim, F X S; Ghani, L; Delima; Lestari, W; Sintawati, F X; Kusumawardani, N; Malik, R; Santoso, T S; Nadjib, M; Soeroso, S; Wangsasaputra, F; Ali, M; Ivanoff, B; Galindo, C M; Pang, T; Clemens, J D; Suwandono, A; Acosta, C J

    2006-11-01

    To report results on coverage, safety and logistics of a large-scale, school-based Vi polysaccharide immunization campaign in North Jakarta. Of 443 primary schools in North Jakarta, Indonesia, 18 public schools were randomly selected for this study. Exclusion criteria were fever 37.5 degrees C or higher at the time of vaccination or a known history of hypersensitivity to any vaccine. Adverse events were monitored and recorded for 1 month after immunization. Because this was a pilot programme, resource use was tracked in detail. During the February 2004 vaccination campaign, 4828 students were immunized (91% of the target population); another 394 students (7%) were vaccinated during mop-up programmes. Informed consent was obtained for 98% of the target population. In all, 34 adverse events were reported, corresponding to seven events per 1000 doses injected; none was serious. The manufacturer recommended cold chain was maintained throughout the programme. This demonstration project in two sub-districts of North Jakarta shows that a large-scale, school-based typhoid fever Vi polysaccharide vaccination campaign is logistically feasible, safe and minimally disruptive to regular school activities, when used in the context of an existing successful immunization platform. The project had high parental acceptance. Nonetheless, policy-relevant questions still need to be answered before implementing a widespread Vi polysaccharide vaccine programme in Indonesia.

  20. Overview of physical dosimetry methods for triage application integrated in the new European network RENEB.

    PubMed

    Trompier, François; Burbidge, Christopher; Bassinet, Céline; Baumann, Marion; Bortolin, Emanuela; De Angelis, Cinzia; Eakins, Jonathan; Della Monaca, Sara; Fattibene, Paola; Quattrini, Maria Cristina; Tanner, Rick; Wieser, Albrecht; Woda, Clemens

    2017-01-01

    In the EC-funded project RENEB (Realizing the European Network in Biodosimetry), physical methods applied to fortuitous dosimetric materials are used to complement biological dosimetry, to increase dose assessment capacity for large-scale radiation/nuclear accidents. This paper describes the work performed to implement Optically Stimulated Luminescence (OSL) and Electron Paramagnetic Resonance (EPR) dosimetry techniques. OSL is applied to electronic components and EPR to touch-screen glass from mobile phones. To implement these new approaches, several blind tests and inter-laboratory comparisons (ILC) were organized for each assay. OSL systems have shown good performances. EPR systems also show good performance in controlled conditions, but ILC have also demonstrated that post-irradiation exposure to sunlight increases the complexity of the EPR signal analysis. Physically-based dosimetry techniques present high capacity, new possibilities for accident dosimetry, especially in the case of large-scale events. Some of the techniques applied can be considered as operational (e.g. OSL on Surface Mounting Devices [SMD]) and provide a large increase of measurement capacity for existing networks. Other techniques and devices currently undergoing validation or development in Europe could lead to considerable increases in the capacity of the RENEB accident dosimetry network.

  1. KSC-2009-6456

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, Kennedy Director Bob Cabana, left, congratulates, Eric Silagy, Florida Power & Light Company vice president and chief development officer, for his part in the construction of NASA's first large-scale solar power generation facility as Roderick Roche, senior manager, Project Management Office of North America, SunPower Corporation, looks on. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  2. KSC-2009-6455

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, Kennedy Director Bob Cabana, left, congratulates Roderick Roche, senior manager, Project Management Office of North America, SunPower Corporation, for his part in the construction of NASA's first large-scale solar power generation facility as Eric Silagy, Florida Power & Light Company vice president and chief development officer, looks on. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  3. Cross-indexing of binary SIFT codes for large-scale image search.

    PubMed

    Liu, Zhen; Li, Houqiang; Zhang, Liyan; Zhou, Wengang; Tian, Qi

    2014-05-01

    In recent years, there has been growing interest in mapping visual features into compact binary codes for applications on large-scale image collections. Encoding high-dimensional data as compact binary codes reduces the memory cost for storage. Besides, it benefits the computational efficiency since the computation of similarity can be efficiently measured by Hamming distance. In this paper, we propose a novel flexible scale invariant feature transform (SIFT) binarization (FSB) algorithm for large-scale image search. The FSB algorithm explores the magnitude patterns of SIFT descriptor. It is unsupervised and the generated binary codes are demonstrated to be dispreserving. Besides, we propose a new searching strategy to find target features based on the cross-indexing in the binary SIFT space and original SIFT space. We evaluate our approach on two publicly released data sets. The experiments on large-scale partial duplicate image retrieval system demonstrate the effectiveness and efficiency of the proposed algorithm.

  4. Solar Technology Acceleration Center (SolarTAC): Solar Resource & Meteorological Assessment Project (SOLRAMP)

    DOE Data Explorer

    Andreas, Afshin; Wilcox, Steve

    2016-03-14

    Located in Colorado, near Denver International Airport, SolarTAC is a private, member-based, 74-acre outdoor facility where the solar industry tests, validates, and demonstrates advanced solar technologies. SolarTAC was launched in 2008 by a public-private consortium, including Midwest Research Institute (MRI). As a supporting member of SolarTAC, the U.S. Department of Energy National Renewable Energy Laboratory (NMREL) has established a high quality solar and meteorological measurement station at this location. This Solar Resource and Meteorological Assessment Project (SOLRMAP) provides high quality measurements to support deployment of power projects in the United States. The no-funds-exchanged collaboration brings NREL solar resource assessment expertise together with industry needs for measurements. The end result is high quality data sets to support the financing, design, and monitoring of large scale solar powered projects for industry in addition to research-quality data for NREL model development. NREL provides consultation for instrumentation and station deployment, along with instrument calibrations, data acquisition, quality assessment, data distribution, and summary reports. Industry participants provide equipment, infrastructure, and station maintenance.

  5. Commercial-scale demonstration of the Liquid Phase Methanol (LPMEOH{sup trademark}) process. Third quarterly report, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The Liquid Phase Methanol (LPMEOH)(TM) demonstration project at King sport, Tennessee, is a $213.7 million cooperative agreement between the U.S. Department of Energy (DOE) and Air Products Liquid Phase Conversion Company, L. P. (the Partnership). A demonstration unit producing 80,000 gallons per day (260 TPD) of methanol is being designed and constructed at a site located at the Eastman Chemical Company (Eastman) complex in Kingsport. The Partnership will own and operate the facility for the four year demonstration period. This project is sponsored under the DOE`s Clean Coal Technology Program, and its primary objective is to `demonstrate the production ofmore » methanol using the LPMEOH(TM) Process in conjunction with an integrated coal gasification facility.` The project will also demonstrate the suitability of the methanol produced for use as a chemical feedstock or as a low-sulfur dioxide, low-nitrogen oxides alternative fuel in stationary and transportation applications. The project may also demonstrate the production of dimethyl ether (DME) as a mixed coproduct with methanol, if laboratory- and pilot-scale research and market verification studies show promising results. If implemented, the DME would be produced during the last six months of the four year demonstration period. The LPMEOH(TM) process is the product of a cooperative development effort by Air Products and the DOE in a program that started in 1981. It was successfully piloted at a 10-TPD rate in the DOE-owned experimental unit at Air Products` LaPorte, Texas, site. This demonstration project is the culmination of that extensive cooperative development effort.« less

  6. Demonstration of Active Power Controls by Utility-Scale PV Power Plant in an Island Grid: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gevorgian, Vahan; O'Neill, Barbara

    The National Renewable Energy Laboratory (NREL), AES, and the Puerto Rico Electric Power Authority conducted a demonstration project on a utility-scale photovoltaic (PV) plant to test the viability of providing important ancillary services from this facility. As solar generation increases globally, there is a need for innovation and increased operational flexibility. A typical PV power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. In this way, it may mitigate the impact of its variability on the grid and contribute to important system requirements more like traditional generators. In 2015,more » testing was completed on a 20-MW AES plant in Puerto Rico, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to provide various types of new grid-friendly controls. This data showed how active power controls can leverage PV's value from being simply an intermittent energy resource to providing additional ancillary services for an isolated island grid. Specifically, the tests conducted included PV plant participation in automatic generation control, provision of droop response, and fast frequency response.« less

  7. Software engineering risk factors in the implementation of a small electronic medical record system: the problem of scalability.

    PubMed

    Chiang, Michael F; Starren, Justin B

    2002-01-01

    The successful implementation of clinical information systems is difficult. In examining the reasons and potential solutions for this problem, the medical informatics community may benefit from the lessons of a rich body of software engineering and management literature about the failure of software projects. Based on previous studies, we present a conceptual framework for understanding the risk factors associated with large-scale projects. However, the vast majority of existing literature is based on large, enterprise-wide systems, and it unclear whether those results may be scaled down and applied to smaller projects such as departmental medical information systems. To examine this issue, we discuss the case study of a delayed electronic medical record implementation project in a small specialty practice at Columbia-Presbyterian Medical Center. While the factors contributing to the delay of this small project share some attributes with those found in larger organizations, there are important differences. The significance of these differences for groups implementing small medical information systems is discussed.

  8. Improving Future Ecosystem Benefits through Earth Observations: the H2020 Project ECOPOTENTIAL

    NASA Astrophysics Data System (ADS)

    Provenzale, Antonello; Beierkuhnlein, Carl; Ziv, Guy

    2016-04-01

    Terrestrial and marine ecosystems provide essential goods and services to human societies. In the last decades, however, anthropogenic pressures caused serious threats to ecosystem integrity, functions and processes, potentially leading to the loss of essential ecosystem services. ECOPOTENTIAL is a large European-funded H2020 project which focuses its activities on a targeted set of internationally recognised protected areas in Europe, European Territories and beyond, blending Earth Observations from remote sensing and field measurements, data analysis and modelling of current and future ecosystem conditions and services. The definition of future scenarios is based on climate and land-use change projections, addressing the issue of uncertainties and uncertainty propagation across the modelling chain. The ECOPOTENTIAL project addresses cross-scale geosphere-biosphere interactions and landscape-ecosystem dynamics at regional to continental scales, using geostatistical methods and the emerging approaches in Macrosystem Ecology and Earth Critical Zone studies, addressing long-term and large-scale environmental and ecological challenges. The project started its activities in 2015, by defining a set of storylines which allow to tackle some of the most crucial issues in the assessment of present conditions and the estimate of the future state of selected ecosystem services. In this contribution, we focus on some of the main storylines of the project and discuss the general approach, focusing on the interplay of data and models and on the estimate of projection uncertainties.

  9. The PLX- α project: demonstrating the viability of spherically imploding plasma liners as an MIF driver

    NASA Astrophysics Data System (ADS)

    Hsu, S. C.; Witherspoon, F. D.; Cassibry, J. T.; Gilmore, M.; Samulyak, R.; Stoltz, P.; the PLX-α Team

    2015-11-01

    Under ARPA-E's ALPHA program, the Plasma Liner Experiment-ALPHA (PLX- α) project aims to demonstrate the viability and scalability of spherically imploding plasma liners as a standoff, high-implosion-velocity magneto-inertial-fusion (MIF) driver that is potentially compatible with both low- and high- β targets. The project has three major objectives: (a) advancing existing contoured-gap coaxial-gun technology to achieve higher operational reliability/precision and better control/reproducibility of plasma-jet properties and profiles; (2) conducting ~ π / 2 -solid-angle plasma-liner experiments with 9 guns to demonstrate (along with extrapolations from modeling) that the jet-merging process leads to Mach-number degradation and liner uniformity that are acceptable for MIF; and (3) conducting 4 π experiments with up to 60 guns to demonstrate the formation of an imploding spherical plasma liner for the first time, and to provide empirical ram-pressure and uniformity scaling data for benchmarking our codes and informing us whether the scalings justify further development beyond ALPHA. This talk will provide an overview of the PLX- α project as well as key research results to date. Supported by ARPA-E's ALPHA program; original PLX construction supported by DOE Fusion Energy Sciences.

  10. On large-scale dynamo action at high magnetic Reynolds number

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cattaneo, F.; Tobias, S. M., E-mail: smt@maths.leeds.ac.uk

    2014-07-01

    We consider the generation of magnetic activity—dynamo waves—in the astrophysical limit of very large magnetic Reynolds number. We consider kinematic dynamo action for a system consisting of helical flow and large-scale shear. We demonstrate that large-scale dynamo waves persist at high Rm if the helical flow is characterized by a narrow band of spatial scales and the shear is large enough. However, for a wide band of scales the dynamo becomes small scale with a further increase of Rm, with dynamo waves re-emerging only if the shear is then increased. We show that at high Rm, the key effect ofmore » the shear is to suppress small-scale dynamo action, allowing large-scale dynamo action to be observed. We conjecture that this supports a general 'suppression principle'—large-scale dynamo action can only be observed if there is a mechanism that suppresses the small-scale fluctuations.« less

  11. Monitoring Ephemeral Streams Using Airborne Very High Resolution Multispectral Remote Sensing in Arid Environments

    NASA Astrophysics Data System (ADS)

    Hamada, Y.; O'Connor, B. L.

    2012-12-01

    Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary field measurements necessary to verify remote sensing landscape models, and for generating hydrologic models and analyses.

  12. Prototype solar house. Study of the scientific evaluation and feasibility of a research and development project

    NASA Astrophysics Data System (ADS)

    Bundschuh, V.; Grueter, J. W.; Kleemann, M.; Melis, M.; Stein, H. J.; Wagner, H. J.; Dittrich, A.; Pohlmann, D.

    1982-08-01

    A preliminary study was undertaken before a large scale project for construction and survey of about a hundred solar houses was launched. The notion of solar house was defined and the use of solar energy (hot water preparation, heating of rooms, heating of swimming pool, or a combination of these possibilities) were examined. A coherent measuring program was set up. Advantages and inconveniences of the large scale project were reviewed. Production of hot water, evaluation of different concepts and different fabrications of solar systems, coverage of the different systems, conservation of energy, failure frequency and failures statistics, durability of the installation, investment maintenance and energy costs were retained as study parameters. Different solar hot water production systems and the heat counter used for measurements are described.

  13. Assessing the Feasibility of Large-Scale Countercyclical Public Job-Creation. Final Report, Volume III. Selected Implications of Public Job-Creation.

    ERIC Educational Resources Information Center

    Urban Inst., Washington, DC.

    This last of a three-volume report of a study done to assess the feasibility of large-scale, countercyclical public job creation covers the findings regarding the priorities among projects, indirect employment effects, skill imbalances, and administrative issues; and summarizes the overall findings, conclusions, and recommendations. (Volume 1,…

  14. First Large-Scale Proteogenomic Study of Breast Cancer Provides Insight into Potential Therapeutic Targets | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    News Release: May 25, 2016 — Building on data from The Cancer Genome Atlas (TCGA) project, a multi-institutional team of scientists has completed the first large-scale “proteogenomic” study of breast cancer, linking DNA mutations to protein signaling and helping pinpoint the genes that drive cancer.

  15. Projecting Images of the "Good" and the "Bad School": Top Scorers in Educational Large-Scale Assessments as Reference Societies

    ERIC Educational Resources Information Center

    Waldow, Florian

    2017-01-01

    Researchers interested in the global flow of educational ideas and programmes have long been interested in the role of so-called "reference societies." The article investigates how top scorers in large-scale assessments are framed as positive or negative reference societies in the education policy-making debate in German mass media and…

  16. Infrastructure for Large-Scale Quality-Improvement Projects: Early Lessons from North Carolina Improving Performance in Practice

    ERIC Educational Resources Information Center

    Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…

  17. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    PubMed

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  18. Climatological temperature senstivity of soil carbon turnover: Observations, simple scaling models, and ESMs

    NASA Astrophysics Data System (ADS)

    Koven, C. D.; Hugelius, G.; Lawrence, D. M.; Wieder, W. R.

    2016-12-01

    The projected loss of soil carbon to the atmosphere resulting from climate change is a potentially large but highly uncertain feedback to warming. The magnitude of this feedback is poorly constrained by observations and theory, and is disparately represented in Earth system models. To assess the likely long-term response of soils to climate change, spatial gradients in soil carbon turnover times can identify broad-scale and long-term controls on the rate of carbon cycling as a function of climate and other factors. Here we show that the climatological temperature control on carbon turnover in the top meter of global soils is more sensitive in cold climates than in warm ones. We present a simplified model that explains the high cold-climate sensitivity using only the physical scaling of soil freeze-thaw state across climate gradients. Critically, current Earth system models (ESMs) fail to capture this pattern, however it emerges from an ESM that explicitly resolves vertical gradients in soil climate and turnover. The weak tropical temperature sensitivity emerges from a different model that explicitly resolves mineralogical control on decomposition. These results support projections of strong future carbon-climate feedbacks from northern soils and demonstrate a method for ESMs to capture this emergent behavior.

  19. Risk management in a large-scale CO2 geosequestration pilot project, Illinois, USA

    USGS Publications Warehouse

    Hnottavange-Telleen, K.; Chabora, E.; Finley, R.J.; Greenberg, S.E.; Marsteller, S.

    2011-01-01

    Like most large-scale infrastructure projects, carbon dioxide (CO 2) geological sequestration (GS) projects have multiple success criteria and multiple stakeholders. In this context "risk evaluation" encompasses multiple scales. Yet a risk management program aims to maximize the chance of project success by assessing, monitoring, minimizing all risks in a consistent framework. The 150,000-km2 Illinois Basin underlies much of the state of Illinois, USA, and parts of adjacent Kentucky and Indiana. Its potential for CO2 storage is first-rate among basins in North America, an impression that has been strengthened by early testing of the injection well of the Midwest Geological Sequestration Consortium's (MGSC's) Phase III large scale demonstration project, the Illinois Basin - Decatur Project (IBDP). The IBDP, funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL), represents a key trial of GS technologies and project-management techniques. Though risks are specific to each site and project, IBDP risk management methodologies provide valuable experience for future GS projects. IBDP views risk as the potential for negative impact to any of these five values: health and safety, environment, financial, advancing the viability and public acceptability of a GS industry, and research. Research goals include monitoring one million metric tonnes of injected CO2 in the subsurface. Risk management responds to the ways in which any values are at risk: for example, monitoring is designed to reduce uncertainties in parameter values that are important for research and system control, and is also designed to provide public assurance. Identified risks are the primary basis for risk-reduction measures: risks linked to uncertainty in geologic parameters guide further characterization work and guide simulations applied to performance evaluation. Formally, industry defines risk (more precisely risk criticality) as the product L*S, the Likelihood multiplied by the Severity of negative impact. L and S are each evaluated on five-point scales, yielding a theoretical spread in risk values of 1 through 25. So defined, these judgment-based values are categorical and ordinal - they do not represent physically measurable quantities, but are nonetheless useful for comparison and therefore decision support. The "risk entities" first evaluated are FEPs - conceptual Features, Events, and Processes based on the list published by Quintessa Ltd. After concrete scenarios are generated based on selected FEPs, scenarios become the critical entities whose associated risks are evaluated and tracked. In IBDP workshops, L and S values for 123 FEPs were generated through expert elicitation. About 30 experts in the project or in GS in general were assigned among six facilitated working groups, and each group was charged to envision risks within a sphere of project operations. Working groups covered FEPs with strong spatial characteristics - such as those related to the injection wellbore and simulated plume footprint - and "nonspatial" FEPs related to finance, regulations, legal, and stakeholder issues. Within these working groups, experts shared information, examined assumptions, refined and extended the FEP list, calibrated responses, and provided initial L and S values by consensus. Individual rankings were collected in a follow-up process via emailed spreadsheets. For each of L and S, three values were collected: Lower Bound, Best Guess, and Upper Bound. The Lower-Upper Bound ranges and the spreads among experts can be interpreted to yield rough confidence measures. Based on experts' responses, FEPs were ranked in terms of their L*S risk levels. FEP rankings were determined from individual (not consensus or averaged) results, thus no high-risk responses were damped out. The higher-risk FEPs were used to generate one or more concrete, well defined risk-bearing scenarios for each FEP. Any FEP scored by any expert as having associated risk of

  20. Off-farm applications of solar energy in agriculture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, R.E.

    1980-01-01

    Food processing applications make up almost all present off-farm studies of solar energy in agriculture. Research, development and demonstration projects on solar food processing have shown significant progress over the past 3 years. Projects have included computer simulation and mathematical models, hardware and process development for removing moisture from horticultural or animal products, integration of energy conservation with solar energy augmentation in conventional processes, and commercial scale demonstrations. The demonstration projects include solar heated air for drying prunes and raisins, soy beans and onions/garlic; and solar generated steam for orange juice pasteurization. Several new and planned projects hold considerable promisemore » for commerical exploitation in future food processes.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leboeuf, C.; Taylor, R.W.; Corbus, D.

    A cooperative renewable energy project is underway between the U.S. Department of Energy (through the National Renewable Energy Laboratory, NREL), and the Federal Republic of Brazil (through the Centro de Pesquisas de Energia Eletrica, CEPEL). The objectives of this joint US/Brazilian program are to establish technical, institutional, and economic confidence in using renewable energy systems to meet the needs of the people of rural Brazil, to build ongoing partnerships beneficial to both countries, and to demonstrate the potential for large-scale rural electrification through the use of renewable energy systems. Phase 1 of this program resulted in the deployment of moremore » than 700 photovoltaic (PV) electric lighting systems in the Brazilian states of Pernambuco and Ceara. Phase 2 of the program extends the pilot project into six additional Brazilian states and demonstrates a wider variety of stand-alone end uses, including the use of wind electric power generation for selected sites and applications. Additionally, Phase 2 also includes the development of two hybrid village power systems, including one comprising PV, wind, battery, and diesel power sources. This paper focuses on this hybrid system, which is located in the Amazon River delta.« less

  2. Small-scale response in an avian community to a large-scale thinning project in the southwestern United States

    Treesearch

    Karen E. Bagne; Deborah M. Finch

    2009-01-01

    Avian populations were monitored using point counts from 2002 to 2007, two years before and four years after a 2800 ha fuel reduction project. The study area was within a ponderosa pine forest near Santa Fe, New Mexico, USA. Adjacent unthinned areas were also monitored as a reference for population variation related to other factors. For individual bird species...

  3. How do glacier inventory data aid global glacier assessments and projections?

    NASA Astrophysics Data System (ADS)

    Hock, R.

    2017-12-01

    Large-scale glacier modeling relies heavily on datasets that are collected by many individuals across the globe, but managed and maintained in a coordinated fashion by international data centers. The Global Terrestrial Network for Glaciers (GTN-G) provides the framework for coordinating and making available a suite of data sets such as the Randolph Glacier Inventory (RGI), the Glacier Thickness Dataset or the World Glacier Inventory (WGI). These datasets have greatly increased our ability to assess global-scale glacier mass changes. These data have also been vital for projecting the glacier mass changes of all mountain glaciers in the world outside the Greenland and Antarctic ice sheet, a total >200,000 glaciers covering an area of more than 700,000 km2. Using forcing from 8 to 15 GCMs and 4 different emission scenarios, global-scale glacier evolution models project multi-model mean net mass losses of all glaciers between 7 cm and 24 cm sea-level equivalent by the end of the 21st century. Projected mass losses vary greatly depending on the choice of the forcing climate and emission scenario. Insufficiently constrained model parameters likely are an important reason for large differences found among these studies even when forced by the same emission scenario, especially on regional scales.

  4. Correction of projective distortion in long-image-sequence mosaics without prior information

    NASA Astrophysics Data System (ADS)

    Yang, Chenhui; Mao, Hongwei; Abousleman, Glen; Si, Jennie

    2010-04-01

    Image mosaicking is the process of piecing together multiple video frames or still images from a moving camera to form a wide-area or panoramic view of the scene being imaged. Mosaics have widespread applications in many areas such as security surveillance, remote sensing, geographical exploration, agricultural field surveillance, virtual reality, digital video, and medical image analysis, among others. When mosaicking a large number of still images or video frames, the quality of the resulting mosaic is compromised by projective distortion. That is, during the mosaicking process, the image frames that are transformed and pasted to the mosaic become significantly scaled down and appear out of proportion with respect to the mosaic. As more frames continue to be transformed, important target information in the frames can be lost since the transformed frames become too small, which eventually leads to the inability to continue further. Some projective distortion correction techniques make use of prior information such as GPS information embedded within the image, or camera internal and external parameters. Alternatively, this paper proposes a new algorithm to reduce the projective distortion without using any prior information whatsoever. Based on the analysis of the projective distortion, we approximate the projective matrix that describes the transformation between image frames using an affine model. Using singular value decomposition, we can deduce the affine model scaling factor that is usually very close to 1. By resetting the image scale of the affine model to 1, the transformed image size remains unchanged. Even though the proposed correction introduces some error in the image matching, this error is typically acceptable and more importantly, the final mosaic preserves the original image size after transformation. We demonstrate the effectiveness of this new correction algorithm on two real-world unmanned air vehicle (UAV) sequences. The proposed method is shown to be effective and suitable for real-time implementation.

  5. Nanomanufacturing : nano-structured materials made layer-by-layer.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, James V.; Cheng, Shengfeng; Grest, Gary Stephen

    Large-scale, high-throughput production of nano-structured materials (i.e. nanomanufacturing) is a strategic area in manufacturing, with markets projected to exceed $1T by 2015. Nanomanufacturing is still in its infancy; process/product developments are costly and only touch on potential opportunities enabled by growing nanoscience discoveries. The greatest promise for high-volume manufacturing lies in age-old coating and imprinting operations. For materials with tailored nm-scale structure, imprinting/embossing must be achieved at high speeds (roll-to-roll) and/or over large areas (batch operation) with feature sizes less than 100 nm. Dispersion coatings with nanoparticles can also tailor structure through self- or directed-assembly. Layering films structured with thesemore » processes have tremendous potential for efficient manufacturing of microelectronics, photovoltaics and other topical nano-structured devices. This project is designed to perform the requisite R and D to bring Sandia's technology base in computational mechanics to bear on this scale-up problem. Project focus is enforced by addressing a promising imprinting process currently being commercialized.« less

  6. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  7. Bulk silicon as photonic dynamic infrared scene projector

    NASA Astrophysics Data System (ADS)

    Malyutenko, V. K.; Bogatyrenko, V. V.; Malyutenko, O. Yu.

    2013-04-01

    A Si-based fast (frame rate >1 kHz), large-scale (scene area 100 cm2), broadband (3-12 μm), dynamic contactless infrared (IR) scene projector is demonstrated. An IR movie appears on a scene because of the conversion of a visible scenario projected at a scene kept at elevated temperature. Light down conversion comes as a result of free carrier generation in a bulk Si scene followed by modulation of its thermal emission output in the spectral band of free carrier absorption. The experimental setup, an IR movie, figures of merit, and the process's advantages in comparison to other projector technologies are discussed.

  8. Citizens unite for computational immunology!

    PubMed

    Belden, Orrin S; Baker, Sarah Catherine; Baker, Brian M

    2015-07-01

    Recruiting volunteers who can provide computational time, programming expertise, or puzzle-solving talent has emerged as a powerful tool for biomedical research. Recent projects demonstrate the potential for such 'crowdsourcing' efforts in immunology. Tools for developing applications, new funding opportunities, and an eager public make crowdsourcing a serious option for creative solutions for computationally-challenging problems. Expanded uses of crowdsourcing in immunology will allow for more efficient large-scale data collection and analysis. It will also involve, inspire, educate, and engage the public in a variety of meaningful ways. The benefits are real - it is time to jump in! Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Growing up and Growing out: Emerging Adults Learn Management through Service-Learning

    ERIC Educational Resources Information Center

    Fairfield, Kent D.

    2010-01-01

    This article describes a journey introducing service-learning based on large-scale projects in an undergraduate management curriculum, leading to supplementing this approach with more conventional small-group projects. It outlines some of the foundation for service-learning. Having students undertake a single class-wide project offers distinctive…

  10. Primary Teachers Conducting Inquiry Projects: Effects on Attitudes towards Teaching Science and Conducting Inquiry

    ERIC Educational Resources Information Center

    van Aalderen-Smeets, Sandra I.; Walma van der Molen, Juliette H.; van Hest, Erna G. W. C. M.; Poortman, Cindy

    2017-01-01

    This study used an experimental, pretest-posttest control group design to investigate whether participation in a large-scale inquiry project would improve primary teachers' attitudes towards teaching science and towards conducting inquiry. The inquiry project positively affected several elements of teachers' attitudes. Teachers felt less anxious…

  11. Dynamic system simulation of small satellite projects

    NASA Astrophysics Data System (ADS)

    Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper

    2010-11-01

    A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.

  12. The CODATwins Project: The Cohort Description of Collaborative Project of Development of Anthropometrical Measures in Twins to Study Macro-Environmental Variation in Genetic and Environmental Effects on Anthropometric Traits.

    PubMed

    Silventoinen, Karri; Jelenkovic, Aline; Sund, Reijo; Honda, Chika; Aaltonen, Sari; Yokoyama, Yoshie; Tarnoki, Adam D; Tarnoki, David L; Ning, Feng; Ji, Fuling; Pang, Zengchang; Ordoñana, Juan R; Sánchez-Romera, Juan F; Colodro-Conde, Lucia; Burt, S Alexandra; Klump, Kelly L; Medland, Sarah E; Montgomery, Grant W; Kandler, Christian; McAdams, Tom A; Eley, Thalia C; Gregory, Alice M; Saudino, Kimberly J; Dubois, Lise; Boivin, Michel; Haworth, Claire M A; Plomin, Robert; Öncel, Sevgi Y; Aliev, Fazil; Stazi, Maria A; Fagnani, Corrado; D'Ippolito, Cristina; Craig, Jeffrey M; Saffery, Richard; Siribaddana, Sisira H; Hotopf, Matthew; Sumathipala, Athula; Spector, Timothy; Mangino, Massimo; Lachance, Genevieve; Gatz, Margaret; Butler, David A; Bayasgalan, Gombojav; Narandalai, Danshiitsoodol; Freitas, Duarte L; Maia, José Antonio; Harden, K Paige; Tucker-Drob, Elliot M; Christensen, Kaare; Skytthe, Axel; Kyvik, Kirsten O; Hong, Changhee; Chong, Youngsook; Derom, Catherine A; Vlietinck, Robert F; Loos, Ruth J F; Cozen, Wendy; Hwang, Amie E; Mack, Thomas M; He, Mingguang; Ding, Xiaohu; Chang, Billy; Silberg, Judy L; Eaves, Lindon J; Maes, Hermine H; Cutler, Tessa L; Hopper, John L; Aujard, Kelly; Magnusson, Patrik K E; Pedersen, Nancy L; Aslan, Anna K Dahl; Song, Yun-Mi; Yang, Sarah; Lee, Kayoung; Baker, Laura A; Tuvblad, Catherine; Bjerregaard-Andersen, Morten; Beck-Nielsen, Henning; Sodemann, Morten; Heikkilä, Kauko; Tan, Qihua; Zhang, Dongfeng; Swan, Gary E; Krasnow, Ruth; Jang, Kerry L; Knafo-Noam, Ariel; Mankuta, David; Abramson, Lior; Lichtenstein, Paul; Krueger, Robert F; McGue, Matt; Pahlen, Shandell; Tynelius, Per; Duncan, Glen E; Buchwald, Dedra; Corley, Robin P; Huibregtse, Brooke M; Nelson, Tracy L; Whitfield, Keith E; Franz, Carol E; Kremen, William S; Lyons, Michael J; Ooki, Syuichi; Brandt, Ingunn; Nilsen, Thomas Sevenius; Inui, Fujio; Watanabe, Mikio; Bartels, Meike; van Beijsterveldt, Toos C E M; Wardle, Jane; Llewellyn, Clare H; Fisher, Abigail; Rebato, Esther; Martin, Nicholas G; Iwatani, Yoshinori; Hayakawa, Kazuo; Rasmussen, Finn; Sung, Joohon; Harris, Jennifer R; Willemsen, Gonneke; Busjahn, Andreas; Goldberg, Jack H; Boomsma, Dorret I; Hur, Yoon-Mi; Sørensen, Thorkild I A; Kaprio, Jaakko

    2015-08-01

    For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.

  13. The CODAtwins project: the cohort description of COllaborative project of Development of Anthropometrical measures in Twins to study macro-environmental variation in genetic and environmental effects on anthropometric traits

    PubMed Central

    Silventoinen, Karri; Jelenkovic, Aline; Sund, Reijo; Honda, Chika; Aaltonen, Sari; Yokoyama, Yoshie; Tarnoki, Adam D; Tarnoki, David L; Ning, Feng; Ji, Fuling; Pang, Zengchang; Ordoñana, Juan R; Sánchez-Romera, Juan F; Colodro-Conde, Lucia; Burt, S Alexandra; Klump, Kelly L; Medland, Sarah E; Montgomery, Grant W; Kandler, Christian; McAdams, Tom A; Eley, Thalia C; Gregory, Alice M; Saudino, Kimberly J; Dubois, Lise; Boivin, Michel; Haworth, Claire MA; Plomin, Robert; Öncel, Sevgi Y; Aliev, Fazil; Stazi, Maria A; Fagnani, Corrado; D'Ippolito, Cristina; Craig, Jeffrey M; Saffery, Richard; Siribaddana, Sisira H; Hotopf, Matthew; Sumathipala, Athula; Spector, Timothy; Mangino, Massimo; Lachance, Genevieve; Gatz, Margaret; Butler, David A; Bayasgalan, Gombojav; Narandalai, Danshiitsoodol; Freitas, Duarte L; Maia, José Antonio; Harden, K Paige; Tucker-Drob, Elliot M; Christensen, Kaare; Skytthe, Axel; Kyvik, Kirsten O; Hong, Changhee; Chong, Youngsook; Derom, Catherine A; Vlietinck, Robert F; Loos, Ruth JF; Cozen, Wendy; Hwang, Amie E; Mack, Thomas M; He, Mingguang; Ding, Xiaohu; Chang, Billy; Silberg, Judy L; Eaves, Lindon J; Maes, Hermine H; Cutler, Tessa L; Hopper, John L; Aujard, Kelly; Magnusson, Patrik KE; Pedersen, Nancy L; Dahl-Aslan, Anna K; Song, Yun-Mi; Yang, Sarah; Lee, Kayoung; Baker, Laura A; Tuvblad, Catherine; Bjerregaard-Andersen, Morten; Beck-Nielsen, Henning; Sodemann, Morten; Heikkilä, Kauko; Tan, Qihua; Zhang, Dongfeng; Swan, Gary E; Krasnow, Ruth; Jang, Kerry L; Knafo-Noam, Ariel; Mankuta, David; Abramson, Lior; Lichtenstein, Paul; Krueger, Robert F; McGue, Matt; Pahlen, Shandell; Tynelius, Per; Duncan, Glen E; Buchwald, Dedra; Corley, Robin P; Huibregtse, Brooke M; Nelson, Tracy L; Whitfield, Keith E; Franz, Carol E; Kremen, William S; Lyons, Michael J; Ooki, Syuichi; Brandt, Ingunn; Nilsen, Thomas Sevenius; Inui, Fujio; Watanabe, Mikio; Bartels, Meike; van Beijsterveldt, Toos CEM; Wardle, Jane; Llewellyn, Clare H; Fisher, Abigail; Rebato, Esther; Martin, Nicholas G; Iwatani, Yoshinori; Hayakawa, Kazuo; Rasmussen, Finn; Sung, Joohon; Harris, Jennifer R; Willemsen, Gonneke; Busjahn, Andreas; Goldberg, Jack H; Boomsma, Dorret I; Hur, Yoon-Mi; Sørensen, Thorkild IA; Kaprio, Jaakko

    2015-01-01

    For over one hundred years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically 1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and 2) to study the effects of birth related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects including both monozygotic and dizygotic twins using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes. PMID:26014041

  14. Subsurface Monitoring of CO2 Sequestration - A Review and Look Forward

    NASA Astrophysics Data System (ADS)

    Daley, T. M.

    2012-12-01

    The injection of CO2 into subsurface formations is at least 50 years old with large-scale utilization of CO2 for enhanced oil recovery (CO2-EOR) beginning in the 1970s. Early monitoring efforts had limited measurements in available boreholes. With growing interest in CO2 sequestration beginning in the 1990's, along with growth in geophysical reservoir monitoring, small to mid-size sequestration monitoring projects began to appear. The overall goals of a subsurface monitoring plan are to provide measurement of CO2 induced changes in subsurface properties at a range of spatial and temporal scales. The range of spatial scales allows tracking of the location and saturation of the plume with varying detail, while finer temporal sampling (up to continuous) allows better understanding of dynamic processes (e.g. multi-phase flow) and constraining of reservoir models. Early monitoring of small scale pilots associated with CO2-EOR (e.g., the McElroy field and the Lost Hills field), developed many of the methodologies including tomographic imaging and multi-physics measurements. Large (reservoir) scale sequestration monitoring began with the Sleipner and Weyburn projects. Typically, large scale monitoring, such as 4D surface seismic, has limited temporal sampling due to costs. Smaller scale pilots can allow more frequent measurements as either individual time-lapse 'snapshots' or as continuous monitoring. Pilot monitoring examples include the Frio, Nagaoka and Otway pilots using repeated well logging, crosswell imaging, vertical seismic profiles and CASSM (continuous active-source seismic monitoring). For saline reservoir sequestration projects, there is typically integration of characterization and monitoring, since the sites are not pre-characterized resource developments (oil or gas), which reinforces the need for multi-scale measurements. As we move beyond pilot sites, we need to quantify CO2 plume and reservoir properties (e.g. pressure) over large scales, while still obtaining high resolution. Typically the high-resolution (spatial and temporal) tools are deployed in permanent or semi-permanent borehole installations, where special well design may be necessary, such as non-conductive casing for electrical surveys. Effective utilization of monitoring wells requires an approach of modular borehole monitoring (MBM) were multiple measurements can be made. An example is recent work at the Citronelle pilot injection site where an MBM package with seismic, fluid sampling and distributed fiber sensing was deployed. For future large scale sequestration monitoring, an adaptive borehole-monitoring program is proposed.

  15. Modelling Contributions of the Local and Regional Groundwater Flow of Managed Aquifer Recharge Activities at Querença-Silves Aquifer System.

    NASA Astrophysics Data System (ADS)

    Costa, Luís; Monteiro, José Paulo; Oliveira, Manuel; Mota, Rogério; Lobo-Ferreira, João Paulo; Martins de Carvalho, José; Martins de Carvalho, Tiago; Agostinho, Rui; Hugman, Rui

    2015-04-01

    The Querença-Silves (QS) aquifer system is one of the most important natural groundwater reservoirs in the Algarve region of southern Portugal. With a surface area of 324 km2, this karst aquifer system is the main source of supply for irrigation as well as an important source of water for the urban supply. Due to the importance given to QS aquifer system by both governmental actors and end users, ongoing research during the last two decades at the University of Algarve has attempted to provide a better understanding of the hydrogeology and hydraulic behavior, which has resulted in the development of regional scale numerical models. The most recent hydrogeological data has been acquired during the ongoing MARSOL project (MARSOL-GA-2013-619120) which aims to demonstrate that Managed Aquifer Recharge (MAR) is a sound, safe and sustainable strategy that can be applied with great confidence in finding solutions to water scarcity in Southern Europe. Within the scope of the project large diameter well injection tests (with and without tracers) as well as geophysical surveys have been carried out in order to determine the infiltration capacity and aquifer properties. The results of which allowed the use of analytical methods to determine local scale values of hydraulic parameters (e.g. hydraulic conductivity and storage coefficient). These values will be compared with results from pre-existing numerical flow and transport models in order to obtain complementary solutions to the problem at local and regional scales. This analysis will contribute to the selection of the most appropriate methods to interpret, reproduce and model the impacts of MAR activities planned within the scope of the MARSOL project. Subsequent to the planned injection tests and, with the support of modelling efforts, the capacity of infiltration of rejected water from water treatment plants or surface storage dams in the large diameter well will be assessed.

  16. Energy transfers in large-scale and small-scale dynamos

    NASA Astrophysics Data System (ADS)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winkler, Mirko S., E-mail: mirko.winkler@unibas.c; NewFields, LLC, Pretoria 0062; Divall, Mark J., E-mail: mdivall@newfields.co

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developedmore » within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.« less

  18. Universal distribution of component frequencies in biological and technological systems

    PubMed Central

    Pang, Tin Yau; Maslov, Sergei

    2013-01-01

    Bacterial genomes and large-scale computer software projects both consist of a large number of components (genes or software packages) connected via a network of mutual dependencies. Components can be easily added or removed from individual systems, and their use frequencies vary over many orders of magnitude. We study this frequency distribution in genomes of ∼500 bacterial species and in over 2 million Linux computers and find that in both cases it is described by the same scale-free power-law distribution with an additional peak near the tail of the distribution corresponding to nearly universal components. We argue that the existence of a power law distribution of frequencies of components is a general property of any modular system with a multilayered dependency network. We demonstrate that the frequency of a component is positively correlated with its dependency degree given by the total number of upstream components whose operation directly or indirectly depends on the selected component. The observed frequency/dependency degree distributions are reproduced in a simple mathematically tractable model introduced and analyzed in this study. PMID:23530195

  19. Evaluating a collaborative IT based research and development project.

    PubMed

    Khan, Zaheer; Ludlow, David; Caceres, Santiago

    2013-10-01

    In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. MIGHTEE: The MeerKAT International GHz Tiered Extragalactic Exploration

    NASA Astrophysics Data System (ADS)

    Taylor, A. Russ; Jarvis, Matt

    2017-05-01

    The MeerKAT telescope is the precursor of the Square Kilometre Array mid-frequency dish array to be deployed later this decade on the African continent. MIGHTEE is one of the MeerKAT large survey projects designed to pathfind SKA key science in cosmology and galaxy evolution. Through a tiered radio continuum deep imaging project including several fields totaling 20 square degrees to microJy sensitivities and an ultra-deep image of a single 1 square degree field of view, MIGHTEE will explore dark matter and large scale structure, the evolution of galaxies, including AGN activity and star formation as a function of cosmic time and environment, the emergence and evolution of magnetic fields in galaxies, and the magnetic counter part to large scale structure of the universe.

  1. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this research. Hopefully, all results developed from this research can be used as a warning system for Predicting Large Scale Landslides in the southern Taiwan. Keywords:Heavy Rainfall, Large Scale, landslides, Critical Rainfall Value

  2. Evaluation of Subgrid-Scale Models for Large Eddy Simulation of Compressible Flows

    NASA Technical Reports Server (NTRS)

    Blaisdell, Gregory A.

    1996-01-01

    The objective of this project was to evaluate and develop subgrid-scale (SGS) turbulence models for large eddy simulations (LES) of compressible flows. During the first phase of the project results from LES using the dynamic SGS model were compared to those of direct numerical simulations (DNS) of compressible homogeneous turbulence. The second phase of the project involved implementing the dynamic SGS model in a NASA code for simulating supersonic flow over a flat-plate. The model has been successfully coded and a series of simulations has been completed. One of the major findings of the work is that numerical errors associated with the finite differencing scheme used in the code can overwhelm the SGS model and adversely affect the LES results. Attached to this overview are three submitted papers: 'Evaluation of the Dynamic Model for Simulations of Compressible Decaying Isotropic Turbulence'; 'The effect of the formulation of nonlinear terms on aliasing errors in spectral methods'; and 'Large-Eddy Simulation of a Spatially Evolving Compressible Boundary Layer Flow'.

  3. Linear time algorithms to construct populations fitting multiple constraint distributions at genomic scales.

    PubMed

    Siragusa, Enrico; Haiminen, Niina; Utro, Filippo; Parida, Laxmi

    2017-10-09

    Computer simulations can be used to study population genetic methods, models and parameters, as well as to predict potential outcomes. For example, in plant populations, predicting the outcome of breeding operations can be studied using simulations. In-silico construction of populations with pre-specified characteristics is an important task in breeding optimization and other population genetic studies. We present two linear time Simulation using Best-fit Algorithms (SimBA) for two classes of problems where each co-fits two distributions: SimBA-LD fits linkage disequilibrium and minimum allele frequency distributions, while SimBA-hap fits founder-haplotype and polyploid allele dosage distributions. An incremental gap-filling version of previously introduced SimBA-LD is here demonstrated to accurately fit the target distributions, allowing efficient large scale simulations. SimBA-hap accuracy and efficiency is demonstrated by simulating tetraploid populations with varying numbers of founder haplotypes, we evaluate both a linear time greedy algoritm and an optimal solution based on mixed-integer programming. SimBA is available on http://researcher.watson.ibm.com/project/5669.

  4. A Large-Scale Inquiry-Based Astronomy Intervention Project: Impact on Students' Content Knowledge Performance and Views of Their High School Science Classroom

    ERIC Educational Resources Information Center

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena; Deehan, James

    2016-01-01

    In this paper, we present the results from a study of the impact on students involved in a large-scale inquiry-based astronomical high school education intervention in Australia. Students in this intervention were led through an educational design allowing them to undertake an investigative approach to understanding the lifecycle of stars more…

  5. Applying a framework for assessing the health system challenges to scaling up mHealth in South Africa

    PubMed Central

    2012-01-01

    Background Mobile phone technology has demonstrated the potential to improve health service delivery, but there is little guidance to inform decisions about acquiring and implementing mHealth technology at scale in health systems. Using the case of community-based health services (CBS) in South Africa, we apply a framework to appraise the opportunities and challenges to effective implementation of mHealth at scale in health systems. Methods A qualitative study reviewed the benefits and challenges of mHealth in community-based services in South Africa, through a combination of key informant interviews, site visits to local projects and document reviews. Using a framework adapted from three approaches to reviewing sustainable information and communication technology (ICT), the lessons from local experience and elsewhere formed the basis of a wider consideration of scale up challenges in South Africa. Results Four key system dimensions were identified and assessed: government stewardship and the organisational, technological and financial systems. In South Africa, the opportunities for successful implementation of mHealth include the high prevalence of mobile phones, a supportive policy environment for eHealth, successful use of mHealth for CBS in a number of projects and a well-developed ICT industry. However there are weaknesses in other key health systems areas such as organisational culture and capacity for using health information for management, and the poor availability and use of ICT in primary health care. The technological challenges include the complexity of ensuring interoperability and integration of information systems and securing privacy of information. Finally, there are the challenges of sustainable financing required for large scale use of mobile phone technology in resource limited settings. Conclusion Against a background of a health system with a weak ICT environment and limited implementation capacity, it remains uncertain that the potential benefits of mHealth for CBS would be retained with immediate large-scale implementation. Applying a health systems framework facilitated a systematic appraisal of potential challenges to scaling up mHealth for CBS in South Africa and may be useful for policy and practice decision-making in other low- and middle-income settings. PMID:23126370

  6. Effectiveness and cost-effectiveness of telehealthcare for chronic obstructive pulmonary disease: study protocol for a cluster randomized controlled trial.

    PubMed

    Udsen, Flemming Witt; Lilholt, Pernille Heyckendorff; Hejlesen, Ole; Ehlers, Lars Holger

    2014-05-21

    Several feasibility studies show promising results of telehealthcare on health outcomes and health-related quality of life for patients suffering from chronic obstructive pulmonary disease, and some of these studies show that telehealthcare may even lower healthcare costs. However, the only large-scale trial we have so far - the Whole System Demonstrator Project in England - has raised doubts about these results since it conclude that telehealthcare as a supplement to usual care is not likely to be cost-effective compared with usual care alone. The present study is known as 'TeleCare North' in Denmark. It seeks to address these doubts by implementing a large-scale, pragmatic, cluster-randomized trial with nested economic evaluation. The purpose of the study is to assess the effectiveness and the cost-effectiveness of a telehealth solution for patients suffering from chronic obstructive pulmonary disease compared to usual practice. General practitioners will be responsible for recruiting eligible participants (1,200 participants are expected) for the trial in the geographical area of the North Denmark Region. Twenty-six municipality districts in the region define the randomization clusters. The primary outcomes are changes in health-related quality of life, and the incremental cost-effectiveness ratio measured from baseline to follow-up at 12 months. Secondary outcomes are changes in mortality and physiological indicators (diastolic and systolic blood pressure, pulse, oxygen saturation, and weight). There has been a call for large-scale clinical trials with rigorous cost-effectiveness assessments in telehealthcare research. This study is meant to improve the international evidence base for the effectiveness and cost-effectiveness of telehealthcare to patients suffering from chronic obstructive pulmonary disease by implementing a large-scale pragmatic cluster-randomized clinical trial. Clinicaltrials.gov, http://NCT01984840, November 14, 2013.

  7. Large Spatial Scale Variability in Bathyal Macrobenthos Abundance, Biomass, α- and β-Diversity along the Mediterranean Continental Margin

    PubMed Central

    Baldrighi, Elisa; Lavaleye, Marc; Aliani, Stefano; Conversi, Alessandra; Manini, Elena

    2014-01-01

    The large-scale deep-sea biodiversity distribution of the benthic fauna was explored in the Mediterranean Sea, which can be seen as a miniature model of the oceans of the world. Within the framework of the BIOFUN project (“Biodiversity and Ecosystem Functioning in Contrasting Southern European Deep-sea Environments: from viruses to megafauna”), we investigated the large spatial scale variability (over >1,000 km) of the bathyal macrofauna communities that inhabit the Mediterranean basin, and their relationships with the environmental variables. The macrofauna abundance, biomass, community structure and functional diversity were analysed and the α-diversity and β-diversity were estimated across six selected slope areas at different longitudes and along three main depths. The macrobenthic standing stock and α-diversity were lower in the deep-sea sediments of the eastern Mediterranean basin, compared to the western and central basins. The macrofaunal standing stock and diversity decreased significantly from the upper bathyal to the lower bathyal slope stations. The major changes in the community composition of the higher taxa and in the trophic (functional) structure occurred at different longitudes, rather than at increasing water depth. For the β-diversity, very high dissimilarities emerged at all levels: (i) between basins; (ii) between slopes within the same basin; and (iii) between stations at different depths; this therefore demonstrates the high macrofaunal diversity of the Mediterranean basins at large spatial scales. Overall, the food sources (i.e., quantity and quality) that characterised the west, central and eastern Mediterranean basins, as well as sediment grain size, appear to influence the macrobenthic standing stock and the biodiversity along the different slope areas. PMID:25225909

  8. Large spatial scale variability in bathyal macrobenthos abundance, biomass, α- and β-diversity along the Mediterranean continental margin.

    PubMed

    Baldrighi, Elisa; Lavaleye, Marc; Aliani, Stefano; Conversi, Alessandra; Manini, Elena

    2014-01-01

    The large-scale deep-sea biodiversity distribution of the benthic fauna was explored in the Mediterranean Sea, which can be seen as a miniature model of the oceans of the world. Within the framework of the BIOFUN project ("Biodiversity and Ecosystem Functioning in Contrasting Southern European Deep-sea Environments: from viruses to megafauna"), we investigated the large spatial scale variability (over >1,000 km) of the bathyal macrofauna communities that inhabit the Mediterranean basin, and their relationships with the environmental variables. The macrofauna abundance, biomass, community structure and functional diversity were analysed and the α-diversity and β-diversity were estimated across six selected slope areas at different longitudes and along three main depths. The macrobenthic standing stock and α-diversity were lower in the deep-sea sediments of the eastern Mediterranean basin, compared to the western and central basins. The macrofaunal standing stock and diversity decreased significantly from the upper bathyal to the lower bathyal slope stations. The major changes in the community composition of the higher taxa and in the trophic (functional) structure occurred at different longitudes, rather than at increasing water depth. For the β-diversity, very high dissimilarities emerged at all levels: (i) between basins; (ii) between slopes within the same basin; and (iii) between stations at different depths; this therefore demonstrates the high macrofaunal diversity of the Mediterranean basins at large spatial scales. Overall, the food sources (i.e., quantity and quality) that characterised the west, central and eastern Mediterranean basins, as well as sediment grain size, appear to influence the macrobenthic standing stock and the biodiversity along the different slope areas.

  9. Bench-Scale Development of a Non-Aqueous Solvent (NAS) CO2 Capture Process for Coal-Fired Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lail, Marty

    The project aimed to advance RTI’s non-aqueous amine solvent technology by improving the solvent to reduce volatility, demonstrating long-term continuous operation at lab- (0.5 liters solvent) and bench-scale (~120 liters solvent), showing low reboiler heat duty measured during bench-scale testing, evaluating degradation products, building a rate-based process model, and evaluating the techno-economic performance of the process. The project team (RTI, SINTEF, Linde Engineering) and the technology performed well in each area of advancement. The modifications incorporated throughout the project enabled the attainment of target absorber and regenerator conditions for the process. Reboiler duties below 2,000 kJt/kg CO2 were observed inmore » a bench-scale test unit operated at RTI.« less

  10. Los Angeles-Gateway Freight Advanced Traveler Information System : prototype development and small-scale demonstrations for FRATIS.

    DOT National Transportation Integrated Search

    2013-06-01

    This Demonstration Plan has been prepared to provide guidance and a common definition to all parties of the testing program that will be conducted for the LA-Gateway FRATIS Demonstration Project. More specifically, this document provides: Plannin...

  11. Establishment of a National Wind Energy Center at University of Houston

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Su Su

    The DOE-supported project objectives are to: establish a national wind energy center (NWEC) at University of Houston and conduct research to address critical science and engineering issues for the development of future large MW-scale wind energy production systems, especially offshore wind turbines. The goals of the project are to: (1) establish a sound scientific/technical knowledge base of solutions to critical science and engineering issues for developing future MW-scale large wind energy production systems, (2) develop a state-of-the-art wind rotor blade research facility at the University of Houston, and (3) through multi-disciplinary research, introducing technology innovations on advanced wind-turbine materials, processing/manufacturingmore » technology, design and simulation, testing and reliability assessment methods related to future wind turbine systems for cost-effective production of offshore wind energy. To achieve the goals of the project, the following technical tasks were planned and executed during the period from April 15, 2010 to October 31, 2014 at the University of Houston: (1) Basic research on large offshore wind turbine systems (2) Applied research on innovative wind turbine rotors for large offshore wind energy systems (3) Integration of offshore wind-turbine design, advanced materials and manufacturing technologies (4) Integrity and reliability of large offshore wind turbine blades and scaled model testing (5) Education and training of graduate and undergraduate students and post- doctoral researchers (6) Development of a national offshore wind turbine blade research facility The research program addresses both basic science and engineering of current and future large wind turbine systems, especially offshore wind turbines, for MW-scale power generation. The results of the research advance current understanding of many important scientific issues and provide technical information for solving future large wind turbines with advanced design, composite materials, integrated manufacturing, and structural reliability and integrity. The educational program have trained many graduate and undergraduate students and post-doctoral level researchers to learn critical science and engineering of wind energy production systems through graduate-level courses and research, and participating in various projects in center’s large multi-disciplinary research. These students and researchers are now employed by the wind industry, national labs and universities to support the US and international wind energy industry. The national offshore wind turbine blade research facility developed in the project has been used to support the technical and training tasks planned in the program to accomplish their goals, and it is a national asset which is available for used by domestic and international researchers in the wind energy arena.« less

  12. The topology of galaxy clustering.

    NASA Astrophysics Data System (ADS)

    Coles, P.; Plionis, M.

    The authors discuss an objective method for quantifying the topology of the galaxy distribution using only projected galaxy counts. The method is a useful complement to fully three-dimensional studies of topology based on the genus by virtue of the enormous projected data sets available. Applying the method to the Lick counts they find no evidence for large-scale non-gaussian behaviour, whereas the small-scale distribution is strongly non-gaussian, with a shift in the meatball direction.

  13. Distant Influence of Kuroshio Eddies on North Pacific Weather Patterns?

    PubMed

    Ma, Xiaohui; Chang, Ping; Saravanan, R; Montuoro, Raffaele; Hsieh, Jen-Shan; Wu, Dexing; Lin, Xiaopei; Wu, Lixin; Jing, Zhao

    2015-12-04

    High-resolution satellite measurements of surface winds and sea-surface temperature (SST) reveal strong coupling between meso-scale ocean eddies and near-surface atmospheric flow over eddy-rich oceanic regions, such as the Kuroshio and Gulf Stream, highlighting the importance of meso-scale oceanic features in forcing the atmospheric planetary boundary layer (PBL). Here, we present high-resolution regional climate modeling results, supported by observational analyses, demonstrating that meso-scale SST variability, largely confined in the Kuroshio-Oyashio confluence region (KOCR), can further exert a significant distant influence on winter rainfall variability along the U.S. Northern Pacific coast. The presence of meso-scale SST anomalies enhances the diabatic conversion of latent heat energy to transient eddy energy, intensifying winter cyclogenesis via moist baroclinic instability, which in turn leads to an equivalent barotropic downstream anticyclone anomaly with reduced rainfall. The finding points to the potential of improving forecasts of extratropical winter cyclones and storm systems and projections of their response to future climate change, which are known to have major social and economic impacts, by improving the representation of ocean eddy-atmosphere interaction in forecast and climate models.

  14. CPV plants data analysis. ISFOC and NACIR projects results

    NASA Astrophysics Data System (ADS)

    Martínez, M.; Rubio, F.; Sala, G.; Pachón, D.; Bett, A.; Siefer, G.; Vetter, M.; Schies, A.; Wachtel, J.; Gombert, A.; Wüllner, J.; Díaz, V.; Vázquez, M. A.; Abulfotuh, F.; Fetyan, K.; el Moussaoui, A.; Mansouri, S.; Loudiyi, K.; Darhmaoui, H.; Mrabti, T.

    2012-10-01

    Now it is the moment for CPV to become a reliable solution for large scale electricity generation, because it is one of the technologies with higher efficiency, and moreover, it has still margin for improvement. In order to continue with this development, it is important to introduce, in the design of the installations, all the lessons learned during the operation of pilot plants. This paper presents the operation results obtained at the ISFOC pilot plants, during the first three and a half years of operation, and the NACIR project. The CPV technology is not demonstrating signs of degradation which could reduce its high capability of transforming light into electricity. From the operation issues, valuable information is obtained in order to improve the design, turning CPV prototypes into an industrialized product ready to compete with other technologies, making a great effort in the reduction of the installation costs.

  15. KSC-2009-6454

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – At NASA's Kennedy Space Center in Florida, recipients of a NASA Team Award for their parts in the successful construction of NASA's first large-scale solar power generation facility pose for a group portrait. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  16. KSC-2009-6451

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – Florida Power & Light Company Vice President and Chief Development Officer Eric Silagy, left, and NASA Kennedy Space Center Director Bob Cabana, center, examine one of the solar panels at the unveiling of NASA's first large-scale solar power generation facility at Kennedy in Florida. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  17. Improvements on non-equilibrium and transport Green function techniques: The next-generation TRANSIESTA

    NASA Astrophysics Data System (ADS)

    Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads

    2017-03-01

    We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.

  18. USDOE/Russian Ministry of Fuel and Energy joint collaboration for renewable energy resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Touryan, K.

    1997-12-01

    This paper describes a joint collaboration between the US and Russia to develop renewable energy resources. There are five main goals of the project. First is to establish Intersolarcenter as a sister organization to NREL for joint R&D activities, and to provide training to the staff. Second is to install demonstration systems in parks and selected locations around Moscow. Third is to install pilot projects: a wind/diesel hybrid system at 21 sites in the northern territories; a 500 kW biomass power plant in the Arkhangelsk Region. Fourth is to assist in the start-up operations of a 2 MW/yr Triple Junctionmore » amorphous-Si manufacturing facility in Moscow using US technology. Fifth is to explore the possibilities of financing large-scale wind/hybrid and biomass power systems for the nouthern territories (possibly 900 sites).« less

  19. Thermal Performance Comparison of Glass Microsphere and Perlite Insulation Systems for Liquid Hydrogen Storage Tanks

    NASA Astrophysics Data System (ADS)

    Sass, J. P.; Fesmire, J. E.; Nagy, Z. F.; Sojourner, S. J.; Morris, D. L.; Augustynowicz, S. D.

    2008-03-01

    A technology demonstration test project was conducted by the Cryogenics Test Laboratory at the Kennedy Space Center (KSC) to provide comparative thermal performance data for glass microspheres, referred to as bubbles, and perlite insulation for liquid hydrogen tank applications. Two identical 1/15th scale versions of the 3,200,000 liter spherical liquid hydrogen tanks at Launch Complex 39 at KSC were custom designed and built to serve as test articles for this test project. Evaporative (boil-off) calorimeter test protocols, including liquid nitrogen and liquid hydrogen, were established to provide tank test conditions characteristic of the large storage tanks that support the Space Shuttle launch operations. This paper provides comparative thermal performance test results for bubbles and perlite for a wide range of conditions. Thermal performance as a function of cryogenic commodity (nitrogen and hydrogen), vacuum pressure, insulation fill level, tank liquid level, and thermal cycles will be presented.

  20. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  1. Asynchronous Two-Level Checkpointing Scheme for Large-Scale Adjoints in the Spectral-Element Solver Nek5000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schanen, Michel; Marin, Oana; Zhang, Hong

    Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less

  2. Exposing the Science in Citizen Science: Fitness to Purpose and Intentional Design.

    PubMed

    Parrish, Julia K; Burgess, Hillary; Weltzin, Jake F; Fortson, Lucy; Wiggins, Andrea; Simmons, Brooke

    2018-05-21

    Citizen science is a growing phenomenon. With millions of people involved and billions of in-kind dollars contributed annually, this broad extent, fine grain approach to data collection should be garnering enthusiastic support in the mainstream science and higher education communities. However, many academic researchers demonstrate distinct biases against the use of citizen science as a source of rigorous information. To engage the public in scientific research, and the research community in the practice of citizen science, a mutual understanding is needed of accepted quality standards in science, and the corresponding specifics of project design and implementation when working with a broad public base. We define a science-based typology focused on the degree to which projects deliver the type(s) and quality of data/work needed to produce valid scientific outcomes directly useful in science and natural resource management. Where project intent includes direct contribution to science and the public is actively involved either virtually or hands-on, we examine the measures of quality assurance (methods to increase data quality during the design and implementation phases of a project) and quality control (post hoc methods to increase the quality of scientific outcomes). We suggest that high quality science can be produced with massive, largely one-off, participation if data collection is simple and quality control includes algorithm voting, statistical pruning and/or computational modeling. Small to mid-scale projects engaging participants in repeated, often complex, sampling can advance quality through expert-led training and well-designed materials, and through independent verification. Both approaches - simplification at scale and complexity with care - generate more robust science outcomes.

  3. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys).

    PubMed

    Schmidt, Olga; Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology - Indonesian Institute of Sciences (RCB-LIPI, Bogor).

  4. Kingsbury Bay-Grassy Point habitat restoration project: A Health Impact Assessment-oral presentation

    EPA Science Inventory

    Undertaking large-scale aquatic habitat restoration projects in prominent waterfront locations, such as city parks, provides an opportunity to both improve ecological integrity and enhance community well-being. However, to consider both opportunities simultaneously, a community-b...

  5. Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption

    ERIC Educational Resources Information Center

    Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane

    2014-01-01

    A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…

  6. Bayesian hierarchical model for large-scale covariance matrix estimation.

    PubMed

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  7. Intercomparison of regional-scale hydrological models and climate change impacts projected for 12 large river basins worldwide—a synthesis

    NASA Astrophysics Data System (ADS)

    Krysanova, Valentina; Vetter, Tobias; Eisner, Stephanie; Huang, Shaochun; Pechlivanidis, Ilias; Strauch, Michael; Gelfan, Alexander; Kumar, Rohini; Aich, Valentin; Arheimer, Berit; Chamorro, Alejandro; van Griensven, Ann; Kundu, Dipangkar; Lobanova, Anastasia; Mishra, Vimal; Plötner, Stefan; Reinhardt, Julia; Seidou, Ousmane; Wang, Xiaoyan; Wortmann, Michel; Zeng, Xiaofan; Hattermann, Fred F.

    2017-10-01

    An intercomparison of climate change impacts projected by nine regional-scale hydrological models for 12 large river basins on all continents was performed, and sources of uncertainty were quantified in the framework of the ISIMIP project. The models ECOMAG, HBV, HYMOD, HYPE, mHM, SWAT, SWIM, VIC and WaterGAP3 were applied in the following basins: Rhine and Tagus in Europe, Niger and Blue Nile in Africa, Ganges, Lena, Upper Yellow and Upper Yangtze in Asia, Upper Mississippi, MacKenzie and Upper Amazon in America, and Darling in Australia. The model calibration and validation was done using WATCH climate data for the period 1971-2000. The results, evaluated with 14 criteria, are mostly satisfactory, except for the low flow. Climate change impacts were analyzed using projections from five global climate models under four representative concentration pathways. Trends in the period 2070-2099 in relation to the reference period 1975-2004 were evaluated for three variables: the long-term mean annual flow and high and low flow percentiles Q 10 and Q 90, as well as for flows in three months high- and low-flow periods denoted as HF and LF. For three river basins: the Lena, MacKenzie and Tagus strong trends in all five variables were found (except for Q 10 in the MacKenzie); trends with moderate certainty for three to five variables were confirmed for the Rhine, Ganges and Upper Mississippi; and increases in HF and LF were found for the Upper Amazon, Upper Yangtze and Upper Yellow. The analysis of projected streamflow seasonality demonstrated increasing streamflow volumes during the high-flow period in four basins influenced by monsoonal precipitation (Ganges, Upper Amazon, Upper Yangtze and Upper Yellow), an amplification of the snowmelt flood peaks in the Lena and MacKenzie, and a substantial decrease of discharge in the Tagus (all months). The overall average fractions of uncertainty for the annual mean flow projections in the multi-model ensemble applied for all basins were 57% for GCMs, 27% for RCPs, and 16% for hydrological models.

  8. Multi-Scale Models for the Scale Interaction of Organized Tropical Convection

    NASA Astrophysics Data System (ADS)

    Yang, Qiu

    Assessing the upscale impact of organized tropical convection from small spatial and temporal scales is a research imperative, not only for having a better understanding of the multi-scale structures of dynamical and convective fields in the tropics, but also for eventually helping in the design of new parameterization strategies to improve the next-generation global climate models. Here self-consistent multi-scale models are derived systematically by following the multi-scale asymptotic methods and used to describe the hierarchical structures of tropical atmospheric flows. The advantages of using these multi-scale models lie in isolating the essential components of multi-scale interaction and providing assessment of the upscale impact of the small-scale fluctuations onto the large-scale mean flow through eddy flux divergences of momentum and temperature in a transparent fashion. Specifically, this thesis includes three research projects about multi-scale interaction of organized tropical convection, involving tropical flows at different scaling regimes and utilizing different multi-scale models correspondingly. Inspired by the observed variability of tropical convection on multiple temporal scales, including daily and intraseasonal time scales, the goal of the first project is to assess the intraseasonal impact of the diurnal cycle on the planetary-scale circulation such as the Hadley cell. As an extension of the first project, the goal of the second project is to assess the intraseasonal impact of the diurnal cycle over the Maritime Continent on the Madden-Julian Oscillation. In the third project, the goals are to simulate the baroclinic aspects of the ITCZ breakdown and assess its upscale impact on the planetary-scale circulation over the eastern Pacific. These simple multi-scale models should be useful to understand the scale interaction of organized tropical convection and help improve the parameterization of unresolved processes in global climate models.

  9. Impact of Data Placement on Resilience in Large-Scale Object Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carns, Philip; Harms, Kevin; Jenkins, John

    Distributed object storage architectures have become the de facto standard for high-performance storage in big data, cloud, and HPC computing. Object storage deployments using commodity hardware to reduce costs often employ object replication as a method to achieve data resilience. Repairing object replicas after failure is a daunting task for systems with thousands of servers and billions of objects, however, and it is increasingly difficult to evaluate such scenarios at scale on realworld systems. Resilience and availability are both compromised if objects are not repaired in a timely manner. In this work we leverage a high-fidelity discrete-event simulation model tomore » investigate replica reconstruction on large-scale object storage systems with thousands of servers, billions of objects, and petabytes of data. We evaluate the behavior of CRUSH, a well-known object placement algorithm, and identify configuration scenarios in which aggregate rebuild performance is constrained by object placement policies. After determining the root cause of this bottleneck, we then propose enhancements to CRUSH and the usage policies atop it to enable scalable replica reconstruction. We use these methods to demonstrate a simulated aggregate rebuild rate of 410 GiB/s (within 5% of projected ideal linear scaling) on a 1,024-node commodity storage system. We also uncover an unexpected phenomenon in rebuild performance based on the characteristics of the data stored on the system.« less

  10. Northeastern Oregon bark beetle control project 1910-11.

    Treesearch

    H.E. Burke

    1990-01-01

    This history, from the memoirs of the entomologist in charge, describes the first large-scale cooperative bark beetle control project funded by Congress in the Western United States. It describes relations between the Forest Service, Bureau of Entomology, and private timber owners, how the project was organized and conducted, and results of the control measures. The...

  11. Beowulf Distributed Processing and the United States Geological Survey

    USGS Publications Warehouse

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.

  12. The Galics Project: Virtual Galaxy: from Cosmological N-body Simulations

    NASA Astrophysics Data System (ADS)

    Guiderdoni, B.

    The GalICS project develops extensive semi-analytic post-processing of large cosmological simulations to describe hierarchical galaxy formation. The multiwavelength statistical properties of high-redshift and local galaxies are predicted within the large-scale structures. The fake catalogs and mock images that are generated from the outputs are used for the analysis and preparation of deep surveys. The whole set of results is now available in an on-line database that can be easily queried. The GalICS project represents a first step towards a 'Virtual Observatory of virtual galaxies'.

  13. Large scale afforestation projects mitigate degradation and increase the stability of the karst ecosystems in southwest China

    NASA Astrophysics Data System (ADS)

    Yue, Y.; Tong, X.; Wang, K.; Fensholt, R.; Brandt, M.

    2017-12-01

    With the aim to combat desertification and improve the ecological environment, mega-engineering afforestation projects have been launched in the karst regions of southwest China around the turn of the new millennium. A positive impact of these projects on vegetation cover has been shown, however, it remains unclear if conservation efforts have been able to effectively restore ecosystem properties and reduce the sensitivity of the karst ecosystem to climate variations at large scales. Here we use passive microwave and optical satellite time series data combined with the ecosystem model LPJ-GUESS and show widespread increase in vegetation cover with a clear demarcation at the Chinese national border contrasting the conditions of neighboring countries. We apply a breakpoint detection to identify permanent changes in vegetation time series and assess the vegetation's sensitivity against climate before and after the breakpoints. A majority (74%) of the breakpoints were detected between 2001 and 2004 and are remarkably in line with the implementation and spatial extent of the Grain to Green project. We stratify the counties of the study area into four groups according to the extent of Grain to Green conservation areas and find distinct differences between the groups. Vegetation trends are similar prior to afforestation activities (1982-2000), but clearly diverge at a later stage, following the spatial extent of conservation areas. Moreover, vegetation cover dynamics were increasingly decoupled from climatic influence in areas of high conservation efforts. Whereas both vegetation resilience and resistance were considerably improved in areas with large conservation efforts thereby showing an increase in ecosystem stability, ongoing degradation and an amplified sensitivity to climate variability was found in areas with limited project implementation. Our study concludes that large scale conservation projects can regionally contribute to a greening Earth and are able to mitigate desertification by increasing the vegetation cover and reducing the ecosystem sensitivity to climate change, however, degradation remains a serious issue in the karst ecosystem of southwest China.

  14. [Privacy and public benefit in using large scale health databases].

    PubMed

    Yamamoto, Ryuichi

    2014-01-01

    In Japan, large scale heath databases were constructed in a few years, such as National Claim insurance and health checkup database (NDB) and Japanese Sentinel project. But there are some legal issues for making adequate balance between privacy and public benefit by using such databases. NDB is carried based on the act for elderly person's health care but in this act, nothing is mentioned for using this database for general public benefit. Therefore researchers who use this database are forced to pay much concern about anonymization and information security that may disturb the research work itself. Japanese Sentinel project is a national project to detecting drug adverse reaction using large scale distributed clinical databases of large hospitals. Although patients give the future consent for general such purpose for public good, it is still under discussion using insufficiently anonymized data. Generally speaking, researchers of study for public benefit will not infringe patient's privacy, but vague and complex requirements of legislation about personal data protection may disturb the researches. Medical science does not progress without using clinical information, therefore the adequate legislation that is simple and clear for both researchers and patients is strongly required. In Japan, the specific act for balancing privacy and public benefit is now under discussion. The author recommended the researchers including the field of pharmacology should pay attention to, participate in the discussion of, and make suggestion to such act or regulations.

  15. A Spanish ''Power Tower'' solar system: Project CESA-1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torralbo, A.M.; Gonzalvez, M.; Lacal, J.A.

    1984-02-01

    Like many other countries and organizations, Spain has been developing a program to investigate the economic viability of new sources of energy. Among these, it should be pointed out, is included the large solar power systems. Within this investigation program, ''Centro de Estudios de la Energia'', an organization dependent on ''Ministerio de Industria y Energia'', is carrying out the CESA-1 Project, which consists of design, construction, start-up, and operation of a 1.2-MW Pilot Solar Power Plant. If the current technical uncertainties are removed and the power tower concept demonstrates its economical viability, Spain will be one of the most appropriatemore » countries in the world for a full-scale implementation of this technology. For this reason, the ''Ministerio de Industria y Energia'' reached the conclusion in mid-1977 that it would be of interest to explore this technology using the domestic industrial potential. The project was approved by the Council of Ministers in June 1977 and the project begun in early 1978. The management of the Project is the direct responsibility of ''El Centro de Estudios de la Energia'' and was helped by the engineering firms Initec and Sener to attain the adequate organization to carry out the project.« less

  16. Sample Identification at Scale - Implementing IGSN in a Research Agency

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Golodoniuc, P.; Wyborn, L. A.; Devaraju, A.; Fraser, R.

    2015-12-01

    Earth sciences are largely observational and rely on natural samples, types of which vary significantly between science disciplines. Sharing and referencing of samples in scientific literature and across the Web requires the use of globally unique identifiers essential for disambiguation. This practice is very common in other fields, e.g. ISBN in publishing, doi in scientific literature, etc. In Earth sciences however, this is still often done in an ad-hoc manner without the use of unique identifiers. The International Geo Sample Number (IGSN) system provides a persistent, globally unique label for identifying environmental samples. As an IGSN allocating agency, CSIRO implements the IGSN registration service at the organisational scale with contributions from multiple research groups. Capricorn Distal Footprints project is one of the first pioneers and early adopters of the technology in Australia. For this project, IGSN provides a mechanism for identification of new and legacy samples, as well as derived sub-samples. It will ensure transparency and reproducibility in various geochemical sampling campaigns that will involve a diversity of sampling methods. Hence, diverse geochemical and isotopic results can be linked back to the parent sample, particularly where multiple children of that sample have also been analysed. The IGSN integration for this project is still in early stages and requires further consultations on the governance mechanisms that we need to put in place to allow efficient collaboration within CSIRO and collaborating partners on the project including naming conventions, service interfaces, etc. In this work, we present the results of the initial implementation of IGSN in the context of the Capricorn Distal Footprints project. This study has so far demonstrated the effectiveness of the proposed approach, while maintaining the flexibility to adapt to various media types, which is critical in the context of a multi-disciplinary project.

  17. Past and present cosmic structure in the SDSS DR7 main sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasche, J.; Leclercq, F.; Wandelt, B.D., E-mail: jasche@iap.fr, E-mail: florent.leclercq@polytechnique.org, E-mail: wandelt@iap.fr

    2015-01-01

    We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structuremore » formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than ∼ 3 Mpc/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.« less

  18. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  19. Analyzing large-scale proteomics projects with latent semantic indexing.

    PubMed

    Klie, Sebastian; Martens, Lennart; Vizcaíno, Juan Antonio; Côté, Richard; Jones, Phil; Apweiler, Rolf; Hinneburg, Alexander; Hermjakob, Henning

    2008-01-01

    Since the advent of public data repositories for proteomics data, readily accessible results from high-throughput experiments have been accumulating steadily. Several large-scale projects in particular have contributed substantially to the amount of identifications available to the community. Despite the considerable body of information amassed, very few successful analyses have been performed and published on this data, leveling off the ultimate value of these projects far below their potential. A prominent reason published proteomics data is seldom reanalyzed lies in the heterogeneous nature of the original sample collection and the subsequent data recording and processing. To illustrate that at least part of this heterogeneity can be compensated for, we here apply a latent semantic analysis to the data contributed by the Human Proteome Organization's Plasma Proteome Project (HUPO PPP). Interestingly, despite the broad spectrum of instruments and methodologies applied in the HUPO PPP, our analysis reveals several obvious patterns that can be used to formulate concrete recommendations for optimizing proteomics project planning as well as the choice of technologies used in future experiments. It is clear from these results that the analysis of large bodies of publicly available proteomics data by noise-tolerant algorithms such as the latent semantic analysis holds great promise and is currently underexploited.

  20. Impacts of large scale afforestation on regional climate: a case study in the Kubuqi Desert, Inner Mongolia based on WRF model

    NASA Astrophysics Data System (ADS)

    Wang, L.; Lin, G.; Feng, D.; Chen, S.; Schultz, N. M.; Fu, C.; Wei, Z.; Yin, C.; Wang, W.; Lee, X.

    2017-12-01

    To better design climate mitigation strategies, it is important to understand the response of regional climatic indicators and related biophysical forcings to large scale afforestation projects. The response of surface temperature (Ts) caused by afforestation activities in the Kubuqi Desert, Inner Mongolia, China is simulated by the weather research and forecasting (WRF) model and the temperature changes (ΔTs) are decomposed into contributions from changes in surface albedo, surface roughness, Bowen ratio and ground heat flux using the intrinsic biophysical mechanism (IBPM). The 30-m resolution land cover maps of the Kubuqi Desert corresponding to 2000 and 2010 conditions are analyzed and the major land use changes are found to be an increase in the area of grassland (6%) and shrubland (15%), but a decrease in the area of bare land (21%) owed to the aerial seeding afforestation activities organized by Elion Resources Group, Co. and local government agencies. Our WRF simulations show that during winter, the increased cover of vegetation mainly has a warming effect (0.38 K) in the daytime due to the changes in albedo (0.24 K) and Bowen ratio (0.15 K). In the nighttime, the vegetation has a slight warming effect (0.2 K) mainly caused by energy redistribution associated with roughness change (0.2 K) as a result of vegetation turbulence, which brought heat from aloft to the surface. Although both roughness change (-0.35 K) and Bowen ratio change (-0.35 K) have cooling effects during summer days, the warming effect caused by radiative forcing (0.93 K) dominates the ΔTs. During summer nights, the change in surface temperature is not significant. Our findings demonstrate that the large-scale afforestation project in the Kubuqi Desert during a decade alters the regional surface temperature and the analysis of biophysical forcings changes using WRF simulation provides useful information for developing climate change mitigation strategies in semi-arid and arid regions.

  1. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.

    Bioinformatics researchers are increasingly confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBasemore » project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date.« less

  2. Algorithm of OMA for large-scale orthology inference

    PubMed Central

    Roth, Alexander CJ; Gonnet, Gaston H; Dessimoz, Christophe

    2008-01-01

    Background OMA is a project that aims to identify orthologs within publicly available, complete genomes. With 657 genomes analyzed to date, OMA is one of the largest projects of its kind. Results The algorithm of OMA improves upon standard bidirectional best-hit approach in several respects: it uses evolutionary distances instead of scores, considers distance inference uncertainty, includes many-to-many orthologous relations, and accounts for differential gene losses. Herein, we describe in detail the algorithm for inference of orthology and provide the rationale for parameter selection through multiple tests. Conclusion OMA contains several novel improvement ideas for orthology inference and provides a unique dataset of large-scale orthology assignments. PMID:19055798

  3. In Defense of the National Labs and Big-Budget Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodwin, J R

    2008-07-29

    The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less

  4. Mean-field dynamo in a turbulence with shear and kinetic helicity fluctuations.

    PubMed

    Kleeorin, Nathan; Rogachevskii, Igor

    2008-03-01

    We study the effects of kinetic helicity fluctuations in a turbulence with large-scale shear using two different approaches: the spectral tau approximation and the second-order correlation approximation (or first-order smoothing approximation). These two approaches demonstrate that homogeneous kinetic helicity fluctuations alone with zero mean value in a sheared homogeneous turbulence cannot cause a large-scale dynamo. A mean-field dynamo is possible when the kinetic helicity fluctuations are inhomogeneous, which causes a nonzero mean alpha effect in a sheared turbulence. On the other hand, the shear-current effect can generate a large-scale magnetic field even in a homogeneous nonhelical turbulence with large-scale shear. This effect was investigated previously for large hydrodynamic and magnetic Reynolds numbers. In this study we examine the threshold required for the shear-current dynamo versus Reynolds number. We demonstrate that there is no need for a developed inertial range in order to maintain the shear-current dynamo (e.g., the threshold in the Reynolds number is of the order of 1).

  5. A general method for large-scale fabrication of Cu nanoislands/dragonfly wing SERS flexible substrates

    NASA Astrophysics Data System (ADS)

    Wang, Yuhong; Wang, Mingli; Shen, Lin; Zhu, Yanying; Sun, Xin; Shi, Guochao; Xu, Xiaona; Li, Ruifeng; Ma, Wanli

    2018-01-01

    Not Available Project supported by the Youth Fund Project of University Science and Technology Plan of Hebei Provincial Department of Education, China (Grant No. QN2015004) and the Doctoral Fund of Yanshan University, China (Grant No. B924).

  6. Breaking barriers through collaboration: the example of the Cell Migration Consortium.

    PubMed

    Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas

    2002-10-15

    Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.

  7. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  8. The dynamics and evolution of clusters of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret; Huchra, John P.

    1987-01-01

    Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.

  9. White Paper on Dish Stirling Technology: Path Toward Commercial Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andraka, Charles E.; Stechel, Ellen; Becker, Peter

    2016-07-01

    Dish Stirling energy systems have been developed for distributed and large-scale utility deployment. This report summarizes the state of the technology in a joint project between Stirling Energy Systems, Sandia National Laboratories, and the Department of Energy in 2011. It then lays out a feasible path to large scale deployment, including development needs and anticipated cost reduction paths that will make a viable deployment product.

  10. Multi-Scale Ordered Cell Structure for Cost Effective Production of Hydrogen by HTWS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elangovan, Elango; Rao, Ranjeet; Colella, Whitney

    Production of hydrogen using an electrochemical device provides for large scale, high efficiency conversion and storage of electrical energy. When renewable electricity is used for conversion of steam to hydrogen, a low-cost and low emissions pathway to hydrogen production emerges. This project was intended to demonstrate a high efficiency High Temperature Water Splitting (HTWS) stack for the electrochemical production of low cost H2. The innovations investigated address the limitations of the state of the art through the use of a novel architecture that introduces macro-features to provide mechanical support of a thin electrolyte, and micro-features of the electrodes to lowermore » polarization losses. The approach also utilizes a combination of unique sets of fabrication options that are scalable to achieve manufacturing cost objectives. The development of HTWS process and device is guided by techno-economic and life cycle analyses.« less

  11. Non-contact XUV metrology of Ru/B4C multilayer optics by means of Hartmann wavefront analysis.

    PubMed

    Ruiz-Lopez, Mabel; Dacasa, Hugo; Mahieu, Benoit; Lozano, Magali; Li, Lu; Zeitoun, Philippe; Bleiner, Davide

    2018-02-20

    Short-wavelength imaging, spectroscopy, and lithography scale down the characteristic length-scale to nanometers. This poses tight constraints on the optics finishing tolerances, which is often difficult to characterize. Indeed, even a tiny surface defect degrades the reflectivity and spatial projection of such optics. In this study, we demonstrate experimentally that a Hartmann wavefront sensor for extreme ultraviolet (XUV) wavelengths is an effective non-contact analytical method for inspecting the surface of multilayer optics. The experiment was carried out in a tabletop laboratory using a high-order harmonic generation as an XUV source. The wavefront sensor was used to measure the wavefront errors after the reflection of the XUV beam on a spherical Ru/B 4 C multilayer mirror, scanning a large surface of approximately 40 mm in diameter. The results showed that the technique detects the aberrations in the nanometer range.

  12. Future potential distribution of the emerging amphibian chytrid fungus under anthropogenic climate change.

    PubMed

    Rödder, Dennis; Kielgast, Jos; Lötters, Stefan

    2010-11-01

    Anthropogenic climate change poses a major threat to global biodiversity with a potential to alter biological interactions at all spatial scales. Amphibians are the most threatened vertebrates and have been subject to increasing conservation attention over the past decade. A particular concern is the pandemic emergence of the parasitic chytrid fungus Batrachochytrium dendrobatidis, which has been identified as the cause of extremely rapid large-scale declines and species extinctions. Experimental and observational studies have demonstrated that the host-pathogen system is strongly influenced by climatic parameters and thereby potentially affected by climate change. Herein we project a species distribution model of the pathogen onto future climatic scenarios generated by the IPCC to examine their potential implications on the pandemic. Results suggest that predicted anthropogenic climate change may reduce the geographic range of B. dendrobatidis and its potential influence on amphibian biodiversity.

  13. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    PubMed

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  14. High Fidelity Simulations of Large-Scale Wireless Networks (Plus-Up)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma

    Sandia has built a strong reputation in scalable network simulation and emulation for cyber security studies to protect our nation’s critical information infrastructures. Georgia Tech has preeminent reputation in academia for excellence in scalable discrete event simulations, with strong emphasis on simulating cyber networks. Many of the experts in this field, such as Dr. Richard Fujimoto, Dr. George Riley, and Dr. Chris Carothers, have strong affiliations with Georgia Tech. The collaborative relationship that we intend to immediately pursue is in high fidelity simulations of practical large-scale wireless networks using ns-3 simulator via Dr. George Riley. This project will have mutualmore » benefits in bolstering both institutions’ expertise and reputation in the field of scalable simulation for cyber-security studies. This project promises to address high fidelity simulations of large-scale wireless networks. This proposed collaboration is directly in line with Georgia Tech’s goals for developing and expanding the Communications Systems Center, the Georgia Tech Broadband Institute, and Georgia Tech Information Security Center along with its yearly Emerging Cyber Threats Report. At Sandia, this work benefits the defense systems and assessment area with promise for large-scale assessment of cyber security needs and vulnerabilities of our nation’s critical cyber infrastructures exposed to wireless communications.« less

  15. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    DTIC Science & Technology

    2004-10-01

    MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory

  16. Sunlight-Driven Hydrogen Formation by Membrane-Supported Photoelectrochemical Water Splitting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Nathan S.

    2014-03-26

    This report describes the significant advances in the development of the polymer-supported photoelectrochemical water-splitting system that was proposed under DOE grant number DE-FG02-05ER15754. We developed Si microwire-array photoelectrodes, demonstrated control over the material and light-absorption properties of the microwire-array photoelectrodes, developed inexpensive processes for synthesizing the arrays, and doped the arrays p-type for use as photocathodes. We also developed techniques for depositing metal-nanoparticle catalysts of the hydrogen-evolution reaction (HER) on the wire arrays, investigated the stability and catalytic performance of the nanoparticles, and demonstrated that Ni-Mo alloys are promising earth-abundant catalysts of the HER. We also developed methods that allowmore » reuse of the single-crystalline Si substrates used for microwire growth and methods of embedding the microwire photocathodes in plastic to enable large-scale processing and deployment of the technology. Furthermore we developed techniques for controlling the structure of WO3 films, and demonstrated that structural control can improve the quantum yield of photoanodes. Thus, by the conclusion of this project, we demonstrated significant advances in the development of all components of a sunlight-driven membrane-supported photoelectrochemical water-splitting system. This final report provides descriptions of some of the scientific accomplishments that were achieved under the support of this project and also provides references to the peer-reviewed publications that resulted from this effort.« less

  17. How much a galaxy knows about its large-scale environment?: An information theoretic perspective

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2017-05-01

    The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.

  18. Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA)

    NASA Technical Reports Server (NTRS)

    Lichtwardt, Jonathan; Paciano, Eric; Jameson, Tina; Fong, Robert; Marshall, David

    2012-01-01

    With the very recent advent of NASA's Environmentally Responsible Aviation Project (ERA), which is dedicated to designing aircraft that will reduce the impact of aviation on the environment, there is a need for research and development of methodologies to minimize fuel burn, emissions, and reduce community noise produced by regional airliners. ERA tackles airframe technology, propulsion technology, and vehicle systems integration to meet performance objectives in the time frame for the aircraft to be at a Technology Readiness Level (TRL) of 4-6 by the year of 2020 (deemed N+2). The proceeding project that investigated similar goals to ERA was NASA's Subsonic Fixed Wing (SFW). SFW focused on conducting research to improve prediction methods and technologies that will produce lower noise, lower emissions, and higher performing subsonic aircraft for the Next Generation Air Transportation System. The work provided in this investigation was a NASA Research Announcement (NRA) contract #NNL07AA55C funded by Subsonic Fixed Wing. The project started in 2007 with a specific goal of conducting a large-scale wind tunnel test along with the development of new and improved predictive codes for the advanced powered-lift concepts. Many of the predictive codes were incorporated to refine the wind tunnel model outer mold line design. The large scale wind tunnel test goal was to investigate powered lift technologies and provide an experimental database to validate current and future modeling techniques. Powered-lift concepts investigated were Circulation Control (CC) wing in conjunction with over-the-wing mounted engines to entrain the exhaust to further increase the lift generated by CC technologies alone. The NRA was a five-year effort; during the first year the objective was to select and refine CESTOL concepts and then to complete a preliminary design of a large-scale wind tunnel model for the large scale test. During the second, third, and fourth years the large-scale wind tunnel model design would be completed, manufactured, and calibrated. During the fifth year the large scale wind tunnel test was conducted. This technical memo will describe all phases of the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) project and provide a brief summary of the background and modeling efforts involved in the NRA. The conceptual designs considered for this project and the decision process for the selected configuration adapted for a wind tunnel model will be briefly discussed. The internal configuration of AMELIA, and the internal measurements chosen in order to satisfy the requirements of obtaining a database of experimental data to be used for future computational model validations. The external experimental techniques that were employed during the test, along with the large-scale wind tunnel test facility are covered in great detail. Experimental measurements in the database include forces and moments, and surface pressure distributions, local skin friction measurements, boundary and shear layer velocity profiles, far-field acoustic data and noise signatures from turbofan propulsion simulators. Results and discussion of the circulation control performance, over-the-wing mounted engines, and the combined performance are also discussed in great detail.

  19. Integrated Mid-Continent Carbon Capture, Sequestration & Enhanced Oil Recovery Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian McPherson

    2010-08-31

    A consortium of research partners led by the Southwest Regional Partnership on Carbon Sequestration and industry partners, including CAP CO2 LLC, Blue Source LLC, Coffeyville Resources, Nitrogen Fertilizers LLC, Ash Grove Cement Company, Kansas Ethanol LLC, Headwaters Clean Carbon Services, Black & Veatch, and Schlumberger Carbon Services, conducted a feasibility study of a large-scale CCS commercialization project that included large-scale CO{sub 2} sources. The overall objective of this project, entitled the 'Integrated Mid-Continent Carbon Capture, Sequestration and Enhanced Oil Recovery Project' was to design an integrated system of US mid-continent industrial CO{sub 2} sources with CO{sub 2} capture, and geologicmore » sequestration in deep saline formations and in oil field reservoirs with concomitant EOR. Findings of this project suggest that deep saline sequestration in the mid-continent region is not feasible without major financial incentives, such as tax credits or otherwise, that do not exist at this time. However, results of the analysis suggest that enhanced oil recovery with carbon sequestration is indeed feasible and practical for specific types of geologic settings in the Midwestern U.S.« less

  20. CACTUS: Command and Control Training Using Knowledge-Based Simulations

    ERIC Educational Resources Information Center

    Hartley, Roger; Ravenscroft, Andrew; Williams, R. J.

    2008-01-01

    The CACTUS project was concerned with command and control training of large incidents where public order may be at risk, such as large demonstrations and marches. The training requirements and objectives of the project are first summarized justifying the use of knowledge-based computer methods to support and extend conventional training…

  1. Using Compliance Analysis for PPP to bridge the gap between SEA and EIA: Lessons from the Turcot Interchange reconstruction in Montréal, Québec

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Undiné-Celeste, E-mail: undine_t@hotmail.com; Marsan, Jean-François, E-mail: jfmarsan@hotmail.com; Fournier-Peyresblanques, Bastien, E-mail: bastien.fp@gmail.com

    2013-09-15

    There is increasing concern about the disjunct between the intent of higher level government goals and actual projects “on the ground” in Canada. Although strategic environmental assessment (SEA) and a wide variety of plans, policies and programmes (PPP) contain and promote goals that envision a movement towards social, economic and environmental sustainability, these goals are not necessarily upheld by large-scale projects and their environmental impact assessments (EIAs). This disconnect is often illustrated through anecdotal observations. However, to be able to overcome this disjunct it is imperative to come to a clearer understanding of the degree of sustainability or unsustainability ofmore » large-scale developments and the way in which they “measure up” in terms of the goals when compared to alternative options. This article proposes a Compliance Analysis method for investigating the level of harmonization between SEA, PPP and proposed projects and their possible alternatives (CAPPP). This method is quantified through a Likert scale which allows for comparison of alternatives for decision making and analytical purposes. The 2009 proposal for the Turcot Exchange redevelopment in Montréal, Québec, put forward by the Ministry of Transport of Québec (MTQ), as well as two alternative proposals, were utilized as a case study to clearly demonstrate the CAPPP methodology and its applicability. The approved plan for the Turcot redevelopment proposed by MTQ was found to be in poor compliance with the majority of the 178 goals in the six sectors that were examined (air quality, climate change, health, noise, socioeconomic, transport), while alternative proposals were found to be in greater accordance with the intentions of governmental SEA and PPP. Synthesis and applications: The CAPPP methodology is a versatile “watchdog” tool for the examination of the level of compliance between stated goals for regions, industrial sectors, or governments and the EIAs of concrete projects “on the ground”. CAPPP can be used as a tool for comparative analysis in decision-making situations at various scales. CAPPP is a fairly straight-forward method that can be used by policy makers, EIA experts, and members of the general public alike. Highlights: ► We investigated the level of harmonization between SEA, plans, policies and programmes and EIA projects. ► We created a new methodology: the goal compliance analysis (GCA). ► We tested it on an ongoing project, the Turcot Interchange in Montreal, Canada. ► The method is straight-forward and can be used by policy makers, EIA experts, and members of the general public alike.« less

  2. The ECE Culminating Design Experience: Analysis of ABET 2000 Compliance at Leading Academic Institutions

    DTIC Science & Technology

    2006-05-01

    a significant design project that requires development of a large scale software project . A distinct shortcoming of Purdue ECE...18-540: Rapid Prototyping of Computer Systems This is a project -oriented course which will deal with all four aspects of project development ; the...instructors, will develop specifications for a mobile computer to assist in inspection and maintenance. The application will be partitioned

  3. Sodium-cutting: a new top-down approach to cut open nanostructures on nonplanar surfaces on a large scale.

    PubMed

    Chen, Wei; Deng, Da

    2014-11-11

    We report a new, low-cost and simple top-down approach, "sodium-cutting", to cut and open nanostructures deposited on a nonplanar surface on a large scale. The feasibility of sodium-cutting was demonstrated with the successfully cutting open of ∼100% carbon nanospheres into nanobowls on a large scale from Sn@C nanospheres for the first time.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rhinefrank, Kenneth; Lamb, Bradford; Prudell, Joseph

    This Project aims to satisfy objectives of the DOE’s Water Power Program by completing a system detailed design (SDD) and other important activities in the first phase of a utility-scale grid-connected ocean wave energy demonstration. In early 2012, Columbia Power (CPwr) had determined that further cost and performance optimization was necessary in order to commercialize its StingRAY wave energy converter (WEC). CPwr’s progress toward commercialization, and the requisite technology development path, were focused on transitioning toward a commercial-scale demonstration. This path required significant investment to be successful, and the justification for this investment required improved annual energy production (AEP) andmore » lower capital costs. Engineering solutions were developed to address these technical and cost challenges, incorporated into a proposal to the US Department of Energy (DOE), and then adapted to form the technical content and statement of project objectives of the resulting Project (DE-EE0005930). Through Project cost-sharing and technical collaboration between DOE and CPwr, and technical collaboration with Oregon State University (OSU), National Renewable Energy Lab (NREL) and other Project partners, we have demonstrated experimentally that these conceptual improvements have merit and made significant progress towards a certified WEC system design at a selected and contracted deployment site at the Wave Energy Test Site (WETS) at the Marine Corps Base in Oahu, HI (MCBH).« less

  5. Lake Charles CCS Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leib, Thomas; Cole, Dan

    In late September 2014 development of the Lake Charles Clean Energy (LCCE) Plant was abandoned resulting in termination of Lake Charles Carbon Capture and Sequestration (CCS) Project which was a subset the LCCE Plant. As a result, the project was only funded through Phase 2A (Design) and did not enter Phase 2B (Construction) or Phase 2C (Operations). This report was prepared relying on information prepared and provided by engineering companies which were engaged by Leucadia Energy, LLC to prepare or review Front End Engineering and Design (FEED) for the Lake Charles Clean Energy Project, which includes the Carbon Capture andmore » Sequestration (CCS) Project in Lake Charles, Louisiana. The Lake Charles Carbon Capture and Sequestration (CCS) Project was to be a large-scale industrial CCS project intended to demonstrate advanced technologies that capture and sequester carbon dioxide (CO 2) emissions from industrial sources into underground formations. The Scope of work was divided into two discrete sections; 1) Capture and Compression prepared by the Recipient Leucadia Energy, LLC, and 2) Transport and Sequestration prepared by sub-Recipient Denbury Onshore, LLC. Capture and Compression-The Lake Charles CCS Project Final Technical Report describes the systems and equipment that would be necessary to capture CO 2 generated in a large industrial gasification process and sequester the CO 2 into underground formations. The purpose of each system is defined along with a description of its equipment and operation. Criteria for selection of major equipment are provided and ancillary utilities necessary for safe and reliable operation in compliance with environmental regulations are described. Construction considerations are described including a general arrangement of the CCS process units within the overall gasification project. A cost estimate is provided, delineated by system area with cost breakdown showing equipment, piping and materials, construction labor, engineering, and other costs. The CCS Project Final Technical Report is based on a Front End Engineering and Design (FEED) study prepared by SK E&C, completed in [June] 2014. Subsequently, Fluor Enterprises completed a FEED validation study in mid-September 2014. The design analyses indicated that the FEED package was sufficient and as expected. However, Fluor considered the construction risk based on a stick-build approach to be unacceptable, but construction risk would be substantially mitigated through utilization of modular construction where site labor and schedule uncertainty is minimized. Fluor’s estimate of the overall EPC project cost utilizing the revised construction plan was comparable to SKE&C’s value after reflecting Fluor’s assessment of project scope and risk characteristic. Development was halted upon conclusion of Phase 2A FEED and the project was not constructed.Transport and Sequestration – The overall objective of the pipeline project was to construct a pipeline to transport captured CO 2 from the Lake Charles Clean Energy project to the existing Denbury Green Line and then to the Hastings Field in Southeast Texas to demonstrate effective geologic sequestration of captured CO 2 through commercial EOR operations. The overall objective of the MVA portion of the project was to demonstrate effective geologic sequestration of captured CO 2 through commercial Enhanced Oil Recovery (EOR) operations in order to evaluate costs, operational processes and technical performance. The DOE target for the project was to capture and implement a research MVA program to demonstrate the sequestration through EOR of approximately one million tons of CO 2 per year as an integral component of commercial operations.« less

  6. First Pass Annotation of Promoters on Human Chromosome 22

    PubMed Central

    Scherf, Matthias; Klingenhoff, Andreas; Frech, Kornelie; Quandt, Kerstin; Schneider, Ralf; Grote, Korbinian; Frisch, Matthias; Gailus-Durner, Valérie; Seidel, Alexander; Brack-Werner, Ruth; Werner, Thomas

    2001-01-01

    The publication of the first almost complete sequence of a human chromosome (chromosome 22) is a major milestone in human genomics. Together with the sequence, an excellent annotation of genes was published which certainly will serve as an information resource for numerous future projects. We noted that the annotation did not cover regulatory regions; in particular, no promoter annotation has been provided. Here we present an analysis of the complete published chromosome 22 sequence for promoters. A recent breakthrough in specific in silico prediction of promoter regions enabled us to attempt large-scale prediction of promoter regions on chromosome 22. Scanning of sequence databases revealed only 20 experimentally verified promoters, of which 10 were correctly predicted by our approach. Nearly 40% of our 465 predicted promoter regions are supported by the currently available gene annotation. Promoter finding also provides a biologically meaningful method for “chromosomal scaffolding”, by which long genomic sequences can be divided into segments starting with a gene. As one example, the combination of promoter region prediction with exon/intron structure predictions greatly enhances the specificity of de novo gene finding. The present study demonstrates that it is possible to identify promoters in silico on the chromosomal level with sufficient reliability for experimental planning and indicates that a wealth of information about regulatory regions can be extracted from current large-scale (megabase) sequencing projects. Results are available on-line at http://genomatix.gsf.de/chr22/. PMID:11230158

  7. First scintillating bolometer tests of a CLYMENE R&D on Li2MoO4 scintillators towards a large-scale double-beta decay experiment

    NASA Astrophysics Data System (ADS)

    Buşe, G.; Giuliani, A.; de Marcillac, P.; Marnieros, S.; Nones, C.; Novati, V.; Olivieri, E.; Poda, D. V.; Redon, T.; Sand, J.-B.; Veber, P.; Velázquez, M.; Zolotarova, A. S.

    2018-05-01

    A new R&D on lithium molybdate scintillators has begun within a project CLYMENE (Czochralski growth of Li2MoO4 crYstals for the scintillating boloMeters used in the rare EveNts sEarches). One of the main goals of the CLYMENE is a realization of a Li2MoO4 crystal growth line to be complementary to the one recently developed by LUMINEU in view of a mass production capacity for CUPID, a next-generation tonne-scale bolometric experiment to search for neutrinoless double-beta decay. In the present paper we report the investigation of performance and radiopurity of 158-g and 13.5-g scintillating bolometers based on a first large-mass (230 g) Li2MoO4 crystal scintillator developed within the CLYMENE project. In particular, a good energy resolution (2-7 keV FWHM in the energy range of 0.2-5 MeV), one of the highest light yield (0.97 keV/MeV) amongst Li2MoO4 scintillating bolometers, an efficient alpha particles discrimination (10 σ) and potentially low internal radioactive contamination (below 0.2-0.3 mBq/kg of U/Th, but 1.4 mBq/kg of 210Po) demonstrate prospects of the CLYMENE in the development of high quality and radiopure Li2MoO4 scintillators for CUPID.

  8. GPP Webinar: Beyond Demonstration Projects: How Universities Can Use Mid-Scale Solar

    EPA Pesticide Factsheets

    The Green Power Partnership hosts webinars on a regular basis that explore a variety of topics. This webinar provided a forum to learn about new, not-yet-built renewable energy projects that may align with their energy, environmental, and financial object

  9. Capital and Operating Costs of Small Arsenic Removal Adsorptive Media Systems

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) conducted 50 full-scale demonstration projects on treatment systems removing arsenic from drinking water in 26 states throughout the U.S. The projects were conducted to evaluate the performance, reliability, and cost of arsenic remo...

  10. Skate Genome Project: Cyber-Enabled Bioinformatics Collaboration

    PubMed Central

    Vincent, J.

    2011-01-01

    The Skate Genome Project, a pilot project of the North East Cyber infrastructure Consortium, aims to produce a draft genome sequence of Leucoraja erinacea, the Little Skate. The pilot project was designed to also develop expertise in large scale collaborations across the NECC region. An overview of the bioinformatics and infrastructure challenges faced during the first year of the project will be presented. Results to date and lessons learned from the perspective of a bioinformatics core will be highlighted.

  11. Combining points and lines in rectifying satellite images

    NASA Astrophysics Data System (ADS)

    Elaksher, Ahmed F.

    2017-09-01

    The quick advance in remote sensing technologies established the potential to gather accurate and reliable information about the Earth surface using high resolution satellite images. Remote sensing satellite images of less than one-meter pixel size are currently used in large-scale mapping. Rigorous photogrammetric equations are usually used to describe the relationship between the image coordinates and ground coordinates. These equations require the knowledge of the exterior and interior orientation parameters of the image that might not be available. On the other hand, the parallel projection transformation could be used to represent the mathematical relationship between the image-space and objectspace coordinate systems and provides the required accuracy for large-scale mapping using fewer ground control features. This article investigates the differences between point-based and line-based parallel projection transformation models in rectifying satellite images with different resolutions. The point-based parallel projection transformation model and its extended form are presented and the corresponding line-based forms are developed. Results showed that the RMS computed using the point- or line-based transformation models are equivalent and satisfy the requirement for large-scale mapping. The differences between the transformation parameters computed using the point- and line-based transformation models are insignificant. The results showed high correlation between the differences in the ground elevation and the RMS.

  12. Reducing HIV infection among new injecting drug users in the China-Vietnam Cross Border Project.

    PubMed

    Des Jarlais, Don C; Kling, Ryan; Hammett, Theodore M; Ngu, Doan; Liu, Wei; Chen, Yi; Binh, Kieu Thanh; Friedmann, Patricia

    2007-12-01

    To assess an HIV prevention programme for injecting drug users (IDU) in the crossborder area between China and Vietnam. Serial cross-sectional surveys (0, 6, 12, 18, 24 and 36 months) of community-recruited current IDU. The project included peer educator outreach and the large-scale distribution of sterile injection equipment. Serial cross-sectional surveys with HIV testing of community recruited IDU were conducted at baseline (before implementation) and 6, 12, 18, 24 and 36 months post-baseline. HIV prevalence and estimated HIV incidence among new injectors (individuals injecting drugs for < 3 years) in each survey wave were the primary outcome measures. The percentages of new injectors among all subjects declined across each survey waves in both Ning Ming and Lang Son. HIV prevalence and estimated incidence fell by approximately half at the 24-month survey and by approximately three quarters at the 36-month survey in both areas (all P < 0.01). The implementation of large-scale outreach and syringe access programmes was followed by substantial reductions in HIV infection among new injectors, with no evidence of any increase in individuals beginning to inject drugs. This project may serve as a model for large-scale HIV prevention programming for IDU in China, Vietnam, and other developing/transitional countries.

  13. Close Range Calibration of Long Focal Length Lenses in a Changing Environment

    NASA Astrophysics Data System (ADS)

    Robson, Stuart; MacDonald, Lindsay; Kyle, Stephen; Shortis, Mark R.

    2016-06-01

    University College London is currently developing a large-scale multi-camera system for dimensional control tasks in manufacturing, including part machining, assembly and tracking, as part of the Light Controlled Factory project funded by the UK Engineering and Physical Science Research Council. In parallel, as part of the EU LUMINAR project funded by the European Association of National Metrology Institutes, refraction models of the atmosphere in factory environments are being developed with the intent of modelling and eliminating the effects of temperature and other variations. The accuracy requirements for both projects are extremely demanding, so accordingly improvements in the modelling of both camera imaging and the measurement environment are essential. At the junction of these two projects lies close range camera calibration. The accurate and reliable calibration of cameras across a realistic range of atmospheric conditions in the factory environment is vital in order to eliminate systematic errors. This paper demonstrates the challenge of experimentally isolating environmental effects at the level of a few tens of microns. Longer lines of sight promote the use and calibration of a near perfect perspective projection from a Kern 75mm lens with maximum radial distortion of the order of 0.5m. Coordination of a reference target array, representing a manufactured part, is achieved to better than 0.1mm at a standoff of 8m. More widely, results contribute to better sensor understanding, improved mathematical modelling of factory environments and more reliable coordination of targets to 0.1mm and better over large volumes.

  14. Pilot-Scale Biorefinery: Sustainable Transport Fuels from Biomass via Integrated Pyrolysis and Catalytic Hydroconversion - Wastewater Cleanup by Catalytic Hydrothermal Gasification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Douglas C.; Olarte, Mariefel V.; Hart, Todd R.

    2015-06-19

    DOE-EE Bioenergy Technologies Office has set forth several goals to increase the use of bioenergy and bioproducts derived from renewable resources. One of these goals is to facilitate the implementation of the biorefinery. The biorefinery will include the production of liquid fuels, power and, in some cases, products. The integrated biorefinery should stand-alone from an economic perspective with fuels and power driving the economy of scale while the economics/profitability of the facility will be dependent on existing market conditions. UOP LLC proposed to demonstrate a fast pyrolysis based integrated biorefinery. Pacific Northwest National Laboratory (PNNL) has expertise in an importantmore » technology area of interest to UOP for use in their pyrolysis-based biorefinery. This CRADA project provides the supporting technology development and demonstration to allow incorporation of this technology into the biorefinery. PNNL developed catalytic hydrothermal gasification (CHG) for use with aqueous streams within the pyrolysis biorefinery. These aqueous streams included the aqueous phase separated from the fast pyrolysis bio-oil and the aqueous byproduct streams formed in the hydroprocessing of the bio-oil to finished products. The purpose of this project was to demonstrate a technically and economically viable technology for converting renewable biomass feedstocks to sustainable and fungible transportation fuels. To demonstrate the technology, UOP constructed and operated a pilot-scale biorefinery that processed one dry ton per day of biomass using fast pyrolysis. Specific objectives of the project were to: The anticipated outcomes of the project were a validated process technology, a range of validated feedstocks, product property and Life Cycle data, and technical and operating data upon which to base the design of a full-scale biorefinery. The anticipated long-term outcomes from successful commercialization of the technology were: (1) the replacement of a significant fraction of petroleum based fuels with advanced biofuels, leading to increased energy security and decreased carbon footprint; and (2) establishment of a new biofuel industry segment, leading to the creation of U.S. engineering, manufacturing, construction, operations and agricultural jobs. PNNL development of CHG progressed at two levels. Initial tests were made in the laboratory in both mini-scale and bench-scale continuous flow reactor systems. Following positive results, the next level of evaluation was in the scaled-up engineering development system, which was operated at PNNL.« less

  15. Dual Arm Work Platform teleoperated robotics system. Innovative technology summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The US Department of Energy (DOE) and the Federal Energy Technology Center (FETC) has developed a Large Scale Demonstration Project (LSDP) at the Chicago Pile-5 Research Reactor (CP-5) at Argonne National Laboratory-East (ANL). The objective of the LSDP is to demonstrate potentially beneficial Deactivation and Decommissioning (D and D) technologies in comparison with current baseline technologies. The Dual Arm Work Platform (DAWP) demonstration focused on the use of the DAWP to segment and dismantle the CP-5 reactor tank and surrounding bio-shield components (including the graphite block reflector, lead and boral sheeting) and performing some minor tasks best suited for themore » use of teleoperated robotics that were not evaluated in this demonstration. The DAWP system is not a commercially available product at this time. The CP-5 implementation was its first D and D application. The demonstration of the DAWP was to determine the areas on which improvements must be made to make this technology commercially viable. The results of the demonstration are included in this greenbook. It is the intention of the developers to incorporate lessons learned at this demonstration and current technological advancements in robotics into the next generation of the DAWP.« less

  16. Forensic Schedule Analysis of Construction Delay in Military Projects in the Middle East

    DTIC Science & Technology

    This research performs forensic schedule analysis of delay factors that impacted recent large-scale military construction projects in the Middle East...The methodologies for analysis are adapted from the Professional Practice Guide to Forensic Schedule Analysis, particularly Method 3.7 Modeled

  17. CRP: Collaborative Research Project (A Mathematical Research Experience for Undergraduates)

    ERIC Educational Resources Information Center

    Parsley, Jason; Rusinko, Joseph

    2017-01-01

    The "Collaborative Research Project" ("CRP")--a mathematics research experience for undergraduates--offers a large-scale collaborative experience in research for undergraduate students. CRP seeks to widen the audience of students who participate in undergraduate research in mathematics. In 2015, the inaugural CRP had 100…

  18. Testing the DQP: What Was Learned about Learning Outcomes?

    ERIC Educational Resources Information Center

    Ickes, Jessica L.; Flowers, Daniel R.

    2015-01-01

    Through a campuswide project using the Degree Qualifications Profile (DQP) as a comparison tool that engaged students and faculty, the authors share findings and implications about learning outcomes for IR professionals and DQP authors while considering the role of IR in large-scale, campuswide projects.

  19. Book Review: Large-Scale Ecosystem Restoration: Five Case Studies from the United States

    EPA Science Inventory

    Broad-scale ecosystem restoration efforts involve a very complex set of ecological and societal components, and the success of any ecosystem restoration project rests on an integrated approach to implementation. Editors Mary Doyle and Cynthia Drew have successfully synthesized ma...

  20. A sparsity-based iterative algorithm for reconstruction of micro-CT images from highly undersampled projection datasets obtained with a synchrotron X-ray source

    NASA Astrophysics Data System (ADS)

    Melli, S. Ali; Wahid, Khan A.; Babyn, Paul; Cooper, David M. L.; Gopi, Varun P.

    2016-12-01

    Synchrotron X-ray Micro Computed Tomography (Micro-CT) is an imaging technique which is increasingly used for non-invasive in vivo preclinical imaging. However, it often requires a large number of projections from many different angles to reconstruct high-quality images leading to significantly high radiation doses and long scan times. To utilize this imaging technique further for in vivo imaging, we need to design reconstruction algorithms that reduce the radiation dose and scan time without reduction of reconstructed image quality. This research is focused on using a combination of gradient-based Douglas-Rachford splitting and discrete wavelet packet shrinkage image denoising methods to design an algorithm for reconstruction of large-scale reduced-view synchrotron Micro-CT images with acceptable quality metrics. These quality metrics are computed by comparing the reconstructed images with a high-dose reference image reconstructed from 1800 equally spaced projections spanning 180°. Visual and quantitative-based performance assessment of a synthetic head phantom and a femoral cortical bone sample imaged in the biomedical imaging and therapy bending magnet beamline at the Canadian Light Source demonstrates that the proposed algorithm is superior to the existing reconstruction algorithms. Using the proposed reconstruction algorithm to reduce the number of projections in synchrotron Micro-CT is an effective way to reduce the overall radiation dose and scan time which improves in vivo imaging protocols.

  1. Preliminary measurement of the noise from the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, J. H.

    1985-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken into the NASA Lewis 8- by 6-Foot Wind Tunnel. The maximum blade passing tone decreases from the peak level when going to higher helical tip Mach numbers. This noise reduction points to the use of higher propeller speeds as a possible method to reduce airplane cabin noise while maintaining high flight speed and efficiency. Comparison of the SR-7A blade passing noise with the noise of the similarly designed SR-3 propeller shows good agreement as expected. The SR-7A propeller is slightly noisier than the SR-3 model in the plane of rotation at the cruise condition. Projections of the tunnel model data are made to the full-scale LAP propeller mounted on the test bed aircraft and compared with design predictions. The prediction method is conservative in the sense that it overpredicts the projected model data.

  2. Cruise noise of the 2/9th scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Stang, David B.

    1987-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.

  3. Cruise noise of the 2/9 scale model of the Large-scale Advanced Propfan (LAP) propeller, SR-7A

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Stang, David B.

    1987-01-01

    Noise data on the Large-scale Advanced Propfan (LAP) propeller model SR-7A were taken in the NASA Lewis Research Center 8 x 6 foot Wind Tunnel. The maximum blade passing tone noise first rises with increasing helical tip Mach number to a peak level, then remains the same or decreases from its peak level when going to higher helical tip Mach numbers. This trend was observed for operation at both constant advance ratio and approximately equal thrust. This noise reduction or, leveling out at high helical tip Mach numbers, points to the use of higher propeller tip speeds as a possible method to limit airplane cabin noise while maintaining high flight speed and efficiency. Projections of the tunnel model data are made to the full scale LAP propeller mounted on the test bed aircraft and compared with predictions. The prediction method is found to be somewhat conservative in that it slightly overpredicts the projected model data at the peak.

  4. Large-scale serum protein biomarker discovery in Duchenne muscular dystrophy.

    PubMed

    Hathout, Yetrib; Brody, Edward; Clemens, Paula R; Cripe, Linda; DeLisle, Robert Kirk; Furlong, Pat; Gordish-Dressman, Heather; Hache, Lauren; Henricson, Erik; Hoffman, Eric P; Kobayashi, Yvonne Monique; Lorts, Angela; Mah, Jean K; McDonald, Craig; Mehler, Bob; Nelson, Sally; Nikrad, Malti; Singer, Britta; Steele, Fintan; Sterling, David; Sweeney, H Lee; Williams, Steve; Gold, Larry

    2015-06-09

    Serum biomarkers in Duchenne muscular dystrophy (DMD) may provide deeper insights into disease pathogenesis, suggest new therapeutic approaches, serve as acute read-outs of drug effects, and be useful as surrogate outcome measures to predict later clinical benefit. In this study a large-scale biomarker discovery was performed on serum samples from patients with DMD and age-matched healthy volunteers using a modified aptamer-based proteomics technology. Levels of 1,125 proteins were quantified in serum samples from two independent DMD cohorts: cohort 1 (The Parent Project Muscular Dystrophy-Cincinnati Children's Hospital Medical Center), 42 patients with DMD and 28 age-matched normal volunteers; and cohort 2 (The Cooperative International Neuromuscular Research Group, Duchenne Natural History Study), 51 patients with DMD and 17 age-matched normal volunteers. Forty-four proteins showed significant differences that were consistent in both cohorts when comparing DMD patients and healthy volunteers at a 1% false-discovery rate, a large number of significant protein changes for such a small study. These biomarkers can be classified by known cellular processes and by age-dependent changes in protein concentration. Our findings demonstrate both the utility of this unbiased biomarker discovery approach and suggest potential new diagnostic and therapeutic avenues for ameliorating the burden of DMD and, we hope, other rare and devastating diseases.

  5. A streamlined collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, exemplified by the Indonesian Biodiversity Discovery and Information System (IndoBioSys)

    PubMed Central

    Hausmann, Axel; Cancian de Araujo, Bruno; Sutrisno, Hari; Peggie, Djunijanti; Schmidt, Stefan

    2017-01-01

    Abstract Here we present a general collecting and preparation protocol for DNA barcoding of Lepidoptera as part of large-scale rapid biodiversity assessment projects, and a comparison with alternative preserving and vouchering methods. About 98% of the sequenced specimens processed using the present collecting and preparation protocol yielded sequences with more than 500 base pairs. The study is based on the first outcomes of the Indonesian Biodiversity Discovery and Information System (IndoBioSys). IndoBioSys is a German-Indonesian research project that is conducted by the Museum für Naturkunde in Berlin and the Zoologische Staatssammlung München, in close cooperation with the Research Center for Biology – Indonesian Institute of Sciences (RCB-LIPI, Bogor). PMID:29134041

  6. Large laser projection displays utilizing all-solid-state RGB lasers

    NASA Astrophysics Data System (ADS)

    Xu, Zuyan; Bi, Yong

    2005-01-01

    RGB lasers projection displays have the advantages of producing large color triangle, high color saturation and high image resolution. In this report, with more than 4W white light synthesized by red (671nm), green (532nm) and blue (473nm) lasers, a RGB laser projection display system based on diode pumped solid-state lasers is developed and the performance of brilliant and vivid DVD dynamitic pictures on 60 inch screen is demonstrated.

  7. Projecting future impacts of hurricanes on the carbon balance of eastern U.S. forests

    NASA Astrophysics Data System (ADS)

    Fisk, J. P.; Hurtt, G. C.; Chambers, J. Q.; Zeng, H.; Dolan, K.; Flanagan, S.; Rourke, O.; Negron Juarez, R. I.

    2011-12-01

    In U.S. Atlantic coastal areas, hurricanes are a principal agent of catastrophic wind damage, with dramatic impacts on the structure and functioning of forests. Substantial recent progress has been made to estimate the biomass loss and resulting carbon emissions caused by hurricanes impacting the U.S. Additionally, efforts to evaluate the net effects of hurricanes on the regional carbon balance have demonstrated the importance of viewing large disturbance events in the broader context of recovery from a mosaic of past events. Viewed over sufficiently long time scales and large spatial scales, regrowth from previous storms may largely offset new emissions; however, changes in number, strength or spatial distribution of extreme disturbance events will result in changes to the equilibrium state of the ecosystem and have the potential to result in a lasting carbon source or sink. Many recent studies have linked climate change to changes in the frequency and intensity of hurricanes. In this study, we use a mechanistic ecosystem model, the Ecosystem Demography (ED) model, driven by scenarios of future hurricane activity based on historic activity and future climate projections, to evaluate how changes in hurricane frequency, intensity and spatial distribution could affect regional carbon storage and flux over the coming century. We find a non-linear response where increased storm activity reduces standing biomass stocks reducing the impacts of future events. This effect is highly dependent on the spatial pattern and repeat interval of future hurricane activity. Developing this kind of predictive modeling capability that tracks disturbance events and recovery is key to our understanding and ability to predict the carbon balance of forests.

  8. Integrative, multimodal analysis of glioblastoma using TCGA molecular data, pathology images, and clinical outcomes.

    PubMed

    Kong, Jun; Cooper, Lee A D; Wang, Fusheng; Gutman, David A; Gao, Jingjing; Chisolm, Candace; Sharma, Ashish; Pan, Tony; Van Meir, Erwin G; Kurc, Tahsin M; Moreno, Carlos S; Saltz, Joel H; Brat, Daniel J

    2011-12-01

    Multimodal, multiscale data synthesis is becoming increasingly critical for successful translational biomedical research. In this letter, we present a large-scale investigative initiative on glioblastoma, a high-grade brain tumor, with complementary data types using in silico approaches. We integrate and analyze data from The Cancer Genome Atlas Project on glioblastoma that includes novel nuclear phenotypic data derived from microscopic slides, genotypic signatures described by transcriptional class and genetic alterations, and clinical outcomes defined by response to therapy and patient survival. Our preliminary results demonstrate numerous clinically and biologically significant correlations across multiple data types, revealing the power of in silico multimodal data integration for cancer research.

  9. Telecommunications model for continuing education of health professionals: the Royal Brompton case.

    PubMed

    Kotis, Takis

    2003-01-01

    Telemedicine is said to be helpful to both patients and providers, but we need real-world examples to demonstrate its effectiveness. This paper presents such an example. Royal Brompton, under the Tele-remedy Program of EC Telecom, conducted a project with the Children's Hospital of Athens, Greece, to provide remote diagnosis management and continuing education for heart disease, using European ISDN technology. Preliminary results showed that, when carried out in a large scale multi-site environment, Teleremedy program significantly reduced geographic and socio-economic isolation for the patient and the professional isolation for the physician. Comparison of original vs. transmitted data revealed no significant differences, with diagnosis accuracy of 100%.

  10. Wind energy - A utility perspective

    NASA Astrophysics Data System (ADS)

    Fung, K. T.; Scheffler, R. L.; Stolpe, J.

    1981-03-01

    Broad consideration is given to the siting, demand, capital and operating cost and wind turbine design factors involved in a utility company's incorporation of wind powered electrical generation into existing grids. With the requirements of the Southern California Edison service region in mind, it is concluded that although the economic and legal climate for major investments in windpower are favorable, the continued development of large only wind turbine machines (on the scale of NASA's 2.5 MW Mod-2 design) is imperative in order to reduce manpower and maintenance costs. Stress is also put on the use of demonstration projects for both vertical and horizontal axis devices, in order to build up operational experience and confidence.

  11. WET-NZ Multi-Mode Wave Energy Converter Advancement Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopf, Steven

    2013-10-15

    The overall objective of the project was to verify the ocean wavelength functionality of the WET-NZ through targeted hydrodynamic testing at wave tank scale and controlled open sea deployment of a 1/2 scale (1:2) experimental device. This objective was accomplished through a series of tasks designed to achieve four specific goals: Wave Tank Testing to Characterize Hydrodynamic Characteristics;  Open-Sea Testing of a New 1:2 Scale Experimental Model;  Synthesis and Analysis to Demonstrate and Confirm TRL5/6 Status;  Market Impact & Competitor Analysis, Business Plan and Commercialization Strategy.

  12. Coronal mass ejection and solar flare initiation processes without appreciable

    NASA Astrophysics Data System (ADS)

    Veselovsky, I.

    TRACE and SOHO/EIT movies clearly show the cases of the coronal mass ejection and solar flare initiations without noticeable large-scale topology modifications in observed features. Instead of this, the appearance of new intermediate scales is often omnipresent in the erupting region structures when the overall configuration is preserved. Examples of this kind are presented and discussed in the light of the existing magnetic field reconnection paradigms. It is demonstrated that spurious large-scale reconnections and detachments are often produced due to the projection effects in poorly resolved images of twisted loops and sheared arcades especially when deformed parts of them are underexposed and not seen in the images only because of this reason. Other parts, which are normally exposed or overexposed, can make the illusion of "islands" or detached elements in these situations though in reality they preserve the initial magnetic connectivity. Spurious "islands" of this kind could be wrongly interpreted as signatures of topological transitions in the large-scale magnetic fields in many instances described in the vast literature in the past based mainly on fuzzy YOHKOH images, which resulted in the myth about universal solar flare models and the scenario of detached magnetic island formations with new null points in the large scale magnetic field. The better visualization with higher resolution and sensitivity limits allowed to clarify this confusion and to avoid this unjustified interpretation. It is concluded that topological changes obviously can happen in the coronal magnetic fields, but these changes are not always necessary ingredients at least of all coronal mass ejections and solar flares. The scenario of the magnetic field opening is not universal for all ejections. Otherwise, expanding ejections with closed magnetic configurations can be produced by the fast E cross B drifts in strong inductive electric fields, which appear due to the emergence of the new magnetic flux. Corresponding theoretical models are presented and discussed.

  13. High-fidelity operations in microfabricated surface ion traps

    NASA Astrophysics Data System (ADS)

    Maunz, Peter

    2017-04-01

    Trapped ion systems can be used to implement quantum computation as well as quantum simulation. To scale these systems to the number of qubits required to solve interesting problems in quantum chemistry or solid state physics, the use of large multi-zone ion traps has been proposed. Microfabrication enables the realization of surface electrode ion traps with complex electrode structures. While these traps may enable the scaling of trapped ion quantum information processing (QIP), microfabricated ion traps also pose several technical challenges. Here, we present Sandia's trap fabrication capabilities and characterize trap properties and shuttling operations in our most recent high optical access trap (HOA-2). To demonstrate the viability of Sandia's microfabricated ion traps for QIP we realize robust single and two-qubit gates and characterize them using gate set tomography (GST). In this way we are able to demonstrate the first single qubit gates with a diamond norm of less than 1 . 7 ×10-4 , below a rigorous fault tolerance threshold for general noise of 6 . 7 ×10-4. Furthermore, we realize Mølmer-Sørensen two qubit gates with a process fidelity of 99 . 58(6) % also characterized by GST. These results demonstrate the viability of microfabricated surface traps for state of the art quantum information processing demonstrations. This research was funded, in part, by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA).

  14. Identification and Functional Prediction of Large Intergenic Noncoding RNAs (lincRNAs) in Rainbow Trout (Oncorhynchus mykiss)

    USDA-ARS?s Scientific Manuscript database

    Long noncoding RNAs (lncRNAs) have been recognized in recent years as key regulators of diverse cellular processes. Genome-wide large-scale projects have uncovered thousands of lncRNAs in many model organisms. Large intergenic noncoding RNAs (lincRNAs) are lncRNAs that are transcribed from intergeni...

  15. Design of dual multiple aperture devices for dynamical fluence field modulated CT.

    PubMed

    Mathews, Aswin John; Tilley, Steven; Gang, Grace; Kawamoto, Satomi; Zbijewski, Wojciech; Siewerdsen, Jeffrey H; Levinson, Reuven; Webster Stayman, J

    2016-07-01

    A Multiple Aperture Device (MAD) is a novel x-ray beam modulator that uses binary filtration on a fine scale to spatially modulate an x-ray beam. Using two MADs in series enables a large variety of fluence profiles by shifting the MADS relative to each other. This work details the design and control of dual MADs for a specific class of desired fluence patterns. Specifically, models of MAD operation are integrated into a best fit objective followed by CMA-ES optimization. To illustrate this framework we demonstrate the design process for an abdominal phantom with the goal of uniform detected signal. Achievable fluence profiles show good agreement with target fluence profiles, and the ability to flatten projections when a phantom is scanned is demonstrated. Simulated data reconstruction using traditional tube current modulation (TCM) and MAD filtering with TCM are investigated with the dual MAD system demonstrating more uniformity in noise and illustrating the potential for dose reduction under a maximum noise level constraint.

  16. Final test results for the ground operations demonstration unit for liquid hydrogen

    NASA Astrophysics Data System (ADS)

    Notardonato, W. U.; Swanger, A. M.; Fesmire, J. E.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    Described herein is a comprehensive project-a large-scale test of an integrated refrigeration and storage system called the Ground Operations and Demonstration Unit for Liquid Hydrogen (GODU LH2), sponsored by the Advanced Exploration Systems Program and constructed at Kennedy Space Center. A commercial cryogenic refrigerator interfaced with a 125,000 l liquid hydrogen tank and auxiliary systems in a manner that enabled control of the propellant state by extracting heat via a closed loop Brayton cycle refrigerator coupled to a novel internal heat exchanger. Three primary objectives were demonstrating zero-loss storage and transfer, gaseous liquefaction, and propellant densification. Testing was performed at three different liquid hydrogen fill-levels. Data were collected on tank pressure, internal tank temperature profiles, mass flow in and out of the system, and refrigeration system performance. All test objectives were successfully achieved during approximately two years of testing. A summary of the final results is presented in this paper.

  17. Strengthening Software Authentication with the ROSE Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less

  18. Precision calculations of the cosmic shear power spectrum projection

    NASA Astrophysics Data System (ADS)

    Kilbinger, Martin; Heymans, Catherine; Asgari, Marika; Joudaki, Shahab; Schneider, Peter; Simon, Patrick; Van Waerbeke, Ludovic; Harnois-Déraps, Joachim; Hildebrandt, Hendrik; Köhlinger, Fabian; Kuijken, Konrad; Viola, Massimo

    2017-12-01

    We compute the spherical-sky weak-lensing power spectrum of the shear and convergence. We discuss various approximations, such as flat-sky, and first- and second-order Limber equations for the projection. We find that the impact of adopting these approximations is negligible when constraining cosmological parameters from current weak-lensing surveys. This is demonstrated using data from the Canada-France-Hawaii Telescope Lensing Survey. We find that the reported tension with Planck cosmic microwave background temperature anisotropy results cannot be alleviated. For future large-scale surveys with unprecedented precision, we show that the spherical second-order Limber approximation will provide sufficient accuracy. In this case, the cosmic-shear power spectrum is shown to be in agreement with the full projection at the sub-percent level for ℓ > 3, with the corresponding errors an order of magnitude below cosmic variance for all ℓ. When computing the two-point shear correlation function, we show that the flat-sky fast Hankel transformation results in errors below two percent compared to the full spherical transformation. In the spirit of reproducible research, our numerical implementation of all approximations and the full projection are publicly available within the package NICAEA at http://www.cosmostat.org/software/nicaea.

  19. Regeneration of Full Scale Adsorptive Media Systems - Update

    EPA Science Inventory

    Presentation provides a short summary of the USEPA arsenic demonstration program followed by some results of lab and pilot tests on the regeneration of a number of exhausted media products collected from several demonstration projects. Following this short introduction, the pres...

  20. Bench Scale Process for Low Cost CO 2 Capture Using a Phase-Changing Absorbent: Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westendorf, Tiffany; Buddle, Stanlee; Caraher, Joel

    The objective of this project is to design and build a bench-scale process for a novel phase-changing aminosilicone-based CO 2-capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO 2-capture absorbent for post-combustion capture of CO 2 from coal-fired power plants. The U.S. Department of Energy’s goal for Transformational Carbon Capture Technologies is the development of technologies available for demonstration by 2025 that can capture 90% of emitted CO 2 with at least 95% CO 2 purity for less than $40/tonne of CO 2 captured. In the first budget period of the project,more » the bench-scale phase-changing CO2 capture process was designed using data and operating experience generated under a previous project (ARPA-e project DE-AR0000084). Sizing and specification of all major unit operations was completed, including detailed process and instrumentation diagrams. The system was designed to operate over a wide range of operating conditions to allow for exploration of the effect of process variables on CO 2 capture performance. In the second budget period of the project, individual bench-scale unit operations were tested to determine the performance of each of each unit. Solids production was demonstrated in dry simulated flue gas across a wide range of absorber operating conditions, with single stage CO 2 conversion rates up to 75mol%. Desorber operation was demonstrated in batch mode, resulting in desorption performance consistent with the equilibrium isotherms for GAP-0/CO 2 reaction. Important risks associated with gas humidity impact on solids consistency and desorber temperature impact on thermal degradation were explored, and adjustments to the bench-scale process were made to address those effects. Corrosion experiments were conducted to support selection of suitable materials of construction for the major unit operations in the process. The bench scale unit operations were assembled into a continuous system to support steady state system testing. In the third budget period of the project, continuous system testing was conducted, including closed-loop operation of the absorber and desober systems. Slurries of GAP-0/GAP-0 carbamate/water mixtures produced in the absorber were pumped successfully to the desorber unit, and regenerated solvent was returned to the absorber. A techno-economic analysis, EH&S risk assessment, and solvent manufacturability study were completed.« less

  1. A theory of forest dynamics: Spatially explicit models and issues of scale

    NASA Technical Reports Server (NTRS)

    Pacala, S.

    1990-01-01

    Good progress has been made in the first year of DOE grant (number sign) FG02-90ER60933. The purpose of the project is to develop and investigate models of forest dynamics that apply across a range of spatial scales. The grant is one third of a three-part project. The second third was funded by the NSF this year and is intended to provide the empirical data necessary to calibrate and test small-scale (less than or equal to 1000 ha) models. The final third was also funded this year (NASA), and will provide data to calibrate and test the large-scale features of the models.

  2. A family of conjugate gradient methods for large-scale nonlinear equations.

    PubMed

    Feng, Dexiang; Sun, Min; Wang, Xueyong

    2017-01-01

    In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.

  3. The African Genome Variation Project shapes medical genetics in Africa

    NASA Astrophysics Data System (ADS)

    Gurdasani, Deepti; Carstensen, Tommy; Tekola-Ayele, Fasil; Pagani, Luca; Tachmazidou, Ioanna; Hatzikotoulas, Konstantinos; Karthikeyan, Savita; Iles, Louise; Pollard, Martin O.; Choudhury, Ananyo; Ritchie, Graham R. S.; Xue, Yali; Asimit, Jennifer; Nsubuga, Rebecca N.; Young, Elizabeth H.; Pomilla, Cristina; Kivinen, Katja; Rockett, Kirk; Kamali, Anatoli; Doumatey, Ayo P.; Asiki, Gershim; Seeley, Janet; Sisay-Joof, Fatoumatta; Jallow, Muminatou; Tollman, Stephen; Mekonnen, Ephrem; Ekong, Rosemary; Oljira, Tamiru; Bradman, Neil; Bojang, Kalifa; Ramsay, Michele; Adeyemo, Adebowale; Bekele, Endashaw; Motala, Ayesha; Norris, Shane A.; Pirie, Fraser; Kaleebu, Pontiano; Kwiatkowski, Dominic; Tyler-Smith, Chris; Rotimi, Charles; Zeggini, Eleftheria; Sandhu, Manjinder S.

    2015-01-01

    Given the importance of Africa to studies of human origins and disease susceptibility, detailed characterization of African genetic diversity is needed. The African Genome Variation Project provides a resource with which to design, implement and interpret genomic studies in sub-Saharan Africa and worldwide. The African Genome Variation Project represents dense genotypes from 1,481 individuals and whole-genome sequences from 320 individuals across sub-Saharan Africa. Using this resource, we find novel evidence of complex, regionally distinct hunter-gatherer and Eurasian admixture across sub-Saharan Africa. We identify new loci under selection, including loci related to malaria susceptibility and hypertension. We show that modern imputation panels (sets of reference genotypes from which unobserved or missing genotypes in study sets can be inferred) can identify association signals at highly differentiated loci across populations in sub-Saharan Africa. Using whole-genome sequencing, we demonstrate further improvements in imputation accuracy, strengthening the case for large-scale sequencing efforts of diverse African haplotypes. Finally, we present an efficient genotype array design capturing common genetic variation in Africa.

  4. KSC-2009-6453

    NASA Image and Video Library

    2009-11-19

    CAPE CANAVERAL, Fla. – A ceremonial "flipping of the switch" officially begins operation of NASA's first large-scale solar power generation facility at NASA's Kennedy Space Center in Florida. Flipping the four-foot-tall light switch in unison are, from left, Bob Cabana, Kennedy center director; Roderick Roche, senior manager, Project Management Office of North America, SunPower Corporation; and Eric Silagy, Florida Power & Light Company vice president and chief development officer. Representatives from NASA, Florida Power & Light Company, or FPL, and SunPower Corporation formally commissioned the one-megawatt facility and announced plans to pursue a new research, development and demonstration project at Kennedy to advance America's use of renewable energy. The facility is the first element of a major renewable energy project currently under construction at Kennedy. The completed system features a fixed-tilt, ground-mounted solar power system designed and built by SunPower, along with SunPower solar panels. A 10-megawatt solar farm, which SunPower is building on nearby Kennedy property, will supply power to FPL's customers when it is completed in April 2010. Photo credit: NASA/Jim Grossmann

  5. The African Genome Variation Project shapes medical genetics in Africa.

    PubMed

    Gurdasani, Deepti; Carstensen, Tommy; Tekola-Ayele, Fasil; Pagani, Luca; Tachmazidou, Ioanna; Hatzikotoulas, Konstantinos; Karthikeyan, Savita; Iles, Louise; Pollard, Martin O; Choudhury, Ananyo; Ritchie, Graham R S; Xue, Yali; Asimit, Jennifer; Nsubuga, Rebecca N; Young, Elizabeth H; Pomilla, Cristina; Kivinen, Katja; Rockett, Kirk; Kamali, Anatoli; Doumatey, Ayo P; Asiki, Gershim; Seeley, Janet; Sisay-Joof, Fatoumatta; Jallow, Muminatou; Tollman, Stephen; Mekonnen, Ephrem; Ekong, Rosemary; Oljira, Tamiru; Bradman, Neil; Bojang, Kalifa; Ramsay, Michele; Adeyemo, Adebowale; Bekele, Endashaw; Motala, Ayesha; Norris, Shane A; Pirie, Fraser; Kaleebu, Pontiano; Kwiatkowski, Dominic; Tyler-Smith, Chris; Rotimi, Charles; Zeggini, Eleftheria; Sandhu, Manjinder S

    2015-01-15

    Given the importance of Africa to studies of human origins and disease susceptibility, detailed characterization of African genetic diversity is needed. The African Genome Variation Project provides a resource with which to design, implement and interpret genomic studies in sub-Saharan Africa and worldwide. The African Genome Variation Project represents dense genotypes from 1,481 individuals and whole-genome sequences from 320 individuals across sub-Saharan Africa. Using this resource, we find novel evidence of complex, regionally distinct hunter-gatherer and Eurasian admixture across sub-Saharan Africa. We identify new loci under selection, including loci related to malaria susceptibility and hypertension. We show that modern imputation panels (sets of reference genotypes from which unobserved or missing genotypes in study sets can be inferred) can identify association signals at highly differentiated loci across populations in sub-Saharan Africa. Using whole-genome sequencing, we demonstrate further improvements in imputation accuracy, strengthening the case for large-scale sequencing efforts of diverse African haplotypes. Finally, we present an efficient genotype array design capturing common genetic variation in Africa.

  6. Systems Engineering Provides Successful High Temperature Steam Electrolysis Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles V. Park; Emmanuel Ohene Opare, Jr.

    2011-06-01

    This paper describes two Systems Engineering Studies completed at the Idaho National Laboratory (INL) to support development of the High Temperature Stream Electrolysis (HTSE) process. HTSE produces hydrogen from water using nuclear power and was selected by the Department of Energy (DOE) for integration with the Next Generation Nuclear Plant (NGNP). The first study was a reliability, availability and maintainability (RAM) analysis to identify critical areas for technology development based on available information regarding expected component performance. An HTSE process baseline flowsheet at commercial scale was used as a basis. The NGNP project also established a process and capability tomore » perform future RAM analyses. The analysis identified which components had the greatest impact on HTSE process availability and indicated that the HTSE process could achieve over 90% availability. The second study developed a series of life-cycle cost estimates for the various scale-ups required to demonstrate the HTSE process. Both studies were useful in identifying near- and long-term efforts necessary for successful HTSE process deployment. The size of demonstrations to support scale-up was refined, which is essential to estimate near- and long-term cost and schedule. The life-cycle funding profile, with high-level allocations, was identified as the program transitions from experiment scale R&D to engineering scale demonstration.« less

  7. gram-scale metafluids and large area tunable metamaterials: design, fabrication, and nano-optical tomographic characterization (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Dionne, Jennifer A.

    2016-09-01

    Advances in metamaterials and metasurfaces have enabled unprecedented control of light-matter interactions. Metamaterial constituents support high-frequency electric and magnetic dipoles, which can be used as building blocks for new materials capable of negative refraction, electromagnetic cloaking, strong visible-frequency circular dichroism, and enhanced magnetic or chiral transitions in ions and molecules. However, most metamaterials to date have been limited to solid-state, static, narrow-band, and/or small-area structures. Here, we introduce the design, fabrication, and three-dimensional nano-optical characterization of large-area, dynamically-tunable metamaterials and gram-scale metafluids. First, we use transformation optics to design a broadband metamaterial constituent - a metallo-dielectric nanocrescent - characterized by degenerate electric and magnetic dipoles. A periodic array of nanocrescents exhibits large positive and negative refractive indices at optical frequencies, confirmed through simulations of plane wave refraction through a metamaterial prism. Simulations also reveal that the metamaterial optical properties are largely insensitive to the wavelength, orientation and polarization of incident light. Then, we introduce a new tomographic technique, cathodoluminescence (CL) spectroscopic tomography, to probe light-matter interactions in individual nanocrescents with nanometer-scale resolution. Two-dimensional CL maps of the three-dimensional nanostructure are obtained at various orientations, while a filtered back projection is used to reconstruct the CL intensity at each wavelength. The resulting tomograms allow us to locate regions of efficient cathodoluminescence in three dimensions across visible and near-infrared wavelengths, with contributions from material luminescence and radiative decay of electromagnetic eigenmodes. Finally, we demonstrate the fabrication of dynamically tunable large-area metamaterials and gram-scale metafluids, using a combination of colloidal synthesis, protein-directed assembly, self-assembly, etching, and stamping. The electric and magnetic response of the bulk metamaterial and metafluid are directly probed with optical scattering and spectroscopy. Using chemical swelling, these metamaterials exhibit reversible, unity-order refractive index changes that may provide a foundation for new adaptive optical materials in sensing, solar, and display applications.

  8. Servant leadership behaviors of aerospace and defense project managers and their relation to project success

    NASA Astrophysics Data System (ADS)

    Dominik, Michael T.

    The success of a project is dependent in part on the skills, knowledge, and behavior of its leader, the project manager. Despite advances in project manager certifications and professional development, the aerospace and defense industry has continued to see highly visible and expensive project failures partially attributable to failures in leadership. Servant leadership is an emerging leadership theory whose practitioners embrace empowerment, authenticity, humility, accountability, forgiveness, courage, standing back, and stewardship, but has not yet been fully examined in the context of the project manager as leader. The objective of this study was to examine the relationship between servant leadership behaviors demonstrated by aerospace and defense project managers and the resulting success of their projects. Study participants were drawn from aerospace and defense oriented affinity groups from the LinkedInRTM social media web system. The participants rated their project managers using a 30-item servant leadership scale, and rated the success of their project using a 12-item project success scale. One hundred and fifteen valid responses were analyzed from 231 collected samples from persons who had worked for a project manager on an aerospace and defense project within the past year. The results of the study demonstrated statistically significant levels of positive correlation to project success for all eight servant leadership factors independently evaluated. Using multiple linear regression methods, the servant leadership factors of empowerment and authenticity were determined to be substantial and statistically significant predictors of project success. The study results established the potential application of servant leadership as a valid approach for improving outcomes of projects.

  9. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. The SCALE-UP Project

    NASA Astrophysics Data System (ADS)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  11. Last of the Monumental Book Catalogs.

    ERIC Educational Resources Information Center

    Welsh, William J.

    1981-01-01

    Reviews the history of the National Union Catalog and the publication of the Pre-1956 Imprints. The roles of the ALA and Mansell Publishing in the completion of what is probably the last large-scale nonautomated bibliographic project, editorial problems, and the role of automation in future projects are discussed. (JL)

  12. DEVELOPMENT OF A SCALABLE, LOW-COST, ULTRANANOCRYSTALLINE DIAMOND ELECTROCHEMICAL PROCESS FOR THE DESTRUCTION OF CONTAMINANTS OF EMERGING CONCERN (CECS) - PHASE II

    EPA Science Inventory

    This Small Business Innovation Research (SBIR) Phase II project will employ the large scale; highly reliable boron-doped ultrananocrystalline diamond (BD-UNCD®) electrodes developed during Phase I project to build and test Electrochemical Anodic Oxidation process (EAOP)...

  13. Workforce Development Analysis | Energy Analysis | NREL

    Science.gov Websites

    with customer service, construction, and electrical projects One-half of surveyed firms reported , training, and experience that will enable continued large-scale deployment of wind and solar technologies engineers; and project managers. Standardized education and training at all levels-primary school through

  14. Factors Affecting Intervention Fidelity of Differentiated Instruction in Kindergarten

    ERIC Educational Resources Information Center

    Dijkstra, Elma M.; Walraven, Amber; Mooij, Ton; Kirschner, Paul A.

    2017-01-01

    This paper reports on the findings in the first phase of a design-based research project as part of a large-scale intervention study in Dutch kindergartens. The project aims at enhancing differentiated instruction and evaluating its effects on children's development, in particular high-ability children. This study investigates relevant…

  15. The Comprehensive Project for Deprived Communitites in Israel.

    ERIC Educational Resources Information Center

    Goldstein, Joseph

    A large-scale educational program, involving 30 settlements and neighborhoods that had been defined as suffering from deprivation, this project included a variety of reinforcement and enrichment programs. Information for a case study of the program was collected through interviews. Findings indicated that the guiding principles of the program…

  16. Strategies for Effective Dissemination of the Outcomes of Teaching and Learning Projects

    ERIC Educational Resources Information Center

    Southwell, Deborah; Gannaway, Deanne; Orrell, Janice; Chalmers, Denise; Abraham, Catherine

    2010-01-01

    This paper describes an empirical study that addresses the question of how higher education institutions can disseminate effectively the outcomes of projects that seek to achieve large-scale change in teaching and learning. Traditionally, dissemination of innovation and good practice is strongly advocated within universities, but little…

  17. Sensitivity of CEAP cropland simulations to the parameterization of the APEX model

    USDA-ARS?s Scientific Manuscript database

    For large scale applications like the U.S. National Scale Conservation Effects Assessment Project (CEAP), soil hydraulic characteristics data are not readily available and therefore need to be estimated. Field soil water properties are commonly approximated using laboratory soil water retention meas...

  18. Scaling up Psycholinguistics

    ERIC Educational Resources Information Center

    Smith, Nathaniel J.

    2011-01-01

    This dissertation contains several projects, each addressing different questions with different techniques. In chapter 1, I argue that they are unified thematically by their goal of "scaling up psycholinguistics"; they are all aimed at analyzing large data-sets using tools that reveal patterns to propose and test mechanism-neutral hypotheses about…

  19. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  20. HARD CHROME POLLUTION PREVENTION DEMONSTRATION PROJECT - INTERIM REPORT

    EPA Science Inventory

    In the project, five chromium emission prevention/control devices were tested tha cover the spectrum of prevention/control techniques currently in use in small- and large-size hard chromium electroplating job shops. The Project results show that some of the tested devices had ch...

Top