Climate@Home: Crowdsourcing Climate Change Research
NASA Astrophysics Data System (ADS)
Xu, C.; Yang, C.; Li, J.; Sun, M.; Bambacus, M.
2011-12-01
Climate change deeply impacts human wellbeing. Significant amounts of resources have been invested in building super-computers that are capable of running advanced climate models, which help scientists understand climate change mechanisms, and predict its trend. Although climate change influences all human beings, the general public is largely excluded from the research. On the other hand, scientists are eagerly seeking communication mediums for effectively enlightening the public on climate change and its consequences. The Climate@Home project is devoted to connect the two ends with an innovative solution: crowdsourcing climate computing to the general public by harvesting volunteered computing resources from the participants. A distributed web-based computing platform will be built to support climate computing, and the general public can 'plug-in' their personal computers to participate in the research. People contribute the spare computing power of their computers to run a computer model, which is used by scientists to predict climate change. Traditionally, only super-computers could handle such a large computing processing load. By orchestrating massive amounts of personal computers to perform atomized data processing tasks, investments on new super-computers, energy consumed by super-computers, and carbon release from super-computers are reduced. Meanwhile, the platform forms a social network of climate researchers and the general public, which may be leveraged to raise climate awareness among the participants. A portal is to be built as the gateway to the climate@home project. Three types of roles and the corresponding functionalities are designed and supported. The end users include the citizen participants, climate scientists, and project managers. Citizen participants connect their computing resources to the platform by downloading and installing a computing engine on their personal computers. Computer climate models are defined at the server side. Climate scientists configure computer model parameters through the portal user interface. After model configuration, scientists then launch the computing task. Next, data is atomized and distributed to computing engines that are running on citizen participants' computers. Scientists will receive notifications on the completion of computing tasks, and examine modeling results via visualization modules of the portal. Computing tasks, computing resources, and participants are managed by project managers via portal tools. A portal prototype has been built for proof of concept. Three forums have been setup for different groups of users to share information on science aspect, technology aspect, and educational outreach aspect. A facebook account has been setup to distribute messages via the most popular social networking platform. New treads are synchronized from the forums to facebook. A mapping tool displays geographic locations of the participants and the status of tasks on each client node. A group of users have been invited to test functions such as forums, blogs, and computing resource monitoring.
Ocean Modeling and Visualization on Massively Parallel Computer
NASA Technical Reports Server (NTRS)
Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.
1997-01-01
Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.
Climate Ocean Modeling on Parallel Computers
NASA Technical Reports Server (NTRS)
Wang, P.; Cheng, B. N.; Chao, Y.
1998-01-01
Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.
A Computing Infrastructure for Supporting Climate Studies
NASA Astrophysics Data System (ADS)
Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team
2011-12-01
Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.
NASA Technical Reports Server (NTRS)
Druyan, Leonard M.
2012-01-01
Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.
Majda, Andrew J; Abramov, Rafail; Gershgorin, Boris
2010-01-12
Climate change science focuses on predicting the coarse-grained, planetary-scale, longtime changes in the climate system due to either changes in external forcing or internal variability, such as the impact of increased carbon dioxide. The predictions of climate change science are carried out through comprehensive, computational atmospheric, and oceanic simulation models, which necessarily parameterize physical features such as clouds, sea ice cover, etc. Recently, it has been suggested that there is irreducible imprecision in such climate models that manifests itself as structural instability in climate statistics and which can significantly hamper the skill of computer models for climate change. A systematic approach to deal with this irreducible imprecision is advocated through algorithms based on the Fluctuation Dissipation Theorem (FDT). There are important practical and computational advantages for climate change science when a skillful FDT algorithm is established. The FDT response operator can be utilized directly for multiple climate change scenarios, multiple changes in forcing, and other parameters, such as damping and inverse modelling directly without the need of running the complex climate model in each individual case. The high skill of FDT in predicting climate change, despite structural instability, is developed in an unambiguous fashion using mathematical theory as guidelines in three different test models: a generic class of analytical models mimicking the dynamical core of the computer climate models, reduced stochastic models for low-frequency variability, and models with a significant new type of irreducible imprecision involving many fast, unstable modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablonowski, Christiane
The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less
Combining Statistics and Physics to Improve Climate Downscaling
NASA Astrophysics Data System (ADS)
Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.
2017-12-01
Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.
Ravazzani, Giovanni; Ghilardi, Matteo; Mendlik, Thomas; Gobiet, Andreas; Corbari, Chiara; Mancini, Marco
2014-01-01
Assessing the future effects of climate change on water availability requires an understanding of how precipitation and evapotranspiration rates will respond to changes in atmospheric forcing. Use of simplified hydrological models is required beacause of lack of meteorological forcings with the high space and time resolutions required to model hydrological processes in mountains river basins, and the necessity of reducing the computational costs. The main objective of this study was to quantify the differences between a simplified hydrological model, which uses only precipitation and temperature to compute the hydrological balance when simulating the impact of climate change, and an enhanced version of the model, which solves the energy balance to compute the actual evapotranspiration. For the meteorological forcing of future scenario, at-site bias-corrected time series based on two regional climate models were used. A quantile-based error-correction approach was used to downscale the regional climate model simulations to a point scale and to reduce its error characteristics. The study shows that a simple temperature-based approach for computing the evapotranspiration is sufficiently accurate for performing hydrological impact investigations of climate change for the Alpine river basin which was studied. PMID:25285917
Ravazzani, Giovanni; Ghilardi, Matteo; Mendlik, Thomas; Gobiet, Andreas; Corbari, Chiara; Mancini, Marco
2014-01-01
Assessing the future effects of climate change on water availability requires an understanding of how precipitation and evapotranspiration rates will respond to changes in atmospheric forcing. Use of simplified hydrological models is required because of lack of meteorological forcings with the high space and time resolutions required to model hydrological processes in mountains river basins, and the necessity of reducing the computational costs. The main objective of this study was to quantify the differences between a simplified hydrological model, which uses only precipitation and temperature to compute the hydrological balance when simulating the impact of climate change, and an enhanced version of the model, which solves the energy balance to compute the actual evapotranspiration. For the meteorological forcing of future scenario, at-site bias-corrected time series based on two regional climate models were used. A quantile-based error-correction approach was used to downscale the regional climate model simulations to a point scale and to reduce its error characteristics. The study shows that a simple temperature-based approach for computing the evapotranspiration is sufficiently accurate for performing hydrological impact investigations of climate change for the Alpine river basin which was studied.
CPMIP: measurements of real computational performance of Earth system models in CMIP6
NASA Astrophysics Data System (ADS)
Balaji, Venkatramani; Maisonnave, Eric; Zadeh, Niki; Lawrence, Bryan N.; Biercamp, Joachim; Fladrich, Uwe; Aloisio, Giovanni; Benson, Rusty; Caubel, Arnaud; Durachta, Jeffrey; Foujols, Marie-Alice; Lister, Grenville; Mocavero, Silvia; Underwood, Seth; Wright, Garrett
2017-01-01
A climate model represents a multitude of processes on a variety of timescales and space scales: a canonical example of multi-physics multi-scale modeling. The underlying climate system is physically characterized by sensitive dependence on initial conditions, and natural stochastic variability, so very long integrations are needed to extract signals of climate change. Algorithms generally possess weak scaling and can be I/O and/or memory-bound. Such weak-scaling, I/O, and memory-bound multi-physics codes present particular challenges to computational performance. Traditional metrics of computational efficiency such as performance counters and scaling curves do not tell us enough about real sustained performance from climate models on different machines. They also do not provide a satisfactory basis for comparative information across models. codes present particular challenges to computational performance. We introduce a set of metrics that can be used for the study of computational performance of climate (and Earth system) models. These measures do not require specialized software or specific hardware counters, and should be accessible to anyone. They are independent of platform and underlying parallel programming models. We show how these metrics can be used to measure actually attained performance of Earth system models on different machines, and identify the most fruitful areas of research and development for performance engineering. codes present particular challenges to computational performance. We present results for these measures for a diverse suite of models from several modeling centers, and propose to use these measures as a basis for a CPMIP, a computational performance model intercomparison project (MIP).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
A European Flagship Programme on Extreme Computing and Climate
NASA Astrophysics Data System (ADS)
Palmer, Tim
2017-04-01
In 2016, an outline proposal co-authored by a number of leading climate modelling scientists from around Europe for a (c. 1 billion euro) flagship project on exascale computing and high-resolution global climate modelling was sent to the EU via its Future and Emerging Flagship Technologies Programme. The project is formally entitled "A Flagship European Programme on Extreme Computing and Climate (EPECC)"? In this talk I will outline the reasons why I believe such a project is needed and describe the current status of the project. I will leave time for some discussion.
Model falsifiability and climate slow modes
NASA Astrophysics Data System (ADS)
Essex, Christopher; Tsonis, Anastasios A.
2018-07-01
The most advanced climate models are actually modified meteorological models attempting to capture climate in meteorological terms. This seems a straightforward matter of raw computing power applied to large enough sources of current data. Some believe that models have succeeded in capturing climate in this manner. But have they? This paper outlines difficulties with this picture that derive from the finite representation of our computers, and the fundamental unavailability of future data instead. It suggests that alternative windows onto the multi-decadal timescales are necessary in order to overcome the issues raised for practical problems of prediction.
Vezér, Martin A
2016-04-01
To study climate change, scientists employ computer models, which approximate target systems with various levels of skill. Given the imperfection of climate models, how do scientists use simulations to generate knowledge about the causes of observed climate change? Addressing a similar question in the context of biological modelling, Levins (1966) proposed an account grounded in robustness analysis. Recent philosophical discussions dispute the confirmatory power of robustness, raising the question of how the results of computer modelling studies contribute to the body of evidence supporting hypotheses about climate change. Expanding on Staley's (2004) distinction between evidential strength and security, and Lloyd's (2015) argument connecting variety-of-evidence inferences and robustness analysis, I address this question with respect to recent challenges to the epistemology robustness analysis. Applying this epistemology to case studies of climate change, I argue that, despite imperfections in climate models, and epistemic constraints on variety-of-evidence reasoning and robustness analysis, this framework accounts for the strength and security of evidence supporting climatological inferences, including the finding that global warming is occurring and its primary causes are anthropogenic. Copyright © 2016 Elsevier Ltd. All rights reserved.
Developing Models for Predictive Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake, John B; Jones, Philip W
2007-01-01
The Community Climate System Model results from a multi-agency collaboration designed to construct cutting-edge climate science simulation models for a broad research community. Predictive climate simulations are currently being prepared for the petascale computers of the near future. Modeling capabilities are continuously being improved in order to provide better answers to critical questions about Earth's climate. Climate change and its implications are front page news in today's world. Could global warming be responsible for the July 2006 heat waves in Europe and the United States? Should more resources be devoted to preparing for an increase in the frequency of strongmore » tropical storms and hurricanes like Katrina? Will coastal cities be flooded due to a rise in sea level? The National Climatic Data Center (NCDC), which archives all weather data for the nation, reports that global surface temperatures have increased over the last century, and that the rate of increase is three times greater since 1976. Will temperatures continue to climb at this rate, will they decline again, or will the rate of increase become even steeper? To address such a flurry of questions, scientists must adopt a systematic approach and develop a predictive framework. With responsibility for advising on energy and technology strategies, the DOE is dedicated to advancing climate research in order to elucidate the causes of climate change, including the role of carbon loading from fossil fuel use. Thus, climate science--which by nature involves advanced computing technology and methods--has been the focus of a number of DOE's SciDAC research projects. Dr. John Drake (ORNL) and Dr. Philip Jones (LANL) served as principal investigators on the SciDAC project, 'Collaborative Design and Development of the Community Climate System Model for Terascale Computers.' The Community Climate System Model (CCSM) is a fully-coupled global system that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. The collaborative SciDAC team--including over a dozen researchers at institutions around the country--developed, validated, documented, and optimized the performance of CCSM using the latest software engineering approaches, computational technology, and scientific knowledge. Many of the factors that must be accounted for in a comprehensive model of the climate system are illustrated in figure 1.« less
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
ERIC Educational Resources Information Center
Pallant, Amy; Lee, Hee-Sun
2015-01-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students (N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation…
Estimation of the fractional coverage of rainfall in climate models
NASA Technical Reports Server (NTRS)
Eltahir, E. A. B.; Bras, R. L.
1993-01-01
The fraction of the grid cell area covered by rainfall, mu, is an essential parameter in descriptions of land surface hydrology in climate models. A simple procedure is presented for estimating this fraction, based on extensive observations of storm areas and rainfall volumes. Storm area and rainfall volume are often linearly related; this relation can be used to compute the storm area from the volume of rainfall simulated by a climate model. A formula is developed for computing mu, which describes the dependence of the fractional coverage of rainfall on the season of the year, the geographical region, rainfall volume, and the spatial and temporal resolution of the model. The new formula is applied in computing mu over the Amazon region. Significant temporal variability in the fractional coverage of rainfall is demonstrated. The implications of this variability for the modeling of land surface hydrology in climate models are discussed.
Statistical Emulation of Climate Model Projections Based on Precomputed GCM Runs*
Castruccio, Stefano; McInerney, David J.; Stein, Michael L.; ...
2014-02-24
The authors describe a new approach for emulating the output of a fully coupled climate model under arbitrary forcing scenarios that is based on a small set of precomputed runs from the model. Temperature and precipitation are expressed as simple functions of the past trajectory of atmospheric CO 2 concentrations, and a statistical model is fit using a limited set of training runs. The approach is demonstrated to be a useful and computationally efficient alternative to pattern scaling and captures the nonlinear evolution of spatial patterns of climate anomalies inherent in transient climates. The approach does as well as patternmore » scaling in all circumstances and substantially better in many; it is not computationally demanding; and, once the statistical model is fit, it produces emulated climate output effectively instantaneously. In conclusion, it may therefore find wide application in climate impacts assessments and other policy analyses requiring rapid climate projections.« less
The origins of computer weather prediction and climate modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynch, Peter
2008-03-20
Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. Amore » fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.« less
The origins of computer weather prediction and climate modeling
NASA Astrophysics Data System (ADS)
Lynch, Peter
2008-03-01
Numerical simulation of an ever-increasing range of geophysical phenomena is adding enormously to our understanding of complex processes in the Earth system. The consequences for mankind of ongoing climate change will be far-reaching. Earth System Models are capable of replicating climate regimes of past millennia and are the best means we have of predicting the future of our climate. The basic ideas of numerical forecasting and climate modeling were developed about a century ago, long before the first electronic computer was constructed. There were several major practical obstacles to be overcome before numerical prediction could be put into practice. A fuller understanding of atmospheric dynamics allowed the development of simplified systems of equations; regular radiosonde observations of the free atmosphere and, later, satellite data, provided the initial conditions; stable finite difference schemes were developed; and powerful electronic computers provided a practical means of carrying out the prodigious calculations required to predict the changes in the weather. Progress in weather forecasting and in climate modeling over the past 50 years has been dramatic. In this presentation, we will trace the history of computer forecasting through the ENIAC integrations to the present day. The useful range of deterministic prediction is increasing by about one day each decade, and our understanding of climate change is growing rapidly as Earth System Models of ever-increasing sophistication are developed.
Theory and Programs for Dynamic Modeling of Tree Rings from Climate
Paul C. van Deusen; Jennifer Koretz
1988-01-01
Computer programs written in GAUSS(TM) for IBM compatible personal computers are described that perform dynamic tree ring modeling with climate data; the underlying theory is also described. The programs and a separate users manual are available from the authors, although users must have the GAUSS software package on their personal computer. An example application of...
NASA Astrophysics Data System (ADS)
Hartin, C.; Lynch, C.; Kravitz, B.; Link, R. P.; Bond-Lamberty, B. P.
2017-12-01
Typically, uncertainty quantification of internal variability relies on large ensembles of climate model runs under multiple forcing scenarios or perturbations in a parameter space. Computationally efficient, standard pattern scaling techniques only generate one realization and do not capture the complicated dynamics of the climate system (i.e., stochastic variations with a frequency-domain structure). In this study, we generate large ensembles of climate data with spatially and temporally coherent variability across a subselection of Coupled Model Intercomparison Project Phase 5 (CMIP5) models. First, for each CMIP5 model we apply a pattern emulation approach to derive the model response to external forcing. We take all the spatial and temporal variability that isn't explained by the emulator and decompose it into non-physically based structures through use of empirical orthogonal functions (EOFs). Then, we perform a Fourier decomposition of the EOF projection coefficients to capture the input fields' temporal autocorrelation so that our new emulated patterns reproduce the proper timescales of climate response and "memory" in the climate system. Through this 3-step process, we derive computationally efficient climate projections consistent with CMIP5 model trends and modes of variability, which address a number of deficiencies inherent in the ability of pattern scaling to reproduce complex climate model behavior.
Software Simplifies the Sharing of Numerical Models
NASA Technical Reports Server (NTRS)
2014-01-01
To ease the sharing of climate models with university students, Goddard Space Flight Center awarded SBIR funding to Reston, Virginia-based Parabon Computation Inc., a company that specializes in cloud computing. The firm developed a software program capable of running climate models over the Internet, and also created an online environment for people to collaborate on developing such models.
Multi-objective optimization of GENIE Earth system models.
Price, Andrew R; Myerscough, Richard J; Voutchkov, Ivan I; Marsh, Robert; Cox, Simon J
2009-07-13
The tuning of parameters in climate models is essential to provide reliable long-term forecasts of Earth system behaviour. We apply a multi-objective optimization algorithm to the problem of parameter estimation in climate models. This optimization process involves the iterative evaluation of response surface models (RSMs), followed by the execution of multiple Earth system simulations. These computations require an infrastructure that provides high-performance computing for building and searching the RSMs and high-throughput computing for the concurrent evaluation of a large number of models. Grid computing technology is therefore essential to make this algorithm practical for members of the GENIE project.
Quantifying uncertainty in climate change science through empirical information theory.
Majda, Andrew J; Gershgorin, Boris
2010-08-24
Quantifying the uncertainty for the present climate and the predictions of climate change in the suite of imperfect Atmosphere Ocean Science (AOS) computer models is a central issue in climate change science. Here, a systematic approach to these issues with firm mathematical underpinning is developed through empirical information theory. An information metric to quantify AOS model errors in the climate is proposed here which incorporates both coarse-grained mean model errors as well as covariance ratios in a transformation invariant fashion. The subtle behavior of model errors with this information metric is quantified in an instructive statistically exactly solvable test model with direct relevance to climate change science including the prototype behavior of tracer gases such as CO(2). Formulas for identifying the most sensitive climate change directions using statistics of the present climate or an AOS model approximation are developed here; these formulas just involve finding the eigenvector associated with the largest eigenvalue of a quadratic form computed through suitable unperturbed climate statistics. These climate change concepts are illustrated on a statistically exactly solvable one-dimensional stochastic model with relevance for low frequency variability of the atmosphere. Viable algorithms for implementation of these concepts are discussed throughout the paper.
EdGCM: Research Tools for Training the Climate Change Generation
NASA Astrophysics Data System (ADS)
Chandler, M. A.; Sohl, L. E.; Zhou, J.; Sieber, R.
2011-12-01
Climate scientists employ complex computer simulations of the Earth's physical systems to prepare climate change forecasts, study the physical mechanisms of climate, and to test scientific hypotheses and computer parameterizations. The Intergovernmental Panel on Climate Change 4th Assessment Report (2007) demonstrates unequivocally that policy makers rely heavily on such Global Climate Models (GCMs) to assess the impacts of potential economic and emissions scenarios. However, true climate modeling capabilities are not disseminated to the majority of world governments or U.S. researchers - let alone to the educators who will be training the students who are about to be presented with a world full of climate change stakeholders. The goal is not entirely quixotic; in fact, by the mid-1990's prominent climate scientists were predicting with certainty that schools and politicians would "soon" be running GCMs on laptops [Randall, 1996]. For a variety of reasons this goal was never achieved (nor even really attempted). However, around the same time NASA and the National Science Foundation supported a small pilot project at Columbia University to show the potential of putting sophisticated computer climate models - not just "demos" or "toy models" - into the hands of non-specialists. The Educational Global Climate Modeling Project (EdGCM) gave users access to a real global climate model and provided them with the opportunity to experience the details of climate model setup, model operation, post-processing and scientific visualization. EdGCM was designed for use in both research and education - it is a full-blown research GCM, but the ultimate goal is to develop a capability to embed these crucial technologies across disciplines, networks, platforms, and even across academia and industry. With this capability in place we can begin training the skilled workforce that is necessary to deal with the multitude of climate impacts that will occur over the coming decades. To further increase the educational potential of climate models, the EdGCM project has also created "EZgcm". Through a joint venture of NASA, Columbia University and McGill University EZgcm moves the focus toward a greater use of Web 1.0 and Web 2.0-based technologies. It shifts the educational objectives towards a greater emphasis on teaching students how science is conducted and what role science plays in assessing climate change. That is, students learn about the steps of the scientific process as conveyed by climate modeling research: constructing a hypothesis, designing an experiment, running a computer model, using scientific visualization to support analysis, communicating the results of that analysis, and role playing the scientific peer review process. This is in stark contrast to what they learn from the political debate over climate change, which they often confuse with a scientific debate.
A computational approach to climate science education with CLIMLAB
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2017-12-01
CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format
A Power Efficient Exaflop Computer Design for Global Cloud System Resolving Climate Models.
NASA Astrophysics Data System (ADS)
Wehner, M. F.; Oliker, L.; Shalf, J.
2008-12-01
Exascale computers would allow routine ensemble modeling of the global climate system at the cloud system resolving scale. Power and cost requirements of traditional architecture systems are likely to delay such capability for many years. We present an alternative route to the exascale using embedded processor technology to design a system optimized for ultra high resolution climate modeling. These power efficient processors, used in consumer electronic devices such as mobile phones, portable music players, cameras, etc., can be tailored to the specific needs of scientific computing. We project that a system capable of integrating a kilometer scale climate model a thousand times faster than real time could be designed and built in a five year time scale for US$75M with a power consumption of 3MW. This is cheaper, more power efficient and sooner than any other existing technology.
Bridging the Gap Between the iLEAPS and GEWEX Land-Surface Modeling Communities
NASA Technical Reports Server (NTRS)
Bonan, Gordon; Santanello, Joseph A., Jr.
2013-01-01
Models of Earth's weather and climate require fluxes of momentum, energy, and moisture across the land-atmosphere interface to solve the equations of atmospheric physics and dynamics. Just as atmospheric models can, and do, differ between weather and climate applications, mostly related to issues of scale, resolved or parameterised physics,and computational requirements, so too can the land models that provide the required surface fluxes differ between weather and climate models. Here, however, the issue is less one of scale-dependent parameterisations.Computational demands can influence other minor land model differences, especially with respect to initialisation, data assimilation, and forecast skill. However, the distinction among land models (and their development and application) is largely driven by the different science and research needs of the weather and climate communities.
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
ClimateSpark: An in-memory distributed computing framework for big climate data analytics
NASA Astrophysics Data System (ADS)
Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei
2018-06-01
The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.
NASA Technical Reports Server (NTRS)
Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol
2003-01-01
The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.
Development of a Cloud Resolving Model for Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.
2017-12-01
A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.
Accelerating Climate and Weather Simulations through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark
2011-01-01
Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.
The Finer Details: Climate Modeling
NASA Technical Reports Server (NTRS)
2000-01-01
If you want to know whether you will need sunscreen or an umbrella for tomorrow's picnic, you can simply read the local weather report. However, if you are calculating the impact of gas combustion on global temperatures, or anticipating next year's rainfall levels to set water conservation policy, you must conduct a more comprehensive investigation. Such complex matters require long-range modeling techniques that predict broad trends in climate development rather than day-to-day details. Climate models are built from equations that calculate the progression of weather-related conditions over time. Based on the laws of physics, climate model equations have been developed to predict a number of environmental factors, for example: 1. Amount of solar radiation that hits the Earth. 2. Varying proportions of gases that make up the air. 3. Temperature at the Earth's surface. 4. Circulation of ocean and wind currents. 5. Development of cloud cover. Numerical modeling of the climate can improve our understanding of both the past and, the future. A model can confirm the accuracy of environmental measurements taken. in, the past and can even fill in gaps in those records. In addition, by quantifying the relationship between different aspects of climate, scientists can estimate how a future change in one aspect may alter the rest of the world. For example, could an increase in the temperature of the Pacific Ocean somehow set off a drought on the other side of the world? A computer simulation could lead to an answer for this and other questions. Quantifying the chaotic, nonlinear activities that shape our climate is no easy matter. You cannot run these simulations on your desktop computer and expect results by the time you have finished checking your morning e-mail. Efficient and accurate climate modeling requires powerful computers that can process billions of mathematical calculations in a single second. The NCCS exists to provide this degree of vast computing capability.
Accelerating Climate Simulations Through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark
2009-01-01
Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.
The computational future for climate and Earth system models: on the path to petaflop and beyond.
Washington, Warren M; Buja, Lawrence; Craig, Anthony
2009-03-13
The development of the climate and Earth system models has had a long history, starting with the building of individual atmospheric, ocean, sea ice, land vegetation, biogeochemical, glacial and ecological model components. The early researchers were much aware of the long-term goal of building the Earth system models that would go beyond what is usually included in the climate models by adding interactive biogeochemical interactions. In the early days, the progress was limited by computer capability, as well as by our knowledge of the physical and chemical processes. Over the last few decades, there has been much improved knowledge, better observations for validation and more powerful supercomputer systems that are increasingly meeting the new challenges of comprehensive models. Some of the climate model history will be presented, along with some of the successes and difficulties encountered with present-day supercomputer systems.
Parallel computing method for simulating hydrological processesof large rivers under climate change
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, Y.
2016-12-01
Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.
ClimateSpark: An In-memory Distributed Computing Framework for Big Climate Data Analytics
NASA Astrophysics Data System (ADS)
Hu, F.; Yang, C. P.; Duffy, D.; Schnase, J. L.; Li, Z.
2016-12-01
Massive array-based climate data is being generated from global surveillance systems and model simulations. They are widely used to analyze the environment problems, such as climate changes, natural hazards, and public health. However, knowing the underlying information from these big climate datasets is challenging due to both data- and computing- intensive issues in data processing and analyzing. To tackle the challenges, this paper proposes ClimateSpark, an in-memory distributed computing framework to support big climate data processing. In ClimateSpark, the spatiotemporal index is developed to enable Apache Spark to treat the array-based climate data (e.g. netCDF4, HDF4) as native formats, which are stored in Hadoop Distributed File System (HDFS) without any preprocessing. Based on the index, the spatiotemporal query services are provided to retrieve dataset according to a defined geospatial and temporal bounding box. The data subsets will be read out, and a data partition strategy will be applied to equally split the queried data to each computing node, and store them in memory as climateRDDs for processing. By leveraging Spark SQL and User Defined Function (UDFs), the climate data analysis operations can be conducted by the intuitive SQL language. ClimateSpark is evaluated by two use cases using the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. One use case is to conduct the spatiotemporal query and visualize the subset results in animation; the other one is to compare different climate model outputs using Taylor-diagram service. Experimental results show that ClimateSpark can significantly accelerate data query and processing, and enable the complex analysis services served in the SQL-style fashion.
System and Method for Providing a Climate Data Persistence Service
NASA Technical Reports Server (NTRS)
Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)
2018-01-01
A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.
A Simple Climate Model Program for High School Education
NASA Astrophysics Data System (ADS)
Dommenget, D.
2012-04-01
The future climate change projections of the IPCC AR4 are based on GCM simulations, which give a distinct global warming pattern, with an arctic winter amplification, an equilibrium land sea contrast and an inter-hemispheric warming gradient. While these simulations are the most important tool of the IPCC predictions, the conceptual understanding of these predicted structures of climate change are very difficult to reach if only based on these highly complex GCM simulations and they are not accessible for ordinary people. In this study presented here we will introduce a very simple gridded globally resolved energy balance model based on strongly simplified physical processes, which is capable of simulating the main characteristics of global warming. The model shall give a bridge between the 1-dimensional energy balance models and the fully coupled 4-dimensional complex GCMs. It runs on standard PC computers computing globally resolved climate simulation with 2yrs per second or 100,000yrs per day. The program can compute typical global warming scenarios in a few minutes on a standard PC. The computer code is only 730 line long with very simple formulations that high school students should be able to understand. The simple model's climate sensitivity and the spatial structure of the warming pattern is within the uncertainties of the IPCC AR4 models simulations. It is capable of simulating the arctic winter amplification, the equilibrium land sea contrast and the inter-hemispheric warming gradient with good agreement to the IPCC AR4 models in amplitude and structure. The program can be used to do sensitivity studies in which students can change something (e.g. reduce the solar radiation, take away the clouds or make snow black) and see how it effects the climate or the climate response to changes in greenhouse gases. This program is available for every one and could be the basis for high school education. Partners for a high school project are wanted!
Kahnert, Michael; Nousiainen, Timo; Lindqvist, Hannakaisa; Ebert, Martin
2012-04-23
Light scattering by light absorbing carbon (LAC) aggregates encapsulated into sulfate shells is computed by use of the discrete dipole method. Computations are performed for a UV, visible, and IR wavelength, different particle sizes, and volume fractions. Reference computations are compared to three classes of simplified model particles that have been proposed for climate modeling purposes. Neither model matches the reference results sufficiently well. Remarkably, more realistic core-shell geometries fall behind homogeneous mixture models. An extended model based on a core-shell-shell geometry is proposed and tested. Good agreement is found for total optical cross sections and the asymmetry parameter. © 2012 Optical Society of America
Progress in fast, accurate multi-scale climate simulations
Collins, W. D.; Johansen, H.; Evans, K. J.; ...
2015-06-01
We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
NASA Astrophysics Data System (ADS)
Wang, Lunche; Kisi, Ozgur; Zounemat-Kermani, Mohammad; Li, Hui
2017-01-01
Pan evaporation (Ep) plays important roles in agricultural water resources management. One of the basic challenges is modeling Ep using limited climatic parameters because there are a number of factors affecting the evaporation rate. This study investigated the abilities of six different soft computing methods, multi-layer perceptron (MLP), generalized regression neural network (GRNN), fuzzy genetic (FG), least square support vector machine (LSSVM), multivariate adaptive regression spline (MARS), adaptive neuro-fuzzy inference systems with grid partition (ANFIS-GP), and two regression methods, multiple linear regression (MLR) and Stephens and Stewart model (SS) in predicting monthly Ep. Long-term climatic data at various sites crossing a wide range of climates during 1961-2000 are used for model development and validation. The results showed that the models have different accuracies in different climates and the MLP model performed superior to the other models in predicting monthly Ep at most stations using local input combinations (for example, the MAE (mean absolute errors), RMSE (root mean square errors), and determination coefficient (R2) are 0.314 mm/day, 0.405 mm/day and 0.988, respectively for HEB station), while GRNN model performed better in Tibetan Plateau (MAE, RMSE and R2 are 0.459 mm/day, 0.592 mm/day and 0.932, respectively). The accuracies of above models ranked as: MLP, GRNN, LSSVM, FG, ANFIS-GP, MARS and MLR. The overall results indicated that the soft computing techniques generally performed better than the regression methods, but MLR and SS models can be more preferred at some climatic zones instead of complex nonlinear models, for example, the BJ (Beijing), CQ (Chongqing) and HK (Haikou) stations. Therefore, it can be concluded that Ep could be successfully predicted using above models in hydrological modeling studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchings, Jennifer; Joseph, Renu
2013-09-14
The goal of this project is to develop an eddy resolving ocean model (POP) with tides coupled to a sea ice model (CICE) within the Regional Arctic System Model (RASM) to investigate the importance of ocean tides and mesoscale eddies in arctic climate simulations and quantify biases associated with these processes and how their relative contribution may improve decadal to centennial arctic climate predictions. Ocean, sea ice and coupled arctic climate response to these small scale processes will be evaluated with regard to their influence on mass, momentum and property exchange between oceans, shelf-basin, ice-ocean, and ocean-atmosphere. The project willmore » facilitate the future routine inclusion of polar tides and eddies in Earth System Models when computing power allows. As such, the proposed research addresses the science in support of the BER’s Climate and Environmental Sciences Division Long Term Measure as it will improve the ocean and sea ice model components as well as the fully coupled RASM and Community Earth System Model (CESM) and it will make them more accurate and computationally efficient.« less
Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments
NASA Astrophysics Data System (ADS)
Vezer, M. A.
2010-12-01
Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between object and target systems) and some arguments for the claim that materiality entails some inferential advantage to traditional experimentation. I maintain that Parker’s account of the ontology of computer simulations has some interesting though potentially problematic implications regarding conventional distinctions between abstract and concrete methods of inquiry. With respect to her account of materiality, I outline and defend an alternative account, posited by Mary Morgan (2002, 2003, 2005), which holds that ontological similarity between target and object systems confers some epistemological advantage to traditional forms of experimental inquiry.
The report gives results of a series of computer runs using the DOE-2.1E building energy model, simulating a small office in a hot, humid climate (Miami). These simulations assessed the energy and relative humidity (RH) penalties when the outdoor air (OA) ventilation rate is inc...
Computing and Systems Applied in Support of Coordinated Energy, Environmental, and Climate Planning
This talk focuses on how Dr. Loughlin is applying Computing and Systems models, tools and methods to more fully understand the linkages among energy systems, environmental quality, and climate change. Dr. Loughlin will highlight recent and ongoing research activities, including: ...
A flexible tool for diagnosing water, energy, and entropy budgets in climate models
NASA Astrophysics Data System (ADS)
Lembo, Valerio; Lucarini, Valerio
2017-04-01
We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.
The BRIDGE HadCM3 family of climate models: HadCM3@Bristol v1.0
NASA Astrophysics Data System (ADS)
Valdes, Paul J.; Armstrong, Edward; Badger, Marcus P. S.; Bradshaw, Catherine D.; Bragg, Fran; Crucifix, Michel; Davies-Barnard, Taraka; Day, Jonathan J.; Farnsworth, Alex; Gordon, Chris; Hopcroft, Peter O.; Kennedy, Alan T.; Lord, Natalie S.; Lunt, Dan J.; Marzocchi, Alice; Parry, Louise M.; Pope, Vicky; Roberts, William H. G.; Stone, Emma J.; Tourte, Gregory J. L.; Williams, Jonny H. T.
2017-10-01
Understanding natural and anthropogenic climate change processes involves using computational models that represent the main components of the Earth system: the atmosphere, ocean, sea ice, and land surface. These models have become increasingly computationally expensive as resolution is increased and more complex process representations are included. However, to gain robust insight into how climate may respond to a given forcing, and to meaningfully quantify the associated uncertainty, it is often required to use either or both ensemble approaches and very long integrations. For this reason, more computationally efficient models can be very valuable tools. Here we provide a comprehensive overview of the suite of climate models based around the HadCM3 coupled general circulation model. This model was developed at the UK Met Office and has been heavily used during the last 15 years for a range of future (and past) climate change studies, but has now been largely superseded for many scientific studies by more recently developed models. However, it continues to be extensively used by various institutions, including the BRIDGE (Bristol Research Initiative for the Dynamic Global Environment) research group at the University of Bristol, who have made modest adaptations to the base HadCM3 model over time. These adaptations mean that the original documentation is not entirely representative, and several other relatively undocumented configurations are in use. We therefore describe the key features of a number of configurations of the HadCM3 climate model family, which together make up HadCM3@Bristol version 1.0. In order to differentiate variants that have undergone development at BRIDGE, we have introduced the letter B into the model nomenclature. We include descriptions of the atmosphere-only model (HadAM3B), the coupled model with a low-resolution ocean (HadCM3BL), the high-resolution atmosphere-only model (HadAM3BH), and the regional model (HadRM3B). These also include three versions of the land surface scheme. By comparing with observational datasets, we show that these models produce a good representation of many aspects of the climate system, including the land and sea surface temperatures, precipitation, ocean circulation, and vegetation. This evaluation, combined with the relatively fast computational speed (up to 1000 times faster than some CMIP6 models), motivates continued development and scientific use of the HadCM3B family of coupled climate models, predominantly for quantifying uncertainty and for long multi-millennial-scale simulations.
Warren, Jeff; Iversen, Colleen; Brooks, Jonathan; Ricciuto, Daniel
2018-02-13
Using dogwood trees, Oak Ridge National Laboratory researchers are gaining a better understanding of the role photosynthesis and respiration play in the atmospheric carbon dioxide cycle. Their findings will aid computer modelers in improving the accuracy of climate simulations.
A Columnar Storage Strategy with Spatiotemporal Index for Big Climate Data
NASA Astrophysics Data System (ADS)
Hu, F.; Bowen, M. K.; Li, Z.; Schnase, J. L.; Duffy, D.; Lee, T. J.; Yang, C. P.
2015-12-01
Large collections of observational, reanalysis, and climate model output data may grow to as large as a 100 PB in the coming years, so climate dataset is in the Big Data domain, and various distributed computing frameworks have been utilized to address the challenges by big climate data analysis. However, due to the binary data format (NetCDF, HDF) with high spatial and temporal dimensions, the computing frameworks in Apache Hadoop ecosystem are not originally suited for big climate data. In order to make the computing frameworks in Hadoop ecosystem directly support big climate data, we propose a columnar storage format with spatiotemporal index to store climate data, which will support any project in the Apache Hadoop ecosystem (e.g. MapReduce, Spark, Hive, Impala). With this approach, the climate data will be transferred into binary Parquet data format, a columnar storage format, and spatial and temporal index will be built and attached into the end of Parquet files to enable real-time data query. Then such climate data in Parquet data format could be available to any computing frameworks in Hadoop ecosystem. The proposed approach is evaluated using the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. Experimental results show that this approach could efficiently overcome the gap between the big climate data and the distributed computing frameworks, and the spatiotemporal index could significantly accelerate data querying and processing.
Climate Science Performance, Data and Productivity on Titan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, Benjamin W; Worley, Patrick H; Gaddis, Abigail L
2015-01-01
Climate Science models are flagship codes for the largest of high performance computing (HPC) resources, both in visibility, with the newly launched Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) effort, and in terms of significant fractions of system usage. The performance of the DOE ACME model is captured with application level timers and examined through a sizeable run archive. Performance and variability of compute, queue time and ancillary services are examined. As Climate Science advances in the use of HPC resources there has been an increase in the required human and data systems to achieve programs goals.more » A description of current workflow processes (hardware, software, human) and planned automation of the workflow, along with historical and projected data in motion and at rest data usage, are detailed. The combination of these two topics motivates a description of future systems requirements for DOE Climate Modeling efforts, focusing on the growth of data storage and network and disk bandwidth required to handle data at an acceptable rate.« less
Real-Time Climate Simulations in the Interactive 3D Game Universe Sandbox ²
NASA Astrophysics Data System (ADS)
Goldenson, N. L.
2014-12-01
Exploration in an open-ended computer game is an engaging way to explore climate and climate change. Everyone can explore physical models with real-time visualization in the educational simulator Universe Sandbox ² (universesandbox.com/2), which includes basic climate simulations on planets. I have implemented a time-dependent, one-dimensional meridional heat transport energy balance model to run and be adjustable in real time in the midst of a larger simulated system. Universe Sandbox ² is based on the original game - at its core a gravity simulator - with other new physically-based content for stellar evolution, and handling collisions between bodies. Existing users are mostly science enthusiasts in informal settings. We believe that this is the first climate simulation to be implemented in a professionally developed computer game with modern 3D graphical output in real time. The type of simple climate model we've adopted helps us depict the seasonal cycle and the more drastic changes that come from changing the orbit or other external forcings. Users can alter the climate as the simulation is running by altering the star(s) in the simulation, dragging to change orbits and obliquity, adjusting the climate simulation parameters directly or changing other properties like CO2 concentration that affect the model parameters in representative ways. Ongoing visuals of the expansion and contraction of sea ice and snow-cover respond to the temperature calculations, and make it accessible to explore a variety of scenarios and intuitive to understand the output. Variables like temperature can also be graphed in real time. We balance computational constraints with the ability to capture the physical phenomena we wish to visualize, giving everyone access to a simple open-ended meridional energy balance climate simulation to explore and experiment with. The software lends itself to labs at a variety of levels about climate concepts including seasons, the Greenhouse effect, reservoirs and flows, albedo feedback, Snowball Earth, climate sensitivity, and model experiment design. Climate calculations are extended to Mars with some modifications to the Earth climate component, and could be used in lessons about the Mars atmosphere, and exploring scenarios of Mars climate history.
NASA Astrophysics Data System (ADS)
Iglesias, A.; Quiroga, S.; Garrote, L.; Cunningham, R.
2012-04-01
This paper provides monetary estimates of the effects of agricultural adaptation to climate change in Europe. The model computes spatial crop productivity changes as a response to climate change linking biophysical and socioeconomic components. It combines available data sets of crop productivity changes under climate change (Iglesias et al 2011, Ciscar et al 2011), statistical functions of productivity response to water and nitrogen inputs, catchment level water availability, and environmental policy scenarios. Future global change scenarios are derived from several socio-economic futures of representative concentration pathways and regional climate models. The economic valuation is conducted by using GTAP general equilibrium model. The marginal productivity changes has been used as an input for the economic general equilibrium model in order to analyse the economic impact of the agricultural changes induced by climate change in the world. The study also includes the analysis of an adaptive capacity index computed by using the socio-economic results of GTAP. The results are combined to prioritize agricultural adaptation policy needs in Europe.
High resolution global climate modelling; the UPSCALE project, a large simulation campaign
NASA Astrophysics Data System (ADS)
Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.
2014-01-01
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.
High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign
NASA Astrophysics Data System (ADS)
Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.
2014-08-01
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.
Advances in Cross-Cutting Ideas for Computational Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, Esmond; Evans, Katherine J.; Caldwell, Peter
This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less
Advances in Cross-Cutting Ideas for Computational Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ng, E.; Evans, K.; Caldwell, P.
This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less
Shorebird Migration Patterns in Response to Climate Change: A Modeling Approach
NASA Technical Reports Server (NTRS)
Smith, James A.
2010-01-01
The availability of satellite remote sensing observations at multiple spatial and temporal scales, coupled with advances in climate modeling and information technologies offer new opportunities for the application of mechanistic models to predict how continental scale bird migration patterns may change in response to environmental change. In earlier studies, we explored the phenotypic plasticity of a migratory population of Pectoral sandpipers by simulating the movement patterns of an ensemble of 10,000 individual birds in response to changes in stopover locations as an indicator of the impacts of wetland loss and inter-annual variability on the fitness of migratory shorebirds. We used an individual based, biophysical migration model, driven by remotely sensed land surface data, climate data, and biological field data. Mean stop-over durations and stop-over frequency with latitude predicted from our model for nominal cases were consistent with results reported in the literature and available field data. In this study, we take advantage of new computing capabilities enabled by recent GP-GPU computing paradigms and commodity hardware (general purchase computing on graphics processing units). Several aspects of our individual based (agent modeling) approach lend themselves well to GP-GPU computing. We have been able to allocate compute-intensive tasks to the graphics processing units, and now simulate ensembles of 400,000 birds at varying spatial resolutions along the central North American flyway. We are incorporating additional, species specific, mechanistic processes to better reflect the processes underlying bird phenotypic plasticity responses to different climate change scenarios in the central U.S.
Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework
NASA Astrophysics Data System (ADS)
Gannon, C.
2017-12-01
As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.
NASA Astrophysics Data System (ADS)
Lin, S. J.
2015-12-01
The NOAA/Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions (e.g., tornado outbreak events and cat-5 hurricanes) and ultra-high-resolution (1-km) regional climate simulations within a consistent global modeling framework. The fundation of this flexible regional-global modeling system is the non-hydrostatic extension of the vertically Lagrangian dynamical core (Lin 2004, Monthly Weather Review) known in the community as FV3 (finite-volume on the cubed-sphere). Because of its flexability and computational efficiency, the FV3 is one of the final candidates of NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched (single) grid capability, a two-way (regional-global) multiple nested grid capability, and the combination of the stretched and two-way nests, so as to make convection-resolving regional climate simulation within a consistent global modeling system feasible using today's High Performance Computing System. One of our main scientific goals is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornadoes using a global model that was originally designed for century long climate simulations. As a unified weather-climate modeling system, we evaluated the performance of the model with horizontal resolution ranging from 1 km to as low as 200 km. In particular, for downscaling studies, we have developed various tests to ensure that the large-scale circulation within the global varaible resolution system is well simulated while at the same time the small-scale can be accurately captured within the targeted high resolution region.
NASA Astrophysics Data System (ADS)
Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.
2017-12-01
The representation of moist convection in climate models represents a major challenge, due to the small scales involved. Regional climate simulations using horizontal resolutions of O(1km) allow to explicitly resolve deep convection leading to an improved representation of the water cycle. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. A new version of the Consortium for Small-Scale Modeling weather and climate model (COSMO) is capable of exploiting new supercomputer architectures employing GPU accelerators, and allows convection-resolving climate simulations on computational domains spanning continents and time periods up to one decade. We present results from a decade-long, convection-resolving climate simulation on a European-scale computational domain. The simulation has a grid spacing of 2.2 km, 1536x1536x60 grid points, covers the period 1999-2008, and is driven by the ERA-Interim reanalysis. Specifically we present an evaluation of hourly rainfall using a wide range of data sets, including several rain-gauge networks and a remotely-sensed lightning data set. Substantial improvements are found in terms of the diurnal cycles of precipitation amount, wet-hour frequency and all-hour 99th percentile. However the results also reveal substantial differences between regions with and without strong orographic forcing. Furthermore we present an index for deep-convective activity based on the statistics of vertical motion. Comparison of the index with lightning data shows that the convection-resolving climate simulations are able to reproduce important features of the annual cycle of deep convection in Europe. Leutwyler D., D. Lüthi, N. Ban, O. Fuhrer, and C. Schär (2017): Evaluation of the Convection-Resolving Climate Modeling Approach on Continental Scales , J. Geophys. Res. Atmos., 122, doi:10.1002/2016JD026013.
A multiscale climate emulator for long-term morphodynamics (MUSCLE-morpho)
NASA Astrophysics Data System (ADS)
Antolínez, José Antonio A.; Méndez, Fernando J.; Camus, Paula; Vitousek, Sean; González, E. Mauricio; Ruggiero, Peter; Barnard, Patrick
2016-01-01
Interest in understanding long-term coastal morphodynamics has recently increased as climate change impacts become perceptible and accelerated. Multiscale, behavior-oriented and process-based models, or hybrids of the two, are typically applied with deterministic approaches which require considerable computational effort. In order to reduce the computational cost of modeling large spatial and temporal scales, input reduction and morphological acceleration techniques have been developed. Here we introduce a general framework for reducing dimensionality of wave-driver inputs to morphodynamic models. The proposed framework seeks to account for dependencies with global atmospheric circulation fields and deals simultaneously with seasonality, interannual variability, long-term trends, and autocorrelation of wave height, wave period, and wave direction. The model is also able to reproduce future wave climate time series accounting for possible changes in the global climate system. An application of long-term shoreline evolution is presented by comparing the performance of the real and the simulated wave climate using a one-line model. This article was corrected on 2 FEB 2016. See the end of the full text for details.
Biomes computed from simulated climatologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claussen, M.; Esch, M.
1994-01-01
The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a differencemore » in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.« less
Investigation of models for large-scale meteorological prediction experiments
NASA Technical Reports Server (NTRS)
Spar, J.
1981-01-01
An attempt is made to compute the contributions of various surface boundary conditions to the monthly mean states generated by the 7 layer, 8 x 10 GISS climate model (Hansen et al., 1980), and also to examine the influence of initial conditions on the model climate simulations. Obvious climatic controls as the shape and rotation of the Earth, the solar radiation, and the dry composition of the atmosphere are fixed, and only the surface boundary conditions are altered in the various climate simulations.
SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.
2014-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.
Progress in Earth System Modeling since the ENIAC Calculation
NASA Astrophysics Data System (ADS)
Fung, I.
2009-05-01
The success of the first numerical weather prediction experiment on the ENIAC computer in 1950 was hinged on the expansion of the meteorological observing network, which led to theoretical advances in atmospheric dynamics and subsequently the implementation of the simplified equations on the computer. This paper briefly reviews the progress in Earth System Modeling and climate observations, and suggests a strategy to sustain and expand the observations needed to advance climate science and prediction.
NASA Astrophysics Data System (ADS)
Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.
2017-12-01
This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.
High performance computing (HPC) requirements for the new generation variable grid resolution (VGR) global climate models differ from that of traditional global models. A VGR global model with 15 km grids over the CONUS stretching to 60 km grids elsewhere will have about ~2.5 tim...
A personal perspective on modelling the climate system.
Palmer, T N
2016-04-01
Given their increasing relevance for society, I suggest that the climate science community itself does not treat the development of error-free ab initio models of the climate system with sufficient urgency. With increasing levels of difficulty, I discuss a number of proposals for speeding up such development. Firstly, I believe that climate science should make better use of the pool of post-PhD talent in mathematics and physics, for developing next-generation climate models. Secondly, I believe there is more scope for the development of modelling systems which link weather and climate prediction more seamlessly. Finally, here in Europe, I call for a new European Programme on Extreme Computing and Climate to advance our ability to simulate climate extremes, and understand the drivers of such extremes. A key goal for such a programme is the development of a 1 km global climate system model to run on the first exascale supercomputers in the early 2020s.
NASA Technical Reports Server (NTRS)
Shen, Bo-Wen; Tao, Wei-Kuo; Chern, Jiun-Dar
2007-01-01
Improving our understanding of hurricane inter-annual variability and the impact of climate change (e.g., doubling CO2 and/or global warming) on hurricanes brings both scientific and computational challenges to researchers. As hurricane dynamics involves multiscale interactions among synoptic-scale flows, mesoscale vortices, and small-scale cloud motions, an ideal numerical model suitable for hurricane studies should demonstrate its capabilities in simulating these interactions. The newly-developed multiscale modeling framework (MMF, Tao et al., 2007) and the substantial computing power by the NASA Columbia supercomputer show promise in pursuing the related studies, as the MMF inherits the advantages of two NASA state-of-the-art modeling components: the GEOS4/fvGCM and 2D GCEs. This article focuses on the computational issues and proposes a revised methodology to improve the MMF's performance and scalability. It is shown that this prototype implementation enables 12-fold performance improvements with 364 CPUs, thereby making it more feasible to study hurricane climate.
Statistical surrogate models for prediction of high-consequence climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick
2011-09-01
In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less
ERIC Educational Resources Information Center
Carey, Cayelan C.; Gougis, Rebekka Darner
2017-01-01
Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…
Mesoscale Climate Evaluation Using Grid Computing
NASA Astrophysics Data System (ADS)
Campos Velho, H. F.; Freitas, S. R.; Souto, R. P.; Charao, A. S.; Ferraz, S.; Roberti, D. R.; Streck, N.; Navaux, P. O.; Maillard, N.; Collischonn, W.; Diniz, G.; Radin, B.
2012-04-01
The CLIMARS project is focused to establish an operational environment for seasonal climate prediction for the Rio Grande do Sul state, Brazil. The dynamical downscaling will be performed with the use of several software platforms and hardware infrastructure to carry out the investigation on mesoscale of the global change impact. The grid computing takes advantage of geographically spread out computer systems, connected by the internet, for enhancing the power of computation. The ensemble climate prediction is an appropriated application for processing on grid computing, because the integration of each ensemble member does not have a dependency on information from another ensemble members. The grid processing is employed to compute the 20-year climatology and the long range simulations under ensemble methodology. BRAMS (Brazilian Regional Atmospheric Model) is a mesoscale model developed from a version of the RAMS (from the Colorado State University - CSU, USA). BRAMS model is the tool for carrying out the dynamical downscaling from the IPCC scenarios. Long range BRAMS simulations will provide data for some climate (data) analysis, and supply data for numerical integration of different models: (a) Regime of the extreme events for temperature and precipitation fields: statistical analysis will be applied on the BRAMS data, (b) CCATT-BRAMS (Coupled Chemistry Aerosol Tracer Transport - BRAMS) is an environmental prediction system that will be used to evaluate if the new standards of temperature, rain regime, and wind field have a significant impact on the pollutant dispersion in the analyzed regions, (c) MGB-IPH (Portuguese acronym for the Large Basin Model (MGB), developed by the Hydraulic Research Institute, (IPH) from the Federal University of Rio Grande do Sul (UFRGS), Brazil) will be employed to simulate the alteration of the river flux under new climate patterns. Important meteorological input variables for the MGB-IPH are the precipitation (most relevant), temperature, and wind field, all provided by BRAMS. The Uruguay river basin will be analyzed in the scope of this proposal, (d) INFOCROP: this crop model has been calibrated for Southern Brazil, three agriculture cropswill be analyzed: rice, soybean and corn.
Adapting wheat to uncertain future
NASA Astrophysics Data System (ADS)
Semenov, Mikhail; Stratonovitch, Pierre
2015-04-01
This study describes integration of climate change projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model ensemble with the LARS-WG weather generator, which delivers an attractive option for downscaling of large-scale climate projections from global climate models (GCMs) to local-scale climate scenarios for impact assessments. A subset of 18 GCMs from the CMIP5 ensemble and 2 RCPs, RCP4.5 and RCP8.5, were integrated with LARS-WG. Climate sensitivity indexes for temperature and precipitation were computed for all GCMs and for 21 regions in the world. For computationally demanding impact assessments, where it is not practical to explore all possible combinations of GCM × RCP, climate sensitivity indexes could be used to select a subset of GCMs from CMIP5 with contrasting climate sensitivity. This would allow to quantify uncertainty in impacts resulting from the CMIP5 ensemble by conducting fewer simulation experiments. As an example, an in silico design of wheat ideotype optimised for future climate scenarios in Europe was described. Two contrasting GCMs were selected for the analysis, "hot" HadGEM2-ES and "cool" GISS-E2-R-CC, along with 2 RCPs. Despite large uncertainty in climate projections, several wheat traits were identified as beneficial for the high-yielding wheat ideotypes that could be used as targets for wheat improvement by breeders.
Knowledge Discovery from Climate Data using Graph-Based Methods
NASA Astrophysics Data System (ADS)
Steinhaeuser, K.
2012-04-01
Climate and Earth sciences have recently experienced a rapid transformation from a historically data-poor to a data-rich environment, thus bringing them into the realm of the Fourth Paradigm of scientific discovery - a term coined by the late Jim Gray (Hey et al. 2009), the other three being theory, experimentation and computer simulation. In particular, climate-related observations from remote sensors on satellites and weather radars, in situ sensors and sensor networks, as well as outputs of climate or Earth system models from large-scale simulations, provide terabytes of spatio-temporal data. These massive and information-rich datasets offer a significant opportunity for advancing climate science and our understanding of the global climate system, yet current analysis techniques are not able to fully realize their potential benefits. We describe a class of computational approaches, specifically from the data mining and machine learning domains, which may be novel to the climate science domain and can assist in the analysis process. Computer scientists have developed spatial and spatio-temporal analysis techniques for a number of years now, and many of them may be applicable and/or adaptable to problems in climate science. We describe a large-scale, NSF-funded project aimed at addressing climate science question using computational analysis methods; team members include computer scientists, statisticians, and climate scientists from various backgrounds. One of the major thrusts is in the development of graph-based methods, and several illustrative examples of recent work in this area will be presented.
Computational data sciences for assessment and prediction of climate extremes
NASA Astrophysics Data System (ADS)
Ganguly, A. R.
2011-12-01
Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
USDA-ARS?s Scientific Manuscript database
With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...
SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.
2015-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Shujia; Duffy, Daniel; Clune, Thomas
The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratiomore » of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.« less
Report for Oregon State University Reporting Period: June 2016 to June 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchings, Jennifer
The goal of this project is to develop an eddy resolving ocean model (POP) with tides coupled to a sea ice model (CICE) within the Regional Arctic System Model (RASM) to investigate the importance of ocean tides and mesoscale eddies in arctic climate simulations and quantify biases associated with these processes and how their relative contribution may improve decadal to centennial arctic climate predictions. Ocean, sea ice and coupled arctic climate response to these small scale processes will be evaluated with regard to their influence on mass, momentum and property exchange between oceans, shelf-basin, ice-ocean, and ocean-atmosphere. The project willmore » facilitate the future routine inclusion of polar tides and eddies in Earth System Models when computing power allows. As such, the proposed research addresses the science in support of the BER’s Climate and Environmental Sciences Division Long Term Measure as it will improve the ocean and sea ice model components as well as the fully coupled RASM and Community Earth System Model (CESM) and it will make them more accurate and computationally efficient.« less
Ocean-Atmosphere Coupled Model Simulations of Precipitation in the Central Andes
NASA Technical Reports Server (NTRS)
Nicholls, Stephen D.; Mohr, Karen I.
2015-01-01
The meridional extent and complex orography of the South American continent contributes to a wide diversity of climate regimes ranging from hyper-arid deserts to tropical rainforests to sub-polar highland regions. In addition, South American meteorology and climate are also made further complicated by ENSO, a powerful coupled ocean-atmosphere phenomenon. Modelling studies in this region have typically resorted to either atmospheric mesoscale or atmosphere-ocean coupled global climate models. The latter offers full physics and high spatial resolution, but it is computationally inefficient typically lack an interactive ocean, whereas the former offers high computational efficiency and ocean-atmosphere coupling, but it lacks adequate spatial and temporal resolution to adequate resolve the complex orography and explicitly simulate precipitation. Explicit simulation of precipitation is vital in the Central Andes where rainfall rates are light (0.5-5 mm hr-1), there is strong seasonality, and most precipitation is associated with weak mesoscale-organized convection. Recent increases in both computational power and model development have led to the advent of coupled ocean-atmosphere mesoscale models for both weather and climate study applications. These modelling systems, while computationally expensive, include two-way ocean-atmosphere coupling, high resolution, and explicit simulation of precipitation. In this study, we use the Coupled Ocean-Atmosphere-Wave-Sediment Transport (COAWST), a fully-coupled mesoscale atmosphere-ocean modeling system. Previous work has shown COAWST to reasonably simulate the entire 2003-2004 wet season (Dec-Feb) as validated against both satellite and model analysis data when ECMWF interim analysis data were used for boundary conditions on a 27-9-km grid configuration (Outer grid extent: 60.4S to 17.7N and 118.6W to 17.4W).
Evaluating the utility of dynamical downscaling in agricultural impacts projections
Glotter, Michael; Elliott, Joshua; McInerney, David; Best, Neil; Foster, Ian; Moyer, Elisabeth J.
2014-01-01
Interest in estimating the potential socioeconomic costs of climate change has led to the increasing use of dynamical downscaling—nested modeling in which regional climate models (RCMs) are driven with general circulation model (GCM) output—to produce fine-spatial-scale climate projections for impacts assessments. We evaluate here whether this computationally intensive approach significantly alters projections of agricultural yield, one of the greatest concerns under climate change. Our results suggest that it does not. We simulate US maize yields under current and future CO2 concentrations with the widely used Decision Support System for Agrotechnology Transfer crop model, driven by a variety of climate inputs including two GCMs, each in turn downscaled by two RCMs. We find that no climate model output can reproduce yields driven by observed climate unless a bias correction is first applied. Once a bias correction is applied, GCM- and RCM-driven US maize yields are essentially indistinguishable in all scenarios (<10% discrepancy, equivalent to error from observations). Although RCMs correct some GCM biases related to fine-scale geographic features, errors in yield are dominated by broad-scale (100s of kilometers) GCM systematic errors that RCMs cannot compensate for. These results support previous suggestions that the benefits for impacts assessments of dynamically downscaling raw GCM output may not be sufficient to justify its computational demands. Progress on fidelity of yield projections may benefit more from continuing efforts to understand and minimize systematic error in underlying climate projections. PMID:24872455
Prein, Andreas; Langhans, Wolfgang; Fosser, Giorgia; ...
2015-05-27
Regional climate modeling using convection permitting models (CPMs) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs). CPMs do not use convection parameterization schemes, known as a major source of errors and uncertainties, and have more accurate surface and orography elds. The drawback of CPMs is their high demand on computational resources. For this reason, the CPM climate simulations only appeared a decade ago. In this study we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic.more » The most important components in CPM, such as physical parameterizations and dynamical formulations are discussed, and an outlook on required future developments and computer architectures that would support the application of CPMs is given. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Most improvements are found for processes related to deep convection (e.g., precipitation during summer), for mountainous regions, and for the soil-vegetation-atmosphere interactions. The climate change signals of CPM simulations reveal increases in short and extreme rainfall events and an increased ratio of liquid precipitation at the surface (a decrease of hail) potentially leading to more frequent ash oods. Concluding, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to assess their full potential and support their development.« less
Pinatubo global cooling on target
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, R.A.
1993-01-29
When Pinatubo blasted millions of tons of debris into the stratosphere in June 1991, Hansen of NASA's Goddard Institute for Space Studies used his computer climate model to predict that the shade cost by the debris would cool the globe by about half a degree C. Year end temperature reports for 1992 are now showing that the prediction was on target-confirming the tentative belief that volcanos can temporarily cool the climate and validating at least one component of the computer models predicting a greenhouse warming.
The Monash University Interactive Simple Climate Model
NASA Astrophysics Data System (ADS)
Dommenget, D.
2013-12-01
The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.
An introduction to three-dimensional climate modeling
NASA Technical Reports Server (NTRS)
Washington, W. M.; Parkinson, C. L.
1986-01-01
The development and use of three-dimensional computer models of the earth's climate are discussed. The processes and interactions of the atmosphere, oceans, and sea ice are examined. The basic theory of climate simulation which includes the fundamental equations, models, and numerical techniques for simulating the atmosphere, oceans, and sea ice is described. Simulated wind, temperature, precipitation, ocean current, and sea ice distribution data are presented and compared to observational data. The responses of the climate to various environmental changes, such as variations in solar output or increases in atmospheric carbon dioxide, are modeled. Future developments in climate modeling are considered. Information is also provided on the derivation of the energy equation, the finite difference barotropic forecast model, the spectral transform technique, and the finite difference shallow water waved equation model.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.
Development of a system emulating the global carbon cycle in Earth system models
NASA Astrophysics Data System (ADS)
Tachiiri, K.; Hargreaves, J. C.; Annan, J. D.; Oka, A.; Abe-Ouchi, A.; Kawamiya, M.
2010-08-01
Recent studies have indicated that the uncertainty in the global carbon cycle may have a significant impact on the climate. Since state of the art models are too computationally expensive for it to be possible to explore their parametric uncertainty in anything approaching a comprehensive fashion, we have developed a simplified system for investigating this problem. By combining the strong points of general circulation models (GCMs), which contain detailed and complex processes, and Earth system models of intermediate complexity (EMICs), which are quick and capable of large ensembles, we have developed a loosely coupled model (LCM) which can represent the outputs of a GCM-based Earth system model, using much smaller computational resources. We address the problem of relatively poor representation of precipitation within our EMIC, which prevents us from directly coupling it to a vegetation model, by coupling it to a precomputed transient simulation using a full GCM. The LCM consists of three components: an EMIC (MIROC-lite) which consists of a 2-D energy balance atmosphere coupled to a low resolution 3-D GCM ocean (COCO) including an ocean carbon cycle (an NPZD-type marine ecosystem model); a state of the art vegetation model (Sim-CYCLE); and a database of daily temperature, precipitation, and other necessary climatic fields to drive Sim-CYCLE from a precomputed transient simulation from a state of the art AOGCM. The transient warming of the climate system is calculated from MIROC-lite, with the global temperature anomaly used to select the most appropriate annual climatic field from the pre-computed AOGCM simulation which, in this case, is a 1% pa increasing CO2 concentration scenario. By adjusting the effective climate sensitivity (equivalent to the equilibrium climate sensitivity for an energy balance model) of MIROC-lite, the transient warming of the LCM could be adjusted to closely follow the low sensitivity (with an equilibrium climate sensitivity of 4.0 K) version of MIROC3.2. By tuning of the physical and biogeochemical parameters it was possible to reasonably reproduce the bulk physical and biogeochemical properties of previously published CO2 stabilisation scenarios for that model. As an example of an application of the LCM, the behavior of the high sensitivity version of MIROC3.2 (with a 6.3 K equilibrium climate sensitivity) is also demonstrated. Given the highly adjustable nature of the model, we believe that the LCM should be a very useful tool for studying uncertainty in global climate change, and we have named the model, JUMP-LCM, after the name of our research group (Japan Uncertainty Modelling Project).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
Network-based approaches to climate knowledge discovery
NASA Astrophysics Data System (ADS)
Budich, Reinhard; Nyberg, Per; Weigel, Tobias
2011-11-01
Climate Knowledge Discovery Workshop; Hamburg, Germany, 30 March to 1 April 2011 Do complex networks combined with semantic Web technologies offer the next generation of solutions in climate science? To address this question, a first Climate Knowledge Discovery (CKD) Workshop, hosted by the German Climate Computing Center (Deutsches Klimarechenzentrum (DKRZ)), brought together climate and computer scientists from major American and European laboratories, data centers, and universities, as well as representatives from industry, the broader academic community, and the semantic Web communities. The participants, representing six countries, were concerned with large-scale Earth system modeling and computational data analysis. The motivation for the meeting was the growing problem that climate scientists generate data faster than it can be interpreted and the need to prepare for further exponential data increases. Current analysis approaches are focused primarily on traditional methods, which are best suited for large-scale phenomena and coarse-resolution data sets. The workshop focused on the open discussion of ideas and technologies to provide the next generation of solutions to cope with the increasing data volumes in climate science.
A personal perspective on modelling the climate system
Palmer, T. N.
2016-01-01
Given their increasing relevance for society, I suggest that the climate science community itself does not treat the development of error-free ab initio models of the climate system with sufficient urgency. With increasing levels of difficulty, I discuss a number of proposals for speeding up such development. Firstly, I believe that climate science should make better use of the pool of post-PhD talent in mathematics and physics, for developing next-generation climate models. Secondly, I believe there is more scope for the development of modelling systems which link weather and climate prediction more seamlessly. Finally, here in Europe, I call for a new European Programme on Extreme Computing and Climate to advance our ability to simulate climate extremes, and understand the drivers of such extremes. A key goal for such a programme is the development of a 1 km global climate system model to run on the first exascale supercomputers in the early 2020s. PMID:27274686
Impacts of weighting climate models for hydro-meteorological climate change studies
NASA Astrophysics Data System (ADS)
Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe; Caya, Daniel
2017-06-01
Weighting climate models is controversial in climate change impact studies using an ensemble of climate simulations from different climate models. In climate science, there is a general consensus that all climate models should be considered as having equal performance or in other words that all projections are equiprobable. On the other hand, in the impacts and adaptation community, many believe that climate models should be weighted based on their ability to better represent various metrics over a reference period. The debate appears to be partly philosophical in nature as few studies have investigated the impact of using weights in projecting future climate changes. The present study focuses on the impact of assigning weights to climate models for hydrological climate change studies. Five methods are used to determine weights on an ensemble of 28 global climate models (GCMs) adapted from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. Using a hydrological model, streamflows are computed over a reference (1961-1990) and future (2061-2090) periods, with and without post-processing climate model outputs. The impacts of using different weighting schemes for GCM simulations are then analyzed in terms of ensemble mean and uncertainty. The results show that weighting GCMs has a limited impact on both projected future climate in term of precipitation and temperature changes and hydrology in terms of nine different streamflow criteria. These results apply to both raw and post-processed GCM model outputs, thus supporting the view that climate models should be considered equiprobable.
ERIC Educational Resources Information Center
Yarker, Morgan Brown
2013-01-01
Research suggests that scientific models and modeling should be topics covered in K-12 classrooms as part of a comprehensive science curriculum. It is especially important when talking about topics in weather and climate, where computer and forecast models are the center of attention. There are several approaches to model based inquiry, but it can…
NASA Astrophysics Data System (ADS)
Doroszkiewicz, J. M.; Romanowicz, R. J.
2016-12-01
The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.
NASA Astrophysics Data System (ADS)
Erfanian, A.; Fomenko, L.; Wang, G.
2016-12-01
Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling
NASA Technical Reports Server (NTRS)
Suarex, Max J. (Editor); Chou, Ming-Dah
1994-01-01
A detailed description of a parameterization for thermal infrared radiative transfer designed specifically for use in global climate models is presented. The parameterization includes the effects of the main absorbers of terrestrial radiation: water vapor, carbon dioxide, and ozone. While being computationally efficient, the schemes compute very accurately the clear-sky fluxes and cooling rates from the Earth's surface to 0.01 mb. This combination of accuracy and speed makes the parameterization suitable for both tropospheric and middle atmospheric modeling applications. Since no transmittances are precomputed the atmospheric layers and the vertical distribution of the absorbers may be freely specified. The scheme can also account for any vertical distribution of fractional cloudiness with arbitrary optical thickness. These features make the parameterization very flexible and extremely well suited for use in climate modeling studies. In addition, the numerics and the FORTRAN implementation have been carefully designed to conserve both memory and computer time. This code should be particularly attractive to those contemplating long-term climate simulations, wishing to model the middle atmosphere, or planning to use a large number of levels in the vertical.
NASA Astrophysics Data System (ADS)
Lintner, B. R.; Loikith, P. C.; Pike, M.; Aragon, C.
2017-12-01
Climate change information is increasingly required at impact-relevant scales. However, most state-of-the-art climate models are not of sufficiently high spatial resolution to resolve features explicitly at such scales. This challenge is particularly acute in regions of complex topography, such as the Pacific Northwest of the United States. To address this scale mismatch problem, we consider large-scale meteorological patterns (LSMPs), which can be resolved by climate models and associated with the occurrence of local scale climate and climate extremes. In prior work, using self-organizing maps (SOMs), we computed LSMPs over the northwestern United States (NWUS) from daily reanalysis circulation fields and further related these to the occurrence of observed extreme temperatures and precipitation: SOMs were used to group LSMPs into 12 nodes or clusters spanning the continuum of synoptic variability over the regions. Here this observational foundation is utilized as an evaluation target for a suite of global climate models from the Fifth Phase of the Coupled Model Intercomparison Project (CMIP5). Evaluation is performed in two primary ways. First, daily model circulation fields are assigned to one of the 12 reanalysis nodes based on minimization of the mean square error. From this, a bulk model skill score is computed measuring the similarity between the model and reanalysis nodes. Next, SOMs are applied directly to the model output and compared to the nodes obtained from reanalysis. Results reveal that many of the models have LSMPs analogous to the reanalysis, suggesting that the models reasonably capture observed daily synoptic states.
Socio-economic and climate change impacts on agriculture: an integrated assessment, 1990–2080
Fischer, Günther; Shah, Mahendra; N. Tubiello, Francesco; van Velhuizen, Harrij
2005-01-01
A comprehensive assessment of the impacts of climate change on agro-ecosystems over this century is developed, up to 2080 and at a global level, albeit with significant regional detail. To this end an integrated ecological–economic modelling framework is employed, encompassing climate scenarios, agro-ecological zoning information, socio-economic drivers, as well as world food trade dynamics. Specifically, global simulations are performed using the FAO/IIASA agro-ecological zone model, in conjunction with IIASAs global food system model, using climate variables from five different general circulation models, under four different socio-economic scenarios from the intergovernmental panel on climate change. First, impacts of different scenarios of climate change on bio-physical soil and crop growth determinants of yield are evaluated on a 5′×5′ latitude/longitude global grid; second, the extent of potential agricultural land and related potential crop production is computed. The detailed bio-physical results are then fed into an economic analysis, to assess how climate impacts may interact with alternative development pathways, and key trends expected over this century for food demand and production, and trade, as well as key composite indices such as risk of hunger and malnutrition, are computed. This modelling approach connects the relevant bio-physical and socio-economic variables within a unified and coherent framework to produce a global assessment of food production and security under climate change. The results from the study suggest that critical impact asymmetries due to both climate and socio-economic structures may deepen current production and consumption gaps between developed and developing world; it is suggested that adaptation of agricultural techniques will be central to limit potential damages under climate change. PMID:16433094
Challenges and opportunities of cloud computing for atmospheric sciences
NASA Astrophysics Data System (ADS)
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
Intentions of hospital nurses to work with computers: based on the theory of planned behavior.
Shoham, Snunith; Gonen, Ayala
2008-01-01
The purpose of this study was to determine registered nurses' attitudes related to intent to use computers in the hospital setting as a predictor of their future behavior. The study was further aimed at identifying the relationship between these attitudes and selected sociological, professional, and personal factors and to describe a research model integrating these various factors. The study was based on the theory of planned behavior. A random sample of 411 registered nurses was selected from a single large medical center in Israel. The study tool was a Likert-style questionnaire. Nine different indices were used: (1) behavioral intention toward computer use; (2) general attitudes toward computer use; (3) nursing attitudes toward computer use; (4) threat involved in computer use; (5) challenge involved in computer use; (6) organizational climate; (7) departmental climate; (8) attraction to technological innovations/innovativeness; (9) self-efficacy, ability to control behavior. Strong significant positive correlations were found between the nurses' attitudes (general attitudes and nursing attitudes), self-efficacy, innovativeness, and intentions to use computers. Higher correlations were found between departmental climate and attitudes than between organizational climate and attitudes. The threat and challenge that are involved in computer use were shown as important mediating variables to the understanding of the process of predicting attitudes and intentions toward using computers.
Understanding Climate Uncertainty with an Ocean Focus
NASA Astrophysics Data System (ADS)
Tokmakian, R. T.
2009-12-01
Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.
NASA Astrophysics Data System (ADS)
Perez Montes, Diego A.; Añel Cabanelas, Juan A.; Wallom, David C. H.; Arribas, Alberto; Uhe, Peter; Caderno, Pablo V.; Pena, Tomas F.
2017-04-01
Cloud Computing is a technological option that offers great possibilities for modelling in geosciences. We have studied how two different climate models, HadAM3P-HadRM3P and CESM-WACCM, can be adapted in two different ways to run on Cloud Computing Environments from three different vendors: Amazon, Google and Microsoft. Also, we have evaluated qualitatively how the use of Cloud Computing can affect the allocation of resources by funding bodies and issues related to computing security, including scientific reproducibility. Our first experiments were developed using the well known ClimatePrediction.net (CPDN), that uses BOINC, over the infrastructure from two cloud providers, namely Microsoft Azure and Amazon Web Services (hereafter AWS). For this comparison we ran a set of thirteen month climate simulations for CPDN in Azure and AWS using a range of different virtual machines (VMs) for HadRM3P (50 km resolution over South America CORDEX region) nested in the global atmosphere-only model HadAM3P. These simulations were run on a single processor and took between 3 and 5 days to compute depending on the VM type. The last part of our simulation experiments was running WACCM over different VMS on the Google Compute Engine (GCE) and make a comparison with the supercomputer (SC) Finisterrae1 from the Centro de Supercomputacion de Galicia. It was shown that GCE gives better performance than the SC for smaller number of cores/MPI tasks but the model throughput shows clearly how the SC performance is better after approximately 100 cores (related with network speed and latency differences). From a cost point of view, Cloud Computing moves researchers from a traditional approach where experiments were limited by the available hardware resources to monetary resources (how many resources can be afforded). As there is an increasing movement and recommendation for budgeting HPC projects on this technology (budgets can be calculated in a more realistic way) we could see a shift on the trends over the next years to consolidate Cloud as the preferred solution.
High-resolution downscaling for hydrological management
NASA Astrophysics Data System (ADS)
Ulbrich, Uwe; Rust, Henning; Meredith, Edmund; Kpogo-Nuwoklo, Komlan; Vagenas, Christos
2017-04-01
Hydrological modellers and water managers require high-resolution climate data to model regional hydrologies and how these may respond to future changes in the large-scale climate. The ability to successfully model such changes and, by extension, critical infrastructure planning is often impeded by a lack of suitable climate data. This typically takes the form of too-coarse data from climate models, which are not sufficiently detailed in either space or time to be able to support water management decisions and hydrological research. BINGO (Bringing INnovation in onGOing water management;
Analysis of the Relationship Between Climate and NDVI Variability at Global Scales
NASA Technical Reports Server (NTRS)
Zeng, Fan-Wei; Collatz, G. James; Pinzon, Jorge; Ivanoff, Alvaro
2011-01-01
interannual variability in modeled (CASA) C flux is in part caused by interannual variability in Normalized Difference Vegetation Index (NDVI) Fraction of Photosynthetically Active Radiation (FPAR). This study confirms a mechanism producing variability in modeled NPP: -- NDVI (FPAR) interannual variability is strongly driven by climate; -- The climate driven variability in NDVI (FPAR) can lead to much larger fluctuation in NPP vs. the NPP computed from FPAR climatology
Compilation of Abstracts for SC12 Conference Proceedings
NASA Technical Reports Server (NTRS)
Morello, Gina Francine (Compiler)
2012-01-01
1 A Breakthrough in Rotorcraft Prediction Accuracy Using Detached Eddy Simulation; 2 Adjoint-Based Design for Complex Aerospace Configurations; 3 Simulating Hypersonic Turbulent Combustion for Future Aircraft; 4 From a Roar to a Whisper: Making Modern Aircraft Quieter; 5 Modeling of Extended Formation Flight on High-Performance Computers; 6 Supersonic Retropropulsion for Mars Entry; 7 Validating Water Spray Simulation Models for the SLS Launch Environment; 8 Simulating Moving Valves for Space Launch System Liquid Engines; 9 Innovative Simulations for Modeling the SLS Solid Rocket Booster Ignition; 10 Solid Rocket Booster Ignition Overpressure Simulations for the Space Launch System; 11 CFD Simulations to Support the Next Generation of Launch Pads; 12 Modeling and Simulation Support for NASA's Next-Generation Space Launch System; 13 Simulating Planetary Entry Environments for Space Exploration Vehicles; 14 NASA Center for Climate Simulation Highlights; 15 Ultrascale Climate Data Visualization and Analysis; 16 NASA Climate Simulations and Observations for the IPCC and Beyond; 17 Next-Generation Climate Data Services: MERRA Analytics; 18 Recent Advances in High-Resolution Global Atmospheric Modeling; 19 Causes and Consequences of Turbulence in the Earths Protective Shield; 20 NASA Earth Exchange (NEX): A Collaborative Supercomputing Platform; 21 Powering Deep Space Missions: Thermoelectric Properties of Complex Materials; 22 Meeting NASA's High-End Computing Goals Through Innovation; 23 Continuous Enhancements to the Pleiades Supercomputer for Maximum Uptime; 24 Live Demonstrations of 100-Gbps File Transfers Across LANs and WANs; 25 Untangling the Computing Landscape for Climate Simulations; 26 Simulating Galaxies and the Universe; 27 The Mysterious Origin of Stellar Masses; 28 Hot-Plasma Geysers on the Sun; 29 Turbulent Life of Kepler Stars; 30 Modeling Weather on the Sun; 31 Weather on Mars: The Meteorology of Gale Crater; 32 Enhancing Performance of NASAs High-End Computing Applications; 33 Designing Curiosity's Perfect Landing on Mars; 34 The Search Continues: Kepler's Quest for Habitable Earth-Sized Planets.
Potential evapotranspiration and continental drying
Milly, Paul C.D.; Dunne, Krista A.
2016-01-01
By various measures (drought area and intensity, climatic aridity index, and climatic water deficits), some observational analyses have suggested that much of the Earth’s land has been drying during recent decades, but such drying seems inconsistent with observations of dryland greening and decreasing pan evaporation. ‘Offline’ analyses of climate-model outputs from anthropogenic climate change (ACC) experiments portend continuation of putative drying through the twenty-first century, despite an expected increase in global land precipitation. A ubiquitous increase in estimates of potential evapotranspiration (PET), driven by atmospheric warming, underlies the drying trends, but may be a methodological artefact. Here we show that the PET estimator commonly used (the Penman–Monteith PET for either an open-water surface or a reference crop) severely overpredicts the changes in non-water-stressed evapotranspiration computed in the climate models themselves in ACC experiments. This overprediction is partially due to neglect of stomatal conductance reductions commonly induced by increasing atmospheric CO2 concentrations in climate models. Our findings imply that historical and future tendencies towards continental drying, as characterized by offline-computed runoff, as well as other PET-dependent metrics, may be considerably weaker and less extensive than previously thought.
Classification and Feature Selection Algorithms for Modeling Ice Storm Climatology
NASA Astrophysics Data System (ADS)
Swaminathan, R.; Sridharan, M.; Hayhoe, K.; Dobbie, G.
2015-12-01
Ice storms account for billions of dollars of winter storm loss across the continental US and Canada. In the future, increasing concentration of human populations in areas vulnerable to ice storms such as the northeastern US will only exacerbate the impacts of these extreme events on infrastructure and society. Quantifying the potential impacts of global climate change on ice storm prevalence and frequency is challenging, as ice storm climatology is driven by complex and incompletely defined atmospheric processes, processes that are in turn influenced by a changing climate. This makes the underlying atmospheric and computational modeling of ice storm climatology a formidable task. We propose a novel computational framework that uses sophisticated stochastic classification and feature selection algorithms to model ice storm climatology and quantify storm occurrences from both reanalysis and global climate model outputs. The framework is based on an objective identification of ice storm events by key variables derived from vertical profiles of temperature, humidity and geopotential height. Historical ice storm records are used to identify days with synoptic-scale upper air and surface conditions associated with ice storms. Evaluation using NARR reanalysis and historical ice storm records corresponding to the northeastern US demonstrates that an objective computational model with standard performance measures, with a relatively high degree of accuracy, identify ice storm events based on upper-air circulation patterns and provide insights into the relationships between key climate variables associated with ice storms.
A Decade-Long European-Scale Convection-Resolving Climate Simulation on GPUs
NASA Astrophysics Data System (ADS)
Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.
2016-12-01
Convection-resolving models have proven to be very useful tools in numerical weather prediction and in climate research. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in the supercomputing domain have led to new supercomputer designs that involve conventional multi-core CPUs and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to GPUs is the Consortium for Small-Scale Modeling weather and climate model COSMO. This new version allows us to expand the size of the simulation domain to areas spanning continents and the time period up to one decade. We present results from a decade-long, convection-resolving climate simulation over Europe using the GPU-enabled COSMO version on a computational domain with 1536x1536x60 gridpoints. The simulation is driven by the ERA-interim reanalysis. The results illustrate how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. We discuss some of the advantages and prospects from using GPUs, and focus on the performance of the convection-resolving modeling approach on the European scale. Specifically we investigate the organization of convective clouds and on validate hourly rainfall distributions with various high-resolution data sets.
NASA Astrophysics Data System (ADS)
Nakagawa, T.; Tajika, E.; Kadoya, S.
2017-12-01
Discussing an impact of evolution and dynamics in the Earth's deep interior on the surface climate change for the last few decades (see review by Ehlmann et al., 2016), the mantle volatile (particularly carbon) degassing in the mid-oceanic ridges seems to play a key role in understanding the evolutionary climate track for Earth-like planets (e.g. Kadoya and Tajika, 2015). However, since the mantle degassing occurs not only in the mid-oceanic ridges but also in the wedge mantle (island arc volcanism) and hotspots, to incorporate more accurate estimate of mantle degassing flux into the climate evolution framework, we developed a coupled model of surface climate-deep Earth evolution in numerical mantle convection simulations, including more accurate deep water and carbon cycle (e.g. Nakagawa and Spiegelman, 2017) with an energy balance theory of climate change. Modeling results suggest that the evolution of planetary climate computed from a developed model is basically consistent with an evolutionary climate track in simplified mantle degassing model (Kadoya and Tajika, 2015), but an occurrence timing of global (snowball) glaciation is strongly dependent on mantle degassing rate occurred with activities of surface plate motions. With this implication, the surface plate motion driven by deep mantle dynamics would play an important role in the planetary habitability of such as the Earth and Earth-like planets over geologic time-scale.
Development of climate data storage and processing model
NASA Astrophysics Data System (ADS)
Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.
2016-11-01
We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.
A new climate modeling framework for convection-resolving simulation at continental scale
NASA Astrophysics Data System (ADS)
Charpilloz, Christophe; di Girolamo, Salvatore; Arteaga, Andrea; Fuhrer, Oliver; Hoefler, Torsten; Schulthess, Thomas; Schär, Christoph
2017-04-01
Major uncertainties remain in our understanding of the processes that govern the water cycle in a changing climate and their representation in weather and climate models. Of particular concern are heavy precipitation events of convective origin (thunderstorms and rain showers). The aim of the crCLIM project [1] is to propose a new climate modeling framework that alleviates the I/O-bottleneck in large-scale, convection-resolving climate simulations and thus to enable new analysis techniques for climate scientists. Due to the large computational costs, convection-resolving simulations are currently restricted to small computational domains or very short time scales, unless the largest available supercomputers system such as hybrid CPU-GPU architectures are used [3]. Hence, the COSMO model has been adapted to run on these architectures for research and production purposes [2]. However, the amount of generated data also increases and storing this data becomes infeasible making the analysis of simulations results impractical. To circumvent this problem and enable high-resolution models in climate we propose a data-virtualization layer (DVL) that re-runs simulations on demand and transparently manages the data for the analysis, that means we trade off computational effort (time) for storage (space). This approach also requires a bit-reproducible version of the COSMO model that produces identical results on different architectures (CPUs and GPUs) [4] that will be coupled with a performance model in order enable optimal re-runs depending on requirements of the re-run and available resources. In this contribution, we discuss the strategy to develop the DVL, a first performance model, the challenge of bit-reproducibility and the first results of the crCLIM project. [1] http://www.c2sm.ethz.ch/research/crCLIM.html [2] O. Fuhrer, C. Osuna, X. Lapillonne, T. Gysi, M. Bianco, and T. Schulthess. "Towards gpu-accelerated operational weather forecasting." In The GPU Technology Conference, GTC. 2013. [3] D. Leutwyler, O. Fuhrer, X. Lapillonne, D. Lüthi, and C. Schär. "Towards European-scale convection-resolving climate simulations with GPUs: a study with COSMO 4.19." Geoscientific Model Development 9, no. 9 (2016): 3393. [4] A. Arteaga, O. Fuhrer, and T. Hoefler. "Designing bit-reproducible portable high-performance applications." In Parallel and Distributed Processing Symposium, 2014 IEEE 28th International, pp. 1235-1244. IEEE, 2014.
Agent Model Development for Assessing Climate-Induced Geopolitical Instability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boslough, Mark B.; Backus, George A.
2005-12-01
We present the initial stages of development of new agent-based computational methods to generate and test hypotheses about linkages between environmental change and international instability. This report summarizes the first year's effort of an originally proposed three-year Laboratory Directed Research and Development (LDRD) project. The preliminary work focused on a set of simple agent-based models and benefited from lessons learned in previous related projects and case studies of human response to climate change and environmental scarcity. Our approach was to define a qualitative model using extremely simple cellular agent models akin to Lovelock's Daisyworld and Schelling's segregation model. Such modelsmore » do not require significant computing resources, and users can modify behavior rules to gain insights. One of the difficulties in agent-based modeling is finding the right balance between model simplicity and real-world representation. Our approach was to keep agent behaviors as simple as possible during the development stage (described herein) and to ground them with a realistic geospatial Earth system model in subsequent years. This work is directed toward incorporating projected climate data--including various C02 scenarios from the Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report--and ultimately toward coupling a useful agent-based model to a general circulation model.3« less
Evolving Storage and Cyber Infrastructure at the NASA Center for Climate Simulation
NASA Technical Reports Server (NTRS)
Salmon, Ellen; Duffy, Daniel; Spear, Carrie; Sinno, Scott; Vaughan, Garrison; Bowen, Michael
2018-01-01
This talk will describe recent developments at the NASA Center for Climate Simulation, which is funded by NASAs Science Mission Directorate, and supports the specialized data storage and computational needs of weather, ocean, and climate researchers, as well as astrophysicists, heliophysicists, and planetary scientists. To meet requirements for higher-resolution, higher-fidelity simulations, the NCCS augments its High Performance Computing (HPC) and storage retrieval environment. As the petabytes of model and observational data grow, the NCCS is broadening data services offerings and deploying and expanding virtualization resources for high performance analytics.
2012 Community Earth System Model (CESM) Tutorial - Proposal to DOE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holland, Marika; Bailey, David A
2013-03-18
The Community Earth System Model (CESM) is a fully-coupled, global climate model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. This document provides the agenda and list of participants for the conference. Web materials for all lectures and practical sessions available from: http://www.cesm.ucar.edu/events/tutorials/073012/ .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William D; Johansen, Hans; Evans, Katherine J
We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy andmore » fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies
NASA Astrophysics Data System (ADS)
Williams, Paul; Howe, Nicola; Gregory, Jonathan; Smith, Robin; Joshi, Manoj
2016-04-01
In climate simulations, the impacts of the sub-grid scales on the resolved scales are conventionally represented using deterministic closure schemes, which assume that the impacts are uniquely determined by the resolved scales. Stochastic parameterization relaxes this assumption, by sampling the sub-grid variability in a computationally inexpensive manner. This presentation shows that the simulated climatological state of the ocean is improved in many respects by implementing a simple stochastic parameterization of ocean eddies into a coupled atmosphere-ocean general circulation model. Simulations from a high-resolution, eddy-permitting ocean model are used to calculate the eddy statistics needed to inject realistic stochastic noise into a low-resolution, non-eddy-permitting version of the same model. A suite of four stochastic experiments is then run to test the sensitivity of the simulated climate to the noise definition, by varying the noise amplitude and decorrelation time within reasonable limits. The addition of zero-mean noise to the ocean temperature tendency is found to have a non-zero effect on the mean climate. Specifically, in terms of the ocean temperature and salinity fields both at the surface and at depth, the noise reduces many of the biases in the low-resolution model and causes it to more closely resemble the high-resolution model. The variability of the strength of the global ocean thermohaline circulation is also improved. It is concluded that stochastic ocean perturbations can yield reductions in climate model error that are comparable to those obtained by refining the resolution, but without the increased computational cost. Therefore, stochastic parameterizations of ocean eddies have the potential to significantly improve climate simulations. Reference PD Williams, NJ Howe, JM Gregory, RS Smith, and MM Joshi (2016) Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies. Journal of Climate, under revision.
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Venzke, J.; Lingerfelt, E.; Messer, B.
2015-12-01
Climate model simulations are used to understand the evolution and variability of earth's climate. Unfortunately, high-resolution multi-decadal climate simulations can take days to weeks to complete. Typically, the simulation results are not analyzed until the model runs have ended. During the course of the simulation, the output may be processed periodically to ensure that the model is preforming as expected. However, most of the data analytics and visualization are not performed until the simulation is finished. The lengthy time period needed for the completion of the simulation constrains the productivity of climate scientists. Our implementation of near real-time data visualization analytics capabilities allows scientists to monitor the progress of their simulations while the model is running. Our analytics software executes concurrently in a co-scheduling mode, monitoring data production. When new data are generated by the simulation, a co-scheduled data analytics job is submitted to render visualization artifacts of the latest results. These visualization output are automatically transferred to Bellerophon's data server located at ORNL's Compute and Data Environment for Science (CADES) where they are processed and archived into Bellerophon's database. During the course of the experiment, climate scientists can then use Bellerophon's graphical user interface to view animated plots and their associated metadata. The quick turnaround from the start of the simulation until the data are analyzed permits research decisions and projections to be made days or sometimes even weeks sooner than otherwise possible! The supercomputer resources used to run the simulation are unaffected by co-scheduling the data visualization jobs, so the model runs continuously while the data are visualized. Our just-in-time data visualization software looks to increase climate scientists' productivity as climate modeling moves into exascale era of computing.
Challenges in the development of very high resolution Earth System Models for climate science
NASA Astrophysics Data System (ADS)
Rasch, Philip J.; Xie, Shaocheng; Ma, Po-Lun; Lin, Wuyin; Wan, Hui; Qian, Yun
2017-04-01
The authors represent the 20+ members of the ACME atmosphere development team. The US Department of Energy (DOE) has, like many other organizations around the world, identified the need for an Earth System Model capable of rapid completion of decade to century length simulations at very high (vertical and horizontal) resolution with good climate fidelity. Two years ago DOE initiated a multi-institution effort called ACME (Accelerated Climate Modeling for Energy) to meet this an extraordinary challenge, targeting a model eventually capable of running at 10-25km horizontal and 20-400m vertical resolution through the troposphere on exascale computational platforms at speeds sufficient to complete 5+ simulated years per day. I will outline the challenges our team has encountered in development of the atmosphere component of this model, and the strategies we have been using for tuning and debugging a model that we can barely afford to run on today's computational platforms. These strategies include: 1) evaluation at lower resolutions; 2) ensembles of short simulations to explore parameter space, and perform rough tuning and evaluation; 3) use of regionally refined versions of the model for probing high resolution model behavior at less expense; 4) use of "auto-tuning" methodologies for model tuning; and 5) brute force long climate simulations.
Reliable low precision simulations in land surface models
NASA Astrophysics Data System (ADS)
Dawson, Andrew; Düben, Peter D.; MacLeod, David A.; Palmer, Tim N.
2017-12-01
Weather and climate models must continue to increase in both resolution and complexity in order that forecasts become more accurate and reliable. Moving to lower numerical precision may be an essential tool for coping with the demand for ever increasing model complexity in addition to increasing computing resources. However, there have been some concerns in the weather and climate modelling community over the suitability of lower precision for climate models, particularly for representing processes that change very slowly over long time-scales. These processes are difficult to represent using low precision due to time increments being systematically rounded to zero. Idealised simulations are used to demonstrate that a model of deep soil heat diffusion that fails when run in single precision can be modified to work correctly using low precision, by splitting up the model into a small higher precision part and a low precision part. This strategy retains the computational benefits of reduced precision whilst preserving accuracy. This same technique is also applied to a full complexity land surface model, resulting in rounding errors that are significantly smaller than initial condition and parameter uncertainties. Although lower precision will present some problems for the weather and climate modelling community, many of the problems can likely be overcome using a straightforward and physically motivated application of reduced precision.
NASA Astrophysics Data System (ADS)
Dullo, T. T.; Gangrade, S.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.; Kao, S. C.; Kalyanapu, A. J.
2017-12-01
The damage and cost of flooding are continuously increasing due to climate change and variability, which compels the development and advance of global flood hazard models. However, due to computational expensiveness, evaluation of large-scale and high-resolution flood regime remains a challenge. The objective of this research is to use a coupled modeling framework that consists of a dynamically downscaled suite of eleven Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models, a distributed hydrologic model called DHSVM, and a computational-efficient 2-dimensional hydraulic model called Flood2D-GPU to study the impacts of climate change on flood regime in the Alabama-Coosa-Tallapoosa (ACT) River Basin. Downscaled meteorologic forcings for 40 years in the historical period (1966-2005) and 40 years in the future period (2011-2050) were used as inputs to drive the calibrated DHSVM to generate annual maximum flood hydrographs. These flood hydrographs along with 30-m resolution digital elevation and estimated surface roughness were then used by Flood2D-GPU to estimate high-resolution flood depth, velocities, duration, and regime. Preliminary results for the Conasauga river basin (an upper subbasin within ACT) indicate that seven of the eleven climate projections show an average increase of 25 km2 in flooded area (between historic and future projections). Future work will focus on illustrating the effects of climate change on flood duration and area for the entire ACT basin.
NASA Astrophysics Data System (ADS)
Cloern, J.
2008-12-01
Programs to ensure sustainability of coastal ecosystems and the biological diversity they harbor require ecological forecasting to assess habitat transformations from the coupled effects of climate change and human population growth. A multidisciplinary modeling project (CASCaDE) was launched in 2007 to develop 21st-century visions of the Sacramento-San Joaquin Delta and San Francisco Bay under four scenarios of climate change and increasing demand for California's water resource. The process begins with downscaled projections of daily weather from GCM's and routes these to a watershed model that computes runoff and an operations model that computes inflows to the Bay-Delta. Hydrologic and climatic outputs, including sea level rise, drive models of tidal hydrodynamics-salinity-temperature in the Delta, sediment inputs and evolving geomorphology of San Francisco Bay. These projected habitat changes are being used to address priority questions asked by resource managers: How will changes in seasonal streamflow, salinity and water temperature, frequency of extreme weather and hydrologic events, and geomorphology influence the sustainability of native species that depend upon the Bay-Delta and the ecosystem services it provides?
Application of Local Discretization Methods in the NASA Finite-Volume General Circulation Model
NASA Technical Reports Server (NTRS)
Yeh, Kao-San; Lin, Shian-Jiann; Rood, Richard B.
2002-01-01
We present the basic ideas of the dynamics system of the finite-volume General Circulation Model developed at NASA Goddard Space Flight Center for climate simulations and other applications in meteorology. The dynamics of this model is designed with emphases on conservative and monotonic transport, where the property of Lagrangian conservation is used to maintain the physical consistency of the computational fluid for long-term simulations. As the model benefits from the noise-free solutions of monotonic finite-volume transport schemes, the property of Lagrangian conservation also partly compensates the accuracy of transport for the diffusion effects due to the treatment of monotonicity. By faithfully maintaining the fundamental laws of physics during the computation, this model is able to achieve sufficient accuracy for the global consistency of climate processes. Because the computing algorithms are based on local memory, this model has the advantage of efficiency in parallel computation with distributed memory. Further research is yet desirable to reduce the diffusion effects of monotonic transport for better accuracy, and to mitigate the limitation due to fast-moving gravity waves for better efficiency.
On the use of inexact, pruned hardware in atmospheric modelling
Düben, Peter D.; Joven, Jaume; Lingamneni, Avinash; McNamara, Hugh; De Micheli, Giovanni; Palem, Krishna V.; Palmer, T. N.
2014-01-01
Inexact hardware design, which advocates trading the accuracy of computations in exchange for significant savings in area, power and/or performance of computing hardware, has received increasing prominence in several error-tolerant application domains, particularly those involving perceptual or statistical end-users. In this paper, we evaluate inexact hardware for its applicability in weather and climate modelling. We expand previous studies on inexact techniques, in particular probabilistic pruning, to floating point arithmetic units and derive several simulated set-ups of pruned hardware with reasonable levels of error for applications in atmospheric modelling. The set-up is tested on the Lorenz ‘96 model, a toy model for atmospheric dynamics, using software emulation for the proposed hardware. The results show that large parts of the computation tolerate the use of pruned hardware blocks without major changes in the quality of short- and long-time diagnostics, such as forecast errors and probability density functions. This could open the door to significant savings in computational cost and to higher resolution simulations with weather and climate models. PMID:24842031
2017-08-01
This large repository of climate model results for North America (Wang and Kotamarthi 2013, 2014, 2015) is stored in Network Common Data Form (NetCDF...Network Common Data Form (NetCDF). UCAR/Unidata Program Center, Boulder, CO. Available at: http://www.unidata.ucar.edu/software/netcdf. Accessed on 6/20...emissions diverge from each other regarding fossil fuel use, technology, and other socioeconomic factors. As a result, the estimated emissions for each of
NASA Astrophysics Data System (ADS)
Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus
2017-07-01
Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.
Coupling Climate Models and Forward-Looking Economic Models
NASA Astrophysics Data System (ADS)
Judd, K.; Brock, W. A.
2010-12-01
Authors: Dr. Kenneth L. Judd, Hoover Institution, and Prof. William A. Brock, University of Wisconsin Current climate models range from General Circulation Models (GCM’s) with millions of degrees of freedom to models with few degrees of freedom. Simple Energy Balance Climate Models (EBCM’s) help us understand the dynamics of GCM’s. The same is true in economics with Computable General Equilibrium Models (CGE’s) where some models are infinite-dimensional multidimensional differential equations but some are simple models. Nordhaus (2007, 2010) couples a simple EBCM with a simple economic model. One- and two- dimensional ECBM’s do better at approximating damages across the globe and positive and negative feedbacks from anthroprogenic forcing (North etal. (1981), Wu and North (2007)). A proper coupling of climate and economic systems is crucial for arriving at effective policies. Brock and Xepapadeas (2010) have used Fourier/Legendre based expansions to study the shape of socially optimal carbon taxes over time at the planetary level in the face of damages caused by polar ice cap melt (as discussed by Oppenheimer, 2005) but in only a “one dimensional” EBCM. Economists have used orthogonal polynomial expansions to solve dynamic, forward-looking economic models (Judd, 1992, 1998). This presentation will couple EBCM climate models with basic forward-looking economic models, and examine the effectiveness and scaling properties of alternative solution methods. We will use a two dimensional EBCM model on the sphere (Wu and North, 2007) and a multicountry, multisector regional model of the economic system. Our aim will be to gain insights into intertemporal shape of the optimal carbon tax schedule, and its impact on global food production, as modeled by Golub and Hertel (2009). We will initially have limited computing resources and will need to focus on highly aggregated models. However, this will be more complex than existing models with forward-looking economic modules, and the initial models will help guide the construction of more refined models that can effectively use more powerful computational environments to analyze economic policies related to climate change. REFERENCES Brock, W., Xepapadeas, A., 2010, “An Integration of Simple Dynamic Energy Balance Climate Models and Ramsey Growth Models,” Department of Economics, University of Wisconsin, Madison, and University of Athens. Golub, A., Hertel, T., etal., 2009, “The opportunity cost of land use and the global potential for greenhouse gas mitigation in agriculture and forestry,” RESOURCE AND ENERGY ECONOMICS, 31, 299-319. Judd, K., 1992, “Projection methods for solving aggregate growth models,” JOURNAL OF ECONOMIC THEORY, 58: 410-52. Judd, K., 1998, NUMERICAL METHODS IN ECONOMICS, MIT Press, Cambridge, Mass. Nordhaus, W., 2007, A QUESTION OF BALANCE: ECONOMIC MODELS OF CLIMATE CHANGE, Yale University Press, New Haven, CT. North, G., R., Cahalan, R., Coakely, J., 1981, “Energy balance climate models,” REVIEWS OF GEOPHYSICS AND SPACE PHYSICS, Vol. 19, No. 1, 91-121, February Wu, W., North, G. R., 2007, “Thermal decay modes of a 2-D energy balance climate model,” TELLUS, 59A, 618-626.
NASA Astrophysics Data System (ADS)
Yarker, M. B.; Stanier, C. O.; Forbes, C.; Park, S.
2011-12-01
As atmospheric scientists, we depend on Numerical Weather Prediction (NWP) models. We use them to predict weather patterns, to understand external forcing on the atmosphere, and as evidence to make claims about atmospheric phenomenon. Therefore, it is important that we adequately prepare atmospheric science students to use computer models. However, the public should also be aware of what models are in order to understand scientific claims about atmospheric issues, such as climate change. Although familiar with weather forecasts on television and the Internet, the general public does not understand the process of using computer models to generate a weather and climate forecasts. As a result, the public often misunderstands claims scientists make about their daily weather as well as the state of climate change. Since computer models are the best method we have to forecast the future of our climate, scientific models and modeling should be a topic covered in K-12 classrooms as part of a comprehensive science curriculum. According to the National Science Education Standards, teachers are encouraged to science models into the classroom as a way to aid in the understanding of the nature of science. However, there is very little description of what constitutes a science model, so the term is often associated with scale models. Therefore, teachers often use drawings or scale representations of physical entities, such as DNA, the solar system, or bacteria. In other words, models used in classrooms are often used as visual representations, but the purpose of science models is often overlooked. The implementation of a model-based curriculum in the science classroom can be an effective way to prepare students to think critically, problem solve, and make informed decisions as a contributing member of society. However, there are few resources available to help teachers implement science models into the science curriculum effectively. Therefore, this research project looks at strategies middle school science teachers use to implement science models into their classrooms. These teachers in this study took part in a week-long professional development designed to orient them towards appropriate use of science models for a unit on weather, climate, and energy concepts. The goal of this project is to describe the professional development and describe how teachers intend to incorporate science models into each of their individual classrooms.
ParCAT: A Parallel Climate Analysis Toolkit
NASA Astrophysics Data System (ADS)
Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.
2012-12-01
Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.
Web Based Data Access to the World Data Center for Climate
NASA Astrophysics Data System (ADS)
Toussaint, F.; Lautenschlager, M.
2006-12-01
The World Data Center for Climate (WDC-Climate, www.wdc-climate.de) is hosted by the Model &Data Group (M&D) of the Max Planck Institute for Meteorology. The M&D department is financed by the German government and uses the computers and mass storage facilities of the German Climate Computing Centre (Deutsches Klimarechenzentrum, DKRZ). The WDC-Climate provides web access to 200 Terabytes of climate data; the total mass storage archive contains nearly 4 Petabytes. Although the majority of the datasets concern model output data, some satellite and observational data are accessible as well. The underlying relational database is distributed on five servers. The CERA relational data model is used to integrate catalogue data and mass data. The flexibility of the model allows to store and access very different types of data and metadata. The CERA metadata catalogue provides easy access to the content of the CERA database as well as to other data in the web. Visit ceramodel.wdc-climate.de for additional information on the CERA data model. The majority of the users access data via the CERA metadata catalogue, which is open without registration. However, prior to retrieving data user are required to check in and apply for a userid and password. The CERA metadata catalogue is servlet based. So it is accessible worldwide through any web browser at cera.wdc-climate.de. In addition to data and metadata access by the web catalogue, WDC-Climate offers a number of other forms of web based data access. All metadata are available via http request as xml files in various metadata formats (ISO, DC, etc., see wini.wdc-climate.de) which allows for easy data interchange with other catalogues. Model data can be retrieved in GRIB, ASCII, NetCDF, and binary (IEEE) format. WDC-Climate serves as data centre for various projects. Since xml files are accessible by http, the integration of data into applications of different projects is very easy. Projects supported by WDC-Climate are e.g. CEOP, IPCC, and CARIBIC. A script tool for data download (jblob) is offered on the web page, to make retrieval of huge data quantities more comfortable.
Local air temperature tolerance: a sensible basis for estimating climate variability
NASA Astrophysics Data System (ADS)
Kärner, Olavi; Post, Piia
2016-11-01
The customary representation of climate using sample moments is generally biased due to the noticeably nonstationary behaviour of many climate series. In this study, we introduce a moment-free climate representation based on a statistical model fitted to a long-term daily air temperature anomaly series. This model allows us to separate the climate and weather scale variability in the series. As a result, the climate scale can be characterized using the mean annual cycle of series and local air temperature tolerance, where the latter is computed using the fitted model. The representation of weather scale variability is specified using the frequency and the range of outliers based on the tolerance. The scheme is illustrated using five long-term air temperature records observed by different European meteorological stations.
Fossils tell of mild winters in an ancient hothouse
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, R.A.
Fossil evidence from the Eocene points to a warmer winter climate in the continental interior (e.g. North Dakota) than that predicted by computer models. Paleobotanists have been able to quantify approximate winter mean temperatures by using leaf characteristics. As one example, leaves from colder climates have toothed edges. Leaf structure was correlated with modern climate regimes, and these relations were then applied to Eocene fossils. They found cold-month mean temperatures of 1-8[degrees]C in Wyoming and Montana, well above model predictions. Climate models can be manipulated to reproduce these temperatures, but not without overheating the entire globe. The problem could bemore » that the Eocene atmospheric circulation was different from today, something not accounted for well by climate models.« less
The Construction of 3-d Neutral Density for Arbitrary Data Sets
NASA Astrophysics Data System (ADS)
Riha, S.; McDougall, T. J.; Barker, P. M.
2014-12-01
The Neutral Density variable allows inference of water pathways from thermodynamic properties in the global ocean, and is therefore an essential component of global ocean circulation analysis. The widely used algorithm for the computation of Neutral Density yields accurate results for data sets which are close to the observed climatological ocean. Long-term numerical climate simulations, however, often generate a significant drift from present-day climate, which renders the existing algorithm inaccurate. To remedy this problem, new algorithms which operate on arbitrary data have been developed, which may potentially be used to compute Neutral Density during runtime of a numerical model.We review existing approaches for the construction of Neutral Density in arbitrary data sets, detail their algorithmic structure, and present an analysis of the computational cost for implementations on a single-CPU computer. We discuss possible strategies for the implementation in state-of-the-art numerical models, with a focus on distributed computing environments.
Superensemble of a Regional Climate Model for the Western US using Climateprediction.net
NASA Astrophysics Data System (ADS)
Mote, P.; Salahuddin, A.; Allen, M.; Jones, R.
2010-12-01
For over a decade, a citizen science experiment called climateprediction.net organized by Oxford University has used computer time contributed by over 80,000 volunteers around the world to create superensembles of global climate simulations. A new climateprediction.net experiment built by the UK Meteorological Office and Oxford, and released in late summer 2010, brings these computing resources to bear on regional climate modeling for the Western US, western Europe, and southern Africa. For the western US, the spatial resolution of 25km permits important topological features -- mountain ranges and valleys -- to be resolved and to influence simulated climate, which consequently includes many important observed features of climate like the fact that California’s Central Valley is hottest at the north and south ends in summer, and cooler in the middle owing to the maritime influence that leaks through the gap in the coast range in the San Francisco area. We designed the output variables to satisfy both research needs and societal and environmental impacts needs. These include atmospheric circulation on regional and global scales, surface fluxes of energy, and hydrologic variables; extremes of temperature, precipitation, and wind; and derived quantities like frost days and number of consecutive dry days. Early results from pre-release beta testing suggest that the simulated fields compare favorably with available observations, and that the model performs as well in the distributed computing environment as on a dedicated high-performance machine. The advantages of a superensemble in interpreting regional climate change will permit an unprecedented combination of statistical completeness and spatial resolution.
Software Testing and Verification in Climate Model Development
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Rood, RIchard B.
2011-01-01
Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prein, Andreas; Langhans, Wolfgang; Fosser, Giorgia
Regional climate modeling using convection permitting models (CPMs) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs). CPMs do not use convection parameterization schemes, known as a major source of errors and uncertainties, and have more accurate surface and orography elds. The drawback of CPMs is their high demand on computational resources. For this reason, the CPM climate simulations only appeared a decade ago. In this study we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic.more » The most important components in CPM, such as physical parameterizations and dynamical formulations are discussed, and an outlook on required future developments and computer architectures that would support the application of CPMs is given. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Most improvements are found for processes related to deep convection (e.g., precipitation during summer), for mountainous regions, and for the soil-vegetation-atmosphere interactions. The climate change signals of CPM simulations reveal increases in short and extreme rainfall events and an increased ratio of liquid precipitation at the surface (a decrease of hail) potentially leading to more frequent ash oods. Concluding, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to assess their full potential and support their development.« less
Signal to noise quantification of regional climate projections
NASA Astrophysics Data System (ADS)
Li, S.; Rupp, D. E.; Mote, P.
2016-12-01
One of the biggest challenges in interpreting climate model outputs for impacts studies and adaptation planning is understanding the sources of disagreement among models (which is often used imperfectly as a stand-in for system uncertainty). Internal variability is a primary source of uncertainty in climate projections, especially for precipitation, for which models disagree about even the sign of changes in large areas like the continental US. Taking advantage of a large initial-condition ensemble of regional climate simulations, this study quantifies the magnitude of changes forced by increasing greenhouse gas concentrations relative to internal variability. Results come from a large initial-condition ensemble of regional climate model simulations generated by weather@home, a citizen science computing platform, where the western United States climate was simulated for the recent past (1985-2014) and future (2030-2059) using a 25-km horizontal resolution regional climate model (HadRM3P) nested in global atmospheric model (HadAM3P). We quantify grid point level signal-to-noise not just in temperature and precipitation responses, but also the energy and moisture flux terms that are related to temperature and precipitation responses, to provide important insights regarding uncertainty in climate change projections at local and regional scales. These results will aid modelers in determining appropriate ensemble sizes for different climate variables and help users of climate model output with interpreting climate model projections.
Future climate scenarios and rainfall--runoff modelling in the Upper Gallego catchment (Spain).
Bürger, C M; Kolditz, O; Fowler, H J; Blenkinsop, S
2007-08-01
Global climate change may have large impacts on water supplies, drought or flood frequencies and magnitudes in local and regional hydrologic systems. Water authorities therefore rely on computer models for quantitative impact prediction. In this study we present kernel-based learning machine river flow models for the Upper Gallego catchment of the Ebro basin. Different learning machines were calibrated using daily gauge data. The models posed two major challenges: (1) estimation of the rainfall-runoff transfer function from the available time series is complicated by anthropogenic regulation and mountainous terrain and (2) the river flow model is weak when only climate data are used, but additional antecedent flow data seemed to lead to delayed peak flow estimation. These types of models, together with the presented downscaled climate scenarios, can be used for climate change impact assessment in the Gallego, which is important for the future management of the system.
Convergence in France facing Big Data era and Exascale challenges for Climate Sciences
NASA Astrophysics Data System (ADS)
Denvil, Sébastien; Dufresne, Jean-Louis; Salas, David; Meurdesoif, Yann; Valcke, Sophie; Caubel, Arnaud; Foujols, Marie-Alice; Servonnat, Jérôme; Sénési, Stéphane; Derouillat, Julien; Voury, Pascal
2014-05-01
The presentation will introduce a french national project : CONVERGENCE that has been funded for four years. This project will tackle big data and computational challenges faced by climate modeling community in HPC context. Model simulations are central to the study of complex mechanisms and feedbacks in the climate system and to provide estimates of future and past climate changes. Recent trends in climate modelling are to add more physical components in the modelled system, increasing the resolution of each individual component and the more systematic use of large suites of simulations to address many scientific questions. Climate simulations may therefore differ in their initial state, parameter values, representation of physical processes, spatial resolution, model complexity, and degree of realism or degree of idealisation. In addition, there is a strong need for evaluating, improving and monitoring the performance of climate models using a large ensemble of diagnostics and better integration of model outputs and observational data. High performance computing is currently reaching the exascale and has the potential to produce this exponential increase of size and numbers of simulations. However, post-processing, analysis, and exploration of the generated data have stalled and there is a strong need for new tools to cope with the growing size and complexity of the underlying simulations and datasets. Exascale simulations require new scalable software tools to generate, manage and mine those simulations ,and data to extract the relevant information and to take the correct decision. The primary purpose of this project is to develop a platform capable of running large ensembles of simulations with a suite of models, to handle the complex and voluminous datasets generated, to facilitate the evaluation and validation of the models and the use of higher resolution models. We propose to gather interdisciplinary skills to design, using a component-based approach, a specific programming environment for scalable scientific simulations and analytics, integrating new and efficient ways of deploying and analysing the applications on High Performance Computing (HPC) system. CONVERGENCE, gathering HPC and informatics expertise that cuts across the individual partners and the broader HPC community, will allow the national climate community to leverage information technology (IT) innovations to address its specific needs. Our methodology consists in developing an ensemble of generic elements needed to run the French climate models with different grids and different resolution, ensuring efficient and reliable execution of these models, managing large volume and number of data and allowing analysis of the results and precise evaluation of the models. These elements include data structure definition and input-output (IO), code coupling and interpolation, as well as runtime and pre/post-processing environments. A common data and metadata structure will allow transferring consistent information between the various elements. All these generic elements will be open source and publicly available. The IPSL-CM and CNRM-CM climate models will make use of these elements that will constitute a national platform for climate modelling. This platform will be used, in its entirety, to optimise and tune the next version of the IPSL-CM model and to develop a global coupled climate model with a regional grid refinement. It will also be used, at least partially, to run ensembles of the CNRM-CM model at relatively high resolution and to run a very-high resolution prototype of this model. The climate models we developed are already involved in many international projects. For instance we participate to the CMIP (Coupled Model Intercomparison Project) project that is very demanding but has a high visibility: its results are widely used and are in particular synthesised in the IPCC (Intergovernmental Panel on Climate Change) assessment reports. The CONVERGENCE project will constitute an invaluable step for the French climate community to prepare and better contribute to the next phase of the CMIP project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, Atul K.
The overall objectives of this DOE funded project is to combine scientific and computational challenges in climate modeling by expanding our understanding of the biogeophysical-biogeochemical processes and their interactions in the northern high latitudes (NHLs) using an earth system modeling (ESM) approach, and by adopting an adaptive parallel runtime system in an ESM to achieve efficient and scalable climate simulations through improved load balancing algorithms.
Evaluation of a Mesoscale Convective System in Variable-Resolution CESM
NASA Astrophysics Data System (ADS)
Payne, A. E.; Jablonowski, C.
2017-12-01
Warm season precipitation over the Southern Great Plains (SGP) follows a well observed diurnal pattern of variability, peaking at night-time, due to the eastward propagation of mesoscale convection systems that develop over the eastern slopes of the Rockies in the late afternoon. While most climate models are unable to adequately capture the organization of convection and characteristic pattern of precipitation over this region, models with high enough resolution to explicitly resolve convection show improvement. However, high resolution simulations are computationally expensive and, in the case of regional climate models, are subject to boundary conditions. Newly developed variable resolution global climate models strike a balance between the benefits of high-resolution regional climate models and the large-scale dynamics of global climate models and low computational cost. Recently developed parameterizations that are insensitive to the model grid scale provide a way to improve model performance. Here, we present an evaluation of the newly available Cloud Layers Unified by Binormals (CLUBB) parameterization scheme in a suite of variable-resolution CESM simulations with resolutions ranging from 110 km to 7 km within a regionally refined region centered over the SGP Atmospheric Radiation Measurement (ARM) site. Simulations utilize the hindcast approach developed by the Department of Energy's Cloud-Associated Parameterizations Testbed (CAPT) for the assessment of climate models. We limit our evaluation to a single mesoscale convective system that passed over the region on May 24, 2008. The effects of grid-resolution on the timing and intensity of precipitation, as well as, on the transition from shallow to deep convection are assessed against ground-based observations from the SGP ARM site, satellite observations and ERA-Interim reanalysis.
NASA Astrophysics Data System (ADS)
Doroszkiewicz, Joanna; Romanowicz, Renata
2016-04-01
Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.
NASA Astrophysics Data System (ADS)
Parishani, H.; Pritchard, M. S.; Bretherton, C. S.; Wyant, M. C.; Khairoutdinov, M.; Singh, B.
2017-12-01
Biases and parameterization formulation uncertainties in the representation of boundary layer clouds remain a leading source of possible systematic error in climate projections. Here we show the first results of cloud feedback to +4K SST warming in a new experimental climate model, the ``Ultra-Parameterized (UP)'' Community Atmosphere Model, UPCAM. We have developed UPCAM as an unusually high-resolution implementation of cloud superparameterization (SP) in which a global set of cloud resolving arrays is embedded in a host global climate model. In UP, the cloud-resolving scale includes sufficient internal resolution to explicitly generate the turbulent eddies that form marine stratocumulus and trade cumulus clouds. This is computationally costly but complements other available approaches for studying low clouds and their climate interaction, by avoiding parameterization of the relevant scales. In a recent publication we have shown that UP, while not without its own complexity trade-offs, can produce encouraging improvements in low cloud climatology in multi-month simulations of the present climate and is a promising target for exascale computing (Parishani et al. 2017). Here we show results of its low cloud feedback to warming in multi-year simulations for the first time. References: Parishani, H., M. S. Pritchard, C. S. Bretherton, M. C. Wyant, and M. Khairoutdinov (2017), Toward low-cloud-permitting cloud superparameterization with explicit boundary layer turbulence, J. Adv. Model. Earth Syst., 9, doi:10.1002/2017MS000968.
Earth System Grid II, Turning Climate Datasets into Community Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Middleton, Don
2006-08-01
The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less
Using a Global Climate Model in an On-line Climate Change Course
NASA Astrophysics Data System (ADS)
Randle, D. E.; Chandler, M. A.; Sohl, L. E.
2012-12-01
Seminars on Science: Climate Change is an on-line, graduate-level teacher professional development course offered by the American Museum of Natural History. It is an intensive 6-week course covering a broad range of global climate topics, from the fundamentals of the climate system, to the causes of climate change, the role of paleoclimate investigations, and a discussion of potential consequences and risks. The instructional method blends essays, videos, textbooks, and linked websites, with required participation in electronic discussion forums that are moderated by an experienced educator and a course scientist. Most weeks include additional assignments. Three of these assignments employ computer models, including two weeks spent working with a full-fledged 3D global climate model (GCM). The global climate modeling environment is supplied through a partnership with Columbia University's Educational Global Climate Modeling Project (EdGCM). The objective is to have participants gain hands-on experience with one of the most important, yet misunderstood, aspects of climate change research. Participants in the course are supplied with a USB drive that includes installers for the software and sample data. The EdGCM software includes a version of NASA's global climate model fitted with a graphical user interface and pre-loaded with several climate change simulations. Step-by-step assignments and video tutorials help walk people through these challenging exercises and the course incorporates a special assignment discussion forum to help with technical problems and questions about the NASA GCM. There are several takeaways from our first year and a half of offering this course, which has become one of the most popular out of the twelve courses offered by the Museum. Participants report a high level of satisfaction in using EdGCM. Some report frustration at the initial steps, but overwhelmingly claim that the assignments are worth the effort. Many of the difficulties that arise are due to a lack of computer literacy amongst participants and we have found, through iterative improvements in the materials, that breaking assignments into discrete, well-supported tasks has been key to the success.
Objective calibration of regional climate models
NASA Astrophysics Data System (ADS)
Bellprat, O.; Kotlarski, S.; Lüthi, D.; SchäR, C.
2012-12-01
Climate models are subject to high parametric uncertainty induced by poorly confined model parameters of parameterized physical processes. Uncertain model parameters are typically calibrated in order to increase the agreement of the model with available observations. The common practice is to adjust uncertain model parameters manually, often referred to as expert tuning, which lacks objectivity and transparency in the use of observations. These shortcomings often haze model inter-comparisons and hinder the implementation of new model parameterizations. Methods which would allow to systematically calibrate model parameters are unfortunately often not applicable to state-of-the-art climate models, due to computational constraints facing the high dimensionality and non-linearity of the problem. Here we present an approach to objectively calibrate a regional climate model, using reanalysis driven simulations and building upon a quadratic metamodel presented by Neelin et al. (2010) that serves as a computationally cheap surrogate of the model. Five model parameters originating from different parameterizations are selected for the optimization according to their influence on the model performance. The metamodel accurately estimates spatial averages of 2 m temperature, precipitation and total cloud cover, with an uncertainty of similar magnitude as the internal variability of the regional climate model. The non-linearities of the parameter perturbations are well captured, such that only a limited number of 20-50 simulations are needed to estimate optimal parameter settings. Parameter interactions are small, which allows to further reduce the number of simulations. In comparison to an ensemble of the same model which has undergone expert tuning, the calibration yields similar optimal model configurations, but leading to an additional reduction of the model error. The performance range captured is much wider than sampled with the expert-tuned ensemble and the presented methodology is effective and objective. It is argued that objective calibration is an attractive tool and could become standard procedure after introducing new model implementations, or after a spatial transfer of a regional climate model. Objective calibration of parameterizations with regional models could also serve as a strategy toward improving parameterization packages of global climate models.
Petascale Computing: Impact on Future NASA Missions
NASA Technical Reports Server (NTRS)
Brooks, Walter
2006-01-01
This slide presentation reviews NASA's use of a new super computer, called Columbia, capable of operating at 62 Tera Flops. This computer is the 4th fastest computer in the world. This computer will serve all mission directorates. The applications that it would serve are: aerospace analysis and design, propulsion subsystem analysis, climate modeling, hurricane prediction and astrophysics and cosmology.
Accelerating Time Integration for the Shallow Water Equations on the Sphere Using GPUs
Archibald, R.; Evans, K. J.; Salinger, A.
2015-06-01
The push towards larger and larger computational platforms has made it possible for climate simulations to resolve climate dynamics across multiple spatial and temporal scales. This direction in climate simulation has created a strong need to develop scalable timestepping methods capable of accelerating throughput on high performance computing. This study details the recent advances in the implementation of implicit time stepping of the spectral element dynamical core within the United States Department of Energy (DOE) Accelerated Climate Model for Energy (ACME) on graphical processing units (GPU) based machines. We demonstrate how solvers in the Trilinos project are interfaced with ACMEmore » and GPU kernels to increase computational speed of the residual calculations in the implicit time stepping method for the atmosphere dynamics. We demonstrate the optimization gains and data structure reorganization that facilitates the performance improvements.« less
Recent Naval Postgraduate School Publications.
1982-04-01
477 p. Haney, R L; et al.; eds. Ocean models for climate research: A workshop Sponsored by the U.S. Committee for the Global Atmos. Hes. Program. Nat... climate variability Oceanus, vol. 21, no. 4, p. 33-39, (1978). Williams, R T A review of theoretical models of atmospheric frontogenesis Chapman Conf...structure in large-scale optimization models Symp. 9 n Computer-Assisted Analysis and Model Simpification, Boulder, Colo., Mar. 24, 1980. Brown, G G
Using Web 2.0 Techniques To Bring Global Climate Modeling To More Users
NASA Astrophysics Data System (ADS)
Chandler, M. A.; Sohl, L. E.; Tortorici, S.
2012-12-01
The Educational Global Climate Model has been used for many years in undergraduate courses and professional development settings to teach the fundamentals of global climate modeling and climate change simulation to students and teachers. While course participants have reported a high level of satisfaction in these courses and overwhelmingly claim that EdGCM projects are worth the effort, there is often a high level of frustration during the initial learning stages. Many of the problems stem from issues related to installation of the software suite and to the length of time it can take to run initial experiments. Two or more days of continuous run time may be required before enough data has been gathered to begin analyses. Asking users to download existing simulation data has not been a solution because the GCM data sets are several gigabytes in size, requiring substantial bandwidth and stable dedicated internet connections. As a means of getting around these problems we have been developing a Web 2.0 utility called EzGCM (Easy G-G-M) which emphasizes that participants learn the steps involved in climate modeling research: constructing a hypothesis, designing an experiment, running a computer model and assessing when an experiment has finished (reached equilibrium), using scientific visualization to support analysis, and finally communicating the results through social networking methods. We use classic climate experiments that can be "rediscovered" through exercises with EzGCM and are attempting to make this Web 2.0 tool an entry point into climate modeling for teachers with little time to cover the subject, users with limited computer skills, and for those who want an introduction to the process before tackling more complex projects with EdGCM.
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will involve the further integration and analysis of this data across the social sciences to facilitate the impacts across the societal domain, including timely analysis to more accurately predict and forecast future climate and environmental state.
A computer network with scada and case tools for on-line process control in greenhouses
NASA Astrophysics Data System (ADS)
Gieling, Th. H.; van Meurs, W. Th. M.; Janssen, H. J. J.
Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra opportunities to climate control in greenhouses.
A computer network with SCADA and case tools for on-line process control in greenhouses.
Gieling ThH; van Meurs WTh; Janssen, H J
1996-01-01
Climate control computers in greenhouses are used to control heating and ventilation, supply water and dilute and dispense nutrients. They integrate models into optimally controlled systems. This paper describes how information technology, as in use in other sectors of industry, is applied to greenhouse control. The introduction of modern software and hardware concepts in horticulture adds power and extra oppurtunities to climate contol in greenhouses.
NASA Astrophysics Data System (ADS)
Strassmann, Kuno M.; Joos, Fortunat
2018-05-01
The Bern Simple Climate Model (BernSCM) is a free open-source re-implementation of a reduced-form carbon cycle-climate model which has been used widely in previous scientific work and IPCC assessments. BernSCM represents the carbon cycle and climate system with a small set of equations for the heat and carbon budget, the parametrization of major nonlinearities, and the substitution of complex component systems with impulse response functions (IRFs). The IRF approach allows cost-efficient yet accurate substitution of detailed parent models of climate system components with near-linear behavior. Illustrative simulations of scenarios from previous multimodel studies show that BernSCM is broadly representative of the range of the climate-carbon cycle response simulated by more complex and detailed models. Model code (in Fortran) was written from scratch with transparency and extensibility in mind, and is provided open source. BernSCM makes scientifically sound carbon cycle-climate modeling available for many applications. Supporting up to decadal time steps with high accuracy, it is suitable for studies with high computational load and for coupling with integrated assessment models (IAMs), for example. Further applications include climate risk assessment in a business, public, or educational context and the estimation of CO2 and climate benefits of emission mitigation options.
NASA Astrophysics Data System (ADS)
Gil, Y.; Duffy, C.
2015-12-01
This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.
Probability for Weather and Climate
NASA Astrophysics Data System (ADS)
Smith, L. A.
2013-12-01
Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.
Evaluation of the new EMAC-SWIFT chemistry climate model
NASA Astrophysics Data System (ADS)
Scheffler, Janice; Langematz, Ulrike; Wohltmann, Ingo; Rex, Markus
2016-04-01
It is well known that the representation of atmospheric ozone chemistry in weather and climate models is essential for a realistic simulation of the atmospheric state. Including atmospheric ozone chemistry into climate simulations is usually done by prescribing a climatological ozone field, by including a fast linear ozone scheme into the model or by using a climate model with complex interactive chemistry. While prescribed climatological ozone fields are often not aligned with the modelled dynamics, a linear ozone scheme may not be applicable for a wide range of climatological conditions. Although interactive chemistry provides a realistic representation of atmospheric chemistry such model simulations are computationally very expensive and hence not suitable for ensemble simulations or simulations with multiple climate change scenarios. A new approach to represent atmospheric chemistry in climate models which can cope with non-linearities in ozone chemistry and is applicable to a wide range of climatic states is the Semi-empirical Weighted Iterative Fit Technique (SWIFT) that is driven by reanalysis data and has been validated against observational satellite data and runs of a full Chemistry and Transport Model. SWIFT has recently been implemented into the ECHAM/MESSy (EMAC) chemistry climate model that uses a modular approach to climate modelling where individual model components can be switched on and off. Here, we show first results of EMAC-SWIFT simulations and validate these against EMAC simulations using the complex interactive chemistry scheme MECCA, and against observations.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Nemani, R. R.; Wang, W.; Votava, P.; Hashimoto, H.
2010-12-01
Given the increasing complexity of climate modeling and analysis tools, it is often difficult and expensive to build or recreate an exact replica of the software compute environment used in past experiments. With the recent development of new technologies for hardware virtualization, an opportunity exists to create full modeling, analysis and compute environments that are “archiveable”, transferable and may be easily shared amongst a scientific community or presented to a bureaucratic body if the need arises. By encapsulating and entire modeling and analysis environment in a virtual machine image, others may quickly gain access to the fully built system used in past experiments, potentially easing the task and reducing the costs of reproducing and verify past results produced by other researchers. Moreover, these virtual machine images may be used as a pedagogical tool for others that are interested in performing an academic exercise but don't yet possess the broad expertise required. We built two virtual machine images, one with the Community Earth System Model (CESM) and one with Weather Research Forecast Model (WRF), then ran several small experiments to assess the feasibility, performance overheads costs, reusability, and transferability. We present a list of the pros and cons as well as lessoned learned from utilizing virtualization technology in the climate and earth systems modeling domain.
Milly, Paul C.D.; Dunne, Krista A.
2011-01-01
Hydrologic models often are applied to adjust projections of hydroclimatic change that come from climate models. Such adjustment includes climate-bias correction, spatial refinement ("downscaling"), and consideration of the roles of hydrologic processes that were neglected in the climate model. Described herein is a quantitative analysis of the effects of hydrologic adjustment on the projections of runoff change associated with projected twenty-first-century climate change. In a case study including three climate models and 10 river basins in the contiguous United States, the authors find that relative (i.e., fractional or percentage) runoff change computed with hydrologic adjustment more often than not was less positive (or, equivalently, more negative) than what was projected by the climate models. The dominant contributor to this decrease in runoff was a ubiquitous change in runoff (median -11%) caused by the hydrologic model’s apparent amplification of the climate-model-implied growth in potential evapotranspiration. Analysis suggests that the hydrologic model, on the basis of the empirical, temperature-based modified Jensen–Haise formula, calculates a change in potential evapotranspiration that is typically 3 times the change implied by the climate models, which explicitly track surface energy budgets. In comparison with the amplification of potential evapotranspiration, central tendencies of other contributions from hydrologic adjustment (spatial refinement, climate-bias adjustment, and process refinement) were relatively small. The authors’ findings highlight the need for caution when projecting changes in potential evapotranspiration for use in hydrologic models or drought indices to evaluate climate-change impacts on water.
Climate Ocean Modeling on a Beowulf Class System
NASA Technical Reports Server (NTRS)
Cheng, B. N.; Chao, Y.; Wang, P.; Bondarenko, M.
2000-01-01
With the growing power and shrinking cost of personal computers. the availability of fast ethernet interconnections, and public domain software packages, it is now possible to combine them to build desktop parallel computers (named Beowulf or PC clusters) at a fraction of what it would cost to buy systems of comparable power front supercomputer companies. This led as to build and assemble our own sys tem. specifically for climate ocean modeling. In this article, we present our experience with such a system, discuss its network performance, and provide some performance comparison data with both HP SPP2000 and Cray T3E for an ocean Model used in present-day oceanographic research.
Complex networks as a unified framework for descriptive analysis and predictive modeling in climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R
The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less
NASA Astrophysics Data System (ADS)
Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.
2015-12-01
As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.
Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.; ...
2017-11-26
Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.
Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography. Our derivation, which is based on the rate summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills maturemore » pine trees. This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less
Do Responses to Different Anthropogenic Forcings Add Linearly in Climate Models?
NASA Technical Reports Server (NTRS)
Marvel, Kate; Schmidt, Gavin A.; Shindell, Drew; Bonfils, Celine; LeGrande, Allegra N.; Nazarenko, Larissa; Tsigaridis, Kostas
2015-01-01
Many detection and attribution and pattern scaling studies assume that the global climate response to multiple forcings is additive: that the response over the historical period is statistically indistinguishable from the sum of the responses to individual forcings. Here, we use the NASA Goddard Institute for Space Studies (GISS) and National Center for Atmospheric Research Community Climate System Model (CCSM) simulations from the CMIP5 archive to test this assumption for multi-year trends in global-average, annual-average temperature and precipitation at multiple timescales. We find that responses in models forced by pre-computed aerosol and ozone concentrations are generally additive across forcings; however, we demonstrate that there are significant nonlinearities in precipitation responses to di?erent forcings in a configuration of the GISS model that interactively computes these concentrations from precursor emissions. We attribute these to di?erences in ozone forcing arising from interactions between forcing agents. Our results suggest that attribution to specific forcings may be complicated in a model with fully interactive chemistry and may provide motivation for other modeling groups to conduct further single-forcing experiments.
Do responses to different anthropogenic forcings add linearly in climate models?
Marvel, Kate; Schmidt, Gavin A.; Shindell, Drew; ...
2015-10-14
Many detection and attribution and pattern scaling studies assume that the global climate response to multiple forcings is additive: that the response over the historical period is statistically indistinguishable from the sum of the responses to individual forcings. Here, we use the NASA Goddard Institute for Space Studies (GISS) and National Center for Atmospheric Research Community Climate System Model (CCSM4) simulations from the CMIP5 archive to test this assumption for multi-year trends in global-average, annual-average temperature and precipitation at multiple timescales. We find that responses in models forced by pre-computed aerosol and ozone concentrations are generally additive across forcings. However,more » we demonstrate that there are significant nonlinearities in precipitation responses to different forcings in a configuration of the GISS model that interactively computes these concentrations from precursor emissions. We attribute these to differences in ozone forcing arising from interactions between forcing agents. Lastly, our results suggest that attribution to specific forcings may be complicated in a model with fully interactive chemistry and may provide motivation for other modeling groups to conduct further single-forcing experiments.« less
Zarzycki, Colin M.; Reed, Kevin A.; Bacmeister, Julio T.; ...
2016-02-25
This article discusses the sensitivity of tropical cyclone climatology to surface coupling strategy in high-resolution configurations of the Community Earth System Model. Using two supported model setups, we demonstrate that the choice of grid on which the lowest model level wind stress and surface fluxes are computed may lead to differences in cyclone strength in multi-decadal climate simulations, particularly for the most intense cyclones. Using a deterministic framework, we show that when these surface quantities are calculated on an ocean grid that is coarser than the atmosphere, the computed frictional stress is misaligned with wind vectors in individual atmospheric gridmore » cells. This reduces the effective surface drag, and results in more intense cyclones when compared to a model configuration where the ocean and atmosphere are of equivalent resolution. Our results demonstrate that the choice of computation grid for atmosphere–ocean interactions is non-negligible when considering climate extremes at high horizontal resolution, especially when model components are on highly disparate grids.« less
Contrasting model complexity under a changing climate in a headwaters catchment.
NASA Astrophysics Data System (ADS)
Foster, L.; Williams, K. H.; Maxwell, R. M.
2017-12-01
Alpine, snowmelt-dominated catchments are the source of water for more than 1/6th of the world's population. These catchments are topographically complex, leading to steep weather gradients and nonlinear relationships between water and energy fluxes. Recent evidence suggests that alpine systems are more sensitive to climate warming, but these regions are vastly simplified in climate models and operational water management tools due to computational limitations. Simultaneously, point-scale observations are often extrapolated to larger regions where feedbacks can both exacerbate or mitigate locally observed changes. It is critical to determine whether projected climate impacts are robust to different methodologies, including model complexity. Using high performance computing and an integrated model of a representative headwater catchment we determined the hydrologic response from 30 projected climate changes to precipitation, temperature and vegetation for the Rocky Mountains. Simulations were run with 100m and 1km resolution, and with and without lateral subsurface flow in order to vary model complexity. We found that model complexity alters nonlinear relationships between water and energy fluxes. Higher-resolution models predicted larger changes per degree of temperature increase than lower resolution models, suggesting that reductions to snowpack, surface water, and groundwater due to warming may be underestimated in simple models. Increases in temperature were found to have a larger impact on water fluxes and stores than changes in precipitation, corroborating previous research showing that mountain systems are significantly more sensitive to temperature changes than to precipitation changes and that increases in winter precipitation are unlikely to compensate for increased evapotranspiration in a higher energy environment. These numerical experiments help to (1) bracket the range of uncertainty in published literature of climate change impacts on headwater hydrology; (2) characterize the role of precipitation and temperature changes on water supply for snowmelt-dominated downstream basins; and (3) identify which climate impacts depend on the scale of simulation.
Parallel computing of a climate model on the dawn 1000 by domain decomposition method
NASA Astrophysics Data System (ADS)
Bi, Xunqiang
1997-12-01
In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.
An approach to secure weather and climate models against hardware faults
NASA Astrophysics Data System (ADS)
Düben, Peter D.; Dawson, Andrew
2017-03-01
Enabling Earth System models to run efficiently on future supercomputers is a serious challenge for model development. Many publications study efficient parallelization to allow better scaling of performance on an increasing number of computing cores. However, one of the most alarming threats for weather and climate predictions on future high performance computing architectures is widely ignored: the presence of hardware faults that will frequently hit large applications as we approach exascale supercomputing. Changes in the structure of weather and climate models that would allow them to be resilient against hardware faults are hardly discussed in the model development community. In this paper, we present an approach to secure the dynamical core of weather and climate models against hardware faults using a backup system that stores coarse resolution copies of prognostic variables. Frequent checks of the model fields on the backup grid allow the detection of severe hardware faults, and prognostic variables that are changed by hardware faults on the model grid can be restored from the backup grid to continue model simulations with no significant delay. To justify the approach, we perform model simulations with a C-grid shallow water model in the presence of frequent hardware faults. As long as the backup system is used, simulations do not crash and a high level of model quality can be maintained. The overhead due to the backup system is reasonable and additional storage requirements are small. Runtime is increased by only 13 % for the shallow water model.
Wildfire potential evaluation during a drought event with a regional climate model and NDVI
Y. Liu; J. Stanturf; S. Goodrick
2010-01-01
Regional climate modeling is a technique for simulating high-resolution physical processes in the atmosphere, soil and vegetation. It can be used to evaluate wildfire potential by either providing meteorological conditions for computation of fire indices or predicting soil moisture as a direct measure of fire potential. This study examines these roles using a regional...
New Gravity Wave Treatments for GISS Climate Models
NASA Technical Reports Server (NTRS)
Geller, Marvin A.; Zhou, Tiehan; Ruedy, Reto; Aleinov, Igor; Nazarenko, Larissa; Tausnev, Nikolai L.; Sun, Shan; Kelley, Maxwell; Cheng, Ye
2011-01-01
Previous versions of GISS climate models have either used formulations of Rayleigh drag to represent unresolved gravity wave interactions with the model-resolved flow or have included a rather complicated treatment of unresolved gravity waves that, while being climate interactive, involved the specification of a relatively large number of parameters that were not well constrained by observations and also was computationally very expensive. Here, the authors introduce a relatively simple and computationally efficient specification of unresolved orographic and nonorographic gravity waves and their interaction with the resolved flow. Comparisons of the GISS model winds and temperatures with no gravity wave parameterization; with only orographic gravity wave parameterization; and with both orographic and nonorographic gravity wave parameterizations are shown to illustrate how the zonal mean winds and temperatures converge toward observations. The authors also show that the specifications of orographic and nonorographic gravity waves must be different in the Northern and Southern Hemispheres. Then results are presented where the nonorographic gravity wave sources are specified to represent sources from convection in the intertropical convergence zone and spontaneous emission from jet imbalances. Finally, a strategy to include these effects in a climate-dependent manner is suggested.
An ARM data-oriented diagnostics package to evaluate the climate model simulation
NASA Astrophysics Data System (ADS)
Zhang, C.; Xie, S.
2016-12-01
A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.
New Gravity Wave Treatments for GISS Climate Models
NASA Technical Reports Server (NTRS)
Geller, Marvin A.; Zhou, Tiehan; Ruedy, Reto; Aleinov, Igor; Nazarenko, Larissa; Tausnev, Nikolai L.; Sun, Shan; Kelley, Maxwell; Cheng, Ye
2010-01-01
Previous versions of GISS climate models have either used formulations of Rayleigh drag to represent unresolved gravity wave interactions with the model resolved flow or have included a rather complicated treatment of unresolved gravity waves that, while being climate interactive, involved the specification of a relatively large number of parameters that were not well constrained by observations and also was computationally very expensive. Here, we introduce a relatively simple and computationally efficient specification of unresolved orographic and non-orographic gravity waves and their interaction with the resolved flow. We show comparisons of the GISS model winds and temperatures with no gravity wave parametrization; with only orographic gravity wave parameterization; and with both orographic and non-orographic gravity wave parameterizations to illustrate how the zonal mean winds and temperatures converge toward observations. We also show that the specifications of orographic and nonorographic gravity waves must be different in the Northern and Southern Hemispheres. We then show results where the non-orographic gravity wave sources are specified to represent sources from convection in the Intertropical Convergence Zone and spontaneous emission from jet imbalances. Finally, we suggest a strategy to include these effects in a climate dependent manner.
NASA Technical Reports Server (NTRS)
Youngblut, C.
1984-01-01
Orography and geographically fixed heat sources which force a zonally asymmetric motion field are examined. An extensive space-time spectral analysis of the GLAS climate model (D130) response and observations are compared. An updated version of the model (D150) showed a remarkable improvement in the simulation of the standing waves. The main differences in the model code are an improved boundary layer flux computation and a more realistic specification of the global boundary conditions.
NASA Astrophysics Data System (ADS)
Khan, M.; Abdul-Aziz, O. I.
2016-12-01
Changes in climatic regimes and basin characteristics such as imperviousness, roughness and land use types would lead to potential changes in stormwater budget. In this study we quantified reference sensitivities of stormwater runoff to the potential climatic and land use/cover changes by developing a large-scale, mechanistic rainfall-runoff model for the Tampa Bay Basin of Florida using the US EPA Storm Water Management Model (SWMM 5.1). Key processes of urban hydrology, its dynamic interactions with groundwater and sea level, hydro-climatic variables and land use/cover characteristics were incorporated within the model. The model was calibrated and validated with historical streamflow data. We then computed the historical (1970-2000) and potential 2050s stormwater budgets for the Tampa Bay Basin. Climatic scenario projected by the global climate models (GCMs) and the regional climate models (RCMs), along with sea level and land use/cover projections, were utilized to anticipate the future stormwater budget. The comparative assessment of current and future stormwater scenario will aid a proactive management of stormwater runoff under a changing climate in the Tampa Bay Basin and similar urban basins around the world.
A new framework for the analysis of continental-scale convection-resolving climate simulations
NASA Astrophysics Data System (ADS)
Leutwyler, D.; Charpilloz, C.; Arteaga, A.; Ban, N.; Di Girolamo, S.; Fuhrer, O.; Hoefler, T.; Schulthess, T. C.; Christoph, S.
2017-12-01
High-resolution climate simulations at horizontal resolution of O(1-4 km) allow explicit treatment of deep convection (thunderstorms and rain showers). Explicitly treating convection by the governing equations reduces uncertainties associated with parametrization schemes and allows a model formulation closer to physical first principles [1,2]. But kilometer-scale climate simulations with long integration periods and large computational domains are expensive and data storage becomes unbearably voluminous. Hence new approaches to perform analysis are required. In the crCLIM project we propose a new climate modeling framework that allows scientists to conduct analysis at high spatial and temporal resolution. We tackle the computational cost by using the largest available supercomputers such as hybrid CPU-GPU architectures. For this the COSMO model has been adapted to run on such architectures [2]. We then alleviate the I/O-bottleneck by employing a simulation data-virtualizer (SDaVi) that allows to trade-off storage (space) for computational effort (time). This is achieved by caching the simulation outputs and efficiently launching re-simulations in case of cache misses. All this is done transparently from the analysis applications [3]. For the re-runs this approach requires a bit-reproducible version of COSMO. That is to say a model that produces identical results on different architectures to ensure coherent recomputation of the requested data [4]. In this contribution we present a version of SDaVi, a first performance model, and a strategy to obtain bit-reproducibility across hardware architectures.[1] N. Ban, J. Schmidli, C. Schär. Evaluation of the convection-resolving regional climate modeling approach in decade-long simulations. J. Geophys. Res. Atmos., 7889-7907, 2014.[2] D. Leutwyler, O. Fuhrer, X. Lapillonne, D. Lüthi, C. Schär. Towards European-scale convection-resolving climate simulations with GPUs: a study with COSMO 4.19. Geosci. Model Dev, 3393-3412, 2016.[3] S. Di Girolamo, P. Schmid, T. Schulthess, T. Hoefler. Virtualized Big Data: Reproducing Simulation Output on Demand. Submit. to the 23rd ACM Symposium on PPoPP 18, Vienna, Austria.[4] A. Arteaga, O. Fuhrer, T. Hoefler. Designing Bit-Reproducible Portable High-Performance Applications. IEEE 28th IPDPS, 2014.
NASA Technical Reports Server (NTRS)
Collins, W. D.; Ramaswamy, V.; Schwarzkopf, M. D.; Sun, Y.; Portmann, R. W.; Fu, Q.; Casanova, S. E. B.; Dufresne, J.-L.; Fillmore, D. W.; Forster, P. M. D.;
2006-01-01
The radiative effects from increased concentrations of well-mixed greenhouse gases (WMGHGs) represent the most significant and best understood anthropogenic forcing of the climate system. The most comprehensive tools for simulating past and future climates influenced by WMGHGs are fully coupled atmosphere-ocean general circulation models (AOGCMs). Because of the importance of WMGHGs as forcing agents it is essential that AOGCMs compute the radiative forcing by these gases as accurately as possible. We present the results of a radiative transfer model intercomparison between the forcings computed by the radiative parameterizations of AOGCMs and by benchmark line-by-line (LBL) codes. The comparison is focused on forcing by CO2, CH4, N2O, CFC-11, CFC-12, and the increased H2O expected in warmer climates. The models included in the intercomparison include several LBL codes and most of the global models submitted to the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4). In general, the LBL models are in excellent agreement with each other. However, in many cases, there are substantial discrepancies among the AOGCMs and between the AOGCMs and LBL codes. In some cases this is because the AOGCMs neglect particular absorbers, in particular the near-infrared effects of CH4 and N2O, while in others it is due to the methods for modeling the radiative processes. The biases in the AOGCM forcings are generally largest at the surface level. We quantify these differences and discuss the implications for interpreting variations in forcing and response across the multimodel ensemble of AOGCM simulations assembled for the IPCC AR4.
Constructing Scientific Arguments Using Evidence from Dynamic Computational Climate Models
NASA Astrophysics Data System (ADS)
Pallant, Amy; Lee, Hee-Sun
2015-04-01
Modeling and argumentation are two important scientific practices students need to develop throughout school years. In this paper, we investigated how middle and high school students ( N = 512) construct a scientific argument based on evidence from computational models with which they simulated climate change. We designed scientific argumentation tasks with three increasingly complex dynamic climate models. Each scientific argumentation task consisted of four parts: multiple-choice claim, openended explanation, five-point Likert scale uncertainty rating, and open-ended uncertainty rationale. We coded 1,294 scientific arguments in terms of a claim's consistency with current scientific consensus, whether explanations were model based or knowledge based and categorized the sources of uncertainty (personal vs. scientific). We used chi-square and ANOVA tests to identify significant patterns. Results indicate that (1) a majority of students incorporated models as evidence to support their claims, (2) most students used model output results shown on graphs to confirm their claim rather than to explain simulated molecular processes, (3) students' dependence on model results and their uncertainty rating diminished as the dynamic climate models became more and more complex, (4) some students' misconceptions interfered with observing and interpreting model results or simulated processes, and (5) students' uncertainty sources reflected more frequently on their assessment of personal knowledge or abilities related to the tasks than on their critical examination of scientific evidence resulting from models. These findings have implications for teaching and research related to the integration of scientific argumentation and modeling practices to address complex Earth systems.
An approach to secure weather and climate models against hardware faults
NASA Astrophysics Data System (ADS)
Düben, Peter; Dawson, Andrew
2017-04-01
Enabling Earth System models to run efficiently on future supercomputers is a serious challenge for model development. Many publications study efficient parallelisation to allow better scaling of performance on an increasing number of computing cores. However, one of the most alarming threats for weather and climate predictions on future high performance computing architectures is widely ignored: the presence of hardware faults that will frequently hit large applications as we approach exascale supercomputing. Changes in the structure of weather and climate models that would allow them to be resilient against hardware faults are hardly discussed in the model development community. We present an approach to secure the dynamical core of weather and climate models against hardware faults using a backup system that stores coarse resolution copies of prognostic variables. Frequent checks of the model fields on the backup grid allow the detection of severe hardware faults, and prognostic variables that are changed by hardware faults on the model grid can be restored from the backup grid to continue model simulations with no significant delay. To justify the approach, we perform simulations with a C-grid shallow water model in the presence of frequent hardware faults. As long as the backup system is used, simulations do not crash and a high level of model quality can be maintained. The overhead due to the backup system is reasonable and additional storage requirements are small. Runtime is increased by only 13% for the shallow water model.
Wan, Jizhong; Wang, Chunjing; Yu, Jinghua; Nie, Siming; Han, Shijie; Zu, Yuangang; Chen, Changmei; Yuan, Shusheng; Wang, Qinggui
2014-01-01
Climate change affects both habitat suitability and the genetic diversity of wild plants. Therefore, predicting and establishing the most effective and coherent conservation areas is essential for the conservation of genetic diversity in response to climate change. This is because genetic variance is a product not only of habitat suitability in conservation areas but also of efficient protection and management. Phellodendron amurense Rupr. is a tree species (family Rutaceae) that is endangered due to excessive and illegal harvesting for use in Chinese medicine. Here, we test a general computational method for the prediction of priority conservation areas (PCAs) by measuring the genetic diversity of P. amurense across the entirety of northeast China using a single strand repeat analysis of twenty microsatellite markers. Using computational modeling, we evaluated the geographical distribution of the species, both now and in different future climate change scenarios. Different populations were analyzed according to genetic diversity, and PCAs were identified using a spatial conservation prioritization framework. These conservation areas were optimized to account for the geographical distribution of P. amurense both now and in the future, to effectively promote gene flow, and to have a long period of validity. In situ and ex situ conservation, strategies for vulnerable populations were proposed. Three populations with low genetic diversity are predicted to be negatively affected by climate change, making conservation of genetic diversity challenging due to decreasing habitat suitability. Habitat suitability was important for the assessment of genetic variability in existing nature reserves, which were found to be much smaller than the proposed PCAs. Finally, a simple set of conservation measures was established through modeling. This combined molecular and computational ecology approach provides a framework for planning the protection of species endangered by climate change. PMID:25165526
Intercomparison of hydrologic processes in global climate models
NASA Technical Reports Server (NTRS)
Lau, W. K.-M.; Sud, Y. C.; Kim, J.-H.
1995-01-01
In this report, we address the intercomparison of precipitation (P), evaporation (E), and surface hydrologic forcing (P-E) for 23 Atmospheric Model Intercomparison Project (AMIP) general circulation models (GCM's) including relevant observations, over a variety of spatial and temporal scales. The intercomparison includes global and hemispheric means, latitudinal profiles, selected area means for the tropics and extratropics, ocean and land, respectively. In addition, we have computed anomaly pattern correlations among models and observations for different seasons, harmonic analysis for annual and semiannual cycles, and rain-rate frequency distribution. We also compare the joint influence of temperature and precipitation on local climate using the Koeppen climate classification scheme.
Climate Model Diagnostic Analyzer
NASA Technical Reports Server (NTRS)
Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei
2015-01-01
The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.
Towards a unified Global Weather-Climate Prediction System
NASA Astrophysics Data System (ADS)
Lin, S. J.
2016-12-01
The Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions and kilometer scale regional climate simulations within a unified global modeling system. The foundation of this flexible modeling system is the nonhydrostatic Finite-Volume Dynamical Core on the Cubed-Sphere (FV3). A unique aspect of FV3 is that it is "vertically Lagrangian" (Lin 2004), essentially reducing the equation sets to two dimensions, and is the single most important reason why FV3 outperforms other non-hydrostatic cores. Owning to its accuracy, adaptability, and computational efficiency, the FV3 has been selected as the "engine" for NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched grid, a two-way regional-global nested grid, and an optimal combination of the stretched and two-way nests capability, making kilometer-scale regional simulations within a global modeling system feasible. Our main scientific goal is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that, with the FV3, it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornado-like vortices using a global model that was originally designed for climate simulations. The development and tuning strategy between traditional weather and climate models are fundamentally different due to different metrics. We were able to adapt and use traditional "climate" metrics or standards, such as angular momentum conservation, energy conservation, and flux balance at top of the atmosphere, and gain insight into problems of traditional weather prediction model for medium-range weather prediction, and vice versa. Therefore, the unification in weather and climate models can happen not just at the algorithm or parameterization level, but also in the metric and tuning strategy used for both applications, and ultimately, with benefits to both weather and climate applications.
Extreme storm surge and wind wave climate scenario simulations at the Venetian littoral
NASA Astrophysics Data System (ADS)
Lionello, P.; Galati, M. B.; Elvini, E.
Scenario climate projections for extreme marine storms producing storm surges and wind waves are very important for the northern flat coast of the Adriatic Sea, where the area at risk includes a unique cultural and environmental heritage, and important economic activities. This study uses a shallow water model and a spectral wave model for computing the storm surge and the wind wave field, respectively, from the sea level pressure and wind fields that have been computed by the RegCM regional climate model. Simulations cover the period 1961-1990 for the present climate (control simulations) and the period 2071-2100 for the A2 and B2 scenarios. Generalized Extreme Value analysis is used for estimating values for the 10 and 100 year return times. The adequacy of these modeling tools for a reliable estimation of the climate change signal, without needing further downscaling is shown. However, this study has mainly a methodological value, because issues such as interdecadal variability and intermodel variability cannot be addressed, since the analysis is based on single model 30-year long simulations. The control simulation looks reasonably accurate for extreme value analysis, though it overestimates/underestimates the frequency of high/low surge and wind wave events with respect to observations. Scenario simulations suggest higher frequency of intense storms for the B2 scenario, but not for the A2. Likely, these differences are not the effect of climate change, but of climate multidecadal variability. Extreme storms are stronger in future scenarios, but differences are not statistically significant. Therefore this study does not provide convincing evidence for more stormy conditions in future scenarios.
ENES the European Network for Earth System modelling and its infrastructure projects IS-ENES
NASA Astrophysics Data System (ADS)
Guglielmo, Francesca; Joussaume, Sylvie; Parinet, Marie
2016-04-01
The scientific community working on climate modelling is organized within the European Network for Earth System modelling (ENES). In the past decade, several European university departments, research centres, meteorological services, computer centres, and industrial partners engaged in the creation of ENES with the purpose of working together and cooperating towards the further development of the network, by signing a Memorandum of Understanding. As of 2015, the consortium counts 47 partners. The climate modelling community, and thus ENES, faces challenges which are both science-driven, i.e. analysing of the full complexity of the Earth System to improve our understanding and prediction of climate changes, and have multi-faceted societal implications, as a better representation of climate change on regional scales leads to improved understanding and prediction of impacts and to the development and provision of climate services. ENES, promoting and endorsing projects and initiatives, helps in developing and evaluating of state-of-the-art climate and Earth system models, facilitates model inter-comparison studies, encourages exchanges of software and model results, and fosters the use of high performance computing facilities dedicated to high-resolution multi-model experiments. ENES brings together public and private partners, integrates countries underrepresented in climate modelling studies, and reaches out to different user communities, thus enhancing European expertise and competitiveness. In this need of sophisticated models, world-class, high-performance computers, and state-of-the-art software solutions to make efficient use of models, data and hardware, a key role is played by the constitution and maintenance of a solid infrastructure, developing and providing services to the different user communities. ENES has investigated the infrastructural needs and has received funding from the EU FP7 program for the IS-ENES (InfraStructure for ENES) phase I and II projects. We present here the case study of an existing network of institutions brought together toward common goals by a non-binding agreement, ENES, and of its two IS-ENES projects. These latter will be discussed in their double role as a means to provide and/or maintain the actual infrastructure (hardware, software, skilled human resources, services) to achieve ENES scientific goals -fulfilling the aims set in a strategy document-, but also to inform and provide to the network a structured way of working and of interacting with the extended community. The genesis and evolution of the network and the interaction network/projects will also be analysed in terms of long-term sustainability.
NASA Astrophysics Data System (ADS)
Ercan, Mehmet Bulent
Watershed-scale hydrologic models are used for a variety of applications from flood prediction, to drought analysis, to water quality assessments. A particular challenge in applying these models is calibration of the model parameters, many of which are difficult to measure at the watershed-scale. A primary goal of this dissertation is to contribute new computational methods and tools for calibration of watershed-scale hydrologic models and the Soil and Water Assessment Tool (SWAT) model, in particular. SWAT is a physically-based, watershed-scale hydrologic model developed to predict the impact of land management practices on water quality and quantity. The dissertation follows a manuscript format meaning it is comprised of three separate but interrelated research studies. The first two research studies focus on SWAT model calibration, and the third research study presents an application of the new calibration methods and tools to study climate change impacts on water resources in the Upper Neuse Watershed of North Carolina using SWAT. The objective of the first two studies is to overcome computational challenges associated with calibration of SWAT models. The first study evaluates a parallel SWAT calibration tool built using the Windows Azure cloud environment and a parallel version of the Dynamically Dimensioned Search (DDS) calibration method modified to run in Azure. The calibration tool was tested for six model scenarios constructed using three watersheds of increasing size (the Eno, Upper Neuse, and Neuse) for both a 2 year and 10 year simulation duration. Leveraging the cloud as an on demand computing resource allowed for a significantly reduced calibration time such that calibration of the Neuse watershed went from taking 207 hours on a personal computer to only 3.4 hours using 256 cores in the Azure cloud. The second study aims at increasing SWAT model calibration efficiency by creating an open source, multi-objective calibration tool using the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the methodological advances presented in the first two studies. Therefore, the dissertation contains three independent by interrelated studies that collectively advance the field of watershed-scale hydrologic modeling and analysis.
Climate Analytics as a Service. Chapter 11
NASA Technical Reports Server (NTRS)
Schnase, John L.
2016-01-01
Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.
Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies
NASA Astrophysics Data System (ADS)
Williams, Paul; Howe, Nicola; Gregory, Jonathan; Smith, Robin; Joshi, Manoj
2017-04-01
In climate simulations, the impacts of the subgrid scales on the resolved scales are conventionally represented using deterministic closure schemes, which assume that the impacts are uniquely determined by the resolved scales. Stochastic parameterization relaxes this assumption, by sampling the subgrid variability in a computationally inexpensive manner. This study shows that the simulated climatological state of the ocean is improved in many respects by implementing a simple stochastic parameterization of ocean eddies into a coupled atmosphere-ocean general circulation model. Simulations from a high-resolution, eddy-permitting ocean model are used to calculate the eddy statistics needed to inject realistic stochastic noise into a low-resolution, non-eddy-permitting version of the same model. A suite of four stochastic experiments is then run to test the sensitivity of the simulated climate to the noise definition by varying the noise amplitude and decorrelation time within reasonable limits. The addition of zero-mean noise to the ocean temperature tendency is found to have a nonzero effect on the mean climate. Specifically, in terms of the ocean temperature and salinity fields both at the surface and at depth, the noise reduces many of the biases in the low-resolution model and causes it to more closely resemble the high-resolution model. The variability of the strength of the global ocean thermohaline circulation is also improved. It is concluded that stochastic ocean perturbations can yield reductions in climate model error that are comparable to those obtained by refining the resolution, but without the increased computational cost. Therefore, stochastic parameterizations of ocean eddies have the potential to significantly improve climate simulations. Reference Williams PD, Howe NJ, Gregory JM, Smith RS, and Joshi MM (2016) Improved Climate Simulations through a Stochastic Parameterization of Ocean Eddies. Journal of Climate, 29, 8763-8781. http://dx.doi.org/10.1175/JCLI-D-15-0746.1
NASA Technical Reports Server (NTRS)
Goldsmith, V.; Morris, W. D.; Byrne, R. J.; Whitlock, C. H.
1974-01-01
A computerized wave climate model is developed that applies linear wave theory and shelf depth information to predict wave behavior as they pass over the continental shelf as well as the resulting wave energy distributions along the coastline. Reviewed are also the geomorphology of the Mid-Atlantic Continental Shelf, wave computations resulting from 122 wave input conditions, and a preliminary analysis of these data.
Energy Exascale Earth System Model (E3SM) Project Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, D.
The E3SM project will assert and maintain an international scientific leadership position in the development of Earth system and climate models at the leading edge of scientific knowledge and computational capabilities. With its collaborators, it will demonstrate its leadership by using these models to achieve the goal of designing, executing, and analyzing climate and Earth system simulations that address the most critical scientific questions for the nation and DOE.
NASA Astrophysics Data System (ADS)
Mudelsee, Manfred
2015-04-01
The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to find the suitable method, that is, the mode of estimation and uncertainty-measure determination that optimizes a selected measure for prescribed values close to the initial estimates. Also here, intelligent exploration methods (gradient, Brent, etc.) are useful. The third task is to apply the optimal estimation method to the climate dataset. This conference paper illustrates by means of three examples that optimal estimation has the potential to shape future big climate data analysis. First, we consider various hypothesis tests to study whether climate extremes are increasing in their occurrence. Second, we compare Pearson's and Spearman's correlation measures. Third, we introduce a novel estimator of the tail index, which helps to better quantify climate-change related risks.
DOT National Transportation Integrated Search
2017-05-01
Climate change introduces infrastructure flooding challenges, especially for coastal regions with low topographic relief. More frequently occurring intense storms and sea level rise are two projected impacts of climate change that will lead to increa...
The use of perturbed physics ensembles and emulation in palaeoclimate reconstruction (Invited)
NASA Astrophysics Data System (ADS)
Edwards, T. L.; Rougier, J.; Collins, M.
2010-12-01
Climate is a coherent process, with correlations and dependencies across space, time, and climate variables. However, reconstructions of palaeoclimate traditionally consider individual pieces of information independently, rather than making use of this covariance structure. Such reconstructions are at risk of being unphysical or at least implausible. Climate simulators such as General Circulation Models (GCMs), on the other hand, contain climate system theory in the form of dynamical equations describing physical processes, but are imperfect and computationally expensive. These two datasets - pointwise palaeoclimate reconstructions and climate simulator evaluations - contain complementary information, and a statistical synthesis can produce a palaeoclimate reconstruction that combines them while not ignoring their limitations. We use an ensemble of simulators with perturbed parameterisations, to capture the uncertainty about the simulator variant, and our method also accounts for structural uncertainty. The resulting reconstruction contains a full expression of climate uncertainty, not just pointwise but also jointly over locations. Such joint information is crucial in determining spatially extensive features such as isotherms, or the location of the tree-line. A second outcome of the statistical analysis is a refined distribution for the simulator parameters. In this way, information from palaeoclimate observations can be used directly in quantifying uncertainty in future climate projections. The main challenge is the expense of running a large scale climate simulator: each evaluation of an atmosphere-ocean GCM takes several months of computing time. The solution is to interpret the ensemble of evaluations within an 'emulator', which is a statistical model of the simulator. This technique has been used fruitfully in the statistical field of Computer Models for two decades, and has recently been applied in estimating uncertainty in future climate predictions in the UKCP09 (http://ukclimateprojections.defra.gov.uk). But only in the last couple of years has it developed to the point where it can be applied to large-scale spatial fields. We construct an emulator for the mid-Holocene (6000 calendar years BP) temperature anomaly over North America, at the resolution of our simulator (2.5° latitude by 3.75° longitude). This allows us to explore the behaviour of simulator variants that we could not afford to evaluate directly. We introduce the technique of 'co-emulation' of two versions of the climate simulator: the coupled atmosphere-ocean model HadCM3, and an equivalent with a simplified ocean, HadSM3. Running two different versions of a simulator is a powerful tool for increasing the information yield from a fixed budget of computer time, but the results must be combined statistically to account for the reduced fidelity of the quicker version. Emulators provide the appropriate framework.
A Climate Statistics Tool and Data Repository
NASA Astrophysics Data System (ADS)
Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.
2017-12-01
Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.
A solar radiation model for use in climate studies
NASA Technical Reports Server (NTRS)
Chou, Ming-Dah
1992-01-01
A solar radiation routine is developed for use in climate studies that includes absorption and scattering due to ozone, water vapor, oxygen, carbon dioxide, clouds, and aerosols. Rayleigh scattering is also included. Broadband parameterization is used to compute the absorption by water vapor in a clear atmosphere, and the k-distribution method is applied to compute fluxes in a scattering atmosphere. The reflectivity and transmissivity of a scattering layer are computed analytically using the delta-four-stream discrete-ordinate approximation. The two-stream adding method is then applied to compute fluxes for a composite of clear and scattering layers. Compared to the results of high spectral resolution and detailed multiple-scattering calculations, fluxes and heating rate are accurately computed to within a few percent. The high accuracy of the flux and heating-rate calculations is achieved with a reasonable amount of computing time. With the UV and visible region grouped into four bands, this solar radiation routine is useful not only for climate studies but also for studies on photolysis in the upper atmosphere and photosynthesis in the biosphere.
Climate Science's Globally Distributed Infrastructure
NASA Astrophysics Data System (ADS)
Williams, D. N.
2016-12-01
The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.
NASA Technical Reports Server (NTRS)
North, G. R.; Crowley, T. J.
1984-01-01
Mathematical climate modelling has matured as a discipline to the point that it is useful in paleoclimatology. As an example a new two dimensional energy balance model is described and applied to several problems of current interest. The model includes the seasonal cycle and the detailed land-sea geographical distribution. By examining the changes in the seasonal cycle when external perturbations are forced upon the climate system it is possible to construct hypotheses about the origin of midlatitude ice sheets and polar ice caps. In particular the model predicts a rather sudden potential for glaciation over large areas when the Earth's orbital elements are only slightly altered. Similarly, the drift of continents or the change of atmospheric carbon dioxide over geological time induces radical changes in continental ice cover. With the advance of computer technology and improved understanding of the individual components of the climate system, these ideas will be tested in far more realistic models in the near future.
DOE unveils climate model in advance of global test
NASA Astrophysics Data System (ADS)
Popkin, Gabriel
2018-05-01
The world's growing collection of climate models has a high-profile new entry. Last week, after nearly 4 years of work, the U.S. Department of Energy (DOE) released computer code and initial results from an ambitious effort to simulate the Earth system. The new model is tailored to run on future supercomputers and designed to forecast not just how climate will change, but also how those changes might stress energy infrastructure. Results from an upcoming comparison of global models may show how well the new entrant works. But so far it is getting a mixed reception, with some questioning the need for another model and others saying the $80 million effort has yet to improve predictions of the future climate. Even the project's chief scientist, Ruby Leung of the Pacific Northwest National Laboratory in Richland, Washington, acknowledges that the model is not yet a leader.
Measures of GCM Performance as Functions of Model Parameters Affecting Clouds and Radiation
NASA Astrophysics Data System (ADS)
Jackson, C.; Mu, Q.; Sen, M.; Stoffa, P.
2002-05-01
This abstract is one of three related presentations at this meeting dealing with several issues surrounding optimal parameter and uncertainty estimation of model predictions of climate. Uncertainty in model predictions of climate depends in part on the uncertainty produced by model approximations or parameterizations of unresolved physics. Evaluating these uncertainties is computationally expensive because one needs to evaluate how arbitrary choices for any given combination of model parameters affects model performance. Because the computational effort grows exponentially with the number of parameters being investigated, it is important to choose parameters carefully. Evaluating whether a parameter is worth investigating depends on two considerations: 1) does reasonable choices of parameter values produce a large range in model response relative to observational uncertainty? and 2) does the model response depend non-linearly on various combinations of model parameters? We have decided to narrow our attention to selecting parameters that affect clouds and radiation, as it is likely that these parameters will dominate uncertainties in model predictions of future climate. We present preliminary results of ~20 to 30 AMIPII style climate model integrations using NCAR's CCM3.10 that show model performance as functions of individual parameters controlling 1) critical relative humidity for cloud formation (RHMIN), and 2) boundary layer critical Richardson number (RICR). We also explore various definitions of model performance that include some or all observational data sources (surface air temperature and pressure, meridional and zonal winds, clouds, long and short-wave cloud forcings, etc...) and evaluate in a few select cases whether the model's response depends non-linearly on the parameter values we have selected.
Milly, P.C.D.; Dunne, K.A.
2011-01-01
Hydrologic models often are applied to adjust projections of hydroclimatic change that come from climate models. Such adjustment includes climate-bias correction, spatial refinement ("downscaling"), and consideration of the roles of hydrologic processes that were neglected in the climate model. Described herein is a quantitative analysis of the effects of hydrologic adjustment on the projections of runoff change associated with projected twenty-first-century climate change. In a case study including three climate models and 10 river basins in the contiguous United States, the authors find that relative (i.e., fractional or percentage) runoff change computed with hydrologic adjustment more often than not was less positive (or, equivalently, more negative) than what was projected by the climate models. The dominant contributor to this decrease in runoff was a ubiquitous change in runoff (median 211%) caused by the hydrologic model's apparent amplification of the climate-model-implied growth in potential evapotranspiration. Analysis suggests that the hydrologic model, on the basis of the empirical, temperature-based modified Jensen-Haise formula, calculates a change in potential evapotranspiration that is typically 3 times the change implied by the climate models, which explicitly track surface energy budgets. In comparison with the amplification of potential evapotranspiration, central tendencies of other contributions from hydrologic adjustment (spatial refinement, climate-bias adjustment, and process refinement) were relatively small. The authors' findings highlight the need for caution when projecting changes in potential evapotranspiration for use in hydrologic models or drought indices to evaluate climatechange impacts on water. Copyright ?? 2011, Paper 15-001; 35,952 words, 3 Figures, 0 Animations, 1 Tables.
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Okladnikov, Igor; Titov, Alexander; Gordov, Evgeny
2016-04-01
While there is a strong demand for innovation in digital learning, available training programs in the environmental sciences have no time to adapt to rapid changes in the domain content. A joint group of scientists and university teachers develops and implements an educational environment for new learning experiences in basics of climatic science and its applications. This so-called virtual learning laboratory "Climate" contains educational materials and interactive training courses developed to provide undergraduate and graduate students with profound understanding of changes in regional climate and environment. The main feature of this Laboratory is that students perform their computational tasks on climate modeling and evaluation and assessment of climate change using the typical tools of the "Climate" information-computational system, which are usually used by real-life practitioners performing such kind of research. Students have an opportunity to perform computational laboratory works using information-computational tools of the system and improve skills of their usage simultaneously with mastering the subject. We did not create an artificial learning environment to pass the trainings. On the contrary, the main purpose of association of the educational block and computational information system was to familiarize students with the real existing technologies for monitoring and analysis of data on the state of the climate. Trainings are based on technologies and procedures which are typical for Earth system sciences. Educational courses are designed to permit students to conduct their own investigations of ongoing and future climate changes in a manner that is essentially identical to the techniques used by national and international climate research organizations. All trainings are supported by lectures, devoted to the basic aspects of modern climatology, including analysis of current climate change and its possible impacts ensuring effective links between theory and practice. Along with its usage in graduate and postgraduate education, "Climate" is used as a framework for a developed basic information course on climate change for common public. In this course basic concepts and problems of modern climate change and its possible consequences are described for non-specialists. The course will also include links to relevant information resources on topical issues of Earth Sciences and a number of case studies, which are carried out for a selected region to consolidate the received knowledge.
Educational and Scientific Applications of Climate Model Diagnostic Analyzer
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.
2016-12-01
Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.
Secular trends and climate drift in coupled ocean-atmosphere general circulation models
NASA Astrophysics Data System (ADS)
Covey, Curt; Gleckler, Peter J.; Phillips, Thomas J.; Bader, David C.
2006-02-01
Coupled ocean-atmosphere general circulation models (coupled GCMs) with interactive sea ice are the primary tool for investigating possible future global warming and numerous other issues in climate science. A long-standing problem with such models is that when different components of the physical climate system are linked together, the simulated climate can drift away from observation unless constrained by ad hoc adjustments to interface fluxes. However, 11 modern coupled GCMs, including three that do not employ flux adjustments, behave much better in this respect than the older generation of models. Surface temperature trends in control run simulations (with external climate forcing such as solar brightness and atmospheric carbon dioxide held constant) are small compared with observed trends, which include 20th century climate change due to both anthropogenic and natural factors. Sea ice changes in the models are dominated by interannual variations. Deep ocean temperature and salinity trends are small enough for model control runs to extend over 1000 simulated years or more, but trends in some regions, most notably the Arctic, differ substantially among the models and may be problematic. Methods used to initialize coupled GCMs can mitigate climate drift but cannot eliminate it. Lengthy "spin-ups" of models, made possible by increasing computer power, are one reason for the improvements this paper documents.
NASA Astrophysics Data System (ADS)
Royer, Jean-François; Chauvin, Fabrice; Daloz, Anne-Sophie
2010-05-01
The response of tropical cyclones (TC) activity to global warming has not yet reached a clear consensus in the Fourth Assessment Report (AR4) published by the Intergovernmental Panel on Climate Change (IPCC, 2007) or in the recent scientific literature. Observed series are neither long nor reliable enough for a statistically significant detection and attribution of past TC trends, and coupled climate models give widely divergent results for the future evolution of TC activity in the different ocean basins. The potential importance of the spatial structure of the future SST warming has been pointed out by Chauvin et al. (2006) in simulations performed at CNRM with the ARPEGE-Climat GCM. The current presentation describes a new set of simulations that have been performed with the ARPEGE-Climat model to try to understand the possible role of SST patterns in the TC cyclogenesis response in 15 CMIP3 coupled simulations analysed by Royer et al (2009). The new simulations have been performed with the atmospheric component of the ARPEGE-Climat GCM forced in 10 year simulations by the SST patterns from each of 15 CMIP3 simulations with different climate model at the end of the 21st century according to scenario A2. The TC analysis is based on the computation of a Convective Yearly Genesis Parameter (CYGP) and the Genesis Potential Index (GPI). The computed genesis indices for each of the ARPEGE-Climat forced simulations is compared with the indices computed directly from the initial coupled simulation. The influence of SST patterns can then be more easily assessed since all the ARPEGE-Climat simulations are performed with the same atmospheric model, whereas the original simulations used models with different parameterization and resolutions. The analysis shows that CYGP or GPI anomalies obtained with ARPEGE are as variable between each other as those obtained originally by the different IPCC models. The variety of SST patterns used to force ARPEGE explains a large part of the dispersion, though for a given SST pattern, ARPEGE does not necessarily reproduce the anomaly produced originally by the IPCC model which produced the SST anomaly. Many factors can contribute to this discrepancy, but the most prominent seems to be the absence of coupling between the forced atmospheric ARPEGE simulation and the underlying ocean. When the atmospheric model is forced by prescribed SST anomalies some retroactions between cyclogenesis and ocean are missing. There are however areas over the globe were models agree about the CYGP or GPI anomalies induced by global warming, such as the Indian Ocean that shows a better coherency in the coupled and forced responses. This could be an indication that interaction between ocean and atmosphere is not as strong there as in the other basins. Details of the results for all the other ocean basins will be presented. References: Chauvin F. and J.-F. Royer and M. Déqué , 2006: Response of hurricane-type vortices to global warming as simulated by ARPEGE-Climat at high resolution. Climate Dynamics 27(4), 377-399. IPCC [Intergovernmental Panel for Climate Change], Climate change 2007: The physical science basis, in: S. Solomon et al. (eds.), Cambridge University Press. Royer JF, F Chauvin, 2009: Response of tropical cyclogenesis to global warming in an IPCC AR-4 scenario assessed by a modified yearly genesis parameter. "Hurricanes and Climate Change", J. B. Elsner and T. H. Jagger (Eds.), Springer, ISBN: 978-0-387-09409-0, pp 213-234.
NASA Astrophysics Data System (ADS)
Ammann, C. M.; Brown, B.; Kalb, C. P.; Bullock, R.; Buja, L.; Gutowski, W. J., Jr.; Halley-Gotway, J.; Kaatz, L.; Yates, D. N.
2017-12-01
Coordinated, multi-model climate change projection archives have already led to a flourishing of new climate impact applications. Collections and online tools for the computation of derived indicators have attracted many non-specialist users and decision-makers and facilitated for them the exploration of potential future weather and climate changes on their systems. Guided by a set of standardized steps and analyses, many can now use model output and determine basic model-based changes. But because each application and decision-context is different, the question remains if such a small collection of standardized tools can faithfully and comprehensively represent the critical physical context of change? We use the example of the El Niño - Southern Oscillation, the largest and most broadly recognized mode of variability in the climate system, to explore the difference in impact contexts between a quasi-blind, protocol-bound and a flexible, scientifically guided use of climate information. More use oriented diagnostics of the model-data as well as different strategies for getting data into decision environments are explored.
Estimating daily climatologies for climate indices derived from climate model data and observations
Mahlstein, Irina; Spirig, Christoph; Liniger, Mark A; Appenzeller, Christof
2015-01-01
Climate indices help to describe the past, present, and the future climate. They are usually closer related to possible impacts and are therefore more illustrative to users than simple climate means. Indices are often based on daily data series and thresholds. It is shown that the percentile-based thresholds are sensitive to the method of computation, and so are the climatological daily mean and the daily standard deviation, which are used for bias corrections of daily climate model data. Sample size issues of either the observed reference period or the model data lead to uncertainties in these estimations. A large number of past ensemble seasonal forecasts, called hindcasts, is used to explore these sampling uncertainties and to compare two different approaches. Based on a perfect model approach it is shown that a fitting approach can improve substantially the estimates of daily climatologies of percentile-based thresholds over land areas, as well as the mean and the variability. These improvements are relevant for bias removal in long-range forecasts or predictions of climate indices based on percentile thresholds. But also for climate change studies, the method shows potential for use. Key Points More robust estimates of daily climate characteristics Statistical fitting approach Based on a perfect model approach PMID:26042192
Till, Charlotte; Haverkamp, Jamie; White, Devin; ...
2016-11-22
Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when environmental migrants decide to move is critical to successful strategic planning within organizations tasked with helping the affected groups, and mitigating their systemic impacts. One way to support planning is through the employment of computational modeling techniques. Models can provide a window into possible futures, allowing planners and decision makers to test different scenarios in order to understand what might happen. While modeling is a powerful tool, it presents both opportunities and challenges. This paper builds amore » foundation for the broader community of model consumers and developers by: providing an overview of pertinent climate-induced migration research, describing some different types of models and how to select the most relevant one(s), highlighting three perspectives on obtaining data to use in said model(s), and the consequences associated with each. It concludes with two case studies based on recent research that illustrate what can happen when ambitious modeling efforts are undertaken without sufficient planning, oversight, and interdisciplinary collaboration. Lastly, we hope that the broader community can learn from our experiences and apply this knowledge to their own modeling research efforts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Till, Charlotte; Haverkamp, Jamie; White, Devin
Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when environmental migrants decide to move is critical to successful strategic planning within organizations tasked with helping the affected groups, and mitigating their systemic impacts. One way to support planning is through the employment of computational modeling techniques. Models can provide a window into possible futures, allowing planners and decision makers to test different scenarios in order to understand what might happen. While modeling is a powerful tool, it presents both opportunities and challenges. This paper builds amore » foundation for the broader community of model consumers and developers by: providing an overview of pertinent climate-induced migration research, describing some different types of models and how to select the most relevant one(s), highlighting three perspectives on obtaining data to use in said model(s), and the consequences associated with each. It concludes with two case studies based on recent research that illustrate what can happen when ambitious modeling efforts are undertaken without sufficient planning, oversight, and interdisciplinary collaboration. Lastly, we hope that the broader community can learn from our experiences and apply this knowledge to their own modeling research efforts.« less
NASA Astrophysics Data System (ADS)
DaPonte, John S.; Sadowski, Thomas; Thomas, Paul
2006-05-01
This paper describes a collaborative project conducted by the Computer Science Department at Southern Connecticut State University and NASA's Goddard Institute for Space Science (GISS). Animations of output from a climate simulation math model used at GISS to predict rainfall and circulation have been produced for West Africa from June to September 2002. These early results have assisted scientists at GISS in evaluating the accuracy of the RM3 climate model when compared to similar results obtained from satellite imagery. The results presented below will be refined to better meet the needs of GISS scientists and will be expanded to cover other geographic regions for a variety of time frames.
Tempest: Tools for Addressing the Needs of Next-Generation Climate Models
NASA Astrophysics Data System (ADS)
Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.
2015-12-01
Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.
A Decade-long Continental-Scale Convection-Resolving Climate Simulation on GPUs
NASA Astrophysics Data System (ADS)
Leutwyler, David; Fuhrer, Oliver; Lapillonne, Xavier; Lüthi, Daniel; Schär, Christoph
2016-04-01
The representation of moist convection in climate models represents a major challenge, due to the small scales involved. Convection-resolving models have proven to be very useful tools in numerical weather prediction and in climate research. Using horizontal grid spacings of O(1km), they allow to explicitly resolve deep convection leading to an improved representation of the water cycle. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in the supercomputing domain have led to new supercomputer-designs that involve conventional multicore CPUs and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to GPUs is the Consortium for Small-Scale Modeling weather and climate model COSMO. This new version allows us to expand the size of the simulation domain to areas spanning continents and the time period up to one decade. We present results from a decade-long, convection-resolving climate simulation using the GPU-enabled COSMO version. The simulation is driven by the ERA-interim reanalysis. The results illustrate how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. We discuss the performance of the convection-resolving modeling approach on the European scale. Specifically we focus on the annual cycle of convection in Europe, on the organization of convective clouds and on the verification of hourly rainfall with various high resolution datasets.
Modeling radiative transfer with the doubling and adding approach in a climate GCM setting
NASA Astrophysics Data System (ADS)
Lacis, A. A.
2017-12-01
The nonlinear dependence of multiply scattered radiation on particle size, optical depth, and solar zenith angle, makes accurate treatment of multiple scattering in the climate GCM setting problematic, due primarily to computational cost issues. In regard to the accurate methods of calculating multiple scattering that are available, their computational cost is far too prohibitive for climate GCM applications. Utilization of two-stream-type radiative transfer approximations may be computationally fast enough, but at the cost of reduced accuracy. We describe here a parameterization of the doubling/adding method that is being used in the GISS climate GCM, which is an adaptation of the doubling/adding formalism configured to operate with a look-up table utilizing a single gauss quadrature point with an extra-angle formulation. It is designed to closely reproduce the accuracy of full-angle doubling and adding for the multiple scattering effects of clouds and aerosols in a realistic atmosphere as a function of particle size, optical depth, and solar zenith angle. With an additional inverse look-up table, this single-gauss-point doubling/adding approach can be adapted to model fractional cloud cover for any GCM grid-box in the independent pixel approximation as a function of the fractional cloud particle sizes, optical depths, and solar zenith angle dependence.
2015-01-01
How do the feedbacks between tectonics, sediment transport and climate work to shape the topographic evolution of the Earth? This question has been widely addressed via numerical models constrained with thermochronological and geomorphological data at scales ranging from local to orogenic. Here we present a novel numerical model that aims at reproducing the interaction between these processes at the continental scale. For this purpose, we combine in a single computer program: 1) a thin-sheet viscous model of continental deformation; 2) a stream-power surface-transport approach; 3) flexural isostasy allowing for the formation of large sedimentary foreland basins; and 4) an orographic precipitation model that reproduces basic climatic effects such as continentality and rain shadow. We quantify the feedbacks between these processes in a synthetic scenario inspired by the India-Asia collision and the growth of the Tibetan Plateau. We identify a feedback between erosion and crustal thickening leading locally to a <50% increase in deformation rates in places where orographic precipitation is concentrated. This climatically-enhanced deformation takes place preferentially at the upwind flank of the growing plateau, specially at the corners of the indenter (syntaxes). We hypothesize that this may provide clues for better understanding the mechanisms underlying the intriguing tectonic aneurisms documented in the Himalayas. At the continental scale, however, the overall distribution of topographic basins and ranges seems insensitive to climatic factors, despite these do have important, sometimes counterintuitive effects on the amount of sediments trapped within the continent. The dry climatic conditions that naturally develop in the interior of the continent, for example, trigger large intra-continental sediment trapping at basins similar to the Tarim Basin because they determine its endorheic/exorheic drainage. These complex climatic-drainage-tectonic interactions make the development of steady-state topography at the continental scale unlikely. PMID:26244662
Garcia-Castellanos, Daniel; Jiménez-Munt, Ivone
2015-01-01
How do the feedbacks between tectonics, sediment transport and climate work to shape the topographic evolution of the Earth? This question has been widely addressed via numerical models constrained with thermochronological and geomorphological data at scales ranging from local to orogenic. Here we present a novel numerical model that aims at reproducing the interaction between these processes at the continental scale. For this purpose, we combine in a single computer program: 1) a thin-sheet viscous model of continental deformation; 2) a stream-power surface-transport approach; 3) flexural isostasy allowing for the formation of large sedimentary foreland basins; and 4) an orographic precipitation model that reproduces basic climatic effects such as continentality and rain shadow. We quantify the feedbacks between these processes in a synthetic scenario inspired by the India-Asia collision and the growth of the Tibetan Plateau. We identify a feedback between erosion and crustal thickening leading locally to a <50% increase in deformation rates in places where orographic precipitation is concentrated. This climatically-enhanced deformation takes place preferentially at the upwind flank of the growing plateau, specially at the corners of the indenter (syntaxes). We hypothesize that this may provide clues for better understanding the mechanisms underlying the intriguing tectonic aneurisms documented in the Himalayas. At the continental scale, however, the overall distribution of topographic basins and ranges seems insensitive to climatic factors, despite these do have important, sometimes counterintuitive effects on the amount of sediments trapped within the continent. The dry climatic conditions that naturally develop in the interior of the continent, for example, trigger large intra-continental sediment trapping at basins similar to the Tarim Basin because they determine its endorheic/exorheic drainage. These complex climatic-drainage-tectonic interactions make the development of steady-state topography at the continental scale unlikely.
NASA Astrophysics Data System (ADS)
Nicholls, S.; Mohr, K. I.
2014-12-01
The meridional extent and complex orography of the South American continent contributes to a wide diversity of climate regimes ranging from hyper-arid deserts to tropical rainforests to sub-polar highland regions. Global climate models, although capable of resolving synoptic-scale South American climate features, are inadequate for fully-resolving the strong gradients between climate regimes and the complex orography which define the Tropical Andes given their low spatial and temporal resolution. Recent computational advances now make practical regional climate modeling with prognostic mesoscale atmosphere-ocean coupled models, such as the Coupled Ocean-Atmosphere-Wave-Sediment Transport (COAWST) modeling system, to climate research. Previous work has shown COAWST to reasonably simulate the both the entire 2003-2004 wet season (Dec-Feb) as validated against both satellite and model analysis data. More recently, COAWST simulations have also been shown to sensibly reproduce the entire annual cycle of rainfall (Oct 2003 - Oct 2004) with historical climate model input. Using future global climate model input for COAWST, the present work involves year-long cycle spanning October to October for the years 2031, 2059, and 2087 assuming the most likely regional climate pathway (RCP): RCP 6.0. COAWST output is used to investigate how global climate change impacts the spatial distribution, precipitation rates, and diurnal cycle of precipitation patterns in the Central Andes vary in these yearly "snapshots". Initial results show little change to precipitation coverage or its diurnal cycle, however precipitation amounts did tend drier over the Brazilian Plateau and wetter over the Western Amazon and Central Andes. These results suggest potential adjustments to large-scale climate features (such as the Bolivian High).
NASA Astrophysics Data System (ADS)
Lemaire, Vincent; Colette, Augustin; Menut, Laurent
2016-04-01
Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.
Application of empirical and dynamical closure methods to simple climate models
NASA Astrophysics Data System (ADS)
Padilla, Lauren Elizabeth
This dissertation applies empirically- and physically-based methods for closure of uncertain parameters and processes to three model systems that lie on the simple end of climate model complexity. Each model isolates one of three sources of closure uncertainty: uncertain observational data, large dimension, and wide ranging length scales. They serve as efficient test systems toward extension of the methods to more realistic climate models. The empirical approach uses the Unscented Kalman Filter (UKF) to estimate the transient climate sensitivity (TCS) parameter in a globally-averaged energy balance model. Uncertainty in climate forcing and historical temperature make TCS difficult to determine. A range of probabilistic estimates of TCS computed for various assumptions about past forcing and natural variability corroborate ranges reported in the IPCC AR4 found by different means. Also computed are estimates of how quickly uncertainty in TCS may be expected to diminish in the future as additional observations become available. For higher system dimensions the UKF approach may become prohibitively expensive. A modified UKF algorithm is developed in which the error covariance is represented by a reduced-rank approximation, substantially reducing the number of model evaluations required to provide probability densities for unknown parameters. The method estimates the state and parameters of an abstract atmospheric model, known as Lorenz 96, with accuracy close to that of a full-order UKF for 30-60% rank reduction. The physical approach to closure uses the Multiscale Modeling Framework (MMF) to demonstrate closure of small-scale, nonlinear processes that would not be resolved directly in climate models. A one-dimensional, abstract test model with a broad spatial spectrum is developed. The test model couples the Kuramoto-Sivashinsky equation to a transport equation that includes cloud formation and precipitation-like processes. In the test model, three main sources of MMF error are evaluated independently. Loss of nonlinear multi-scale interactions and periodic boundary conditions in closure models were dominant sources of error. Using a reduced order modeling approach to maximize energy content allowed reduction of the closure model dimension up to 75% without loss in accuracy. MMF and a comparable alternative model peformed equally well compared to direct numerical simulation.
NASA Astrophysics Data System (ADS)
Davis, A. D.; Heimbach, P.; Marzouk, Y.
2017-12-01
We develop a Bayesian inverse modeling framework for predicting future ice sheet volume with associated formal uncertainty estimates. Marine ice sheets are drained by fast-flowing ice streams, which we simulate using a flowline model. Flowline models depend on geometric parameters (e.g., basal topography), parameterized physical processes (e.g., calving laws and basal sliding), and climate parameters (e.g., surface mass balance), most of which are unknown or uncertain. Given observations of ice surface velocity and thickness, we define a Bayesian posterior distribution over static parameters, such as basal topography. We also define a parameterized distribution over variable parameters, such as future surface mass balance, which we assume are not informed by the data. Hyperparameters are used to represent climate change scenarios, and sampling their distributions mimics internal variation. For example, a warming climate corresponds to increasing mean surface mass balance but an individual sample may have periods of increasing or decreasing surface mass balance. We characterize the predictive distribution of ice volume by evaluating the flowline model given samples from the posterior distribution and the distribution over variable parameters. Finally, we determine the effect of climate change on future ice sheet volume by investigating how changing the hyperparameters affects the predictive distribution. We use state-of-the-art Bayesian computation to address computational feasibility. Characterizing the posterior distribution (using Markov chain Monte Carlo), sampling the full range of variable parameters and evaluating the predictive model is prohibitively expensive. Furthermore, the required resolution of the inferred basal topography may be very high, which is often challenging for sampling methods. Instead, we leverage regularity in the predictive distribution to build a computationally cheaper surrogate over the low dimensional quantity of interest (future ice sheet volume). Continual surrogate refinement guarantees asymptotic sampling from the predictive distribution. Directly characterizing the predictive distribution in this way allows us to assess the ice sheet's sensitivity to climate variability and change.
Climate Analytics as a Service
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.
2014-01-01
Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodsman, Devin W.; Aukema, Brian H.; McDowell, Nate G.
Phenology models are becoming increasingly important tools to accurately predict how climate change will impact the life histories of organisms. We propose a class of integral projection phenology models derived from stochastic individual-based models of insect development and demography.Our derivation, which is based on the rate-summation concept, produces integral projection models that capture the effect of phenotypic rate variability on insect phenology, but which are typically more computationally frugal than equivalent individual-based phenology models. We demonstrate our approach using a temperature-dependent model of the demography of the mountain pine beetle (Dendroctonus ponderosae Hopkins), an insect that kills mature pine trees.more » This work illustrates how a wide range of stochastic phenology models can be reformulated as integral projection models. Due to their computational efficiency, these integral projection models are suitable for deployment in large-scale simulations, such as studies of altered pest distributions under climate change.« less
Modeling long-term changes in forested landscapes and their relation to the Earth's energy balance
NASA Technical Reports Server (NTRS)
Shugart, H. H.; Emanuel, W. R.; Solomon, A. M.
1984-01-01
The dynamics of the forested parts of the Earth's surface on time scales from decades to centuries are discussed. A set of computer models developed at Oak Ridge National Laboratory and elsewhere are applied as tools. These models simulate a landscape by duplicating the dynamics of growth, death and birth of each tree living on a 0.10 ha element of the landscape. This spatial unit is generally referred to as a gap in the case of the forest models. The models were tested against and applied to a diverse array of forests and appear to provide a reasonable representation for investigating forest-cover dynamics. Because of the climate linkage, one important test is the reconstruction of paleo-landscapes. Detailed reconstructions of changes in vegetation in response to changes in climate are crucial to understanding the association of the Earth's vegetation and climate and the response of the vegetation to climate change.
NASA Astrophysics Data System (ADS)
Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.
2012-04-01
The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on these systems, and developed as part of the Ultra-High Resolution Climate Modeling Project, allows users of OLCF resources to efficiently share simulated data, often multi-terabyte in volume, as well as the results from the modeling experiments and various synthesized products derived from these simulations. The final objective in the exercise is to ensure that the simulation results and the enhanced understanding will serve the needs of a diverse group of stakeholders across the world, including our research partners in U.S. Department of Energy laboratories & universities, domain scientists, students (K-12 as well as higher education), resource managers, decision makers, and the general public.
Optimal Interpolation scheme to generate reference crop evapotranspiration
NASA Astrophysics Data System (ADS)
Tomas-Burguera, Miquel; Beguería, Santiago; Vicente-Serrano, Sergio; Maneta, Marco
2018-05-01
We used an Optimal Interpolation (OI) scheme to generate a reference crop evapotranspiration (ETo) grid, forcing meteorological variables, and their respective error variance in the Iberian Peninsula for the period 1989-2011. To perform the OI we used observational data from the Spanish Meteorological Agency (AEMET) and outputs from a physically-based climate model. To compute ETo we used five OI schemes to generate grids for the five observed climate variables necessary to compute ETo using the FAO-recommended form of the Penman-Monteith equation (FAO-PM). The granularity of the resulting grids are less sensitive to variations in the density and distribution of the observational network than those generated by other interpolation methods. This is because our implementation of the OI method uses a physically-based climate model as prior background information about the spatial distribution of the climatic variables, which is critical for under-observed regions. This provides temporal consistency in the spatial variability of the climatic fields. We also show that increases in the density and improvements in the distribution of the observational network reduces substantially the uncertainty of the climatic and ETo estimates. Finally, a sensitivity analysis of observational uncertainties and network densification suggests the existence of a trade-off between quantity and quality of observations.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
iGen: An automated generator of simplified models with provable error bounds.
NASA Astrophysics Data System (ADS)
Tang, D.; Dobbie, S.
2009-04-01
Climate models employ various simplifying assumptions and parameterisations in order to increase execution speed. However, in order to draw conclusions about the Earths climate from the results of a climate simulation it is necessary to have information about the error that these assumptions and parameterisations introduce. A novel computer program, called iGen, is being developed which automatically generates fast, simplified models by analysing the source code of a slower, high resolution model. The resulting simplified models have provable bounds on error compared to the high resolution model and execute at speeds that are typically orders of magnitude faster. iGen's input is a definition of the prognostic variables of the simplified model, a set of bounds on acceptable error and the source code of a model that captures the behaviour of interest. In the case of an atmospheric model, for example, this would be a global cloud resolving model with very high resolution. Although such a model would execute far too slowly to be used directly in a climate model, iGen never executes it. Instead, it converts the code of the resolving model into a mathematical expression which is then symbolically manipulated and approximated to form a simplified expression. This expression is then converted back into a computer program and output as a simplified model. iGen also derives and reports formal bounds on the error of the simplified model compared to the resolving model. These error bounds are always maintained below the user-specified acceptable error. Results will be presented illustrating the success of iGen's analysis of a number of example models. These extremely encouraging results have lead on to work which is currently underway to analyse a cloud resolving model and so produce an efficient parameterisation of moist convection with formally bounded error.
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.
Light-weight Parallel Python Tools for Earth System Modeling Workflows
NASA Astrophysics Data System (ADS)
Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.
2015-12-01
With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
2011-04-02
This report summarizes work carried out by the Earth System Grid Center for Enabling Technologies (ESG-CET) from October 1, 2010 through March 31, 2011. It discusses ESG-CET highlights for the reporting period, overall progress, period goals, and collaborations, and lists papers and presentations. To learn more about our project and to find previous reports, please visit the ESG-CET Web sites: http://esg-pcmdi.llnl.gov/ and/or https://wiki.ucar.edu/display/esgcet/Home. This report will be forwarded to managers in the Department of Energy (DOE) Scientific Discovery through Advanced Computing (SciDAC) program and the Office of Biological and Environmental Research (OBER), as well as national and international collaborators andmore » stakeholders (e.g., those involved in the Coupled Model Intercomparison Project, phase 5 (CMIP5) for the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5); the Community Earth System Model (CESM); the Climate Science Computational End Station (CCES); SciDAC II: A Scalable and Extensible Earth System Model for Climate Change Science; the North American Regional Climate Change Assessment Program (NARCCAP); the Atmospheric Radiation Measurement (ARM) program; the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA)), and also to researchers working on a variety of other climate model and observation evaluation activities. The ESG-CET executive committee consists of Dean N. Williams, Lawrence Livermore National Laboratory (LLNL); Ian Foster, Argonne National Laboratory (ANL); and Don Middleton, National Center for Atmospheric Research (NCAR). The ESG-CET team is a group of researchers and scientists with diverse domain knowledge, whose home institutions include eight laboratories and two universities: ANL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), LLNL, NASA/Jet Propulsion Laboratory (JPL), NCAR, Oak Ridge National Laboratory (ORNL), Pacific Marine Environmental Laboratory (PMEL)/NOAA, Rensselaer Polytechnic Institute (RPI), and University of Southern California, Information Sciences Institute (USC/ISI). All ESG-CET work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Through the ESG project, the ESG-CET team has developed and delivered a production environment for climate data from multiple climate model sources (e.g., CMIP (IPCC), CESM, ocean model data (e.g., Parallel Ocean Program), observation data (e.g., Atmospheric Infrared Sounder, Microwave Limb Sounder), and analysis and visualization tools) that serves a worldwide climate research community. Data holdings are distributed across multiple sites including LANL, LBNL, LLNL, NCAR, and ORNL as well as unfunded partners sites such as the Australian National University (ANU) National Computational Infrastructure (NCI), the British Atmospheric Data Center (BADC), the Geophysical Fluid Dynamics Laboratory/NOAA, the Max Planck Institute for Meteorology (MPI-M), the German Climate Computing Centre (DKRZ), and NASA/JPL. As we transition from development activities to production and operations, the ESG-CET team is tasked with making data available to all users who want to understand it, process it, extract value from it, visualize it, and/or communicate it to others. This ongoing effort is extremely large and complex, but it will be incredibly valuable for building 'science gateways' to critical climate resources (such as CESM, CMIP5, ARM, NARCCAP, Atmospheric Infrared Sounder (AIRS), etc.) for processing the next IPCC assessment report. Continued ESG progress will result in a production-scale system that will empower scientists to attempt new and exciting data exchanges, which could ultimately lead to breakthrough climate science discoveries.« less
From the Last Interglacial to the Anthropocene: Modelling a Complete Glacial Cycle (PalMod)
NASA Astrophysics Data System (ADS)
Brücher, Tim; Latif, Mojib
2017-04-01
We will give a short overview and update on the current status of the national climate modelling initiative PalMod (Paleo Modelling, www.palmod.de). PalMod focuses on the understanding of the climate system dynamics and its variability during the last glacial cycle. The initiative is funded by the German Federal Ministry of Education and Research (BMBF) and its specific topics are: (i) to identify and quantify the relative contributions of the fundamental processes which determined the Earth's climate trajectory and variability during the last glacial cycle, (ii) to simulate with comprehensive Earth System Models (ESMs) the climate from the peak of the last interglacial - the Eemian warm period - up to the present, including the changes in the spectrum of variability, and (iii) to assess possible future climate trajectories beyond this century during the next millennia with sophisticated ESMs tested in such a way. The research is intended to be conducted over a period of 10 years, but with shorter funding cycles. PalMod kicked off in February 2016. The first phase focuses on the last deglaciation (app. the last 23.000 years). From the ESM perspective PalMod pushes forward model development by coupling ESM with dynamical ice sheet models. Computer scientists work on speeding up climate models using different concepts (like parallelisation in time) and one working group is dedicated to perform a comprehensive data synthesis to validate model performance. The envisioned approach is innovative in three respects. First, the consortium aims at simulating a full glacial cycle in transient mode and with comprehensive ESMs which allow full interactions between the physical and biogeochemical components of the Earth system, including ice sheets. Second, we shall address climate variability during the last glacial cycle on a large range of time scales, from interannual to multi-millennial, and attempt to quantify the relative contributions of external forcing and processes internal to the Earth system to climate variability at different time scales. Third, in order to achieve a higher level of understanding of natural climate variability at time scales of millennia, its governing processes and implications for the future climate, we bring together three different research communities: the Earth system modeling community, the proxy data community and the computational science community. The consortium consists of 18 partners including all major modelling centers within Germany. The funding comprises approximately 65 PostDoc positions and more than 120 scientists are involved. PalMod is coordinated at the Helmholtz Centre for Ocean Research Kiel (GEOMAR).
Potential effects of climate change on ground water in Lansing, Michigan
Croley, T.E.; Luukkonen, C.L.
2003-01-01
Computer simulations involving general circulation models, a hydrologic modeling system, and a ground water flow model indicate potential impacts of selected climate change projections on ground water levels in the Lansing, Michigan, area. General circulation models developed by the Canadian Climate Centre and the Hadley Centre generated meteorology estimates for 1961 through 1990 (as a reference condition) and for the 20 years centered on 2030 (as a changed climate condition). Using these meteorology estimates, the Great Lakes Environmental Research Laboratory's hydrologic modeling system produced corresponding period streamflow simulations. Ground water recharge was estimated from the streamflow simulations and from variables derived from the general circulation models. The U.S. Geological Survey developed a numerical ground water flow model of the Saginaw and glacial aquifers in the Tri-County region surrounding Lansing, Michigan. Model simulations, using the ground water recharge estimates, indicate changes in ground water levels. Within the Lansing area, simulated ground water levels in the Saginaw aquifer declined under the Canadian predictions and increased under the Hadley.
NASA Astrophysics Data System (ADS)
Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.
2013-12-01
Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models, or seasonal. WRF4G is been used to run WRF simulations which are contributing to the CORDEX initiative and others projects like SPECS and EUPORIAS. This work is been partially funded by the European Regional Development Fund (ERDF) and the Spanish National R&D Plan 2008-2011 (CGL2011-28864)
Temporal and spatial variability of groundwater recharge on Jeju Island, Korea
Mair, Alan; Hagedorn, Benjamin; Tillery, Suzanne; El-Kadi, Aly I.; Westenbroek, Stephen M.; Ha, Kyoochul; Koh, Gi-Won
2013-01-01
Estimates of groundwater recharge spatial and temporal variability are essential inputs to groundwater flow models that are used to test groundwater availability under different management and climate conditions. In this study, a soil water balance analysis was conducted to estimate groundwater recharge on the island of Jeju, Korea, for baseline, drought, and climate-land use change scenarios. The Soil Water Balance (SWB) computer code was used to compute groundwater recharge and other water balance components at a daily time step using a 100 m grid cell size for an 18-year baseline scenario (1992–2009). A 10-year drought scenario was selected from historical precipitation trends (1961–2009), while the climate-land use change scenario was developed using late 21st century climate projections and a change in urban land use. Mean annual recharge under the baseline, drought, and climate-land use scenarios was estimated at 884, 591, and 788 mm, respectively. Under the baseline scenario, mean annual recharge was within the range of previous estimates (825–959 mm) and only slightly lower than the mean of 902 mm. As a fraction of mean annual rainfall, mean annual recharge was computed as only 42% and less than previous estimates of 44–48%. The maximum historical reported annual pumping rate of 241 × 106 m3 equates to 15% of baseline recharge, which is within the range of 14–16% computed from earlier studies. The model does not include a mechanism to account for additional sources of groundwater recharge, such as fog drip, irrigation, and artificial recharge, and may also overestimate evapotranspiration losses. Consequently, the results presented in this study represent a conservative estimate of total recharge.
Daily pan evaporation modelling using a neuro-fuzzy computing technique
NASA Astrophysics Data System (ADS)
Kişi, Özgür
2006-10-01
SummaryEvaporation, as a major component of the hydrologic cycle, is important in water resources development and management. This paper investigates the abilities of neuro-fuzzy (NF) technique to improve the accuracy of daily evaporation estimation. Five different NF models comprising various combinations of daily climatic variables, that is, air temperature, solar radiation, wind speed, pressure and humidity are developed to evaluate degree of effect of each of these variables on evaporation. A comparison is made between the estimates provided by the NF model and the artificial neural networks (ANNs). The Stephens-Stewart (SS) method is also considered for the comparison. Various statistic measures are used to evaluate the performance of the models. Based on the comparisons, it was found that the NF computing technique could be employed successfully in modelling evaporation process from the available climatic data. The ANN also found to perform better than the SS method.
The Community Climate System Model.
NASA Astrophysics Data System (ADS)
Blackmon, Maurice; Boville, Byron; Bryan, Frank; Dickinson, Robert; Gent, Peter; Kiehl, Jeffrey; Moritz, Richard; Randall, David; Shukla, Jagadish; Solomon, Susan; Bonan, Gordon; Doney, Scott; Fung, Inez; Hack, James; Hunke, Elizabeth; Hurrell, James; Kutzbach, John; Meehl, Jerry; Otto-Bliesner, Bette; Saravanan, R.; Schneider, Edwin K.; Sloan, Lisa; Spall, Michael; Taylor, Karl; Tribbia, Joseph; Washington, Warren
2001-11-01
The Community Climate System Model (CCSM) has been created to represent the principal components of the climate system and their interactions. Development and applications of the model are carried out by the U.S. climate research community, thus taking advantage of both wide intellectual participation and computing capabilities beyond those available to most individual U.S. institutions. This article outlines the history of the CCSM, its current capabilities, and plans for its future development and applications, with the goal of providing a summary useful to present and future users. The initial version of the CCSM included atmosphere and ocean general circulation models, a land surface model that was grafted onto the atmosphere model, a sea-ice model, and a flux coupler that facilitates information exchanges among the component models with their differing grids. This version of the model produced a successful 300-yr simulation of the current climate without artificial flux adjustments. The model was then used to perform a coupled simulation in which the atmospheric CO2 concentration increased by 1% per year. In this version of the coupled model, the ocean salinity and deep-ocean temperature slowly drifted away from observed values. A subsequent correction to the roughness length used for sea ice significantly reduced these errors. An updated version of the CCSM was used to perform three simulations of the twentieth century's climate, and several pro-jections of the climate of the twenty-first century. The CCSM's simulation of the tropical ocean circulation has been significantly improved by reducing the background vertical diffusivity and incorporating an anisotropic horizontal viscosity tensor. The meridional resolution of the ocean model was also refined near the equator. These changes have resulted in a greatly improved simulation of both the Pacific equatorial undercurrent and the surface countercurrents. The interannual variability of the sea surface temperature in the central and eastern tropical Pacific is also more realistic in simulations with the updated model. Scientific challenges to be addressed with future versions of the CCSM include realistic simulation of the whole atmosphere, including the middle and upper atmosphere, as well as the troposphere; simulation of changes in the chemical composition of the atmosphere through the incorporation of an integrated chemistry model; inclusion of global, prognostic biogeochemical components for land, ocean, and atmosphere; simulations of past climates, including times of extensive continental glaciation as well as times with little or no ice; studies of natural climate variability on seasonal-to-centennial timescales; and investigations of anthropogenic climate change. In order to make such studies possible, work is under way to improve all components of the model. Plans call for a new version of the CCSM to be released in 2002. Planned studies with the CCSM will require much more computer power than is currently available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oubeidillah, Abdoul A; Kao, Shih-Chieh; Ashfaq, Moetasim
2014-01-01
To extend geographical coverage, refine spatial resolution, and improve modeling efficiency, a computation- and data-intensive effort was conducted to organize a comprehensive hydrologic dataset with post-calibrated model parameters for hydro-climate impact assessment. Several key inputs for hydrologic simulation including meteorologic forcings, soil, land class, vegetation, and elevation were collected from multiple best-available data sources and organized for 2107 hydrologic subbasins (8-digit hydrologic units, HUC8s) in the conterminous United States at refined 1/24 (~4 km) spatial resolution. Using high-performance computing for intensive model calibration, a high-resolution parameter dataset was prepared for the macro-scale Variable Infiltration Capacity (VIC) hydrologic model. The VICmore » simulation was driven by DAYMET daily meteorological forcing and was calibrated against USGS WaterWatch monthly runoff observations for each HUC8. The results showed that this new parameter dataset may help reasonably simulate runoff at most US HUC8 subbasins. Based on this exhaustive calibration effort, it is now possible to accurately estimate the resources required for further model improvement across the entire conterminous United States. We anticipate that through this hydrologic parameter dataset, the repeated effort of fundamental data processing can be lessened, so that research efforts can emphasize the more challenging task of assessing climate change impacts. The pre-organized model parameter dataset will be provided to interested parties to support further hydro-climate impact assessment.« less
The Mediterranean surface wave climate inferred from future scenario simulations
NASA Astrophysics Data System (ADS)
Lionello, P.; Cogo, S.; Galati, M. B.; Sanna, A.
2008-09-01
This study is based on 30-year long simulations of the wind-wave field in the Mediterranean Sea carried out with the WAM model. Wave fields have been computed for the 2071-2100 period of the A2, B2 emission scenarios and for the 1961-1990 period of the present climate (REF). The wave model has been forced by the wind field computed by a regional climate model with 50 km resolution. The mean SWH (Significant Wave Height) field over large fraction of the Mediterranean sea is lower for the A2 scenario than for the present climate during winter, spring and autumn. During summer the A2 mean SWH field is also lower everywhere, except for two areas, those between Greece and Northern Africa and between Spain and Algeria, where it is significantly higher. All these changes are similar, though smaller and less significant, in the B2 scenario, except during winter in the north-western Mediterranean Sea, when the B2 mean SWH field is higher than in the REF simulation. Also extreme SWH values are smaller in future scenarios than in the present climate and such SWH change is larger for the A2 than for the B2 scenario. The only exception is the presence of higher SWH extremes in the central Mediterranean during summer for the A2 scenario. In general, changes of SWH, wind speed and atmospheric circulation are consistent, and results show milder marine storms in future scenarios than in the present climate.
Climate, weather, space weather: model development in an operational context
NASA Astrophysics Data System (ADS)
Folini, Doris
2018-05-01
Aspects of operational modeling for climate, weather, and space weather forecasts are contrasted, with a particular focus on the somewhat conflicting demands of "operational stability" versus "dynamic development" of the involved models. Some common key elements are identified, indicating potential for fruitful exchange across communities. Operational model development is compelling, driven by factors that broadly fall into four categories: model skill, basic physics, advances in computer architecture, and new aspects to be covered, from costumer needs over physics to observational data. Evaluation of model skill as part of the operational chain goes beyond an automated skill score. Permanent interaction between "pure research" and "operational forecast" people is beneficial to both sides. This includes joint model development projects, although ultimate responsibility for the operational code remains with the forecast provider. The pace of model development reflects operational lead times. The points are illustrated with selected examples, many of which reflect the author's background and personal contacts, notably with the Swiss Weather Service and the Max Planck Institute for Meteorology, Hamburg, Germany. In view of current and future challenges, large collaborations covering a range of expertise are a must - within and across climate, weather, and space weather. To profit from and cope with the rapid progress of computer architectures, supercompute centers must form part of the team.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)
2002-01-01
The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 km or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed-shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)
2002-01-01
The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 kin or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed- shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.
CICE, The Los Alamos Sea Ice Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunke, Elizabeth; Lipscomb, William; Jones, Philip
The Los Alamos sea ice model (CICE) is the result of an effort to develop a computationally efficient sea ice component for a fully coupled atmosphere–land–ocean–ice global climate model. It was originally designed to be compatible with the Parallel Ocean Program (POP), an ocean circulation model developed at Los Alamos National Laboratory for use on massively parallel computers. CICE has several interacting components: a vertical thermodynamic model that computes local growth rates of snow and ice due to vertical conductive, radiative and turbulent fluxes, along with snowfall; an elastic-viscous-plastic model of ice dynamics, which predicts the velocity field of themore » ice pack based on a model of the material strength of the ice; an incremental remapping transport model that describes horizontal advection of the areal concentration, ice and snow volume and other state variables; and a ridging parameterization that transfers ice among thickness categories based on energetic balances and rates of strain. It also includes a biogeochemical model that describes evolution of the ice ecosystem. The CICE sea ice model is used for climate research as one component of complex global earth system models that include atmosphere, land, ocean and biogeochemistry components. It is also used for operational sea ice forecasting in the polar regions and in numerical weather prediction models.« less
NASA Astrophysics Data System (ADS)
Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.
2017-12-01
Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.
NASA Astrophysics Data System (ADS)
Shulgina, T. M.; Gordova, Y. E.; Martynova, Y. V.
2014-12-01
A problem of making education relevant to the workplace tasks is a key problem of higher education in the professional field of environmental sciences. To answer this challenge several new courses for students of "Climatology" and "Meteorology" specialties were developed and implemented at the Tomsk State University, which comprises theoretical knowledge from up-to-date environmental sciences with computational tasks. To organize the educational process we use an open-source course management system Moodle (www.moodle.org). It gave us an opportunity to combine text and multimedia in a theoretical part of educational courses. The hands-on approach is realized through development of innovative trainings which are performed within the information-computational web GIS platform "Climate" (http://climate.scert.ru/). The platform has a set of tools and data bases allowing a researcher to perform climate changes analysis on the selected territory. The tools are also used for students' trainings, which contain practical tasks on climate modeling and climate changes assessment and analysis. Laboratory exercises are covering three topics: "Analysis of regional climate changes"; "Analysis of climate extreme indices on the regional scale"; and "Analysis of future climate". They designed to consolidate students' knowledge of discipline, to instill in them the skills to work independently with large amounts of geophysical data using modern processing and analysis tools of web-GIS platform "Climate" and to train them to present results obtained on laboratory work as reports with the statement of the problem, the results of calculations and logically justified conclusion. Thus, students are engaged in n the use of modern tools of the geophysical data analysis and it cultivates dynamic of their professional learning. The approach can help us to fill in this gap because it is the only approach that offers experience, increases students involvement, advance the use of modern information and communication tools. Financial support for this research from the RFBR (13-05-12034, 14-05-00502), SB RAS project VIII.80.2.1 and grant of the President of RF (№ 181) is acknowledged.
Predicting Coupled Ocean-Atmosphere Modes with a Climate Modeling Hierarchy -- Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Ghil, UCLA; Andrew W. Robertson, IRI, Columbia Univ.; Sergey Kravtsov, U. of Wisconsin, Milwaukee
The goal of the project was to determine midlatitude climate predictability associated with tropical-extratropical interactions on interannual-to-interdecadal time scales. Our strategy was to develop and test a hierarchy of climate models, bringing together large GCM-based climate models with simple fluid-dynamical coupled ocean-ice-atmosphere models, through the use of advanced probabilistic network (PN) models. PN models were used to develop a new diagnostic methodology for analyzing coupled ocean-atmosphere interactions in large climate simulations made with the NCAR Parallel Climate Model (PCM), and to make these tools user-friendly and available to other researchers. We focused on interactions between the tropics and extratropics throughmore » atmospheric teleconnections (the Hadley cell, Rossby waves and nonlinear circulation regimes) over both the North Atlantic and North Pacific, and the ocean’s thermohaline circulation (THC) in the Atlantic. We tested the hypothesis that variations in the strength of the THC alter sea surface temperatures in the tropical Atlantic, and that the latter influence the atmosphere in high latitudes through an atmospheric teleconnection, feeding back onto the THC. The PN model framework was used to mediate between the understanding gained with simplified primitive equations models and multi-century simulations made with the PCM. The project team is interdisciplinary and built on an existing synergy between atmospheric and ocean scientists at UCLA, computer scientists at UCI, and climate researchers at the IRI.« less
Selecting climate simulations for impact studies based on multivariate patterns of climate change.
Mendlik, Thomas; Gobiet, Andreas
In climate change impact research it is crucial to carefully select the meteorological input for impact models. We present a method for model selection that enables the user to shrink the ensemble to a few representative members, conserving the model spread and accounting for model similarity. This is done in three steps: First, using principal component analysis for a multitude of meteorological parameters, to find common patterns of climate change within the multi-model ensemble. Second, detecting model similarities with regard to these multivariate patterns using cluster analysis. And third, sampling models from each cluster, to generate a subset of representative simulations. We present an application based on the ENSEMBLES regional multi-model ensemble with the aim to provide input for a variety of climate impact studies. We find that the two most dominant patterns of climate change relate to temperature and humidity patterns. The ensemble can be reduced from 25 to 5 simulations while still maintaining its essential characteristics. Having such a representative subset of simulations reduces computational costs for climate impact modeling and enhances the quality of the ensemble at the same time, as it prevents double-counting of dependent simulations that would lead to biased statistics. The online version of this article (doi:10.1007/s10584-015-1582-0) contains supplementary material, which is available to authorized users.
Chang, Howard H.; Hao, Hua; Sarnat, Stefanie Ebelt
2014-01-01
The adverse health effects of ambient ozone are well established. Given the high sensitivity of ambient ozone concentrations to meteorological conditions, the impacts of future climate change on ozone concentrations and its associated health effects are of concern. We describe a statistical modeling framework for projecting future ozone levels and its health impacts under a changing climate. This is motivated by the continual effort to evaluate projection uncertainties to inform public health risk assessment. The proposed approach was applied to the 20-county Atlanta metropolitan area using regional climate model (RCM) simulations from the North American Regional Climate Change Assessment Program. Future ozone levels and ozone-related excesses in asthma emergency department (ED) visits were examined for the period 2041–2070. The computationally efficient approach allowed us to consider 8 sets of climate model outputs based on different combinations of 4 RCMs and 4 general circulation models. Compared to the historical period of 1999–2004, we found consistent projections across climate models of an average 11.5% higher ozone levels (range: 4.8%, 16.2%), and an average 8.3% (range: −7% to 24%) higher number of ozone exceedance days. Assuming no change in the at-risk population, this corresponds to excess ozone-related ED visits ranging from 267 to 466 visits per year. Health impact projection uncertainty was driven predominantly by uncertainty in the health effect association and climate model variability. Calibrating climate simulations with historical observations reduced differences in projections across climate models. PMID:24764746
PP-SWAT: A phython-based computing software for efficient multiobjective callibration of SWAT
USDA-ARS?s Scientific Manuscript database
With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duane, Greg; Tsonis, Anastasios; Kocarev, Ljupco
This collaborative reserach has several components but the main idea is that when imperfect copies of a given nonlinear dynamical system are coupled, they may synchronize for some set of coupling parameters. This idea is to be tested for several IPCC-like models each one with its own formulation and representing an “imperfect” copy of the true climate system. By computing the coupling parameters, which will lead the models to a synchronized state, a consensus on climate change simulations may be achieved.
Tsushima, Yoko; Manabe, Syukuro
2013-05-07
In the climate system, two types of radiative feedback are in operation. The feedback of the first kind involves the radiative damping of the vertically uniform temperature perturbation of the troposphere and Earth's surface that approximately follows the Stefan-Boltzmann law of blackbody radiation. The second kind involves the change in the vertical lapse rate of temperature, water vapor, and clouds in the troposphere and albedo of the Earth's surface. Using satellite observations of the annual variation of the outgoing flux of longwave radiation and that of reflected solar radiation at the top of the atmosphere, this study estimates the so-called "gain factor," which characterizes the strength of radiative feedback of the second kind that operates on the annually varying, global-scale perturbation of temperature at the Earth's surface. The gain factor is computed not only for all sky but also for clear sky. The gain factor of so-called "cloud radiative forcing" is then computed as the difference between the two. The gain factors thus obtained are compared with those obtained from 35 models that were used for the fourth and fifth Intergovernmental Panel on Climate Change assessment. Here, we show that the gain factors obtained from satellite observations of cloud radiative forcing are effective for identifying systematic biases of the feedback processes that control the sensitivity of simulated climate, providing useful information for validating and improving a climate model.
How Unusual were Hurricane Harvey's Rains?
NASA Astrophysics Data System (ADS)
Emanuel, K.
2017-12-01
We apply an advanced technique for hurricane risk assessment to evaluate the probability of hurricane rainfall of Harvey's magnitude. The technique embeds a detailed computational hurricane model in the large-scale conditions represented by climate reanalyses and by climate models. We simulate 3700 hurricane events affecting the state of Texas, from each of three climate reanalyses spanning the period 1980-2016, and 2000 events from each of six climate models for each of two periods: the period 1981-2000 from historical simulations, and the period 2081-2100 from future simulations under Representative Concentration Pathway (RCP) 8.5. On the basis of these simulations, we estimate that hurricane rain of Harvey's magnitude in the state of Texas would have had an annual probability of 0.01 in the late twentieth century, and will have an annual probability of 0.18 by the end of this century, with remarkably small scatter among the six climate models downscaled. If the event frequency is changing linearly over time, this would yield an annual probability of 0.06 in 2017.
Benchmarking NWP Kernels on Multi- and Many-core Processors
NASA Astrophysics Data System (ADS)
Michalakes, J.; Vachharajani, M.
2008-12-01
Increased computing power for weather, climate, and atmospheric science has provided direct benefits for defense, agriculture, the economy, the environment, and public welfare and convenience. Today, very large clusters with many thousands of processors are allowing scientists to move forward with simulations of unprecedented size. But time-critical applications such as real-time forecasting or climate prediction need strong scaling: faster nodes and processors, not more of them. Moreover, the need for good cost- performance has never been greater, both in terms of performance per watt and per dollar. For these reasons, the new generations of multi- and many-core processors being mass produced for commercial IT and "graphical computing" (video games) are being scrutinized for their ability to exploit the abundant fine- grain parallelism in atmospheric models. We present results of our work to date identifying key computational kernels within the dynamics and physics of a large community NWP model, the Weather Research and Forecast (WRF) model. We benchmark and optimize these kernels on several different multi- and many-core processors. The goals are to (1) characterize and model performance of the kernels in terms of computational intensity, data parallelism, memory bandwidth pressure, memory footprint, etc. (2) enumerate and classify effective strategies for coding and optimizing for these new processors, (3) assess difficulties and opportunities for tool or higher-level language support, and (4) establish a continuing set of kernel benchmarks that can be used to measure and compare effectiveness of current and future designs of multi- and many-core processors for weather and climate applications.
A New Biogeochemical Computational Framework Integrated within the Community Land Model
NASA Astrophysics Data System (ADS)
Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.
2012-12-01
Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.
The Co-evolution of Climate Models and the Intergovernmental Panel on Climate Change
NASA Astrophysics Data System (ADS)
Somerville, R. C.
2010-12-01
As recently as the 1950s, global climate models, or GCMs, did not exist, and the notion that man-made carbon dioxide might lead to significant climate change was not regarded as a serious possibility by most experts. Today, of course, the prospect or threat of exactly this type of climate change dominates the science and ranks among the most pressing issues confronting all mankind. Indeed, the prevailing scientific view throughout the first half of the twentieth century was that adding carbon dioxide to the atmosphere would have only a negligible effect on climate. The science of climate change caused by atmospheric carbon dioxide changes has thus undergone a genuine revolution. An extraordinarily rapid development of global climate models has also characterized this period, especially in the three decades since about 1980. In these three decades, the number of GCMs has greatly increased, and their physical and computational aspects have both markedly improved. Modeling progress has been enabled by many scientific advances, of course, but especially by a massive increase in available computer power, with supercomputer speeds increasing by roughly a factor of a million in the three decades from about 1980 to 2010. This technological advance has permitted a rapid increase in the physical comprehensiveness of GCMs as well as in spatial computational resolution. In short, GCMs have dramatically evolved over time, in exactly the same recent period as popular interest and scientific concern about anthropogenic climate change have markedly increased. In parallel, a unique international organization, the Intergovernmental Panel on Climate Change, or IPCC, has also recently come into being and also evolved rapidly. Today, the IPCC has become widely respected and globally influential. The IPCC was founded in 1988, and its history is thus even shorter than that of GCMs. Yet, its stature today is such that a series of IPCC reports assessing climate change science has already been endorsed by many leading scientific professional societies and academies of science worldwide. These reports are considered as definitive summaries of the state of the science. In 2007, in recognition of its exceptional accomplishments, the IPCC shared the Nobel Peace Prize equally with Al Gore. The present era is characterized not only by the reality and seriousness of human-caused climate change, but also by a young yet powerful science that enables us to understand much about the climate change that has occurred already and that awaits in the future. The development of GCMs is a critical part of the scientific story, and the development of the IPCC is a key factor in connecting the science to the perceptions and priorities of the global public and policymakers. GCMs and the IPCC have co-evolved and strongly influenced one another, as both scientists and the world at large have worked to confront the challenge of climate change.
NASA Technical Reports Server (NTRS)
McGalliard, James
2008-01-01
This viewgraph presentation details the science and systems environments that NASA High End computing program serves. Included is a discussion of the workload that is involved in the processing for the Global Climate Modeling. The Goddard Earth Observing System Model, Version 5 (GEOS-5) is a system of models integrated using the Earth System Modeling Framework (ESMF). The GEOS-5 system was used for the Benchmark tests, and the results of the tests are shown and discussed. Tests were also run for the Cubed Sphere system, results for these test are also shown.
NASA Astrophysics Data System (ADS)
Kim, S. K.; Lee, J.; Zhang, C.; Ames, S.; Williams, D. N.
2017-12-01
Deep learning techniques have been successfully applied to solve many problems in climate and geoscience using massive-scaled observed and modeled data. For extreme climate event detections, several models based on deep neural networks have been recently proposed and attend superior performance that overshadows all previous handcrafted expert based method. The issue arising, though, is that accurate localization of events requires high quality of climate data. In this work, we propose framework capable of detecting and localizing extreme climate events in very coarse climate data. Our framework is based on two models using deep neural networks, (1) Convolutional Neural Networks (CNNs) to detect and localize extreme climate events, and (2) Pixel recursive recursive super resolution model to reconstruct high resolution climate data from low resolution climate data. Based on our preliminary work, we have presented two CNNs in our framework for different purposes, detection and localization. Our results using CNNs for extreme climate events detection shows that simple neural nets can capture the pattern of extreme climate events with high accuracy from very coarse reanalysis data. However, localization accuracy is relatively low due to the coarse resolution. To resolve this issue, the pixel recursive super resolution model reconstructs the resolution of input of localization CNNs. We present a best networks using pixel recursive super resolution model that synthesizes details of tropical cyclone in ground truth data while enhancing their resolution. Therefore, this approach not only dramat- ically reduces the human effort, but also suggests possibility to reduce computing cost required for downscaling process to increase resolution of data.
WRF Test on IBM BG/L:Toward High Performance Application to Regional Climate Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, H S
The effects of climate change will mostly be felt on local to regional scales (Solomon et al., 2007). To develop better forecast skill in regional climate change, an integrated multi-scale modeling capability (i.e., a pair of global and regional climate models) becomes crucially important in understanding and preparing for the impacts of climate change on the temporal and spatial scales that are critical to California's and nation's future environmental quality and economical prosperity. Accurate knowledge of detailed local impact on the water management system from climate change requires a resolution of 1km or so. To this end, a high performancemore » computing platform at the petascale appears to be an essential tool in providing such local scale information to formulate high quality adaptation strategies for local and regional climate change. As a key component of this modeling system at LLNL, the Weather Research and Forecast (WRF) model is implemented and tested on the IBM BG/L machine. The objective of this study is to examine the scaling feature of WRF on BG/L for the optimal performance, and to assess the numerical accuracy of WRF solution on BG/L.« less
Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change
NASA Astrophysics Data System (ADS)
Field, R.; Constantine, P.; Boslough, M.
2011-12-01
We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We applied the calibrated surrogate model to study the probability that the precipitation rate falls below certain thresholds and utilized the Bayesian approach to quantify our confidence in these predictions. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Kwiatkowski, L.; Yool, A.; Allen, J. I.; Anderson, T. R.; Barciela, R.; Buitenhuis, E. T.; Butenschön, M.; Enright, C.; Halloran, P. R.; Le Quéré, C.; de Mora, L.; Racault, M.-F.; Sinha, B.; Totterdell, I. J.; Cox, P. M.
2014-07-01
Ocean biogeochemistry (OBGC) models span a wide range of complexities from highly simplified, nutrient-restoring schemes, through nutrient-phytoplankton-zooplankton-detritus (NPZD) models that crudely represent the marine biota, through to models that represent a broader trophic structure by grouping organisms as plankton functional types (PFT) based on their biogeochemical role (Dynamic Green Ocean Models; DGOM) and ecosystem models which group organisms by ecological function and trait. OBGC models are now integral components of Earth System Models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here, we present an inter-comparison of six OBGC models that were candidates for implementation within the next UK Earth System Model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the Nucleus for the European Modelling of the Ocean (NEMO) ocean general circulation model (GCM), and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform or underperform all other models across all metrics. Nonetheless, the simpler models that are easier to tune are broadly closer to observations across a number of fields, and thus offer a high-efficiency option for ESMs that prioritise high resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low resolution climate dynamics and high complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry-climate interactions.
NASA Astrophysics Data System (ADS)
Kwiatkowski, L.; Yool, A.; Allen, J. I.; Anderson, T. R.; Barciela, R.; Buitenhuis, E. T.; Butenschön, M.; Enright, C.; Halloran, P. R.; Le Quéré, C.; de Mora, L.; Racault, M.-F.; Sinha, B.; Totterdell, I. J.; Cox, P. M.
2014-12-01
Ocean biogeochemistry (OBGC) models span a wide variety of complexities, including highly simplified nutrient-restoring schemes, nutrient-phytoplankton-zooplankton-detritus (NPZD) models that crudely represent the marine biota, models that represent a broader trophic structure by grouping organisms as plankton functional types (PFTs) based on their biogeochemical role (dynamic green ocean models) and ecosystem models that group organisms by ecological function and trait. OBGC models are now integral components of Earth system models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here we present an intercomparison of six OBGC models that were candidates for implementation within the next UK Earth system model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the ocean general circulation model Nucleus for European Modelling of the Ocean (NEMO) and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform all other models across all metrics. Nonetheless, the simpler models are broadly closer to observations across a number of fields and thus offer a high-efficiency option for ESMs that prioritise high-resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low-resolution climate dynamics and high-complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry-climate interactions.
Bringing a Realistic Global Climate Modeling Experience to a Broader Audience
NASA Astrophysics Data System (ADS)
Sohl, L. E.; Chandler, M. A.; Zhou, J.
2010-12-01
EdGCM, the Educational Global Climate Model, was developed with the goal of helping students learn about climate change and climate modeling by giving them the ability to run a genuine NASA global climate model (GCM) on a desktop computer. Since EdGCM was first publicly released in January 2005, tens of thousands of users on seven continents have downloaded the software. EdGCM has been utilized by climate science educators from middle school through graduate school levels, and on occasion even by researchers who otherwise do not have ready access to climate model at national labs in the U.S. and elsewhere. The EdGCM software is designed to walk users through the same process a climate scientist would use in designing and running simulations, and analyzing and visualizing GCM output. Although the current interface design gives users a clear view of some of the complexities involved in using a climate model, it can be daunting for users whose main focus is on climate science rather than modeling per se. As part of the work funded by NASA’s Global Climate Change Education (GCCE) program, we will begin modifications to the user interface that will improve the accessibility of EdGCM to a wider array of users, especially at the middle school and high school levels, by: 1) Developing an automated approach (a “wizard”) to simplify the user experience in setting up new climate simulations; 2) Produce a catalog of “rediscovery experiments” that allow users to reproduce published climate model results, and in some cases compare model projections to real world data; and 3) Enhance distance learning and online learning opportunities through the development of a web-based interface. The prototypes for these modifications will then be presented to educators belonging to an EdGCM Users Group for feedback, so that we can further refine the EdGCM software, and thus deliver the tools and materials educators want and need across a wider range of learning environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arkin, Adam; Bader, David C.; Coffey, Richard
Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less
Software architecture and design of the web services facilitating climate model diagnostic analysis
NASA Astrophysics Data System (ADS)
Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.
2015-12-01
Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.
Modeling climatic effects of anthropogenic CO2 emissions: Unknowns and uncertainties
NASA Astrophysics Data System (ADS)
Soon, W.; Baliunas, S.; Idso, S.; Kondratyev, K. Ya.; Posmentier, E. S.
2001-12-01
A likelihood of disastrous global environmental consequences has been surmised as a result of projected increases in anthropogenic greenhouse gas emissions. These estimates are based on computer climate modeling, a branch of science still in its infancy despite recent, substantial strides in knowledge. Because the expected anthropogenic climate forcings are relatively small compared to other background and forcing factors (internal and external), the credibility of the modeled global and regional responses rests on the validity of the models. We focus on this important question of climate model validation. Specifically, we review common deficiencies in general circulation model calculations of atmospheric temperature, surface temperature, precipitation and their spatial and temporal variability. These deficiencies arise from complex problems associated with parameterization of multiply-interacting climate components, forcings and feedbacks, involving especially clouds and oceans. We also review examples of expected climatic impacts from anthropogenic CO2 forcing. Given the host of uncertainties and unknowns in the difficult but important task of climate modeling, the unique attribution of observed current climate change to increased atmospheric CO2 concentration, including the relatively well-observed latest 20 years, is not possible. We further conclude that the incautious use of GCMs to make future climate projections from incomplete or unknown forcing scenarios is antithetical to the intrinsically heuristic value of models. Such uncritical application of climate models has led to the commonly-held but erroneous impression that modeling has proven or substantiated the hypothesis that CO2 added to the air has caused or will cause significant global warming. An assessment of the positive skills of GCMs and their use in suggesting a discernible human influence on global climate can be found in the joint World Meteorological Organisation and United Nations Environmental Programme's Intergovernmental Panel on Climate Change, IPCC, reports (1990, 1995 and 2001). Our review highlights only the enormous scientific difficulties facing the calculation of climatic effects of added atmospheric CO2 in a GCM. The purpose of such a limited review of the deficiencies of climate model physics and the use of GCMs is to illuminate areas for improvement. Our review does not disprove a significant anthropogenic influence on global climate.
Uncertainty Quantification in Climate Modeling and Projection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, Yun; Jackson, Charles; Giorgi, Filippo
The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Martynova, Yulia; Shulgina, Tamara
2015-04-01
The current situation with the training of specialists in environmental sciences is complicated by the fact that the very scientific field is experiencing a period of rapid development. Global change has caused the development of measurement techniques and modeling of environmental characteristics, accompanied by the expansion of the conceptual and mathematical apparatus. Understanding and forecasting processes in the Earth system requires extensive use of mathematical modeling and advanced computing technologies. As a rule, available training programs in the environmental sciences disciplines do not have time to adapt to such rapid changes in the domain content. As a result, graduates of faculties do not understand processes and mechanisms of the global change, have only superficial knowledge of mathematical modeling of processes in the environment. They do not have the required skills in numerical modeling, data processing and analysis of observations and computation outputs and are not prepared to work with the meteorological data. For adequate training of future specialists in environmental sciences we propose the following approach, which reflects the new "research" paradigm in education. We believe that the training of such specialists should be done not in an artificial learning environment, but based on actual operating information-computational systems used in environment studies, in the so-called virtual research environment via development of virtual research and learning laboratories. In the report the results of the use of computational-informational web-GIS system "Climate" (http://climate.scert.ru/) as a prototype of such laboratory are discussed. The approach is realized at Tomsk State University to prepare bachelors in meteorology. Student survey shows that their knowledge has become deeper and more systemic after undergoing training in virtual learning laboratory. The scientific team plans to assist any educators to utilize the system in earth science education. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grants 13-05-12034 and 14-05-00502.
A large ozone-circulation feedback and its implications for global warming assessments.
Nowack, Peer J; Abraham, N Luke; Maycock, Amanda C; Braesicke, Peter; Gregory, Jonathan M; Joshi, Manoj M; Osprey, Annette; Pyle, John A
2015-01-01
State-of-the-art climate models now include more climate processes which are simulated at higher spatial resolution than ever 1 . Nevertheless, some processes, such as atmospheric chemical feedbacks, are still computationally expensive and are often ignored in climate simulations 1,2 . Here we present evidence that how stratospheric ozone is represented in climate models can have a first order impact on estimates of effective climate sensitivity. Using a comprehensive atmosphere-ocean chemistry-climate model, we find an increase in global mean surface warming of around 1°C (~20%) after 75 years when ozone is prescribed at pre-industrial levels compared with when it is allowed to evolve self-consistently in response to an abrupt 4×CO 2 forcing. The difference is primarily attributed to changes in longwave radiative feedbacks associated with circulation-driven decreases in tropical lower stratospheric ozone and related stratospheric water vapour and cirrus cloud changes. This has important implications for global model intercomparison studies 1,2 in which participating models often use simplified treatments of atmospheric composition changes that are neither consistent with the specified greenhouse gas forcing scenario nor with the associated atmospheric circulation feedbacks 3-5 .
Recommendations for diagnosing effective radiative forcing from climate models for CMIP6
NASA Astrophysics Data System (ADS)
Smith, C. J.; Forster, P.; Richardson, T.; Myhre, G.; Pincus, R.
2016-12-01
The usefulness of previous Coupled Model Intercomparison Project (CMIP) exercises has been hampered by a lack of radiative forcing information. This has made it difficult to understand reasons for differences between model responses. Effective radiative forcing (ERF) is easier to diagnose than traditional radiative forcing in global climate models (GCMs) and is more representative of the ultimate climate response. Here we examine the different methods of computing ERF in two GCMs. We find that ERF computed from a fixed sea-surface temperature (SST) method (ERF_fSST) has much more certainty than regression-based methods. Thirty-year integrations are sufficient to reduce the standard error in global ERF to 0.05 Wm-2. For 2xCO2 ERF, 30 year integrations are needed to ensure that the signal is larger than the standard error over more than 90% of the globe. Within the ERF_fSST method there are various options for prescribing SSTs and sea-ice. We explore these and find that ERF is only weakly dependent on the methodological choices. Prescribing the monthly-averaged seasonally varying model's preindustrial climatology is recommended for its smaller random error and easier implementation. As part of CMIP6, the Radiative Forcing Model Intercomparison Project (RFMIP) asks models to conduct 30-year ERF_fSST experiments using the model's own preindustrial climatology of SST and sea-ice. The Aerosol and Chemistry Model intercomparison Project (AerChemMIP) will also mainly use this approach. We propose this as a standard method for diagnosing ERF in models and recommend that it be used across the climate modeling community to aid future comparisons.
What might we learn from climate forecasts?
Smith, Leonard A.
2002-01-01
Most climate models are large dynamical systems involving a million (or more) variables on big computers. Given that they are nonlinear and not perfect, what can we expect to learn from them about the earth's climate? How can we determine which aspects of their output might be useful and which are noise? And how should we distribute resources between making them “better,” estimating variables of true social and economic interest, and quantifying how good they are at the moment? Just as “chaos” prevents accurate weather forecasts, so model error precludes accurate forecasts of the distributions that define climate, yielding uncertainty of the second kind. Can we estimate the uncertainty in our uncertainty estimates? These questions are discussed. Ultimately, all uncertainty is quantified within a given modeling paradigm; our forecasts need never reflect the uncertainty in a physical system. PMID:11875200
An eco-hydrologic model of malaria outbreaks
NASA Astrophysics Data System (ADS)
Montosi, E.; Manzoni, S.; Porporato, A.; Montanari, A.
2012-03-01
Malaria is a geographically widespread infectious disease that is well known to be affected by climate variability at both seasonal and interannual timescales. In an effort to identify climatic factors that impact malaria dynamics, there has been considerable research focused on the development of appropriate disease models for malaria transmission and their consideration alongside climatic datasets. These analyses have focused largely on variation in temperature and rainfall as direct climatic drivers of malaria dynamics. Here, we further these efforts by considering additionally the role that soil water content may play in driving malaria incidence. Specifically, we hypothesize that hydro-climatic variability should be an important factor in controlling the availability of mosquito habitats, thereby governing mosquito growth rates. To test this hypothesis, we reduce a nonlinear eco-hydrologic model to a simple linear model through a series of consecutive assumptions and apply this model to malaria incidence data from three South African provinces. Despite the assumptions made in the reduction of the model, we show that soil water content can account for a significant portion of malaria's case variability beyond its seasonal patterns, whereas neither temperature nor rainfall alone can do so. Future work should therefore consider soil water content as a simple and computable variable for incorporation into climate-driven disease models of malaria and other vector-borne infectious diseases.
An ecohydrological model of malaria outbreaks
NASA Astrophysics Data System (ADS)
Montosi, E.; Manzoni, S.; Porporato, A.; Montanari, A.
2012-08-01
Malaria is a geographically widespread infectious disease that is well known to be affected by climate variability at both seasonal and interannual timescales. In an effort to identify climatic factors that impact malaria dynamics, there has been considerable research focused on the development of appropriate disease models for malaria transmission driven by climatic time series. These analyses have focused largely on variation in temperature and rainfall as direct climatic drivers of malaria dynamics. Here, we further these efforts by considering additionally the role that soil water content may play in driving malaria incidence. Specifically, we hypothesize that hydro-climatic variability should be an important factor in controlling the availability of mosquito habitats, thereby governing mosquito growth rates. To test this hypothesis, we reduce a nonlinear ecohydrological model to a simple linear model through a series of consecutive assumptions and apply this model to malaria incidence data from three South African provinces. Despite the assumptions made in the reduction of the model, we show that soil water content can account for a significant portion of malaria's case variability beyond its seasonal patterns, whereas neither temperature nor rainfall alone can do so. Future work should therefore consider soil water content as a simple and computable variable for incorporation into climate-driven disease models of malaria and other vector-borne infectious diseases.
Towards European-scale convection-resolving climate simulations with GPUs: a study with COSMO 4.19
NASA Astrophysics Data System (ADS)
Leutwyler, David; Fuhrer, Oliver; Lapillonne, Xavier; Lüthi, Daniel; Schär, Christoph
2016-09-01
The representation of moist convection in climate models represents a major challenge, due to the small scales involved. Using horizontal grid spacings of O(1km), convection-resolving weather and climate models allows one to explicitly resolve deep convection. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in supercomputing have led to new hybrid node designs, mixing conventional multi-core hardware and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to these architectures is the COSMO (Consortium for Small-scale Modeling) model.Here we present the convection-resolving COSMO model on continental scales using a version of the model capable of using GPU accelerators. The verification of a week-long simulation containing winter storm Kyrill shows that, for this case, convection-parameterizing simulations and convection-resolving simulations agree well. Furthermore, we demonstrate the applicability of the approach to longer simulations by conducting a 3-month-long simulation of the summer season 2006. Its results corroborate the findings found on smaller domains such as more credible representation of the diurnal cycle of precipitation in convection-resolving models and a tendency to produce more intensive hourly precipitation events. Both simulations also show how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. This includes the formation of sharp cold frontal structures, convection embedded in fronts and small eddies, or the formation and organization of propagating cold pools. Finally, we assess the performance gain from using heterogeneous hardware equipped with GPUs relative to multi-core hardware. With the COSMO model, we now use a weather and climate model that has all the necessary modules required for real-case convection-resolving regional climate simulations on GPUs.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.
2015-12-01
Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the new methodology as web services and incorporated the system into the Cloud. We have also developed a provenance management system for CMDA where CMDA service semantics modeling, service search and recommendation, and service execution history management are designed and implemented.
PRMS-IV, the precipitation-runoff modeling system, version 4
Markstrom, Steven L.; Regan, R. Steve; Hay, Lauren E.; Viger, Roland J.; Webb, Richard M.; Payn, Robert A.; LaFontaine, Jacob H.
2015-01-01
Computer models that simulate the hydrologic cycle at a watershed scale facilitate assessment of variability in climate, biota, geology, and human activities on water availability and flow. This report describes an updated version of the Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of various combinations of climate and land use on streamflow and general watershed hydrology. Several new model components were developed, and all existing components were updated, to enhance performance and supportability. This report describes the history, application, concepts, organization, and mathematical formulation of the Precipitation-Runoff Modeling System and its model components. This updated version provides improvements in (1) system flexibility for integrated science, (2) verification of conservation of water during simulation, (3) methods for spatial distribution of climate boundary conditions, and (4) methods for simulation of soil-water flow and storage.
Machine Learning Predictions of a Multiresolution Climate Model Ensemble
NASA Astrophysics Data System (ADS)
Anderson, Gemma J.; Lucas, Donald D.
2018-05-01
Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.
OBERON: OBliquity and Energy balance Run on N-body systems
NASA Astrophysics Data System (ADS)
Forgan, Duncan H.
2016-08-01
OBERON (OBliquity and Energy balance Run on N-body systems) models the climate of Earthlike planets under the effects of an arbitrary number and arrangement of other bodies, such as stars, planets and moons. The code, written in C++, simultaneously computes N body motions using a 4th order Hermite integrator, simulates climates using a 1D latitudinal energy balance model, and evolves the orbital spin of bodies using the equations of Laskar (1986a,b).
NASA Astrophysics Data System (ADS)
Girvetz, E. H.; Zganjar, C.; Raber, G. T.; Hoekstra, J.; Lawler, J. J.; Kareiva, P.
2008-12-01
Now that there is overwhelming evidence of global climate change, scientists, managers and planners (i.e. practitioners) need to assess the potential impacts of climate change on particular ecological systems, within specific geographic areas, and at spatial scales they care about, in order to make better land management, planning, and policy decisions. Unfortunately, this application of climate science to real world decisions and planning has proceeded too slowly because we lack tools for translating cutting-edge climate science and climate-model outputs into something managers and planners can work with at local or regional scales (CCSP 2008). To help increase the accessibility of climate information, we have developed a freely-available, easy-to-use, web-based climate-change analysis toolbox, called ClimateWizard, for assessing how climate has and is projected to change at specific geographic locations throughout the world. The ClimateWizard uses geographic information systems (GIS), web-services (SOAP/XML), statistical analysis platforms (e.g. R- project), and web-based mapping services (e.g. Google Earth/Maps, KML/GML) to provide a variety of different analyses (e.g. trends and departures) and outputs (e.g. maps, graphs, tables, GIS layers). Because ClimateWizard analyzes large climate datasets stored remotely on powerful computers, users of the tool do not need to have fast computers or expensive software, but simply need access to the internet. The analysis results are then provided to users in a Google Maps webpage tailored to the specific climate-change question being asked. The ClimateWizard is not a static product, but rather a framework to be built upon and modified to suit the purposes of specific scientific, management, and policy questions. For example, it can be expanded to include bioclimatic variables (e.g. evapotranspiration) and marine data (e.g. sea surface temperature), as well as improved future climate projections, and climate-change impact analyses involving hydrology, vegetation, wildfire, disease, and food security. By harnessing the power of computer and web- based technologies, the ClimateWizard puts local, regional, and global climate-change analyses in the hands of a wider array of managers, planners, and scientists.
NASA Astrophysics Data System (ADS)
Velázquez, J. A.; Schmid, J.; Ricard, S.; Muerth, M. J.; Gauvin St-Denis, B.; Minville, M.; Chaumont, D.; Caya, D.; Ludwig, R.; Turcotte, R.
2012-06-01
Over the recent years, several research efforts investigated the impact of climate change on water resources for different regions of the world. The projection of future river flows is affected by different sources of uncertainty in the hydro-climatic modelling chain. One of the aims of the QBic3 project (Québec-Bavarian International Collaboration on Climate Change) is to assess the contribution to uncertainty of hydrological models by using an ensemble of hydrological models presenting a diversity of structural complexity (i.e. lumped, semi distributed and distributed models). The study investigates two humid, mid-latitude catchments with natural flow conditions; one located in Southern Québec (Canada) and one in Southern Bavaria (Germany). Daily flow is simulated with four different hydrological models, forced by outputs from regional climate models driven by a given number of GCMs' members over a reference (1971-2000) and a future (2041-2070) periods. The results show that the choice of the hydrological model does strongly affect the climate change response of selected hydrological indicators, especially those related to low flows. Indicators related to high flows seem less sensitive on the choice of the hydrological model. Therefore, the computationally less demanding models (usually simple, lumped and conceptual) give a significant level of trust for high and overall mean flows.
Building Systems from Scratch: An Exploratory Study of Students Learning about Climate Change
ERIC Educational Resources Information Center
Puttick, Gillian; Tucker-Raymond, Eli
2018-01-01
Science and computational practices such as modeling and abstraction are critical to understanding the complex systems that are integral to climate science. Given the demonstrated affordances of game design in supporting such practices, we implemented a free 4-day intensive workshop for middle school girls that focused on using the visual…
ERIC Educational Resources Information Center
Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark
2016-01-01
A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models,…
Contributions of the ARM Program to Radiative Transfer Modeling for Climate and Weather Applications
NASA Technical Reports Server (NTRS)
Mlawer, Eli J.; Iacono, Michael J.; Pincus, Robert; Barker, Howard W.; Oreopoulos, Lazaros; Mitchell, David L.
2016-01-01
Accurate climate and weather simulations must account for all relevant physical processes and their complex interactions. Each of these atmospheric, ocean, and land processes must be considered on an appropriate spatial and temporal scale, which leads these simulations to require a substantial computational burden. One especially critical physical process is the flow of solar and thermal radiant energy through the atmosphere, which controls planetary heating and cooling and drives the large-scale dynamics that moves energy from the tropics toward the poles. Radiation calculations are therefore essential for climate and weather simulations, but are themselves quite complex even without considering the effects of variable and inhomogeneous clouds. Clear-sky radiative transfer calculations have to account for thousands of absorption lines due to water vapor, carbon dioxide, and other gases, which are irregularly distributed across the spectrum and have shapes dependent on pressure and temperature. The line-by-line (LBL) codes that treat these details have a far greater computational cost than can be afforded by global models. Therefore, the crucial requirement for accurate radiation calculations in climate and weather prediction models must be satisfied by fast solar and thermal radiation parameterizations with a high level of accuracy that has been demonstrated through extensive comparisons with LBL codes. See attachment for continuation.
NASA Astrophysics Data System (ADS)
Scherstjanoi, M.; Kaplan, J. O.; Lischke, H.
2014-02-01
To be able to simulate climate change effects on forest dynamics over the whole of Switzerland, we adapted the second generation DGVM LPJ-GUESS to the Alpine environment. We modified model functions, tuned model parameters, and implemented new tree species to represent the potential natural vegetation of Alpine landscapes. Furthermore, we increased the computational efficiency of the model to enable area-covering simulations in a fine resolution (1 km) sufficient for the complex topography of the Alps, which resulted in more than 32 000 simulation grid cells. To this aim, we applied the recently developed method GAPPARD (Scherstjanoi et al., 2013) to LPJ-GUESS. GAPPARD derives mean output values from a combination of simulation runs without disturbances and a patch age distribution defined by the disturbance frequency. With this computationally efficient method, that increased the model's speed by approximately the factor 8, we were able to faster detect shortcomings of LPJ-GUESS functions and parameters. We used the adapted LPJ-GUESS together with GAPPARD to assess the influence of one climate change scenario on dynamics of tree species composition and biomass throughout the 21st century in Switzerland. To allow for comparison with the original model, we additionally simulated forest dynamics along a north-south-transect through Switzerland. The results from this transect confirmed the high value of the GAPPARD method despite some limitations towards extreme climatic events. It allowed for the first time to obtain area-wide, detailed high resolution LPJ-GUESS simulation results for a large part of the Alpine region.
THE CLIMATIC AND HYDROLOGIC FACTORS AFFECTING THE REDISTRIBUTION OF SR-90
leaching solution present and the chemical and cation exchange properties of the soil solution ; a mathematical model of movement was established...manual for using high speed computers to compute the factors of the daily water balance was prepared; the influence of the soil solution in
Ocean modelling on the CYBER 205 at GFDL
NASA Technical Reports Server (NTRS)
Cox, M.
1984-01-01
At the Geophysical Fluid Dynamics Laboratory, research is carried out for the purpose of understanding various aspects of climate, such as its variability, predictability, stability and sensitivity. The atmosphere and oceans are modelled mathematically and their phenomenology studied by computer simulation methods. The present state-of-the-art in the computer simulation of large scale oceans on the CYBER 205 is discussed. While atmospheric modelling differs in some aspects, the basic approach used is similar. The equations of the ocean model are presented along with a short description of the numerical techniques used to find their solution. Computational considerations and a typical solution are presented in section 4.
Long-term simulations of dissolved oxygen concentrations in Lake Trout lakes
NASA Astrophysics Data System (ADS)
Jabbari, A.; Boegman, L.; MacKay, M.; Hadley, K.; Paterson, A.; Jeziorski, A.; Nelligan, C.; Smol, J. P.
2016-02-01
Lake Trout are a rare and valuable natural resource that are threatened by multiple environmental stressors. With the added threat of climate warming, there is growing concern among resource managers that increased thermal stratification will reduce the habitat quality of deep-water Lake Trout lakes through enhanced oxygen depletion. To address this issue, a three-part study is underway, which aims to: analyze sediment cores to understand the past, develop empirical formulae to model the present and apply computational models to forecast the future. This presentation reports on the computational modeling efforts. To this end, a simple dissolved oxygen sub-model has been embedded in the one-dimensional bulk mixed-layer thermodynamic Canadian Small Lake Model (CSLM). This model is currently being incorporated into the Canadian Land Surface Scheme (CLASS), the primary land surface component of Environment Canada's global and regional climate modelling systems. The oxygen model was calibrated and validated by hind-casting temperature and dissolved oxygen profiles from two Lake Trout lakes on the Canadian Shield. These data sets include 5 years of high-frequency (10 s to 10 min) data from Eagle Lake and 30 years of bi-weekly data from Harp Lake. Initial results show temperature and dissolved oxygen was predicted with root mean square error <1.5 °C and <3 mgL-1, respectively. Ongoing work is validating the model, over climate-change relevant timescales, against dissolved oxygen reconstructions from the sediment cores and predicting future deep-water temperature and dissolved oxygen concentrations in Canadian Lake Trout lakes under future climate change scenarios. This model will provide a useful tool for managers to ensure sustainable fishery resources for future generations.
Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..
The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.
NASA Astrophysics Data System (ADS)
Scherstjanoi, M.; Kaplan, J. O.; Lischke, H.
2014-07-01
To be able to simulate climate change effects on forest dynamics over the whole of Switzerland, we adapted the second-generation DGVM (dynamic global vegetation model) LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) to the Alpine environment. We modified model functions, tuned model parameters, and implemented new tree species to represent the potential natural vegetation of Alpine landscapes. Furthermore, we increased the computational efficiency of the model to enable area-covering simulations in a fine resolution (1 km) sufficient for the complex topography of the Alps, which resulted in more than 32 000 simulation grid cells. To this aim, we applied the recently developed method GAPPARD (approximating GAP model results with a Probabilistic Approach to account for stand Replacing Disturbances) (Scherstjanoi et al., 2013) to LPJ-GUESS. GAPPARD derives mean output values from a combination of simulation runs without disturbances and a patch age distribution defined by the disturbance frequency. With this computationally efficient method, which increased the model's speed by approximately the factor 8, we were able to faster detect the shortcomings of LPJ-GUESS functions and parameters. We used the adapted LPJ-GUESS together with GAPPARD to assess the influence of one climate change scenario on dynamics of tree species composition and biomass throughout the 21st century in Switzerland. To allow for comparison with the original model, we additionally simulated forest dynamics along a north-south transect through Switzerland. The results from this transect confirmed the high value of the GAPPARD method despite some limitations towards extreme climatic events. It allowed for the first time to obtain area-wide, detailed high-resolution LPJ-GUESS simulation results for a large part of the Alpine region.
An Improved Radiative Transfer Model for Climate Calculations
NASA Technical Reports Server (NTRS)
Bergstrom, Robert W.; Mlawer, Eli J.; Sokolik, Irina N.; Clough, Shepard A.; Toon, Owen B.
1998-01-01
This paper presents a radiative transfer model that has been developed to accurately predict the atmospheric radiant flux in both the infrared and the solar spectrum with a minimum of computational effort. The model is designed to be included in numerical climate models To assess the accuracy of the model, the results are compared to other more detailed models for several standard cases in the solar and thermal spectrum. As the thermal spectrum has been treated in other publications, we focus here on the solar part of the spectrum. We perform several example calculations focussing on the question of absorption of solar radiation by gases and aerosols.
Ensemble forecasting has been used for operational numerical weather prediction in the United States and Europe since the early 1990s. An ensemble of weather or climate forecasts is used to characterize the two main sources of uncertainty in computer models of physical systems: ...
NASA Astrophysics Data System (ADS)
Bush, D. F.; Sieber, R.; Seiler, G.; Chandler, M. A.; Chmura, G. L.
2017-12-01
Efforts to address climate change require public understanding of Earth and climate science. To meet this need, educators require instructional approaches and scientific technologies that overcome cultural barriers to impart conceptual understanding of the work of climate scientists. We compared student inquiry learning with now ubiquitous climate education toy models, data and tools against that which took place using a computational global climate model (GCM) from the National Aeronautics and Space Administration (NASA). Our study at McGill University and John Abbott College in Montreal, QC sheds light on how best to teach the research processes important to Earth and climate scientists studying atmospheric and Earth system processes but ill-understood by those outside the scientific community. We followed a pre/post, control/treatment experimental design that enabled detailed analysis and statistically significant results. Our research found more students succeed at understanding climate change when exposed to actual climate research processes and instruments. Inquiry-based education with a GCM resulted in significantly higher scores pre to post on diagnostic exams (quantitatively) and more complete conceptual understandings (qualitatively). We recognize the difficulty in planning and teaching inquiry with complex technology and we also found evidence that lectures support learning geared toward assessment exams.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Bogomolov, Vasily; Gordova, Yulia; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2014-05-01
Volumes of environmental data archives are growing immensely due to recent models, high performance computers and sensors development. It makes impossible their comprehensive analysis in conventional manner on workplace using in house computing facilities, data storage and processing software at hands. One of possible answers to this challenge is creation of virtual research environment (VRE), which should provide a researcher with an integrated access to huge data resources, tools and services across disciplines and user communities and enable researchers to process structured and qualitative data in virtual workspaces. VRE should integrate data, network and computing resources providing interdisciplinary climatic research community with opportunity to get profound understanding of ongoing and possible future climatic changes and their consequences. Presented are first steps and plans for development of VRE prototype element aimed at regional climatic and ecological monitoring and modeling as well as at continuous education and training support. Recently developed experimental software and hardware platform aimed at integrated analysis of heterogeneous georeferenced data "Climate" (http://climate.scert.ru/, Gordov et al., 2013; Shulgina et al., 2013; Okladnikov et al., 2013) is used as a VRE element prototype and approach test bench. VRE under development will integrate on the base of geoportal distributed thematic data storage, processing and analysis systems and set of models of complex climatic and environmental processes run on supercomputers. VRE specific tools are aimed at high resolution rendering on-going climatic processes occurring in Northern Eurasia and reliable and found prognoses of their dynamics for selected sets of future mankind activity scenaria. Currently the VRE element is accessible via developed geoportal at the same link (http://climate.scert.ru/) and integrates the WRF and «Planet Simulator» models, basic reanalysis and instrumental measurements data and support profound statistical analysis of storaged and modeled on demand data. In particular, one can run the integrated models, preprocess modeling results data, using dedicated modules for numerical processing perform analysys and visualize obtained results. New functionality recently has been added to the statistical analysis tools set aimed at detailed studies of climatic extremes occurring in Northern Asia. The VRE element is also supporting thematic educational courses for students and post-graduate students of the Tomsk State University. In particular, it allow students to perform on-line thematic laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example (Gordova et al, 2013). We plan to expand the integrated models set and add comprehensive surface and Arctic Ocean description. Developed VRE element "Climate" provides specialists involved into multidisciplinary research projects with reliable and practical instruments for integrated research of climate and ecosystems changes on global and regional scales. With its help even a user without programming skills can process and visualize multidimensional observational and model data through unified web-interface using a common graphical web-browser. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grant 13-05-12034, grant 14-05-00502, and integrated project SB RAS 131. References 1. Gordov E.P., Lykosov V.N., Krupchatnikov V.N., Okladnikov I.G., Titov A.G., Shulgina T.M. Computationaland information technologies for monitoring and modeling of climate changes and their consequences. Novosibirsk: Nauka, Siberian branch, 2013. - 195 p. (in Russian) 2. T.M. Shulgina, E.P. Gordov, I.G. Okladnikov, A.G., Titov, E.Yu. Genina, N.P. Gorbatenko, I.V. Kuzhevskaya,A.S. Akhmetshina. Software complex for a regional climate change analysis. // Vestnik NGU. Series: Information technologies. 2013. Vol. 11. Issue 1. P. 124-131. (in Russian) 3. I.G. Okladnikov, A.G. Titov, T.M. Shulgina, E.P. Gordov, V.Yu. Bogomolov, Yu.V. Martynova, S.P. Suschenko,A.V. Skvortsov. Software for analysis and visualization of climate change monitoring and forecasting data //Numerical methods and programming, 2013. Vol. 14. P. 123-131.(in Russian) 4. Yu.E. Gordova, E.Yu. Genina, V.P. Gorbatenko, E.P. Gordov, I.V. Kuzhevskaya, Yu.V. Martynova , I.G. Okladnikov, A.G. Titov, T.M. Shulgina, N.K. Barashkova Support of the educational process in modern climatology within the web-gis platform «Climate». Open and Distant Education. 2013, No 1(49)., P. 14-19.(in Russian)
NASA Astrophysics Data System (ADS)
Gomes, Sandra; Deus, Ricardo; Nogueira, Miguel; Viterbo, Pedro; Miranda, Miguel; Antunes, Sílvia; Silva, Alvaro; Miranda, Pedro
2016-04-01
The Portuguese Local Warming Website (http://portaldoclima.pt) has been developed in order to support the society in Portugal in preparing for the adaptation to the ongoing and future effects of climate change. The climate portal provides systematic and easy access to authoritative scientific data ready to be used by a vast and diverse user community from different public and private sectors, key players and decision makers, but also to high school students, contributing to the increase in knowledge and awareness on climate change topics. A comprehensive set of regional climate variables and indicators are computed, explained and graphically presented. Variables and indicators were built in agreement with identified needs after consultation of the relevant social partners from different sectors, including agriculture, water resources, health, environment and energy and also in direct cooperation with the Portuguese National Strategy for Climate Change Adaptation (ENAAC) group. The visual interface allows the user to dynamically interact, explore, quickly analyze and compare, but also to download and import the data and graphics. The climate variables and indicators are computed from state-of-the-art regional climate model (RCM) simulations (e.g., CORDEX project), at high space-temporal detail, allowing to push the limits of the projections down to local administrative regions (NUTS3) and monthly or seasonal periods, promoting local adaptation strategies. The portal provides both historical data (observed and modelled for the 1971-2000 period) and future climate projections for different scenarios (modelled for the 2011-2100 period). A large effort was undertaken in order to quantify the impacts of the risk of extreme events, such as heavy rain and flooding, droughts, heat and cold waves, and fires. Furthermore the different climate scenarios and the ensemble of RCM models, with high temporal (daily) and spatial (~11km) detail, is taken advantage in order to quantify a plausible evolution of climate impacts and its uncertainties. Clear information on the data value and limitations is also provided. The portal is expected to become a reference tool for evaluation of impacts and vulnerabilities due to climate change, increased awareness and promotion of local adaptation and sustainable development in Portugal. The Portuguese Local Warming Website is part of the ADAPT programme, and is co-funded by the EEA financial mechanism and the Portuguese Carbon Fund.
Where Next for Marine Cloud Brightening Research?
NASA Astrophysics Data System (ADS)
Jenkins, A. K. L.; Forster, P.
2014-12-01
Realistic estimates of geoengineering effectiveness will be central to informed decision-making on its possible role in addressing climate change. Over the last decade, global-scale computer climate modelling of geoengineering has been developing. While these developments have allowed quantitative estimates of geoengineering effectiveness to be produced, the relative coarseness of the grid of these models (tens of kilometres) means that key practical details of the proposed geoengineering is not always realistically captured. This is particularly true for marine cloud brightening (MCB), where both the clouds, as well as the tens-of-meters scale sea-going implementation vessels cannot be captured in detail. Previous research using cloud resolving modelling has shown that neglecting such details may lead to MCB effectiveness being overestimated by up to half. Realism of MCB effectiveness will likely improve from ongoing developments in the understanding and modelling of clouds. We also propose that realism can be increased via more specific improvements (see figure). A readily achievable example would be the reframing of previous MCB effectiveness estimates in light of the cloud resolving scale findings. Incorporation of implementation details could also be made - via parameterisation - into future global-scale modelling of MCB. However, as significant unknowns regarding the design of the MCB aerosol production technique remain, resource-intensive cloud resolving computer modelling of MCB may be premature unless of broader benefit to the wider understanding of clouds. One of the most essential recommendations is for enhanced communication between climate scientists and MCB designers. This would facilitate the identification of potentially important design aspects necessary for realistic computer simulations. Such relationships could be mutually beneficial, with computer modelling potentially informing more efficient designs of the MCB implementation technique. (Acknowledgment) This work is part of the Integrated Assessment of Geoengineering Proposals (IAGP) project, funded by the Engineering and Physical Sciences Research Council and the Natural Environment Research Council (EP/I014721/1).
Current problems in applied mathematics and mathematical modeling
NASA Astrophysics Data System (ADS)
Alekseev, A. S.
Papers are presented on mathematical modeling noting applications to such fields as geophysics, chemistry, atmospheric optics, and immunology. Attention is also given to models of ocean current fluxes, atmospheric and marine interactions, and atmospheric pollution. The articles include studies of catalytic reactors, models of global climate phenomena, and computer-assisted atmospheric models.
Modeling of the Global Water Cycle - Analytical Models
Yongqiang Liu; Roni Avissar
2005-01-01
Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...
3ARM: A Fast, Accurate Radiative Transfer Model for Use in Climate Models
NASA Technical Reports Server (NTRS)
Bergstrom, R. W.; Kinne, S.; Sokolik, I. N.; Toon, O. B.; Mlawer, E. J.; Clough, S. A.; Ackerman, T. P.; Mather, J.
1996-01-01
A new radiative transfer model combining the efforts of three groups of researchers is discussed. The model accurately computes radiative transfer in a inhomogeneous absorbing, scattering and emitting atmospheres. As an illustration of the model, results are shown for the effects of dust on the thermal radiation.
3ARM: A Fast, Accurate Radiative Transfer Model for use in Climate Models
NASA Technical Reports Server (NTRS)
Bergstrom, R. W.; Kinne, S.; Sokolik, I. N.; Toon, O. B.; Mlawer, E. J.; Clough, S. A.; Ackerman, T. P.; Mather, J.
1996-01-01
A new radiative transfer model combining the efforts of three groups of researchers is discussed. The model accurately computes radiative transfer in a inhomogeneous absorbing, scattering and emitting atmospheres. As an illustration of the model, results are shown for the effects of dust on the thermal radiation.
3ARM: A Fast, Accurate Radiative Transfer Model For Use in Climate Models
NASA Technical Reports Server (NTRS)
Bergstrom, R. W.; Kinne, S.; Sokolik, I. N.; Toon, O. B.; Mlawer, E. J.; Clough, S. A.; Ackerman, T. P.; Mather, J.
1996-01-01
A new radiative transfer model combining the efforts of three groups of researchers is discussed. The model accurately computes radiative transfer in a inhomogeneous absorbing, scattering and emitting atmospheres. As an illustration of the model, results are shown for the effects of dust on the thermal radiation.
Statistical Compression for Climate Model Output
NASA Astrophysics Data System (ADS)
Hammerling, D.; Guinness, J.; Soh, Y. J.
2017-12-01
Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.
Description of the NCAR Community Climate Model (CCM3). Technical note
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiehl, J.T.; Hack, J.J.; Bonan, G.B.
This repor presents the details of the governing equations, physical parameterizations, and numerical algorithms defining the version of the NCAR Community Climate Model designated CCM3. The material provides an overview of the major model components, and the way in which they interact as the numerical integration proceeds. This version of the CCM incorporates significant improvements to the physic package, new capabilities such as the incorporation of a slab ocean component, and a number of enhancements to the implementation (e.g., the ability to integrate the model on parallel distributed-memory computational platforms).
Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer
NASA Astrophysics Data System (ADS)
Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain
2015-09-01
Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.
Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldenson, N.; Mauger, G.; Leung, L. R.
Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
NASA Technical Reports Server (NTRS)
Christidis, Z. D.; Spar, J.
1980-01-01
Spherical harmonic analysis was used to analyze the observed climatological (C) fields of temperature at 850 mb, geopotential height at 500 mb, and sea level pressure. The spherical harmonic method was also applied to the corresponding "model climatological" fields (M) generated by a general circulation model, the "GISS climate model." The climate model was initialized with observed data for the first of December 1976 at 00. GMT and allowed to generate five years of meteorological history. Monthly means of the above fields for the five years were computed and subjected to spherical harmonic analysis. It was found from the comparison of the spectral components of both sets, M and C, that the climate model generated reasonable 500 mb geopotential heights. The model temperature field at 850 mb exhibited a generally correct structure. However, the meridional temperature gradient was overestimated and overheating of the continents was observed in summer.
NASA Technical Reports Server (NTRS)
Wang, W.-C.; Stone, P. H.
1980-01-01
The feedback between the ice albedo and temperature is included in a one-dimensional radiative-convective climate model. The effect of this feedback on global sensitivity to changes in solar constant is studied for the current climate conditions. This ice-albedo feedback amplifies global sensitivity by 26 and 39%, respectively, for assumptions of fixed cloud altitude and fixed cloud temperature. The global sensitivity is not affected significantly if the latitudinal variations of mean solar zenith angle and cloud cover are included in the global model. The differences in global sensitivity between one-dimensional radiative-convective models and energy balance models are examined. It is shown that the models are in close agreement when the same feedback mechanisms are included. The one-dimensional radiative-convective model with ice-albedo feedback included is used to compute the equilibrium ice line as a function of solar constant.
NASA Technical Reports Server (NTRS)
Chou, S.-H.; Curran, R. J.; Ohring, G.
1981-01-01
The effects of two different evaporation parameterizations on the sensitivity of simulated climate to solar constant variations are investigated by using a zonally averaged climate model. One parameterization is a nonlinear formulation in which the evaporation is nonlinearly proportional to the sensible heat flux, with the Bowen ratio determined by the predicted vertical temperature and humidity gradients near the earth's surface (model A). The other is the formulation of Saltzman (1968) with the evaporation linearly proportional to the sensible heat flux (model B). The computed climates of models A and B are in good agreement except for the energy partition between sensible and latent heat at the earth's surface. The difference in evaporation parameterizations causes a difference in the response of temperature lapse rate to solar constant variations and a difference in the sensitivity of longwave radiation to surface temperature which leads to a smaller sensitivity of surface temperature to solar constant variations in model A than in model B. The results of model A are qualitatively in agreement with those of the general circulation model calculations of Wetherald and Manabe (1975).
Downscaling GISS ModelE Boreal Summer Climate over Africa
NASA Technical Reports Server (NTRS)
Druyan, Leonard M.; Fulakeza, Matthew
2015-01-01
The study examines the perceived added value of downscaling atmosphere-ocean global climate model simulations over Africa and adjacent oceans by a nested regional climate model. NASA/Goddard Institute for Space Studies (GISS) coupled ModelE simulations for June- September 1998-2002 are used to form lateral boundary conditions for synchronous simulations by the GISS RM3 regional climate model. The ModelE computational grid spacing is 2deg latitude by 2.5deg longitude and the RM3 grid spacing is 0.44deg. ModelE precipitation climatology for June-September 1998-2002 is shown to be a good proxy for 30-year means so results based on the 5-year sample are presumed to be generally representative. Comparison with observational evidence shows several discrepancies in ModelE configuration of the boreal summer inter-tropical convergence zone (ITCZ). One glaring shortcoming is that ModelE simulations do not advance the West African rain band northward during the summer to represent monsoon precipitation onset over the Sahel. Results for 1998-2002 show that onset simulation is an important added value produced by downscaling with RM3. ModelE Eastern South Atlantic Ocean computed sea-surface temperatures (SST) are some 4 K warmer than reanalysis, contributing to large positive biases in overlying surface air temperatures (Tsfc). ModelE Tsfc are also too warm over most of Africa. RM3 downscaling somewhat mitigates the magnitude of Tsfc biases over the African continent, it eliminates the ModelE double ITCZ over the Atlantic and it produces more realistic orographic precipitation maxima. Parallel ModelE and RM3 simulations with observed SST forcing (in place of the predicted ocean) lower Tsfc errors but have mixed impacts on circulation and precipitation biases. Downscaling improvements of the meridional movement of the rain band over West Africa and the configuration of orographic precipitation maxima are realized irrespective of the SST biases.
Effects of different representations of transport in the new EMAC-SWIFT chemistry climate model
NASA Astrophysics Data System (ADS)
Scheffler, Janice; Langematz, Ulrike; Wohltmann, Ingo; Kreyling, Daniel; Rex, Markus
2017-04-01
It is well known that the representation of atmospheric ozone chemistry in weather and climate models is essential for a realistic simulation of the atmospheric state. Interactively coupled chemistry climate models (CCMs) provide a means to realistically simulate the interaction between atmospheric chemistry and dynamics. The calculation of chemistry in CCMs, however, is computationally expensive which renders the use of complex chemistry models not suitable for ensemble simulations or simulations with multiple climate change scenarios. In these simulations ozone is therefore usually prescribed as a climatological field or included by incorporating a fast linear ozone scheme into the model. While prescribed climatological ozone fields are often not aligned with the modelled dynamics, a linear ozone scheme may not be applicable for a wide range of climatological conditions. An alternative approach to represent atmospheric chemistry in climate models which can cope with non-linearities in ozone chemistry and is applicable to a wide range of climatic states is the Semi-empirical Weighted Iterative Fit Technique (SWIFT) that is driven by reanalysis data and has been validated against observational satellite data and runs of a full Chemistry and Transport Model. SWIFT has been implemented into the ECHAM/MESSy (EMAC) chemistry climate model that uses a modular approach to climate modelling where individual model components can be switched on and off. When using SWIFT in EMAC, there are several possibilities to represent the effect of transport inside the polar vortex: the semi-Lagrangian transport scheme of EMAC and a transport parameterisation that can be useful when using SWIFT in models not having transport of their own. Here, we present results of equivalent simulations with different handling of transport, compare with EMAC simulations with full interactive chemistry and evaluate the results with observations.
Crossing the chasm: how to develop weather and climate models for next generation computers?
NASA Astrophysics Data System (ADS)
Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon
2018-05-01
Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries - and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.
NASA Downscaling Project: Final Report
NASA Technical Reports Server (NTRS)
Ferraro, Robert; Waliser, Duane; Peters-Lidard, Christa
2017-01-01
A team of researchers from NASA Ames Research Center, Goddard Space Flight Center, the Jet Propulsion Laboratory, and Marshall Space Flight Center, along with university partners at UCLA, conducted an investigation to explore whether downscaling coarse resolution global climate model (GCM) predictions might provide valid insights into the regional impacts sought by decision makers. Since the computational cost of running global models at high spatial resolution for any useful climate scale period is prohibitive, the hope for downscaling is that a coarse resolution GCM provides sufficiently accurate synoptic scale information for a regional climate model (RCM) to accurately develop fine scale features that represent the regional impacts of a changing climate. As a proxy for a prognostic climate forecast model, and so that ground truth in the form of satellite and in-situ observations could be used for evaluation, the MERRA and MERRA - 2 reanalyses were used to drive the NU - WRF regional climate model and a GEOS - 5 replay. This was performed at various resolutions that were at factors of 2 to 10 higher than the reanalysis forcing. A number of experiments were conducted that varied resolution, model parameterizations, and intermediate scale nudging, for simulations over the continental US during the period from 2000 - 2010. The results of these experiments were compared to observational datasets to evaluate the output.
NASA Technical Reports Server (NTRS)
Ferraro, Robert; Waliser, Duane; Peters-Lidard, Christa
2017-01-01
A team of researchers from NASA Ames Research Center, Goddard Space Flight Center, the Jet Propulsion Laboratory, and Marshall Space Flight Center, along with university partners at UCLA, conducted an investigation to explore whether downscaling coarse resolution global climate model (GCM) predictions might provide valid insights into the regional impacts sought by decision makers. Since the computational cost of running global models at high spatial resolution for any useful climate scale period is prohibitive, the hope for downscaling is that a coarse resolution GCM provides sufficiently accurate synoptic scale information for a regional climate model (RCM) to accurately develop fine scale features that represent the regional impacts of a changing climate. As a proxy for a prognostic climate forecast model, and so that ground truth in the form of satellite and in-situ observations could be used for evaluation, the MERRA and MERRA-2 reanalyses were used to drive the NU-WRF regional climate model and a GEOS-5 replay. This was performed at various resolutions that were at factors of 2 to 10 higher than the reanalysis forcing. A number of experiments were conducted that varied resolution, model parameterizations, and intermediate scale nudging, for simulations over the continental US during the period from 2000-2010. The results of these experiments were compared to observational datasets to evaluate the output.
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Fazliev, Alexander
2017-04-01
Description and the first results of the Russian Science Foundation project "Virtual computational information environment for analysis, evaluation and prediction of the impacts of global climate change on the environment and climate of a selected region" is presented. The project is aimed at development of an Internet-accessible computation and information environment providing unskilled in numerical modelling and software design specialists, decision-makers and stakeholders with reliable and easy-used tools for in-depth statistical analysis of climatic characteristics, and instruments for detailed analysis, assessment and prediction of impacts of global climate change on the environment and climate of the targeted region. In the framework of the project, approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platform of the Russian leading institution involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research laboratory focused on interdisciplinary environmental studies. VRE under development will comprise best features and functionality of earlier developed information and computing system CLIMATE (http://climate.scert.ru/), which is widely used in Northern Eurasia environment studies. The Project includes several major directions of research listed below. 1. Preparation of geo-referenced data sets, describing the dynamics of the current and possible future climate and environmental changes in detail. 2. Improvement of methods of analysis of climate change. 3. Enhancing the functionality of the VRE prototype in order to create a convenient and reliable tool for the study of regional social, economic and political consequences of climate change. 4. Using the output of the first three tasks, compilation of the VRE prototype, its validation, preparation of applicable detailed description of climate change in Western Siberia, and dissemination of the Project results. Results of the first stage of the Project implementation are presented. This work is supported by the Russian Science Foundation grant No16-19-10257.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.
2013-12-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.
2014-01-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
CIELO-A GIS integrated model for climatic and water balance simulation in islands environments
NASA Astrophysics Data System (ADS)
Azevedo, E. B.; Pereira, L. S.
2003-04-01
The model CIELO (acronym for "Clima Insular à Escala Local") is a physically based model that simulates the climatic variables in an island using data from a single synoptic reference meteorological station. The reference station "knows" its position in the orographic and dynamic regime context. The domain of computation is a GIS raster grid parameterised with a digital elevation model (DEM). The grid is oriented following the direction of the air masses circulation through a specific algorithm named rotational terrain model (RTM). The model consists of two main sub-models. One, relative to the advective component simulation, assumes the Foehn effect to reproduce the dynamic and thermodynamic processes occurring when an air mass moves through the island orographic obstacle. This makes possible to simulate the air temperature, air humidity, cloudiness and precipitation as influenced by the orography along the air displacement. The second concerns the radiative component as affected by the clouds of orographic origin and by the shadow produced by the relief. The initial state parameters are computed starting from the reference meteorological station across the DEM transept until the sea level at the windward side. Then, starting from the sea level, the model computes the local scale meteorological parameters according to the direction of the air displacement, which is adjusted with the RTM. The air pressure, temperature and humidity are directly calculated for each cell in the computational grid, while several algorithms are used to compute the cloudiness, net radiation, evapotranspiration, and precipitation. The model presented in this paper has been calibrated and validated using data from some meteorological stations and a larger number of rainfall stations located at various elevations in the Azores Islands.
Quantifying Key Climate Parameter Uncertainties Using an Earth System Model with a Dynamic 3D Ocean
NASA Astrophysics Data System (ADS)
Olson, R.; Sriver, R. L.; Goes, M. P.; Urban, N.; Matthews, D.; Haran, M.; Keller, K.
2011-12-01
Climate projections hinge critically on uncertain climate model parameters such as climate sensitivity, vertical ocean diffusivity and anthropogenic sulfate aerosol forcings. Climate sensitivity is defined as the equilibrium global mean temperature response to a doubling of atmospheric CO2 concentrations. Vertical ocean diffusivity parameterizes sub-grid scale ocean vertical mixing processes. These parameters are typically estimated using Intermediate Complexity Earth System Models (EMICs) that lack a full 3D representation of the oceans, thereby neglecting the effects of mixing on ocean dynamics and meridional overturning. We improve on these studies by employing an EMIC with a dynamic 3D ocean model to estimate these parameters. We carry out historical climate simulations with the University of Victoria Earth System Climate Model (UVic ESCM) varying parameters that affect climate sensitivity, vertical ocean mixing, and effects of anthropogenic sulfate aerosols. We use a Bayesian approach whereby the likelihood of each parameter combination depends on how well the model simulates surface air temperature and upper ocean heat content. We use a Gaussian process emulator to interpolate the model output to an arbitrary parameter setting. We use Markov Chain Monte Carlo method to estimate the posterior probability distribution function (pdf) of these parameters. We explore the sensitivity of the results to prior assumptions about the parameters. In addition, we estimate the relative skill of different observations to constrain the parameters. We quantify the uncertainty in parameter estimates stemming from climate variability, model and observational errors. We explore the sensitivity of key decision-relevant climate projections to these parameters. We find that climate sensitivity and vertical ocean diffusivity estimates are consistent with previously published results. The climate sensitivity pdf is strongly affected by the prior assumptions, and by the scaling parameter for the aerosols. The estimation method is computationally fast and can be used with more complex models where climate sensitivity is diagnosed rather than prescribed. The parameter estimates can be used to create probabilistic climate projections using the UVic ESCM model in future studies.
NASA Astrophysics Data System (ADS)
Moral, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo; Honorio, Fulgencio
2016-05-01
Different climatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this study, we propose using the information obtained from ten climatic indices [heliothermal index (HI), cool night index (CI), dryness index (DI), growing season temperature (GST), the Winkler index (WI), September mean thermal amplitude (MTA), annual precipitation (AP), precipitation during flowering (PDF), precipitation before flowering (PBF), and summer precipitation (SP)] as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main climatic indices, which could influence on wine suitability from a climate viewpoint, and utilizing the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural climatic suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the climatic indices which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural climatic suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural climatic zones in a region. To illustrate the process, an application to Extremadura, southwestern Spain, is shown.
NASA Astrophysics Data System (ADS)
Foulon, Étienne; Rousseau, Alain N.; Gagnon, Patrick
2018-02-01
Low flow conditions are governed by short-to-medium term weather conditions or long term climate conditions. This prompts the question: given climate scenarios, is it possible to assess future extreme low flow conditions from climate data indices (CDIs)? Or should we rely on the conventional approach of using outputs of climate models as inputs to a hydrological model? Several CDIs were computed using 42 climate scenarios over the years 1961-2100 for two watersheds located in Québec, Canada. The relationship between the CDIs and hydrological data indices (HDIs; 7- and 30-day low flows for two hydrological seasons) were examined through correlation analysis to identify the indices governing low flows. Results of the Mann-Kendall test, with a modification for autocorrelated data, clearly identified trends. A partial correlation analysis allowed attributing the observed trends in HDIs to trends in specific CDIs. Furthermore, results showed that, even during the spatial validation process, the methodological framework was able to assess trends in low flow series from: (i) trends in the effective drought index (EDI) computed from rainfall plus snowmelt minus PET amounts over ten to twelve months of the hydrological snow cover season or (ii) the cumulative difference between rainfall and potential evapotranspiration over five months of the snow free season. For 80% of the climate scenarios, trends in HDIs were successfully attributed to trends in CDIs. Overall, this paper introduces an efficient methodological framework to assess future trends in low flows given climate scenarios. The outcome may prove useful to municipalities concerned with source water management under changing climate conditions.
NASA Astrophysics Data System (ADS)
Prein, A. F.; Langhans, W.; Fosser, G.; Ferrone, A.; Ban, N.; Goergen, K.; Keller, M.; Tölle, M.; Gutjahr, O.; Feser, F.; Brisson, E.; Kollet, S. J.; Schmidli, J.; Van Lipzig, N. P. M.; Leung, L. R.
2015-12-01
Regional climate modeling using convection-permitting models (CPMs; horizontal grid spacing <4 km) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs; horizontal grid spacing >10 km). CPMs no longer rely on convection parameterization schemes, which had been identified as a major source of errors and uncertainties in LSMs. Moreover, CPMs allow for a more accurate representation of surface and orography fields. The drawback of CPMs is the high demand on computational resources. For this reason, first CPM climate simulations only appeared a decade ago. We aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic. The most important components in CPMs such as physical parameterizations and dynamical formulations are discussed critically. An overview of weaknesses and an outlook on required future developments is provided. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Improvements are evident mostly for climate statistics related to deep convection, mountainous regions, or extreme events. The climate change signals of CPM simulations suggest an increase in flash floods, changes in hail storm characteristics, and reductions in the snowpack over mountains. In conclusion, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to advance parameterizations of unresolved physics and to assess the full potential of CPMs.
Prein, Andreas F; Langhans, Wolfgang; Fosser, Giorgia; Ferrone, Andrew; Ban, Nikolina; Goergen, Klaus; Keller, Michael; Tölle, Merja; Gutjahr, Oliver; Feser, Frauke; Brisson, Erwan; Kollet, Stefan; Schmidli, Juerg; van Lipzig, Nicole P M; Leung, Ruby
2015-06-01
Regional climate modeling using convection-permitting models (CPMs; horizontal grid spacing <4 km) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs; horizontal grid spacing >10 km). CPMs no longer rely on convection parameterization schemes, which had been identified as a major source of errors and uncertainties in LSMs. Moreover, CPMs allow for a more accurate representation of surface and orography fields. The drawback of CPMs is the high demand on computational resources. For this reason, first CPM climate simulations only appeared a decade ago. In this study, we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic. The most important components in CPMs such as physical parameterizations and dynamical formulations are discussed critically. An overview of weaknesses and an outlook on required future developments is provided. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Improvements are evident mostly for climate statistics related to deep convection, mountainous regions, or extreme events. The climate change signals of CPM simulations suggest an increase in flash floods, changes in hail storm characteristics, and reductions in the snowpack over mountains. In conclusion, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to advance parameterizations of unresolved physics and to assess the full potential of CPMs.
NASA Astrophysics Data System (ADS)
Prein, Andreas F.; Langhans, Wolfgang; Fosser, Giorgia; Ferrone, Andrew; Ban, Nikolina; Goergen, Klaus; Keller, Michael; Tölle, Merja; Gutjahr, Oliver; Feser, Frauke; Brisson, Erwan; Kollet, Stefan; Schmidli, Juerg; van Lipzig, Nicole P. M.; Leung, Ruby
2015-06-01
Regional climate modeling using convection-permitting models (CPMs; horizontal grid spacing <4 km) emerges as a promising framework to provide more reliable climate information on regional to local scales compared to traditionally used large-scale models (LSMs; horizontal grid spacing >10 km). CPMs no longer rely on convection parameterization schemes, which had been identified as a major source of errors and uncertainties in LSMs. Moreover, CPMs allow for a more accurate representation of surface and orography fields. The drawback of CPMs is the high demand on computational resources. For this reason, first CPM climate simulations only appeared a decade ago. In this study, we aim to provide a common basis for CPM climate simulations by giving a holistic review of the topic. The most important components in CPMs such as physical parameterizations and dynamical formulations are discussed critically. An overview of weaknesses and an outlook on required future developments is provided. Most importantly, this review presents the consolidated outcome of studies that addressed the added value of CPM climate simulations compared to LSMs. Improvements are evident mostly for climate statistics related to deep convection, mountainous regions, or extreme events. The climate change signals of CPM simulations suggest an increase in flash floods, changes in hail storm characteristics, and reductions in the snowpack over mountains. In conclusion, CPMs are a very promising tool for future climate research. However, coordinated modeling programs are crucially needed to advance parameterizations of unresolved physics and to assess the full potential of CPMs.
NASA Astrophysics Data System (ADS)
Machguth, H.; Paul, F.; Kotlarski, S.; Hoelzle, M.
2009-04-01
Climate model output has been applied in several studies on glacier mass balance calculation. Hereby, computation of mass balance has mostly been performed at the native resolution of the climate model output or data from individual cells were selected and statistically downscaled. Little attention has been given to the issue of downscaling entire fields of climate model output to a resolution fine enough to compute glacier mass balance in rugged high-mountain terrain. In this study we explore the use of gridded output from a regional climate model (RCM) to drive a distributed mass balance model for the perimeter of the Swiss Alps and the time frame 1979-2003. Our focus lies on the development and testing of downscaling and validation methods. The mass balance model runs at daily steps and 100 m spatial resolution while the RCM REMO provides daily grids (approx. 18 km resolution) of dynamically downscaled re-analysis data. Interpolation techniques and sub-grid parametrizations are combined to bridge the gap in spatial resolution and to obtain daily input fields of air temperature, global radiation and precipitation. The meteorological input fields are compared to measurements at 14 high-elevation weather stations. Computed mass balances are compared to various sets of direct measurements, including stake readings and mass balances for entire glaciers. The validation procedure is performed separately for annual, winter and summer balances. Time series of mass balances for entire glaciers obtained from the model run agree well with observed time series. On the one hand, summer melt measured at stakes on several glaciers is well reproduced by the model, on the other hand, observed accumulation is either over- or underestimated. It is shown that these shifts are systematic and correlated to regional biases in the meteorological input fields. We conclude that the gap in spatial resolution is not a large drawback, while biases in RCM output are a major limitation to model performance. The development and testing of methods to reduce regionally variable biases in entire fields of RCM output should be a focus of pursuing studies.
Picheny, Victor; Trépos, Ronan; Casadebaig, Pierre
2017-01-01
Accounting for the interannual climatic variations is a well-known issue for simulation-based studies of environmental systems. It often requires intensive sampling (e.g., averaging the simulation outputs over many climatic series), which hinders many sequential processes, in particular optimization algorithms. We propose here an approach based on a subset selection in a large basis of climatic series, using an ad-hoc similarity function and clustering. A non-parametric reconstruction technique is introduced to estimate accurately the distribution of the output of interest using only the subset sampling. The proposed strategy is non-intrusive and generic (i.e. transposable to most models with climatic data inputs), and can be combined to most “off-the-shelf” optimization solvers. We apply our approach to sunflower ideotype design using the crop model SUNFLO. The underlying optimization problem is formulated as a multi-objective one to account for risk-aversion. Our approach achieves good performances even for limited computational budgets, outperforming significantly standard strategies. PMID:28542198
Impacts of Climate Policy on Regional Air Quality, Health, and Air Quality Regulatory Procedures
NASA Astrophysics Data System (ADS)
Thompson, T. M.; Selin, N. E.
2011-12-01
Both the changing climate, and the policy implemented to address climate change can impact regional air quality. We evaluate the impacts of potential selected climate policies on modeled regional air quality with respect to national pollution standards, human health and the sensitivity of health uncertainty ranges. To assess changes in air quality due to climate policy, we couple output from a regional computable general equilibrium economic model (the US Regional Energy Policy [USREP] model), with a regional air quality model (the Comprehensive Air Quality Model with Extensions [CAMx]). USREP uses economic variables to determine how potential future U.S. climate policy would change emissions of regional pollutants (CO, VOC, NOx, SO2, NH3, black carbon, and organic carbon) from ten emissions-heavy sectors of the economy (electricity, coal, gas, crude oil, refined oil, energy intensive industry, other industry, service, agriculture, and transportation [light duty and heavy duty]). Changes in emissions are then modeled using CAMx to determine the impact on air quality in several cities in the Northeast US. We first calculate the impact of climate policy by using regulatory procedures used to show attainment with National Ambient Air Quality Standards (NAAQS) for ozone and particulate matter. Building on previous work, we compare those results with the calculated results and uncertainties associated with human health impacts due to climate policy. This work addresses a potential disconnect between NAAQS regulatory procedures and the cost/benefit analysis required for and by the Clean Air Act.
Teaching a Model-based Climatology Using Energy Balance Simulation.
ERIC Educational Resources Information Center
Unwin, David
1981-01-01
After outlining the difficulties of teaching climatology within an undergraduate geography curriculum, the author describes and evaluates the use of a computer assisted simulation to model surface energy balance and the effects of land use changes on local climate. (AM)
Advancing Climate Change and Impacts Science Through Climate Informatics
NASA Astrophysics Data System (ADS)
Lenhardt, W.; Pouchard, L. C.; King, A. W.; Branstetter, M. L.; Kao, S.; Wang, D.
2010-12-01
This poster will outline the work to date on developing a climate informatics capability at Oak Ridge National Laboratory (ORNL). The central proposition of this effort is that the application of informatics and information science to the domain of climate change science is an essential means to bridge the realm of high performance computing (HPC) and domain science. The goal is to facilitate knowledge capture and the creation of new scientific insights. For example, a climate informatics capability will help with the understanding and use of model results in domain sciences that were not originally in the scope. From there, HPC can also benefit from feedback as the new approaches may lead to better parameterization in the models. In this poster we will summarize the challenges associated with climate change science that can benefit from the systematic application of informatics and we will highlight our work to date in creating the climate informatics capability to address these types of challenges. We have identified three areas that are particularly challenging in the context of climate change science: 1) integrating model and observational data across different spatial and temporal scales, 2) model linkages, i.e. climate models linked to other models such as hydrologic models, and 3) model diagnostics. Each of these has a methodological component and an informatics component. Our project under way at ORNL seeks to develop new approaches and tools in the context of linking climate change and water issues. We are basing our work on the following four use cases: 1) Evaluation/test of CCSM4 biases in hydrology (precipitation, soil water, runoff, river discharge) over the Rio Grande Basin. User: climate modeler. 2) Investigation of projected changes in hydrology of Rio Grande Basin using the VIC (Variable Infiltration Capacity Macroscale) Hydrologic Model. User: watershed hydrologist/modeler. 3) Impact of climate change on agricultural productivity of the Rio Grande Basin. User: climate impact scientist, agricultural economist. 4) Renegotiation of the 1944 “Treaty for the Utilization of Waters of the Colorado and Tijuana Rivers and of the Rio Grande”. User: A US State Department analyst or their counterpart in Mexico.
a Physical Parameterization of Snow Albedo for Use in Climate Models.
NASA Astrophysics Data System (ADS)
Marshall, Susan Elaine
The albedo of a natural snowcover is highly variable ranging from 90 percent for clean, new snow to 30 percent for old, dirty snow. This range in albedo represents a difference in surface energy absorption of 10 to 70 percent of incident solar radiation. Most general circulation models (GCMs) fail to calculate the surface snow albedo accurately, yet the results of these models are sensitive to the assumed value of the snow albedo. This study replaces the current simple empirical parameterizations of snow albedo with a physically-based parameterization which is accurate (within +/- 3% of theoretical estimates) yet efficient to compute. The parameterization is designed as a FORTRAN subroutine (called SNOALB) which can be easily implemented into model code. The subroutine requires less then 0.02 seconds of computer time (CRAY X-MP) per call and adds only one new parameter to the model calculations, the snow grain size. The snow grain size can be calculated according to one of the two methods offered in this thesis. All other input variables to the subroutine are available from a climate model. The subroutine calculates a visible, near-infrared and solar (0.2-5 μm) snow albedo and offers a choice of two wavelengths (0.7 and 0.9 mu m) at which the solar spectrum is separated into the visible and near-infrared components. The parameterization is incorporated into the National Center for Atmospheric Research (NCAR) Community Climate Model, version 1 (CCM1), and the results of a five -year, seasonal cycle, fixed hydrology experiment are compared to the current model snow albedo parameterization. The results show the SNOALB albedos to be comparable to the old CCM1 snow albedos for current climate conditions, with generally higher visible and lower near-infrared snow albedos using the new subroutine. However, this parameterization offers a greater predictability for climate change experiments outside the range of current snow conditions because it is physically-based and not tuned to current empirical results.
Can beaches survive climate change?
Vitousek, Sean; Barnard, Patrick L.; Limber, Patrick W.
2017-01-01
Anthropogenic climate change is driving sea level rise, leading to numerous impacts on the coastal zone, such as increased coastal flooding, beach erosion, cliff failure, saltwater intrusion in aquifers, and groundwater inundation. Many beaches around the world are currently experiencing chronic erosion as a result of gradual, present-day rates of sea level rise (about 3 mm/year) and human-driven restrictions in sand supply (e.g., harbor dredging and river damming). Accelerated sea level rise threatens to worsen coastal erosion and challenge the very existence of natural beaches throughout the world. Understanding and predicting the rates of sea level rise and coastal erosion depends on integrating data on natural systems with computer simulations. Although many computer modeling approaches are available to simulate shoreline change, few are capable of making reliable long-term predictions needed for full adaption or to enhance resilience. Recent advancements have allowed convincing decadal to centennial-scale predictions of shoreline evolution. For example, along 500 km of the Southern California coast, a new model featuring data assimilation predicts that up to 67% of beaches may completely erode by 2100 without large-scale human interventions. In spite of recent advancements, coastal evolution models must continue to improve in their theoretical framework, quantification of accuracy and uncertainty, computational efficiency, predictive capability, and integration with observed data, in order to meet the scientific and engineering challenges produced by a changing climate.
NCI HPC Scaling and Optimisation in Climate, Weather, Earth system science and the Geosciences
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Bermous, I.; Freeman, J.; Roberts, D. S.; Ward, M. L.; Yang, R.
2016-12-01
The Australian National Computational Infrastructure (NCI) has a national focus in the Earth system sciences including climate, weather, ocean, water management, environment and geophysics. NCI leads a Program across its partners from the Australian science agencies and research communities to identify priority computational models to scale-up. Typically, these cases place a large overall demand on the available computer time, need to scale to higher resolutions, use excessive scarce resources such as large memory or bandwidth that limits, or in some cases, need to meet requirements for transition to a separate operational forecasting system, with set time-windows. The model codes include the UK Met Office Unified Model atmospheric model (UM), GFDL's Modular Ocean Model (MOM), both the UK Met Office's GC3 and Australian ACCESS coupled-climate systems (including sea ice), 4D-Var data assimilation and satellite processing, the Regional Ocean Model (ROMS), and WaveWatch3 as well as geophysics codes including hazards, magentuellerics, seismic inversions, and geodesy. Many of these codes use significant compute resources both for research applications as well as within the operational systems. Some of these models are particularly complex, and their behaviour had not been critically analysed for effective use of the NCI supercomputer or how they could be improved. As part of the Program, we have established a common profiling methodology that uses a suite of open source tools for performing scaling analyses. The most challenging cases are profiling multi-model coupled systems where the component models have their own complex algorithms and performance issues. We have also found issues within the current suite of profiling tools, and no single tool fully exposes the nature of the code performance. As a result of this work, international collaborations are now in place to ensure that improvements are incorporated within the community models, and our effort can be targeted in a coordinated way. The coordinations have involved user stakeholders, the model developer community, and dependent software libraries. For example, we have spent significant time characterising I/O scalability, and improving the use of libraries such as NetCDF and HDF5.
Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node
NASA Astrophysics Data System (ADS)
Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten
2016-04-01
The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.
A network-base analysis of CMIP5 "historical" experiments
NASA Astrophysics Data System (ADS)
Bracco, A.; Foudalis, I.; Dovrolis, C.
2012-12-01
In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.
A continuous latitudinal energy balance model to explore non-uniform climate engineering strategies
NASA Astrophysics Data System (ADS)
Bonetti, F.; McInnes, C. R.
2016-12-01
Current concentrations of atmospheric CO2 exceed measured historical levels in modern times, largely attributed to anthropogenic forcing since the industrial revolution. The required decline in emissions rates has never been achieved leading to recent interest in climate engineering for future risk-mitigation strategies. Climate engineering aims to offset human-driven climate change. It involves techniques developed both to reduce the concentration of CO2 in the atmosphere (Carbon Dioxide Removal (CDR) methods) and to counteract the radiative forcing that it generates (Solar Radiation Management (SRM) methods). In order to investigate effects of SRM technologies for climate engineering, an analytical model describing the main dynamics of the Earth's climate has been developed. The model is a time-dependent Energy Balance Model (EBM) with latitudinal resolution and allows for the evaluation of non-uniform climate engineering strategies. A significant disadvantage of climate engineering techniques involving the management of solar radiation is regional disparities in cooling. This model offers an analytical approach to design multi-objective strategies that counteract climate change on a regional basis: for example, to cool the Artic and restrict undesired impacts at mid-latitudes, or to control the equator-to-pole temperature gradient. Using the Green's function approach the resulting partial differential equation allows for the computation of the surface temperature as a function of time and latitude when a 1% per year increase in the CO2 concentration is considered. After the validation of the model through comparisons with high fidelity numerical models, it will be used to explore strategies for the injection of the aerosol precursors in the stratosphere. In particular, the model involves detailed description of the optical properties of the particles, the wash-out dynamics and the estimation of the radiative cooling they can generate.
NASA Astrophysics Data System (ADS)
Sinha, T.; Gangodagamage, C.; Ale, S.; Frazier, A. G.; Giambelluca, T. W.; Kumagai, T.; Nakai, T.; Sato, H.
2017-12-01
Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.
An economic evaluation of solar radiation management.
Aaheim, Asbjørn; Romstad, Bård; Wei, Taoyuan; Kristjánsson, Jón Egill; Muri, Helene; Niemeier, Ulrike; Schmidt, Hauke
2015-11-01
Economic evaluations of solar radiation management (SRM) usually assume that the temperature will be stabilized, with no economic impacts of climate change, but with possible side-effects. We know from experiments with climate models, however, that unlike emission control the spatial and temporal distributions of temperature, precipitation and wind conditions will change. Hence, SRM may have economic consequences under a stabilization of global mean temperature even if side-effects other than those related to the climatic responses are disregarded. This paper addresses the economic impacts of implementing two SRM technologies; stratospheric sulfur injection and marine cloud brightening. By the use of a computable general equilibrium model, we estimate the economic impacts of climatic responses based on the results from two earth system models, MPI-ESM and NorESM. We find that under a moderately increasing greenhouse-gas concentration path, RCP4.5, the economic benefits of implementing climate engineering are small, and may become negative. Global GDP increases in three of the four experiments and all experiments include regions where the benefits from climate engineering are negative. Copyright © 2015 Elsevier B.V. All rights reserved.
Processes Understanding of Decadal Climate Variability
NASA Astrophysics Data System (ADS)
Prömmel, Kerstin; Cubasch, Ulrich
2016-04-01
The realistic representation of decadal climate variability in the models is essential for the quality of decadal climate predictions. Therefore, the understanding of those processes leading to decadal climate variability needs to be improved. Several of these processes are already included in climate models but their importance has not yet completely been clarified. The simulation of other processes requires sometimes a higher resolution of the model or an extension by additional subsystems. This is addressed within one module of the German research program "MiKlip II - Decadal Climate Predictions" (http://www.fona-miklip.de/en/) with a focus on the following processes. Stratospheric processes and their impact on the troposphere are analysed regarding the climate response to aerosol perturbations caused by volcanic eruptions and the stratospheric decadal variability due to solar forcing, climate change and ozone recovery. To account for the interaction between changing ozone concentrations and climate a computationally efficient ozone chemistry module is developed and implemented in the MiKlip prediction system. The ocean variability and air-sea interaction are analysed with a special focus on the reduction of the North Atlantic cold bias. In addition, the predictability of the oceanic carbon uptake with a special emphasis on the underlying mechanism is investigated. This addresses a combination of physical, biological and chemical processes.
High-resolution regional climate model evaluation using variable-resolution CESM over California
NASA Astrophysics Data System (ADS)
Huang, X.; Rhoades, A.; Ullrich, P. A.; Zarzycki, C. M.
2015-12-01
Understanding the effect of climate change at regional scales remains a topic of intensive research. Though computational constraints remain a problem, high horizontal resolution is needed to represent topographic forcing, which is a significant driver of local climate variability. Although regional climate models (RCMs) have traditionally been used at these scales, variable-resolution global climate models (VRGCMs) have recently arisen as an alternative for studying regional weather and climate allowing two-way interaction between these domains without the need for nudging. In this study, the recently developed variable-resolution option within the Community Earth System Model (CESM) is assessed for long-term regional climate modeling over California. Our variable-resolution simulations will focus on relatively high resolutions for climate assessment, namely 28km and 14km regional resolution, which are much more typical for dynamically downscaled studies. For comparison with the more widely used RCM method, the Weather Research and Forecasting (WRF) model will be used for simulations at 27km and 9km. All simulations use the AMIP (Atmospheric Model Intercomparison Project) protocols. The time period is from 1979-01-01 to 2005-12-31 (UTC), and year 1979 was discarded as spin up time. The mean climatology across California's diverse climate zones, including temperature and precipitation, is analyzed and contrasted with the Weather Research and Forcasting (WRF) model (as a traditional RCM), regional reanalysis, gridded observational datasets and uniform high-resolution CESM at 0.25 degree with the finite volume (FV) dynamical core. The results show that variable-resolution CESM is competitive in representing regional climatology on both annual and seasonal time scales. This assessment adds value to the use of VRGCMs for projecting climate change over the coming century and improve our understanding of both past and future regional climate related to fine-scale processes. This assessment is also relevant for addressing the scale limitation of current RCMs or VRGCMs when next-generation model resolution increases to ~10km and beyond.
Review of Diagnostics for Water Sources in General Circulation Models (GCMs)
NASA Technical Reports Server (NTRS)
Bosilovich, M.
2003-01-01
We will describe the uses of passive tracers in GCMs to compute the geographical sources of water for precipitation. We will present a summary of recent research and how this methodology can be applied in climate and climate change studies. We will also discuss the possibility of using passive tracers in conjunction with simulations and observations of stable observations.
NASA Astrophysics Data System (ADS)
de la Fuente, Alberto; Meruane, Carolina
2017-09-01
Altiplanic wetlands are unique ecosystems located in the elevated plateaus of Chile, Argentina, Peru, and Bolivia. These ecosystems are under threat due to changes in land use, groundwater extractions, and climate change that will modify the water balance through changes in precipitation and evaporation rates. Long-term prediction of the fate of aquatic ecosystems imposes computational constraints that make finding a solution impossible in some cases. In this article, we present a spectral model for long-term simulations of the thermodynamics of shallow wetlands in the limit case when the water depth tends to zero. This spectral model solves for water and sediment temperature, as well as heat, momentum, and mass exchanged with the atmosphere. The parameters of the model (water depth, thermal properties of the sediments, and surface albedo) and the atmospheric downscaling were calibrated using the MODIS product of the land surface temperature. Moreover, the performance of the daily evaporation rates predicted by the model was evaluated against daily pan evaporation data measured between 1964 and 2012. The spectral model was able to correctly represent both seasonal fluctuation and climatic trends observed in daily evaporation rates. It is concluded that the spectral model presented in this article is a suitable tool for assessing the global climate change effects on shallow wetlands whose thermodynamics is forced by heat exchanges with the atmosphere and modulated by the heat-reservoir role of the sediments.
A virtual climate library of surface temperature over North America for 1979-2015
NASA Astrophysics Data System (ADS)
Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras
2017-10-01
The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.
A virtual climate library of surface temperature over North America for 1979–2015
Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras
2017-01-01
The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context—for example, to document trends in extreme events in response to climate change—is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979–2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life. PMID:29039842
A virtual climate library of surface temperature over North America for 1979-2015.
Kravtsov, Sergey; Roebber, Paul; Brazauskas, Vytaras
2017-10-17
The most comprehensive continuous-coverage modern climatic data sets, known as reanalyses, come from combining state-of-the-art numerical weather prediction (NWP) models with diverse available observations. These reanalysis products estimate the path of climate evolution that actually happened, and their use in a probabilistic context-for example, to document trends in extreme events in response to climate change-is, therefore, limited. Free runs of NWP models without data assimilation can in principle be used for the latter purpose, but such simulations are computationally expensive and are prone to systematic biases. Here we produce a high-resolution, 100-member ensemble simulation of surface atmospheric temperature over North America for the 1979-2015 period using a comprehensive spatially extended non-stationary statistical model derived from the data based on the North American Regional Reanalysis. The surrogate climate realizations generated by this model are independent from, yet nearly statistically congruent with reality. This data set provides unique opportunities for the analysis of weather-related risk, with applications in agriculture, energy development, and protection of human life.
Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation
NASA Astrophysics Data System (ADS)
Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.
2014-12-01
Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.
Quantitative Assessment of Antarctic Climate Variability and Change
NASA Astrophysics Data System (ADS)
Ordonez, A.; Schneider, D. P.
2013-12-01
The Antarctic climate is both extreme and highly variable, but there are indications it may be changing. As the climate in Antarctica can affect global sea level and ocean circulation, it is important to understand and monitor its behavior. Observational and model data have been used to study climate change in Antarctica and the Southern Ocean, though observational data is sparse and models have difficulty reproducing many observed climate features. For example, a leading hypothesis that ozone depletion has been responsible for sea ice trends is struggling with the inability of ozone-forced models to reproduce the observed sea ice increase. The extent to which this data-model disagreement represents inadequate observations versus model biases is unknown. This research assessed a variety of climate change indicators to present an overview of Antarctic climate that will allow scientists to easily access this data and compare indicators with other observational data and model output. Indicators were obtained from observational and reanalysis data for variables such as temperature, sea ice area, and zonal wind stress. Multiple datasets were used for key variables. Monthly and annual anomaly data from Antarctica and the Southern Ocean as well as tropical indices were plotted as time series on common axes for comparison. Trends and correlations were also computed. Zonal wind, surface temperature, and austral springtime sea ice had strong relationships and were further discussed in terms of how they may relate to climate variability and change in the Antarctic. This analysis will enable hypothesized mechanisms of Antarctic climate change to be critically evaluated.
Chemistry-Climate Models of the Stratosphere
NASA Technical Reports Server (NTRS)
Austin, J.; Shindell, D.; Bruehl, C.; Dameris, M.; Manzini, E.; Nagashima, T.; Newman, P.; Pawson, S.; Pitari, G.; Rozanov, E.;
2001-01-01
Over the last decade, improved computer power has allowed three-dimensional models of the stratosphere to be developed that can be used to simulate polar ozone levels over long periods. This paper compares the meteorology between these models, and discusses the future of polar ozone levels over the next 50 years.
Computer Assisted Vocational Math. Written for TRS-80, Model I, Level II, 16K.
ERIC Educational Resources Information Center
Daly, Judith; And Others
This computer-assisted curriculum is intended to be used to enhance a vocational mathematics/applied mathematics course. A total of 32 packets were produced to increase the basic mathematics skills of students in the following vocational programs: automotive trades, beauty culture, building trades, climate control, electrical trades,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Xingying; Rhoades, Alan M.; Ullrich, Paul A.
In this paper, the recently developed variable-resolution option within the Community Earth System Model (VR-CESM) is assessed for long-term regional climate modeling of California at 0.25° (~ 28 km) and 0.125° (~ 14 km) horizontal resolutions. The mean climatology of near-surface temperature and precipitation is analyzed and contrasted with reanalysis, gridded observational data sets, and a traditional regional climate model (RCM)—the Weather Research and Forecasting (WRF) model. Statistical metrics for model evaluation and tests for differential significance have been extensively applied. VR-CESM tended to produce a warmer summer (by about 1–3°C) and overestimated overall winter precipitation (about 25%–35%) compared tomore » reference data sets when sea surface temperatures were prescribed. Increasing resolution from 0.25° to 0.125° did not produce a statistically significant improvement in the model results. By comparison, the analogous WRF climatology (constrained laterally and at the sea surface by ERA-Interim reanalysis) was ~1–3°C colder than the reference data sets, underestimated precipitation by ~20%–30% at 27 km resolution, and overestimated precipitation by ~ 65–85% at 9 km. Overall, VR-CESM produced comparable statistical biases to WRF in key climatological quantities. Moreover, this assessment highlights the value of variable-resolution global climate models (VRGCMs) in capturing fine-scale atmospheric processes, projecting future regional climate, and addressing the computational expense of uniform-resolution global climate models.« less
Huang, Xingying; Rhoades, Alan M.; Ullrich, Paul A.; ...
2016-03-01
In this paper, the recently developed variable-resolution option within the Community Earth System Model (VR-CESM) is assessed for long-term regional climate modeling of California at 0.25° (~ 28 km) and 0.125° (~ 14 km) horizontal resolutions. The mean climatology of near-surface temperature and precipitation is analyzed and contrasted with reanalysis, gridded observational data sets, and a traditional regional climate model (RCM)—the Weather Research and Forecasting (WRF) model. Statistical metrics for model evaluation and tests for differential significance have been extensively applied. VR-CESM tended to produce a warmer summer (by about 1–3°C) and overestimated overall winter precipitation (about 25%–35%) compared tomore » reference data sets when sea surface temperatures were prescribed. Increasing resolution from 0.25° to 0.125° did not produce a statistically significant improvement in the model results. By comparison, the analogous WRF climatology (constrained laterally and at the sea surface by ERA-Interim reanalysis) was ~1–3°C colder than the reference data sets, underestimated precipitation by ~20%–30% at 27 km resolution, and overestimated precipitation by ~ 65–85% at 9 km. Overall, VR-CESM produced comparable statistical biases to WRF in key climatological quantities. Moreover, this assessment highlights the value of variable-resolution global climate models (VRGCMs) in capturing fine-scale atmospheric processes, projecting future regional climate, and addressing the computational expense of uniform-resolution global climate models.« less
NASA Astrophysics Data System (ADS)
Kemp, E. M.; Putman, W. M.; Gurganus, J.; Burns, R. W.; Damon, M. R.; McConaughy, G. R.; Seablom, M. S.; Wojcik, G. S.
2009-12-01
We present a regional downscaling system (RDS) suitable for high-resolution weather and climate simulations in multiple supercomputing environments. The RDS is built on the NASA Workflow Tool, a software framework for configuring, running, and managing computer models on multiple platforms with a graphical user interface. The Workflow Tool is used to run the NASA Goddard Earth Observing System Model Version 5 (GEOS-5), a global atmospheric-ocean model for weather and climate simulations down to 1/4 degree resolution; the NASA Land Information System Version 6 (LIS-6), a land surface modeling system that can simulate soil temperature and moisture profiles; and the Weather Research and Forecasting (WRF) community model, a limited-area atmospheric model for weather and climate simulations down to 1-km resolution. The Workflow Tool allows users to customize model settings to user needs; saves and organizes simulation experiments; distributes model runs across different computer clusters (e.g., the DISCOVER cluster at Goddard Space Flight Center, the Cray CX-1 Desktop Supercomputer, etc.); and handles all file transfers and network communications (e.g., scp connections). Together, the RDS is intended to aid researchers by making simulations as easy as possible to generate on the computer resources available. Initial conditions for LIS-6 and GEOS-5 are provided by Modern Era Retrospective-Analysis for Research and Applications (MERRA) reanalysis data stored on DISCOVER. The LIS-6 is first run for 2-4 years forced by MERRA atmospheric analyses, generating initial conditions for the WRF soil physics. GEOS-5 is then initialized from MERRA data and run for the period of interest. Large-scale atmospheric data, sea-surface temperatures, and sea ice coverage from GEOS-5 are used as boundary conditions for WRF, which is run for the same period of interest. Multiply nested grids are used for both LIS-6 and WRF, with the innermost grid run at a resolution sufficient for typical local weather features (terrain, convection, etc.) All model runs, restarts, and file transfers are coordinated by the Workflow Tool. Two use cases are being pursued. First, the RDS generates regional climate simulations down to 4-km for the Chesapeake Bay region, with WRF output provided as input to more specialized models (e.g., ocean/lake, hydrological, marine biology, and air pollution). This will allow assessment of climate impact on local interests (e.g., changes in Bay water levels and temperatures, innundation, fish kills, etc.) Second, the RDS generates high-resolution hurricane simulations in the tropical North Atlantic. This use case will support Observing System Simulation Experiments (OSSEs) of dynamically-targeted lidar observations as part of the NASA Sensor Web Simulator project. Sample results will be presented at the AGU Fall Meeting.
NASA Technical Reports Server (NTRS)
Spar, J.; Cohen, C.; Wu, P.
1981-01-01
A coarse mesh (8 by 10) 7 layer global climate model was used to compute 15 months of meteorological history in two perpetual January experiments on a water planet (without continents) with a zonally symmetric climatological January sea surface temperature field. In the first of the two water planet experiments the initial atmospheric state was a set of zonal mean values of specific humidity, temperature, and wind at each latitude. In the second experiment the model was initialized with globally uniform mean values of specific humidity and temperature on each sigma level surface, constant surface pressure (1010 mb), and zero wind everywhere. A comparison was made of the mean January climatic states generated by the two water planet experiments. The first two months of each 15 January run were discarded, and 13 month averages were computed from months 3 through 15.
Toward GEOS-6, A Global Cloud System Resolving Atmospheric Model
NASA Technical Reports Server (NTRS)
Putman, William M.
2010-01-01
NASA is committed to observing and understanding the weather and climate of our home planet through the use of multi-scale modeling systems and space-based observations. Global climate models have evolved to take advantage of the influx of multi- and many-core computing technologies and the availability of large clusters of multi-core microprocessors. GEOS-6 is a next-generation cloud system resolving atmospheric model that will place NASA at the forefront of scientific exploration of our atmosphere and climate. Model simulations with GEOS-6 will produce a realistic representation of our atmosphere on the scale of typical satellite observations, bringing a visual comprehension of model results to a new level among the climate enthusiasts. In preparation for GEOS-6, the agency's flagship Earth System Modeling Framework [JDl] has been enhanced to support cutting-edge high-resolution global climate and weather simulations. Improvements include a cubed-sphere grid that exposes parallelism; a non-hydrostatic finite volume dynamical core, and algorithm designed for co-processor technologies, among others. GEOS-6 represents a fundamental advancement in the capability of global Earth system models. The ability to directly compare global simulations at the resolution of spaceborne satellite images will lead to algorithm improvements and better utilization of space-based observations within the GOES data assimilation system
IPSL-CM5A2. An Earth System Model designed to run long simulations for past and future climates.
NASA Astrophysics Data System (ADS)
Sepulchre, Pierre; Caubel, Arnaud; Marti, Olivier; Hourdin, Frédéric; Dufresne, Jean-Louis; Boucher, Olivier
2017-04-01
The IPSL-CM5A model was developed and released in 2013 "to study the long-term response of the climate system to natural and anthropogenic forcings as part of the 5th Phase of the Coupled Model Intercomparison Project (CMIP5)" [Dufresne et al., 2013]. Although this model also has been used for numerous paleoclimate studies, a major limitation was its computation time, which averaged 10 model-years / day on 32 cores of the Curie supercomputer (on TGCC computing center, France). Such performances were compatible with the experimental designs of intercomparison projects (e.g. CMIP, PMIP) but became limiting for modelling activities involving several multi-millenial experiments, which are typical for Quaternary or "deeptime" paleoclimate studies, in which a fully-equilibrated deep-ocean is mandatory. Here we present the Earth-System model IPSL-CM5A2. Based on IPSL-CM5A, technical developments have been performed both on separate components and on the coupling system in order to speed up the whole coupled model. These developments include the integration of hybrid parallelization MPI-OpenMP in LMDz atmospheric component, the use of a new input-ouput library to perform parallel asynchronous input/output by using computing cores as "IO servers", the use of a parallel coupling library between the ocean and the atmospheric components. Running on 304 cores, the model can now simulate 55 years per day, opening new gates towards multi-millenial simulations. Apart from obtaining better computing performances, one aim of setting up IPSL-CM5A2 was also to overcome the cold bias depicted in global surface air temperature (t2m) in IPSL-CM5A. We present the tuning strategy to overcome this bias as well as the main characteristics (including biases) of the pre-industrial climate simulated by IPSL-CM5A2. Lastly, we shortly present paleoclimate simulations run with this model, for the Holocene and for deeper timescales in the Cenozoic, for which the particular continental configuration was overcome by a new design of the ocean tripolar grid.
NASA Astrophysics Data System (ADS)
Robinson, Tyler D.; Crisp, David
2018-05-01
Solar and thermal radiation are critical aspects of planetary climate, with gradients in radiative energy fluxes driving heating and cooling. Climate models require that radiative transfer tools be versatile, computationally efficient, and accurate. Here, we describe a technique that uses an accurate full-physics radiative transfer model to generate a set of atmospheric radiative quantities which can be used to linearly adapt radiative flux profiles to changes in the atmospheric and surface state-the Linearized Flux Evolution (LiFE) approach. These radiative quantities describe how each model layer in a plane-parallel atmosphere reflects and transmits light, as well as how the layer generates diffuse radiation by thermal emission and by scattering light from the direct solar beam. By computing derivatives of these layer radiative properties with respect to dynamic elements of the atmospheric state, we can then efficiently adapt the flux profiles computed by the full-physics model to new atmospheric states. We validate the LiFE approach, and then apply this approach to Mars, Earth, and Venus, demonstrating the information contained in the layer radiative properties and their derivatives, as well as how the LiFE approach can be used to determine the thermal structure of radiative and radiative-convective equilibrium states in one-dimensional atmospheric models.
Downscaling Global Emissions and Its Implications Derived from Climate Model Experiments
Abe, Manabu; Kinoshita, Tsuguki; Hasegawa, Tomoko; Kawase, Hiroaki; Kushida, Kazuhide; Masui, Toshihiko; Oka, Kazutaka; Shiogama, Hideo; Takahashi, Kiyoshi; Tatebe, Hiroaki; Yoshikawa, Minoru
2017-01-01
In climate change research, future scenarios of greenhouse gas and air pollutant emissions generated by integrated assessment models (IAMs) are used in climate models (CMs) and earth system models to analyze future interactions and feedback between human activities and climate. However, the spatial resolutions of IAMs and CMs differ. IAMs usually disaggregate the world into 10–30 aggregated regions, whereas CMs require a grid-based spatial resolution. Therefore, downscaling emissions data from IAMs into a finer scale is necessary to input the emissions into CMs. In this study, we examined whether differences in downscaling methods significantly affect climate variables such as temperature and precipitation. We tested two downscaling methods using the same regionally aggregated sulfur emissions scenario obtained from the Asian-Pacific Integrated Model/Computable General Equilibrium (AIM/CGE) model. The downscaled emissions were fed into the Model for Interdisciplinary Research on Climate (MIROC). One of the methods assumed a strong convergence of national emissions intensity (e.g., emissions per gross domestic product), while the other was based on inertia (i.e., the base-year remained unchanged). The emissions intensities in the downscaled spatial emissions generated from the two methods markedly differed, whereas the emissions densities (emissions per area) were similar. We investigated whether the climate change projections of temperature and precipitation would significantly differ between the two methods by applying a field significance test, and found little evidence of a significant difference between the two methods. Moreover, there was no clear evidence of a difference between the climate simulations based on these two downscaling methods. PMID:28076446
Building confidence and credibility amid growing model and computing complexity
NASA Astrophysics Data System (ADS)
Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.
2017-12-01
As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahowald, Natalie
Soils in natural and managed ecosystems and wetlands are well known sources of methane, nitrous oxides, and reactive nitrogen gases, but the magnitudes of gas flux to the atmosphere are still poorly constrained. Thus, the reasons for the large increases in atmospheric concentrations of methane and nitrous oxide since the preindustrial time period are not well understood. The low atmospheric concentrations of methane and nitrous oxide, despite being more potent greenhouse gases than carbon dioxide, complicate empirical studies to provide explanations. In addition to climate concerns, the emissions of reactive nitrogen gases from soils are important to the changing nitrogenmore » balance in the earth system, subject to human management, and may change substantially in the future. Thus improved modeling of the emission fluxes of these species from the land surface is important. Currently, there are emission modules for methane and some nitrogen species in the Community Earth System Model’s Community Land Model (CLM-ME/N); however, there are large uncertainties and problems in the simulations, resulting in coarse estimates. In this proposal, we seek to improve these emission modules by combining state-of-the-art process modules for emissions, available data, and new optimization methods. In earth science problems, we often have substantial data and knowledge of processes in disparate systems, and thus we need to combine data and a general process level understanding into a model for projections of future climate that are as accurate as possible. The best methodologies for optimization of parameters in earth system models are still being developed. In this proposal we will develop and apply surrogate algorithms that a) were especially developed for computationally expensive simulations like CLM-ME/N models; b) were (in the earlier surrogate optimization Stochastic RBF) demonstrated to perform very well on computationally expensive complex partial differential equations in earth science with limited numbers of simulations; and, c) will be (as part of the proposed research) significantly improved both by adding asynchronous parallelism, early truncation of unsuccessful simulations, and the improvement of both serial and parallel performance by the use of derivative and sensitivity information from global and local surrogate approximations S(x). The algorithm development and testing will be focused on the CLM-ME/N model application, but the methods are general and are expected to also perform well on optimization for parameter estimation of other climate models and other classes of continuous multimodal optimization problems arising from complex simulation models. In addition, this proposal will compile available datasets of emissions of methane, nitrous oxides and reactive nitrogen species and develop protocols for site level comparisons with the CLM-ME/N. Once the model parameters are optimized against site level data, the model will be simulated at the global level and compared to atmospheric concentration measurements for the current climate, and future emissions will be estimated using climate change as simulated by the CESM. This proposal combines experts in earth system modeling, optimization, computer science, and process level understanding of soil gas emissions in an interdisciplinary team in order to improve the modeling of methane and nitrogen gas emissions. This proposal thus meets the requirements of the SciDAC RFP, by integrating state-of-the-art computer science and earth system to build an improved earth system model.« less
The future of climate science analysis in a coming era of exascale computing
NASA Astrophysics Data System (ADS)
Bates, S. C.; Strand, G.
2013-12-01
Projections of Community Earth System Model (CESM) output based on the growth of data archived over 2000-2012 at all of our computing sites (NCAR, NERSC, ORNL) show that we can expect to reach 1,000 PB (1 EB) sometime in the next decade or so. The current paradigms of using site-based archival systems to hold these data that are then accessed via portals or gateways, downloading the data to a local system, and then processing/analyzing the data will be irretrievably broken before then. From a climate modeling perspective, the expertise involved in making climate models themselves efficient on HPC systems will need to be applied to the data as well - providing fast parallel analysis tools co-resident in memory with the data, because the disk I/O bandwidth simply will not keep up with the expected arrival of exaflop systems. The ability of scientists, analysts, stakeholders and others to use climate model output to turn these data into understanding and knowledge will require significant advances in the current typical analysis tools and packages to enable these processes for these vast volumes of data. Allowing data users to enact their own analyses on model output is virtually a requirement as well - climate modelers cannot anticipate all the possibilities for analysis that users may want to do. In addition, the expertise of data scientists, and their knowledge of the model output and their knowledge of best practices in data management (metadata, curation, provenance and so on) will need to be rewarded and exploited to gain the most understanding possible from these volumes of data. In response to growing data size, demand, and future projections, the CESM output has undergone a structure evolution and the data management plan has been reevaluated and updated. The major evolution of the CESM data structure is presented here, along with the CESM experience and role within the CMIP3/CMIP5.
Equilibrium and Effective Climate Sensitivity
NASA Astrophysics Data System (ADS)
Rugenstein, M.; Bloch-Johnson, J.
2016-12-01
Atmosphere-ocean general circulation models, as well as the real world, take thousands of years to equilibrate to CO2 induced radiative perturbations. Equilibrium climate sensitivity - a fully equilibrated 2xCO2 perturbation - has been used for decades as a benchmark in model intercomparisons, as a test of our understanding of the climate system and paleo proxies, and to predict or project future climate change. Computational costs and limited time lead to the widespread practice of extrapolating equilibrium conditions from just a few decades of coupled simulations. The most common workaround is the "effective climate sensitivity" - defined through an extrapolation of a 150 year abrupt2xCO2 simulation, including the assumption of linear climate feedbacks. The definitions of effective and equilibrium climate sensitivity are often mixed up and used equivalently, and it is argued that "transient climate sensitivity" is the more relevant measure for predicting the next decades. We present an ongoing model intercomparison, the "LongRunMIP", to study century and millennia time scales of AOGCM equilibration and the linearity assumptions around feedback analysis. As a true ensemble of opportunity, there is no protocol and the only condition to participate is a coupled model simulation of any stabilizing scenario simulating more than 1000 years. Many of the submitted simulations took several years to conduct. As of July 2016 the contribution comprises 27 scenario simulations of 13 different models originating from 7 modeling centers, each between 1000 and 6000 years. To contribute, please contact the authors as soon as possible We present preliminary results, discussing differences between effective and equilibrium climate sensitivity, the usefulness of transient climate sensitivity, extrapolation methods, and the state of the coupled climate system close to equilibrium. Caption for the Figure below: Evolution of temperature anomaly and radiative imbalance of 22 simulations with 12 models (color indicates the model). 20 year moving average.
NASA Technical Reports Server (NTRS)
Bergstrom, Robert W.; Mlawer, Eli J.; Sokolik, Irina N.; Clough, Shepard A.; Toon, Owen B.
1998-01-01
This paper presents a radiative transfer model that has been developed to accurately predict the atmospheric radiant flux in both the infrared and the solar spectrum with a minimum of computational effort. The model is designed to be included in numerical climate models. To assess the accuracy of the model, the results are compared to other more detailed models for several standard cases in the solar and thermal spectrum. As the thermal spectrum has been treated in other publications, we focus here on the solar part of the spectrum. We perform several example calculations focussing on the question of absorption of solar radiation by gases and aerosols.
NASA Technical Reports Server (NTRS)
Bergstrom, Robert W.
1998-01-01
This paper presents a radiative transfer model that has been developed to accurately predict the atmospheric radiant flux in both the infrared and the solar spectrum with a minimum of computational effort. The model is designed to be included in numerical climate models. To assess the accuracy of the model, the results are compared to other more detailed models for several standard cases in the solar and thermal spectrum. As the thermal spectrum has been treated in other publications we focus here on the solar part of the spectrum. We perform several example calculations focussing on the question of absorption of solar radiation by gases and aerosols.
Web-GIS platform for monitoring and forecasting of regional climate and ecological changes
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.
2012-12-01
Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.
Applying an economical scale-aware PDF-based turbulence closure model in NOAA NCEP GCMs
NASA Astrophysics Data System (ADS)
Belochitski, A.; Krueger, S. K.; Moorthi, S.; Bogenschutz, P.; Pincus, R.
2016-12-01
A novel unified representation of sub-grid scale (SGS) turbulence, cloudiness, and shallow convection is being implemented into the NOAA NCEP Global Forecasting System (GFS) general circulation model. The approach, known as Simplified High Order Closure (SHOC), is based on predicting a joint PDF of SGS thermodynamic variables and vertical velocity and using it to diagnose turbulent diffusion coefficients, SGS fluxes, condensation and cloudiness. Unlike other similar methods, only one new prognostic variable, turbulent kinetic energy (TKE), needs to be intoduced, making the technique computationally efficient.SHOC is now incorporated into a version of GFS, as well as into the next generation of the NCEP global model - NOAA Environmental Modeling System (NEMS). Turbulent diffusion coefficients computed by SHOC are now used in place of those produced by the boundary layer turbulence and shallow convection parameterizations. Large scale microphysics scheme is no longer used to calculate cloud fraction or the large-scale condensation/deposition. Instead, SHOC provides these variables. Radiative transfer parameterization uses cloudiness computed by SHOC.Outstanding problems include high level tropical cloud fraction being too high in SHOC runs, possibly related to the interaction of SHOC with condensate detrained from deep convection.Future work will consist of evaluating model performance and tuning the physics if necessary, by performing medium-range NWP forecasts with prescribed initial conditions, and AMIP-type climate tests with prescribed SSTs. Depending on the results, the model will be tuned or parameterizations modified. Next, SHOC will be implemented in the NCEP CFS, and tuned and evaluated for climate applications - seasonal prediction and long coupled climate runs. Impact of new physics on ENSO, MJO, ISO, monsoon variability, etc will be examined.
Application of web-GIS approach for climate change study
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Bogomolov, Vasily; Martynova, Yuliya; Shulgina, Tamara
2013-04-01
Georeferenced datasets are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. It is based on OGC standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. The main advantage of the system lies in a possibility to perform mathematical and statistical data analysis, graphical visualization of results with GIS-functionality, and to prepare binary output files with just only a modern graphical web-browser installed on a common desktop computer connected to Internet. Several geophysical datasets represented by two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others are available for processing by the system. And this list is extending. Also a functionality to run WRF and "Planet simulator" models was implemented in the system. Due to many preset parameters and limited time and spatial ranges set in the system these models have low computational power requirements and could be used in educational workflow for better understanding of basic climatological and meteorological processes. The Web-GIS information-computational system for geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified web-interface in a common graphical web-browser. This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.
Generating synthetic wave climates for coastal modelling: a linear mixed modelling approach
NASA Astrophysics Data System (ADS)
Thomas, C.; Lark, R. M.
2013-12-01
Numerical coastline morphological evolution models require wave climate properties to drive morphological change through time. Wave climate properties (typically wave height, period and direction) may be temporally fixed, culled from real wave buoy data, or allowed to vary in some way defined by a Gaussian or other pdf. However, to examine sensitivity of coastline morphologies to wave climate change, it seems desirable to be able to modify wave climate time series from a current to some new state along a trajectory, but in a way consistent with, or initially conditioned by, the properties of existing data, or to generate fully synthetic data sets with realistic time series properties. For example, mean or significant wave height time series may have underlying periodicities, as revealed in numerous analyses of wave data. Our motivation is to develop a simple methodology to generate synthetic wave climate time series that can change in some stochastic way through time. We wish to use such time series in a coastline evolution model to test sensitivities of coastal landforms to changes in wave climate over decadal and centennial scales. We have worked initially on time series of significant wave height, based on data from a Waverider III buoy located off the coast of Yorkshire, England. The statistical framework for the simulation is the linear mixed model. The target variable, perhaps after transformation (Box-Cox), is modelled as a multivariate Gaussian, the mean modelled as a function of a fixed effect, and two random components, one of which is independently and identically distributed (iid) and the second of which is temporally correlated. The model was fitted to the data by likelihood methods. We considered the option of a periodic mean, the period either fixed (e.g. at 12 months) or estimated from the data. We considered two possible correlation structures for the second random effect. In one the correlation decays exponentially with time. In the second (spherical) model, it cuts off at a temporal range. Having fitted the model, multiple realisations were generated; the random effects were simulated by specifying a covariance matrix for the simulated values, with the estimated parameters. The Cholesky factorisation of the covariance matrix was computed and realizations of the random component of the model generated by pre-multiplying a vector of iid standard Gaussian variables by the lower triangular factor. The resulting random variate was added to the mean value computed from the fixed effects, and the result back-transformed to the original scale of the measurement. Realistic simulations result from approach described above. Background exploratory data analysis was undertaken on 20-day sets of 30-minute buoy data, selected from days 5-24 of months January, April, July, October, 2011, to elucidate daily to weekly variations, and to keep numerical analysis tractable computationally. Work remains to be undertaken to develop suitable models for synthetic directional data. We suggest that the general principles of the method will have applications in other geomorphological modelling endeavours requiring time series of stochastically variable environmental parameters.
NASA Technical Reports Server (NTRS)
Lucarini, Valerio; Russell, Gary L.; Hansen, James E. (Technical Monitor)
2002-01-01
Results are presented for two greenhouse gas experiments of the Goddard Institute for Space Studies Atmosphere-Ocean Model (AOM). The computed trends of surface pressure, surface temperature, 850, 500 and 200 mb geopotential heights and related temperatures of the model for the time frame 1960-2000 are compared to those obtained from the National Centers for Environmental Prediction observations. A spatial correlation analysis and mean value comparison are performed, showing good agreement. A brief general discussion about the statistics of trend detection is presented. The domain of interest is the Northern Hemisphere (NH) because of the higher reliability of both the model results and the observations. The accuracy that this AOM has in describing the observed regional and NH climate trends makes it reliable in forecasting future climate changes.
ARM Data-Oriented Metrics and Diagnostics Package for Climate Model Evaluation Value-Added Product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chengzhu; Xie, Shaocheng
A Python-based metrics and diagnostics package is currently being developed by the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Infrastructure Team at Lawrence Livermore National Laboratory (LLNL) to facilitate the use of long-term, high-frequency measurements from the ARM Facility in evaluating the regional climate simulation of clouds, radiation, and precipitation. This metrics and diagnostics package computes climatological means of targeted climate model simulation and generates tables and plots for comparing the model simulation with ARM observational data. The Coupled Model Intercomparison Project (CMIP) model data sets are also included in the package to enable model intercomparison as demonstratedmore » in Zhang et al. (2017). The mean of the CMIP model can serve as a reference for individual models. Basic performance metrics are computed to measure the accuracy of mean state and variability of climate models. The evaluated physical quantities include cloud fraction, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, and radiative fluxes, with plan to extend to more fields, such as aerosol and microphysics properties. Process-oriented diagnostics focusing on individual cloud- and precipitation-related phenomena are also being developed for the evaluation and development of specific model physical parameterizations. The version 1.0 package is designed based on data collected at ARM’s Southern Great Plains (SGP) Research Facility, with the plan to extend to other ARM sites. The metrics and diagnostics package is currently built upon standard Python libraries and additional Python packages developed by DOE (such as CDMS and CDAT). The ARM metrics and diagnostic package is available publicly with the hope that it can serve as an easy entry point for climate modelers to compare their models with ARM data. In this report, we first present the input data, which constitutes the core content of the metrics and diagnostics package in section 2, and a user's guide documenting the workflow/structure of the version 1.0 codes, and including step-by-step instruction for running the package in section 3.« less
NASA Astrophysics Data System (ADS)
Sargsyan, K.; Safta, C.; Debusschere, B.; Najm, H.
2010-12-01
Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of discontinuous model data with adjustable sharpness and structure. This work was supported by the Sandia National Laboratories Seniors’ Council LDRD (Laboratory Directed Research and Development) program. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Company, for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.
Numerical modelling of climate change impacts on freshwater lenses on the North Sea Island of Borkum
NASA Astrophysics Data System (ADS)
Sulzbacher, H.; Wiederhold, H.; Siemon, B.; Grinat, M.; Igel, J.; Burschil, T.; Günther, T.; Hinsby, K.
2012-03-01
A numerical variable-density groundwater model is set up for the North Sea Island of Borkum to estimate climate change impacts on coastal aquifers and especially the situation of barrier islands in the Wadden Sea. The database includes information from boreholes, a seismic survey, a helicopter-borne electromagnetic survey (HEM), monitoring of the freshwater-saltwater boundary by vertical electrode chains in two boreholes, measurements of groundwater table, pumping and slug tests, as well as water samples. Based on a statistical analysis of borehole columns, seismic sections and HEM, a hydrogeological model is set up. The groundwater model is developed using the finite-element programme FEFLOW. The variable-density groundwater model is calibrated on the basis of hydraulic, hydrological and geophysical data, in particular spatial HEM and local monitoring data. Verification runs with the calibrated model show good agreement between measured and computed hydraulic heads. A good agreement is also obtained between measured and computed density or total dissolved solids data for both the entire freshwater lens on a large scale and in the area of the well fields on a small scale. For simulating future changes in this coastal groundwater system until the end of the current century we use the climate scenario A2, specified by the Intergovernmental Panel on Climate Change and in particular the data for the German North Sea coast. Simulation runs show proceeding salinization with time beneath the well fields of the two waterworks Waterdelle and Ostland. The modelling study shows that spreading of well fields is an appropriate protection measure against excessive salinization of the water supply until the end of the current century.
NASA Astrophysics Data System (ADS)
Sulzbacher, H.; Wiederhold, H.; Siemon, B.; Grinat, M.; Igel, J.; Burschil, T.; Günther, T.; Hinsby, K.
2012-10-01
A numerical, density dependent groundwater model is set up for the North Sea Island of Borkum to estimate climate change impacts on coastal aquifers and especially the situation of barrier islands in the Wadden Sea. The database includes information from boreholes, a seismic survey, a helicopter-borne electromagnetic (HEM) survey, monitoring of the freshwater-saltwater boundary by vertical electrode chains in two boreholes, measurements of groundwater table, pumping and slug tests, as well as water samples. Based on a statistical analysis of borehole columns, seismic sections and HEM, a hydrogeological model is set up. The groundwater model is developed using the finite-element programme FEFLOW. The density dependent groundwater model is calibrated on the basis of hydraulic, hydrological and geophysical data, in particular spatial HEM and local monitoring data. Verification runs with the calibrated model show good agreement between measured and computed hydraulic heads. A good agreement is also obtained between measured and computed density or total dissolved solids data for both the entire freshwater lens on a large scale and in the area of the well fields on a small scale. For simulating future changes in this coastal groundwater system until the end of the current century, we use the climate scenario A2, specified by the Intergovernmental Panel on Climate Change and, in particular, the data for the German North Sea coast. Simulation runs show proceeding salinisation with time beneath the well fields of the two waterworks Waterdelle and Ostland. The modelling study shows that the spreading of well fields is an appropriate protection measure against excessive salinisation of the water supply until the end of the current century.
Sparse Polynomial Chaos Surrogate for ACME Land Model via Iterative Bayesian Compressive Sensing
NASA Astrophysics Data System (ADS)
Sargsyan, K.; Ricciuto, D. M.; Safta, C.; Debusschere, B.; Najm, H. N.; Thornton, P. E.
2015-12-01
For computationally expensive climate models, Monte-Carlo approaches of exploring the input parameter space are often prohibitive due to slow convergence with respect to ensemble size. To alleviate this, we build inexpensive surrogates using uncertainty quantification (UQ) methods employing Polynomial Chaos (PC) expansions that approximate the input-output relationships using as few model evaluations as possible. However, when many uncertain input parameters are present, such UQ studies suffer from the curse of dimensionality. In particular, for 50-100 input parameters non-adaptive PC representations have infeasible numbers of basis terms. To this end, we develop and employ Weighted Iterative Bayesian Compressive Sensing to learn the most important input parameter relationships for efficient, sparse PC surrogate construction with posterior uncertainty quantified due to insufficient data. Besides drastic dimensionality reduction, the uncertain surrogate can efficiently replace the model in computationally intensive studies such as forward uncertainty propagation and variance-based sensitivity analysis, as well as design optimization and parameter estimation using observational data. We applied the surrogate construction and variance-based uncertainty decomposition to Accelerated Climate Model for Energy (ACME) Land Model for several output QoIs at nearly 100 FLUXNET sites covering multiple plant functional types and climates, varying 65 input parameters over broad ranges of possible values. This work is supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research, Accelerated Climate Modeling for Energy (ACME) project. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
A New High Resolution Climate Dataset for Climate Change Impacts Assessments in New England
NASA Astrophysics Data System (ADS)
Komurcu, M.; Huber, M.
2016-12-01
Assessing regional impacts of climate change (such as changes in extreme events, land surface hydrology, water resources, energy, ecosystems and economy) requires much higher resolution climate variables than those available from global model projections. While it is possible to run global models in higher resolution, the high computational cost associated with these simulations prevent their use in such manner. To alleviate this problem, dynamical downscaling offers a method to deliver higher resolution climate variables. As part of an NSF EPSCoR funded interdisciplinary effort to assess climate change impacts on New Hampshire ecosystems, hydrology and economy (the New Hampshire Ecosystems and Society project), we create a unique high-resolution climate dataset for New England. We dynamically downscale global model projections under a high impact emissions scenario using the Weather Research and Forecasting model (WRF) with three nested grids of 27, 9 and 3 km horizontal resolution with the highest resolution innermost grid focusing over New England. We prefer dynamical downscaling over other methods such as statistical downscaling because it employs physical equations to progressively simulate climate variables as atmospheric processes interact with surface processes, emissions, radiation, clouds, precipitation and other model components, hence eliminates fix relationships between variables. In addition to simulating mean changes in regional climate, dynamical downscaling also allows for the simulation of climate extremes that significantly alter climate change impacts. We simulate three time slices: 2006-2015, 2040-2060 and 2080-2100. This new high-resolution climate dataset (with more than 200 variables saved in hourly (six hourly) intervals for the highest resolution domain (outer two domains)) along with model input and restart files used in our WRF simulations will be publicly available for use to the broader scientific community to support in-depth climate change impacts assessments for New England. We present results focusing on future changes in New England extreme events.
Web-GIS approach for integrated analysis of heterogeneous georeferenced data
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara
2014-05-01
Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring atmospheric composition and climate (MACC) Collaborated Project, NOAA-CIRES Twentieth Century Global Reanalysis Version II, NCEP Climate Forecast System Reanalysis (CFSR), meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others are available for processing by the system. The Web-GIS information-computational system for heterogeneous geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for integrated research of climate and ecosystems changes on global and regional scales. With its help even an unskilled in programming user is able to process and visualize multidimensional observational and model data through unified web-interface using a common graphical web-browser. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grant #13-05-12034, grant #14-05-00502, and integrated project SB RAS #131. References 1. Gordov E.P., Lykosov V.N., Krupchatnikov V.N., Okladnikov I.G., Titov A.G., Shulgina T.M. Computational and information technologies for monitoring and modeling of climate changes and their consequences. - Novosibirsk: Nauka, Siberian branch, 2013. - 195 p. (in Russian) 2. Felice Frankel, Rosalind Reid. Big data: Distilling meaning from data // Nature. Vol. 455. N. 7209. P. 30. 3. T.M. Shulgina, E.P. Gordov, I.G. Okladnikov, A.G., Titov, E.Yu. Genina, N.P. Gorbatenko, I.V. Kuzhevskaya, A.S. Akhmetshina. Software complex for a regional climate change analysis. // Vestnik NGU. Series: Information technologies. 2013. Vol. 11. Issue 1. P. 124-131 (in Russian). 4. I.G. Okladnikov, A.G. Titov, T.M. Shulgina, E.P. Gordov, V.Yu. Bogomolov, Yu.V. Martynova, S.P. Suschenko, A.V. Skvortsov. Software for analysis and visualization of climate change monitoring and forecasting data // Numerical methods and programming, 2013. Vol. 14. P. 123-131 (in Russian).
NASA Astrophysics Data System (ADS)
Scheibe, T. D.; Yang, X.; Song, X.; Chen, X.; Hammond, G. E.; Song, H. S.; Hou, Z.; Murray, C. J.; Tartakovsky, A. M.; Tartakovsky, G.; Yang, X.; Zachara, J. M.
2016-12-01
Drought-related tree mortality at a regional scale causes drastic shifts in carbon and water cycling in Southeast Asian tropical rainforests, where severe droughts are projected to occur more frequently, especially under El Niño conditions. To provide a useful tool for projecting the tropical rainforest dynamics under climate change conditions, we developed the Spatially Explicit Individual-Based (SEIB) Dynamic Global Vegetation Model (DGVM) applicable to simulating mechanistic tree mortality induced by the climatic impacts via individual-tree-scale ecophysiology such as hydraulic failure and carbon starvation. In this study, we present the new model, SEIB-originated Terrestrial Ecosystem Dynamics (S-TEDy) model, and the computation results were compared with observations collected at a field site in a Bornean tropical rainforest. Furthermore, after validating the model's performance, numerical experiments addressing a future of the tropical rainforest were conducted using some global climate model (GCM) simulation outputs.
NASA Astrophysics Data System (ADS)
Samuels, Rana
Water issues are a source of tension between Israelis and Palestinians. In the and region of the Middle East, water supply is not just scarce but also uncertain: It is not uncommon for annual rainfall to be as little as 60% or as much as 125% of the multiannual average. This combination of scarcity and uncertainty exacerbates the already strained economy and the already tensed political situation. The uncertainty could be alleviated if it were possible to better forecast water availability. Such forecasting is key not only for water planning and management, but also for economic policy and for political decision making. Water forecasts at multiple time scales are necessary for crop choice, aquifer operation and investments in desalination infrastructure. The unequivocal warming of the climate system adds another level of uncertainty as global and regional water cycles change. This makes the prediction of water availability an even greater challenge. Understanding the impact of climate change on precipitation can provide the information necessary for appropriate risk assessment and water planning. Unfortunately, current global circulation models (GCMs) are only able to predict long term climatic evolution at large scales but not local rainfall. The statistics of local precipitation are traditionally predicted using historical rainfall data. Obviously these data cannot anticipate changes that result from climate change. It is therefore clear that integration of the global information about climate evolution and local historical data is needed to provide the much needed predictions of regional water availability. Currently, there is no theoretical or computational framework that enables such integration for this region. In this dissertation both a conceptual framework and a computational platform for such integration are introduced. In particular, suite of models that link forecasts of climatic evolution under different CO2 emissions scenarios to observed rainfall data from local stations are developed. These are used to develop scenarios for local rainfall statistics such as average annual amounts, dry spells, wet spells and drought persistence. This suite of models can provide information that is not attainable from existing tools in terms of its spatial and temporal resolution. Specifically, the goal is to project the impact of established global climate change scenarios in this region and, how much of the change might be mitigated by proposed CO2 reduction strategies. A major problem in this enterprise is to find the best way to integrate global climatic information with local rainfall data. From the climatologic perspective the problem is to find the right teleconnections. That is, non local or global measurable phenomena that influence local rainfall in a way that could be characterized and quantified statistically. From the computational perspective the challenge is to model these subtle, nonlinear relationships and to downscale the global effects into local predictions. Climate simulations to the year 2100 under selected climate change scenarios are used. Overall, the suite of models developed and presented can be applied to answer most questions from the different water users and planners. Farmers and the irrigation community can ask "What is the probability of rain over the next week?" Policy makers can ask "How much desalination capacity will I need to meet demand 90% of the time in the climate change scenario over the next 20 years?" Aquifer managers can ask "What is the expected recharge rate of the aquifers over the next decade?" The use of climate driven answers to these questions will help the region better prepare and adapt to future shifts in water resources and availability.
Validation of catchment models for predicting land-use and climate change impacts. 1. Method
NASA Astrophysics Data System (ADS)
Ewen, J.; Parkin, G.
1996-02-01
Computer simulation models are increasingly being proposed as tools capable of giving water resource managers accurate predictions of the impact of changes in land-use and climate. Previous validation testing of catchment models is reviewed, and it is concluded that the methods used do not clearly test a model's fitness for such a purpose. A new generally applicable method is proposed. This involves the direct testing of fitness for purpose, uses established scientific techniques, and may be implemented within a quality assured programme of work. The new method is applied in Part 2 of this study (Parkin et al., J. Hydrol., 175:595-613, 1996).
Forecasting conditional climate-change using a hybrid approach
Esfahani, Akbar Akbari; Friedel, Michael J.
2014-01-01
A novel approach is proposed to forecast the likelihood of climate-change across spatial landscape gradients. This hybrid approach involves reconstructing past precipitation and temperature using the self-organizing map technique; determining quantile trends in the climate-change variables by quantile regression modeling; and computing conditional forecasts of climate-change variables based on self-similarity in quantile trends using the fractionally differenced auto-regressive integrated moving average technique. The proposed modeling approach is applied to states (Arizona, California, Colorado, Nevada, New Mexico, and Utah) in the southwestern U.S., where conditional forecasts of climate-change variables are evaluated against recent (2012) observations, evaluated at a future time period (2030), and evaluated as future trends (2009–2059). These results have broad economic, political, and social implications because they quantify uncertainty in climate-change forecasts affecting various sectors of society. Another benefit of the proposed hybrid approach is that it can be extended to any spatiotemporal scale providing self-similarity exists.
CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2015-12-01
Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.
Teleconnections in the Presence of Climate Change: A Case Study of the Annular Modes
NASA Astrophysics Data System (ADS)
Gerber, Edwin; Baldwin, Mark
2010-05-01
Long model integrations of future and past climates present a problem for defining teleconnection patterns through Empirical Orthogonal Function (EOF) or correlation analysis when trends in the underlying climate begin to dominate the covariance structure. Similar issues may soon appear in observations as the record becomes longer, especially if climate trends accelerate. The Northern and Southern Annular Modes provide a prime example, because the poleward shift of the jet streams strongly projects onto these patterns, particularly in the Southern Hemisphere. Climate forecasts of the 21st century by chemistry climate models provide a case study. Computation of the annular modes in these long data sets with secular trends requires refinement of the standard definition of the annular mode, and a more robust procedure that allows for slowly varying trends is established and verified. The new procedure involves two key changes. First, the global mean geopotential height is removed at each time step before computing anomalies. This is particularly important high in the atmosphere, where seasonal variations in geopotential height become significant, and filters out trends due to changes in the temperature structure of the atmosphere. Pattern definition can be very sensitive near the tropopause, as regions of the atmosphere that used to be more of stratospheric character begin to take on tropospheric characteristics as the tropopause rises. The second change is to define anomalies relative to a slowly evolving seasonal climatology, so that the covariance structure reflects internal variability. Once these changes are accounted for, it is found that the zonal mean variability of the atmosphere stays remarkably constant, despite significant changes in the baseline climate forecast for the rest of the century. This stability of the internal variability makes it possible to relate trends in climate to teleconnections.
NASA Astrophysics Data System (ADS)
Biercamp, Joachim; Adamidis, Panagiotis; Neumann, Philipp
2017-04-01
With the exa-scale era approaching, length and time scales used for climate research on one hand and numerical weather prediction on the other hand blend into each other. The Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) represents a European consortium comprising partners from climate, weather and HPC in their effort to address key scientific challenges that both communities have in common. A particular challenge is to reach global models with spatial resolutions that allow simulating convective clouds and small-scale ocean eddies. These simulations would produce better predictions of trends and provide much more fidelity in the representation of high-impact regional events. However, running such models in operational mode, i.e with sufficient throughput in ensemble mode clearly will require exa-scale computing and data handling capability. We will discuss the ESiWACE initiative and relate it to work-in-progress on high-resolution simulations in Europe. We present recent strong scalability measurements from ESiWACE to demonstrate current computability in weather and climate simulation. A special focus in this particular talk is on the Icosahedal Nonhydrostatic (ICON) model used for a comparison of high resolution regional and global simulations with high quality observation data. We demonstrate that close-to-optimal parallel efficiency can be achieved in strong scaling global resolution experiments on Mistral/DKRZ, e.g. 94% for 5km resolution simulations using 36k cores on Mistral/DKRZ. Based on our scalability and high-resolution experiments, we deduce and extrapolate future capabilities for ICON that are expected for weather and climate research at exascale.
Agricultural production and water use scenarios in Cyprus under global change
NASA Astrophysics Data System (ADS)
Bruggeman, Adriana; Zoumides, Christos; Camera, Corrado; Pashiardis, Stelios; Zomeni, Zomenia
2014-05-01
In many countries of the world, food demand exceeds the total agricultural production. In semi-arid countries, agricultural water demand often also exceeds the sustainable supply of water resources. These water-stressed countries are expected to become even drier, as a result of global climate change. This will have a significant impact on the future of the agricultural sector and on food security. The aim of the AGWATER project consortium is to provide recommendations for climate change adaptation for the agricultural sector in Cyprus and the wider Mediterranean region. Gridded climate data sets, with 1-km horizontal resolution were prepared for Cyprus for 1980-2010. Regional Climate Model results were statistically downscaled, with the help of spatial weather generators. A new soil map was prepared using a predictive modelling and mapping technique and a large spatial database with soil and environmental parameters. Stakeholder meetings with agriculture and water stakeholders were held to develop future water prices, based on energy scenarios and to identify climate resilient production systems. Green houses, including also hydroponic systems, grapes, potatoes, cactus pears and carob trees were the more frequently identified production systems. The green-blue-water model, based on the FAO-56 dual crop coefficient approach, has been set up to compute agricultural water demand and yields for all crop fields in Cyprus under selected future scenarios. A set of agricultural production and water use performance indicators are computed by the model, including green and blue water use, crop yield, crop water productivity, net value of crop production and economic water productivity. This work is part of the AGWATER project - AEIFORIA/GEOGRO/0311(BIE)/06 - co-financed by the European Regional Development Fund and the Republic of Cyprus through the Research Promotion Foundation.
Global Warming, Africa and National Security
2008-01-15
African populations. This includes awareness from a global perspective in line with The Army Strategy for the Environment, the UN’s Intergovernmental...2 attention. At the time, computer models did not indicate a significant issue with global warming suggesting only a modest increase of 2°C9...projected climate changes. Current Science The science surrounding climate change and global warming was, until recently, a point of
Educational process in modern climatology within the web-GIS platform "Climate"
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Gorbatenko, Valentina; Gordov, Evgeny; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
These days, common to all scientific fields the problem of training of scientists in the environmental sciences is exacerbated by the need to develop new computational and information technology skills in distributed multi-disciplinary teams. To address this and other pressing problems of Earth system sciences, software infrastructure for information support of integrated research in the geosciences was created based on modern information and computational technologies and a software and hardware platform "Climate» (http://climate.scert.ru/) was developed. In addition to the direct analysis of geophysical data archives, the platform is aimed at teaching the basics of the study of changes in regional climate. The educational component of the platform includes a series of lectures on climate, environmental and meteorological modeling and laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example. The educational process within the Platform is implemented using the distance learning system Moodle (www.moodle.org). This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.
NASA Astrophysics Data System (ADS)
Hakkarinen, C.; Brown, D.; Callahan, J.; hankin, S.; de Koningh, M.; Middleton-Link, D.; Wigley, T.
2001-05-01
A Web-based access system to climate model output data sets for intercomparison and analysis has been produced, using the NOAA-PMEL developed Live Access Server software as host server and Ferret as the data serving and visualization engine. Called ARCAS ("ACACIA Regional Climate-data Access System"), and publicly accessible at http://dataserver.ucar.edu/arcas, the site currently serves climate model outputs from runs of the NCAR Climate System Model for the 21st century, for Business as Usual and Stabilization of Greenhouse Gas Emission scenarios. Users can select, download, and graphically display single variables or comparisons of two variables from either or both of the CSM model runs, averaged for monthly, seasonal, or annual time resolutions. The time length of the averaging period, and the geographical domain for download and display, are fully selectable by the user. A variety of arithmetic operations on the data variables can be computed "on-the-fly", as defined by the user. Expansions of the user-selectable options for defining analysis options, and for accessing other DOD-compatible ("Distributed Ocean Data System-compatible") data sets, residing at locations other than the NCAR hardware server on which ARCAS operates, are planned for this year. These expansions are designed to allow users quick and easy-to-operate web-based access to the largest possible selection of climate model output data sets available throughout the world.
NASA Astrophysics Data System (ADS)
Li, Y.; Kurkute, S.; Chen, L.
2017-12-01
Results from the General Circulation Models (GCMs) suggest more frequent and more severe extreme rain events in a climate warmer than the present. However, current GCMs cannot accurately simulate extreme rainfall events of short duration due to their coarse model resolutions and parameterizations. This limitation makes it difficult to provide the detailed quantitative information for the development of regional adaptation and mitigation strategies. Dynamical downscaling using nested Regional Climate Models (RCMs) are able to capture key regional and local climate processes with an affordable computational cost. Recent studies have demonstrated that the downscaling of GCM results with weather-permitting mesoscale models, such as the pseudo-global warming (PGW) technique, could be a viable and economical approach of obtaining valuable climate change information on regional scales. We have conducted a regional climate 4-km Weather Research and Forecast Model (WRF) simulation with one domain covering the whole western Canada, for a historic run (2000-2015) and a 15-year future run to 2100 and beyond with the PGW forcing. The 4-km resolution allows direct use of microphysics and resolves the convection explicitly, thus providing very convincing spatial detail. With this high-resolution simulation, we are able to study the convective mechanisms, specifically the control of convections over the Prairies, the projected changes of rainfall regimes, and the shift of the convective mechanisms in a warming climate, which has never been examined before numerically at such large scale with such high resolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huerta, Gabriel
The objective of the project is to develop strategies for better representing scientific sensibilities within statistical measures of model skill that then can be used within a Bayesian statistical framework for data-driven climate model development and improved measures of model scientific uncertainty. One of the thorny issues in model evaluation is quantifying the effect of biases on climate projections. While any bias is not desirable, only those biases that affect feedbacks affect scatter in climate projections. The effort at the University of Texas is to analyze previously calculated ensembles of CAM3.1 with perturbed parameters to discover how biases affect projectionsmore » of global warming. The hypothesis is that compensating errors in the control model can be identified by their effect on a combination of processes and that developing metrics that are sensitive to dependencies among state variables would provide a way to select version of climate models that may reduce scatter in climate projections. Gabriel Huerta at the University of New Mexico is responsible for developing statistical methods for evaluating these field dependencies. The UT effort will incorporate these developments into MECS, which is a set of python scripts being developed at the University of Texas for managing the workflow associated with data-driven climate model development over HPC resources. This report reflects the main activities at the University of New Mexico where the PI (Huerta) and the Postdocs (Nosedal, Hattab and Karki) worked on the project.« less
NASA Astrophysics Data System (ADS)
Bader, D. C.
2015-12-01
The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.
NASA Technical Reports Server (NTRS)
Monteleoni, Claire; Schmidt, Gavin A.; Alexander, Francis J.; Niculescu-Mizil, Alexandru; Steinhaeuser, Karsten; Tippett, Michael; Banerjee, Arindam; Blumenthal, M. Benno; Ganguly, Auroop R.; Smerdon, Jason E.;
2013-01-01
The impacts of present and potential future climate change will be one of the most important scientific and societal challenges in the 21st century. Given observed changes in temperature, sea ice, and sea level, improving our understanding of the climate system is an international priority. This system is characterized by complex phenomena that are imperfectly observed and even more imperfectly simulated. But with an ever-growing supply of climate data from satellites and environmental sensors, the magnitude of data and climate model output is beginning to overwhelm the relatively simple tools currently used to analyze them. A computational approach will therefore be indispensable for these analysis challenges. This chapter introduces the fledgling research discipline climate informatics: collaborations between climate scientists and machine learning researchers in order to bridge this gap between data and understanding. We hope that the study of climate informatics will accelerate discovery in answering pressing questions in climate science.
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; Forbes, C.; Roehrig, G.; Chandler, M. A.
2017-12-01
Promoting climate literacy among in-service science teachers necessitates an understanding of fundamental concepts about the Earth's climate System (USGCRP, 2009). Very few teachers report having any formal instruction in climate science (Plutzer et al., 2016), therefore, rather simple conceptions of climate systems and their variability exist, which has implications for students' science learning (Francies et al., 1993; Libarkin, 2005; Rebich, 2005). This study uses the inferences from a NASA Innovations in Climate Education (NICE) teacher professional development program (CYCLES) to establish the necessity for developing an epistemological perspective among teachers. In CYCLES, 19 middle and high school (male=8, female=11) teachers were assessed for their understanding of global climate change (GCC). A qualitative analysis of their concept maps and an alignment of their conceptions with the Essential Principles of Climate Literacy (NOAA, 2009) demonstrated that participants emphasized on EPCL 1, 3, 6, 7 focusing on the Earth system, atmospheric, social and ecological impacts of GCC. However, EPCL 4 (variability in climate) and 5 (data-based observations and modeling) were least represented and emphasized upon. Thus, participants' descriptions about global climatic patterns were often factual rather than incorporating causation (why the temperatures are increasing) and/or correlation (describing what other factors might influence global temperatures). Therefore, engaging with epistemic dimensions of climate science to understand the processes, tools, and norms through which climate scientists study the Earth's climate system (Huxter et al., 2013) is critical for developing an in-depth conceptual understanding of climate. CLiMES (Climate Modeling and Epistemology of Science), a NSF initiative proposes to use EzGCM (EzGlobal Climate Model) to engage students and teachers in designing and running simulations, performing data processing activities, and analyzing computational models to develop their own evidence-based claims about the Earth's climate system. We describe how epistemological investigations can be conducted using EzGCM to bring the scientific process and authentic climate science practice to middle and high school classrooms.
NASA Technical Reports Server (NTRS)
Elshorbany, Yasin F.; Duncan, Bryan N.; Strode, Sarah A.; Wang, James S.; Kouatchou, Jules
2015-01-01
We present the Efficient CH4-CO-OH Module (ECCOH) that allows for the simulation of the methane, carbon monoxide and hydroxyl radical (CH4-CO-OH cycle, within a chemistry climate model, carbon cycle model, or earth system model. The computational efficiency of the module allows many multi-decadal, sensitivity simulations of the CH4-CO-OH cycle, which primarily determines the global tropospheric oxidizing capacity. This capability is important for capturing the nonlinear feedbacks of the CH4-CO-OH system and understanding the perturbations to relatively long-lived methane and the concomitant impacts on climate. We implemented the ECCOH module into the NASA GEOS-5 Atmospheric Global Circulation Model (AGCM), performed multiple sensitivity simulations of the CH4-CO-OH system over two decades, and evaluated the model output with surface and satellite datasets of methane and CO. The favorable comparison of output from the ECCOH module (as configured in the GEOS-5 AGCM) with observations demonstrates the fidelity of the module for use in scientific research.
NASA Technical Reports Server (NTRS)
Elshorbany, Yasin F.; Duncan, Bryan N.; Strode, Sarah A.; Wang, James S.; Kouatchou, Jules
2016-01-01
We present the Efficient CH4-CO-OH (ECCOH) chemistry module that allows for the simulation of the methane, carbon monoxide, and hydroxyl radical (CH4-CO- OH) system, within a chemistry climate model, carbon cycle model, or Earth system model. The computational efficiency of the module allows many multi-decadal sensitivity simulations of the CH4-CO-OH system, which primarily determines the global atmospheric oxidizing capacity. This capability is important for capturing the nonlinear feedbacks of the CH4-CO-OH system and understanding the perturbations to methane, CO, and OH, and the concomitant impacts on climate. We implemented the ECCOH chemistry module in the NASA GEOS-5 atmospheric global circulation model (AGCM), performed multiple sensitivity simulations of the CH4-CO-OH system over 2 decades, and evaluated the model output with surface and satellite data sets of methane and CO. The favorable comparison of output from the ECCOH chemistry module (as configured in the GEOS- 5 AGCM) with observations demonstrates the fidelity of the module for use in scientific research.
A Bayesian Approach to Evaluating Consistency between Climate Model Output and Observations
NASA Astrophysics Data System (ADS)
Braverman, A. J.; Cressie, N.; Teixeira, J.
2010-12-01
Like other scientific and engineering problems that involve physical modeling of complex systems, climate models can be evaluated and diagnosed by comparing their output to observations of similar quantities. Though the global remote sensing data record is relatively short by climate research standards, these data offer opportunities to evaluate model predictions in new ways. For example, remote sensing data are spatially and temporally dense enough to provide distributional information that goes beyond simple moments to allow quantification of temporal and spatial dependence structures. In this talk, we propose a new method for exploiting these rich data sets using a Bayesian paradigm. For a collection of climate models, we calculate posterior probabilities its members best represent the physical system each seeks to reproduce. The posterior probability is based on the likelihood that a chosen summary statistic, computed from observations, would be obtained when the model's output is considered as a realization from a stochastic process. By exploring how posterior probabilities change with different statistics, we may paint a more quantitative and complete picture of the strengths and weaknesses of the models relative to the observations. We demonstrate our method using model output from the CMIP archive, and observations from NASA's Atmospheric Infrared Sounder.
NASA Technical Reports Server (NTRS)
Hope, W. W.; Johnson, L. P.; Obl, W.; Stewart, A.; Harris, W. C.; Craig, R. D.
2000-01-01
Faculty in the Department of Physical, Environmental and Computer Sciences strongly believe in the concept that undergraduate research and research-related activities must be integrated into the fabric of our undergraduate Science and Technology curricula. High level skills, such as problem solving, reasoning, collaboration and the ability to engage in research, are learned for advanced study in graduate school or for competing for well paying positions in the scientific community. One goal of our academic programs is to have a pipeline of research activities from high school to four year college, to graduate school, based on the GISS Institute on Climate and Planets model.
NASA Astrophysics Data System (ADS)
Quiquet, Aurélien; Roche, Didier M.
2017-04-01
Comprehensive fully coupled ice sheet - climate models allowing for multi-millenia transient simulations are becoming available. They represent powerful tools to investigate ice sheet - climate interactions during the repeated retreats and advances of continental ice sheets of the Pleistocene. However, in such models, most of the time, the spatial resolution of the ice sheet model is one order of magnitude lower than the one of the atmospheric model. As such, orography-induced precipitation is only poorly represented. In this work, we briefly present the most recent improvements of the ice sheet - climate coupling within the model of intermediate complexity iLOVECLIM. On the one hand, from the native atmospheric resolution (T21), we have included a dynamical downscaling of heat and moisture at the ice sheet model resolution (40 km x 40 km). This downscaling accounts for feedbacks of sub-grid precipitation on large scale energy and water budgets. From the sub-grid atmospheric variables, we compute an ice sheet surface mass balance required by the ice sheet model. On the other hand, we also explicitly use oceanic temperatures to compute sub-shelf melting at a given depth. Based on palaeo evidences for rate of change of eustatic sea level, we discuss the capability of our new model to correctly simulate the last glacial inception ( 116 kaBP) and the ice volume of the last glacial maximum ( 21 kaBP). We show that the model performs well in certain areas (e.g. Canadian archipelago) but some model biases are consistent over time periods (e.g. Kara-Barents sector). We explore various model sensitivities (e.g. initial state, vegetation, albedo) and we discuss the importance of the downscaling of precipitation for ice nucleation over elevated area and for the surface mass balance of larger ice sheets.
NASA Astrophysics Data System (ADS)
Janská, Veronika; Jiménez-Alfaro, Borja; Chytrý, Milan; Divíšek, Jan; Anenkhonov, Oleg; Korolyuk, Andrey; Lashchinskyi, Nikolai; Culek, Martin
2017-03-01
We modelled the European distribution of vegetation types at the Last Glacial Maximum (LGM) using present-day data from Siberia, a region hypothesized to be a modern analogue of European glacial climate. Distribution models were calibrated with current climate using 6274 vegetation-plot records surveyed in Siberia. Out of 22 initially used vegetation types, good or moderately good models in terms of statistical validation and expert-based evaluation were computed for 18 types, which were then projected to European climate at the LGM. The resulting distributions were generally consistent with reconstructions based on pollen records and dynamic vegetation models. Spatial predictions were most reliable for steppe, forest-steppe, taiga, tundra, fens and bogs in eastern and central Europe, which had LGM climate more similar to present-day Siberia. The models for western and southern Europe, regions with a lower degree of climatic analogy, were only reliable for mires and steppe vegetation, respectively. Modelling LGM vegetation types for the wetter and warmer regions of Europe would therefore require gathering calibration data from outside Siberia. Our approach adds value to the reconstruction of vegetation at the LGM, which is limited by scarcity of pollen and macrofossil data, suggesting where specific habitats could have occurred. Despite the uncertainties of climatic extrapolations and the difficulty of validating the projections for vegetation types, the integration of palaeodistribution modelling with other approaches has a great potential for improving our understanding of biodiversity patterns during the LGM.
J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech
2000-01-01
Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
NASA Astrophysics Data System (ADS)
Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; Senior, Catherine A.; Bellucci, Alessio; Bao, Qing; Chang, Ping; Corti, Susanna; Fučkar, Neven S.; Guemas, Virginie; von Hardenberg, Jost; Hazeleger, Wilco; Kodama, Chihiro; Koenigk, Torben; Leung, L. Ruby; Lu, Jian; Luo, Jing-Jia; Mao, Jiafu; Mizielinski, Matthew S.; Mizuta, Ryo; Nobre, Paulo; Satoh, Masaki; Scoccimarro, Enrico; Semmler, Tido; Small, Justin; von Storch, Jin-Song
2016-11-01
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, "what are the origins and consequences of systematic model biases?", but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.
Scaling up: What coupled land-atmosphere models can tell us about critical zone processes
NASA Astrophysics Data System (ADS)
FitzGerald, K. A.; Masarik, M. T.; Rudisill, W. J.; Gelb, L.; Flores, A. N.
2017-12-01
A significant limitation to extending our knowledge of critical zone (CZ) evolution and function is a lack of hydrometeorological information at sufficiently fine spatial and temporal resolutions to resolve topo-climatic gradients and adequate spatial and temporal extent to capture a range of climatic conditions across ecoregions. Research at critical zone observatories (CZOs) suggests hydrometeorological stores and fluxes exert key controls on processes such as hydrologic partitioning and runoff generation, landscape evolution, soil formation, biogeochemical cycling, and vegetation dynamics. However, advancing fundamental understanding of CZ processes necessitates understanding how hydrometeorological drivers vary across space and time. As a result of recent advances in computational capabilities it has become possible, although still computationally expensive, to simulate hydrometeorological conditions via high resolution coupled land-atmosphere models. Using the Weather Research and Forecasting (WRF) model, we developed a high spatiotemporal resolution dataset extending from water year 1987 to present for the Snake River Basin in the northwestern USA including the Reynolds Creek and Dry Creek Experimental Watersheds, both part of the Reynolds Creek CZO, as well as a range of other ecosystems including shrubland desert, montane forests, and alpine tundra. Drawing from hypotheses generated by work at these sites and across the CZO network, we use the resulting dataset in combination with CZO observations and publically available datasets to provide insights regarding hydrologic partitioning, vegetation distribution, and erosional processes. This dataset provides key context in interpreting and reconciling what observations obtained at particular sites reveal about underlying CZ structure and function. While this dataset does not extend to future climates, the same modeling framework can be used to dynamically downscale coarse global climate model output to scales relevant to CZ processes. This presents an opportunity to better characterize the impact of climate change on the CZ. We also argue that opportunities exist beyond the one way flow of information and that what we learn at CZOs has the potential to contribute significantly to improved Earth system models.
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin
2014-05-01
During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560,640 equivalent cores. Scientific applications, such as CESM, are also required to demonstrate a "computational readiness capability" to efficiently scale across and utilize 20% of the entire system. The 0,25 deg configuration of the spectral element dynamical core of the Community Atmosphere Model (CAM-SE), the atmospheric component of CESM, has been demonstrated to scale efficiently across more than 5,000 nodes (80,000 CPU cores) on Titan. The tracer transport routines of CAM-SE have also been ported to take advantage of the hybrid many-core architecture of Titan using GPUs [see EGU2014-4233], yielding over 2X speedup when transporting over 100 tracers. The high throughput I/O in CESM, based on the Parallel IO Library (PIO), is being further augmented to support even higher resolutions and enhance resiliency. The application performance of the individual runs are archived in a database and routinely analyzed to identify and rectify performance degradation during the course of the experiments. The various resources available at the OLCF now support a scientific workflow to facilitate high-resolution climate modelling. A high-speed center-wide parallel file system, called ATLAS, capable of 1 TB/s, is available on Titan as well as on the clusters used for analysis (Rhea) and visualization (Lens/EVEREST). Long-term archive is facilitated by the HPSS storage system. The Earth System Grid (ESG), featuring search & discovery, is also used to deliver data. The end-to-end workflow allows OLCF users to efficiently share data and publish results in a timely manner.
NASA Astrophysics Data System (ADS)
Will, Andreas; Akhtar, Naveed; Brauch, Jennifer; Breil, Marcus; Davin, Edouard; Ho-Hagemann, Ha T. M.; Maisonnave, Eric; Thürkow, Markus; Weiher, Stefan
2017-04-01
We developed a coupled regional climate system model based on the CCLM regional climate model. Within this model system, using OASIS3-MCT as a coupler, CCLM can be coupled to two land surface models (the Community Land Model (CLM) and VEG3D), the NEMO-MED12 regional ocean model for the Mediterranean Sea, two ocean models for the North and Baltic seas (NEMO-NORDIC and TRIMNP+CICE) and the MPI-ESM Earth system model.We first present the different model components and the unified OASIS3-MCT interface which handles all couplings in a consistent way, minimising the model source code modifications and defining the physical and numerical aspects of the couplings. We also address specific coupling issues like the handling of different domains, multiple usage of the MCT library and exchange of 3-D fields.We analyse and compare the computational performance of the different couplings based on real-case simulations over Europe. The usage of the LUCIA tool implemented in OASIS3-MCT enables the quantification of the contributions of the coupled components to the overall coupling cost. These individual contributions are (1) cost of the model(s) coupled, (2) direct cost of coupling including horizontal interpolation and communication between the components, (3) load imbalance, (4) cost of different usage of processors by CCLM in coupled and stand-alone mode and (5) residual cost including i.a. CCLM additional computations.Finally a procedure for finding an optimum processor configuration for each of the couplings was developed considering the time to solution, computing cost and parallel efficiency of the simulation. The optimum configurations are presented for sequential, concurrent and mixed (sequential+concurrent) coupling layouts. The procedure applied can be regarded as independent of the specific coupling layout and coupling details.We found that the direct cost of coupling, i.e. communications and horizontal interpolation, in OASIS3-MCT remains below 7 % of the CCLM stand-alone cost for all couplings investigated. This is in particular true for the exchange of 450 2-D fields between CCLM and MPI-ESM. We identified remaining limitations in the coupling strategies and discuss possible future improvements of the computational efficiency.
NASA Astrophysics Data System (ADS)
Plegnière, Sabrina; Casper, Markus; Hecker, Benjamin; Müller-Fürstenberger, Georg
2014-05-01
The basis of many models to calculate and assess climate change and its consequences are annual means of temperature and precipitation. This method leads to many uncertainties especially at the regional or local level: the results are not realistic or too coarse. Particularly in agriculture, single events and the distribution of precipitation and temperature during the growing season have enormous influences on plant growth. Therefore, the temporal distribution of climate variables should not be ignored. To reach this goal, a high-resolution ecological-economic model was developed which combines a complex plant growth model (STICS) and an economic model. In this context, input data of the plant growth model are daily climate values for a specific climate station calculated by the statistical climate model (WETTREG). The economic model is deduced from the results of the plant growth model STICS. The chosen plant is corn because corn is often cultivated and used in many different ways. First of all, a sensitivity analysis showed that the plant growth model STICS is suitable to calculate the influences of different cultivation methods and climate on plant growth or yield as well as on soil fertility, e.g. by nitrate leaching, in a realistic way. Additional simulations helped to assess a production function that is the key element of the economic model. Thereby the problems when using mean values of temperature and precipitation in order to compute a production function by linear regression are pointed out. Several examples show why a linear regression to assess a production function based on mean climate values or smoothed natural distribution leads to imperfect results and why it is not possible to deduce a unique climate factor in the production function. One solution for this problem is the additional consideration of stress indices that show the impairment of plants by water or nitrate shortage. Thus, the resulting model takes into account not only the ecological factors (e.g. the plant growth) or the economical factors as a simple monetary calculation, but also their mutual influences. Finally, the ecological-economic model enables us to make a risk assessment or evaluate adaptation strategies.
Evapotranspiration and canopy resistance at an undeveloped prairie in a humid subtropical climate
Bidlake, W.R.
2002-01-01
Reliable estimates of evapotranspiration from areas of wildland vegetation are needed for many types of water-resource investigations. However, little is known about surface fluxes from many areally important vegetation types, and relatively few comparisons have been made to examine how well evapotranspiration models can predict evapotranspiration for soil-, climate-, or vegetation-types that differ from those under which the models have been calibrated. In this investigation at a prairie site in west-central Florida, latent heat flux (??E) computed from the energy balance and alternatively by eddy covariance during a 15-month period differed by 4 percent and 7 percent on hourly and daily time scales, respectively. Annual evapotranspiration computed from the energy balance and by eddy covariance were 978 and 944 mm, respectively. An hourly Penman-Monteith (PM) evapotranspiration model with stomatal control predicated on water-vapor-pressure deficit at canopy level, incoming solar radiation intensity, and soil water deficit was developed and calibrated using surface fluxes from eddy covariance. Model-predicted ??E agreed closely with ??E computed from the energy balance except when moisture from dew or precipitation covered vegetation surfaces. Finally, an hourly PM model developed for an Amazonian pasture predicted ??E for the Florida prairie with unexpected reliability. Additional comparisons of PM-type models that have been developed for differing types of short vegetation could aid in assessing interchangeability of such models.
NASA Astrophysics Data System (ADS)
Lemaire, V. E. P.; Colette, A.; Menut, L.
2015-10-01
Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projection. However, the computing cost of such method requires optimizing ensemble exploration techniques. By using a training dataset of deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed simple statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows concluding on the robustness of the climate impact on air quality. The climate benefit for PM2.5 was confirmed -0.96 (±0.18), -1.00 (±0.37), -1.16 ± (0.23) μg m-3, for resp. Eastern Europe, Mid Europe and Northern Italy and for the Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy regions a climate penalty on ozone was identified 10.11 (±3.22), 8.23 (±2.06), 9.23 (±1.13), 6.41 (±2.14), 7.43 (±2.02) μg m-3. This technique also allows selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections.
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
Aircraft Emissions: Potential Effects on Ozone and Climate - A Review and Progress Report
1977-03-01
which itself absorbs and reemits energy over a wide spectral range. In addition to such direct effects, various possible feedback effects can be...4.3.3 COMESA Climatic Effect Studies 4-17 4.4 Computed Mean Temperature Effects--Fleet Effects 4-24 4.4.1 Introduction 4-24 4.4.2 Mean Temperature Impact ...M B-13 B-2 The Chemical Kinetic Mechanism Used In . Model A B-19 B-3 Rate Coefficients for Model B (with Activation Energy in cal/mole) B-20 B-4
An Integrated Hydro-Economic Model for Economy-Wide Climate Change Impact Assessment for Zambia
NASA Astrophysics Data System (ADS)
Zhu, T.; Thurlow, J.; Diao, X.
2008-12-01
Zambia is a landlocked country in Southern Africa, with a total population of about 11 million and a total area of about 752 thousand square kilometers. Agriculture in the country depends heavily on rainfall as the majority of cultivated land is rain-fed. Significant rainfall variability has been a huge challenge for the country to keep a sustainable agricultural growth, which is an important condition for the country to meet the United Nations Millennium Development Goals. The situation is expected to become even more complex as climate change would impose additional impacts on rainwater availability and crop water requirements, among other changes. To understand the impacts of climate variability and change on agricultural production and national economy, a soil hydrology model and a crop water production model are developed to simulate actual crop water uses and yield losses under water stress which provide annual shocks for a recursive dynamic computational general equilibrium (CGE) model developed for Zambia. Observed meteorological data of the past three decades are used in the integrated hydro-economic model for climate variability impact analysis, and as baseline climatology for climate change impact assessment together with several GCM-based climate change scenarios that cover a broad range of climate projections. We found that climate variability can explain a significant portion of the annual variations of agricultural production and GDP of Zambia in the past. Hidden beneath climate variability, climate change is found to have modest impacts on agriculture and national economy of Zambia around 2025 but the impacts would be pronounced in the far future if appropriate adaptations are not implemented. Policy recommendations are provided based on scenario analysis.
NASA Astrophysics Data System (ADS)
Gabaldon, Clara; Lorite, Ignacio J.; Ines Minguez, M.; Lizaso, Jon; Dosio, Alessandro; Sanchez, Enrique; Ruiz-Ramos, Margarita
2015-04-01
Extreme events of Tmax can threaten maize production on Andalusia (Ruiz-Ramos et al., 2011). The objective of this work is to attempt a quantification of the effects of Tmax extreme events on the previously identified (Gabaldón et al., 2013) local adaptation strategies to climate change of irrigated maize crop in Andalusia for the first half of the 21st century. This study is focused on five Andalusia locations. Local adaptation strategies identified consisted on combinations of changes on sowing dates and choice of cultivar (Gabaldón et al., 2013). Modified cultivar features were the duration of phenological phases and the grain filling rate. The phenological and yield simulations with the adaptative changes were obtained from a modelling chain: current simulated climate and future climate scenarios (2013-2050) were taken from a group of regional climate models at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/). After bias correcting these data for temperature and precipitation (Dosio and Paruolo, 2011; Dosio et al., 2012) crop simulations were generated by the CERES-maize model (Jones and Kiniry, 1986) under DSSAT platform, previously calibrated and validated. Quantification of the effects of extreme Tmax on maize yield was computed for different phenological stages following Teixeira et al. (2013). A heat stress index was computed; this index assumes that yield-damage intensity due to heat stress increases linearly from 0.0 at a critical temperature to a maximum of 1.0 at a limit temperature. The decrease of crop yield is then computed by a normalized production damage index which combines attainable yield and heat stress index for each location. Selection of the most suitable adaptation strategy will be reviewed and discussed in light of the quantified effect on crop yield of the projected change of Tmax extreme events. This study will contribute to MACSUR knowledge Hub within the Joint Programming Initiative on Agriculture, Food Security and Climate Change (FACCE - JPI) of EU and is financed by MULCLIVAR project (CGL2012-38923-C02-02) and IFAPA project AGR6126 from Junta de Andalucía, Spain. References Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate. Journal of Geophysical Research, VOL. 116, D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research, Volume 117, D17, doi: 0.1029/2012JD017968 Gabaldón C, Lorite IJ, Mínguez MI, Dosio A, Sánchez-Sánchez E and Ruiz-Ramos M, 2013. Evaluation of local adaptation strategies to climate change of maize crop in Andalusia for the first half of 21st century. Geophysical Research Abstracts. Vol. 15, EGU2013-13625, 2013. EGU General Assembly 2013, April 2013, Vienna, Austria. Jones C.A. and J.R. Kiniry. 1986. CERES-Maize: A simulation model of maize growth and development. Texas A&M Univ. Press, College Station. Ruiz-Ramos M., E. Sanchez, C. Galllardo, and M.I. Minguez. 2011. Impacts of projected maximum temperature extremes for C21 by an ensemble of regional climate models on cereal cropping systems in the Iberian Peninsula. Natural Hazards and Earth System Science 11: 3275-3291. Teixeira EI, Fischer G, van Velthuizen H, Walter C, Ewert F. Global hotspots of heat stress on agricultural crops due to climate change. Agric For Meteorol. 2013;170(15):206-215.
The thermal influence of continents on a model-generated January climate
NASA Technical Reports Server (NTRS)
Spar, J.; Cohen, C.; Wu, P.
1981-01-01
Two climate simulations were compared. Both climate computations were initialized with the same horizontally uniform state of rest. However, one is carried out on a water planet (without continents), while the second is repeated on a planet with geographically realistic but flat (sea level) continents. The continents in this experiment have a uniform albedo of 0.14, except where snow accumulates, a uniform roughness height of 0.3 m, and zero water storage capacity. Both runs were carried out for a 'perpetual January' with solar declination fixed at January 15.
Biological production models as elements of coupled, atmosphere-ocean models for climate research
NASA Technical Reports Server (NTRS)
Platt, Trevor; Sathyendranath, Shubha
1991-01-01
Process models of phytoplankton production are discussed with respect to their suitability for incorporation into global-scale numerical ocean circulation models. Exact solutions are given for integrals over the mixed layer and the day of analytic, wavelength-independent models of primary production. Within this class of model, the bias incurred by using a triangular approximation (rather than a sinusoidal one) to the variation of surface irradiance through the day is computed. Efficient computation algorithms are given for the nonspectral models. More exact calculations require a spectrally sensitive treatment. Such models exist but must be integrated numerically over depth and time. For these integrations, resolution in wavelength, depth, and time are considered and recommendations made for efficient computation. The extrapolation of the one-(spatial)-dimension treatment to large horizontal scale is discussed.
NASA Astrophysics Data System (ADS)
Perkins, W. A.; Hakim, G. J.
2016-12-01
In this work, we examine the skill of a new approach to performing climate field reconstructions (CFRs) using a form of online paleoclimate data assimilation (PDA). Many previous studies have foregone climate model forecasts during assimilation due to the computational expense of running coupled global climate models (CGCMs), and the relatively low skill of these forecasts on longer timescales. Here we greatly diminish the computational costs by employing an empirical forecast model (known as a linear inverse model; LIM), which has been shown to have comparable skill to CGCMs. CFRs of annually averaged 2m air temperature anomalies are compared between the Last Millennium Reanalysis framework (no forecasting or "offline"), a persistence forecast, and four LIM forecasting experiments over the instrumental period (1850 - 2000). We test LIM calibrations for observational (Berkeley Earth), reanalysis (20th Century Reanalysis), and CMIP5 climate model (CCSM4 and MPI) data. Generally, we find that the usage of LIM forecasts for online PDA increases reconstruction agreement with the instrumental record for both spatial and global mean temperature (GMT). The detrended GMT skill metrics show the most dramatic increases in skill with coefficient of efficiency (CE) improvements over the no-forecasting benchmark averaging 57%. LIM experiments display a common pattern of spatial field increases in CE skill over northern hemisphere land areas and in the high-latitude North Atlantic - Barents Sea corridor (Figure 1). However, the non-GCM-calibrated LIMs introduce other deficiencies into the spatial skill of these reconstructions, likely due to aspects of the LIM calibration process. Overall, the CMIP5 LIMs have the best performance when considering both spatial fields and GMT. A comparison with the persistence forecast experiment suggests that improvements are associated with the usage of the LIM forecasts, and not simple persistence of temperature anomalies over time. These results show that the use of LIM forecasting can help add further dynamical constraint to CFRs. As we move forward, this will be an important factor in fully utilizing dynamically consistent information from the proxy record while reconstructing the past millennium.
weather@home 2: validation of an improved global-regional climate modelling system
NASA Astrophysics Data System (ADS)
Guillod, Benoit P.; Jones, Richard G.; Bowery, Andy; Haustein, Karsten; Massey, Neil R.; Mitchell, Daniel M.; Otto, Friederike E. L.; Sparrow, Sarah N.; Uhe, Peter; Wallom, David C. H.; Wilson, Simon; Allen, Myles R.
2017-05-01
Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes in such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows one to generate a very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of the weather@home system (weather@home 2) with a higher-resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid-scale land-surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM temperature biases are overall reduced, in particular the warm bias over eastern Europe, but large biases remain. Precipitation is improved over the Alps in summer, with mixed changes in other regions and seasons. The model is shown to represent the main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.
Enhancement of Local Climate Analysis Tool
NASA Astrophysics Data System (ADS)
Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.
2012-12-01
The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).
NASA Astrophysics Data System (ADS)
Washington, W. M.
2010-12-01
The development of climate and earth system models has been regarded primarily as the making of scientific tools to study the complex nature of the Earth’s climate. These models have a long history starting with very simple physical models based on fundamental physics in the 1960s and over time they have become much more complex with atmospheric, ocean, sea ice, land/vegetation, biogeochemical, glacial and ecological components. The policy use aspects of these models did not start in the 1960s and 1970s as decision making tools but were used to answer fundamental scientific questions such as what happens when the atmospheric carbon dioxide concentration increases or is doubled. They gave insights into the various interactions and were extensively compared with observations. It was realized that models of the earlier time periods could only give first order answers to many of the fundamental policy questions. As societal concerns about climate change rose, the policy questions of anthropogenic climate change became better defined; they were mostly concerned with the climate impacts of increasing greenhouse gases, aerosols, and land cover change. In the late 1980s, the United Nations set up the Intergovernmental Panel on Climate Change to perform assessments of the published literature. Thus, the development of climate and Earth system models became intimately linked to the need to not only improve our scientific understanding but also answering fundamental policy questions. In order to meet this challenge, the models became more complex and realistic so that they could address these policy oriented science questions such as rising sea level. The presentation will discuss the past and future development of global climate and earth system models for science and policy purposes. Also to be discussed is their interactions with economic integrated assessment models, regional and specialized models such as river transport or ecological components. As an example of one development pathway, the NSF/Department of Energy supported Community Climate System and Earth System Models will be featured in the presentation. Computational challenges will also part of the discussion.
Mid-21st century projections of hydroclimate in Western Himalayas and Satluj River basin
NASA Astrophysics Data System (ADS)
Tiwari, Sarita; Kar, Sarat C.; Bhatla, R.
2018-02-01
The Himalayan climate system is sensitive to global warming and climate change. Regional hydrology and the downstream water flow in the rivers of Himalayan origin may change due to variations in snow and glacier melt in the region. This study examines the mid-21st century climate projections over western Himalayas from the Coupled Model Intercomparison Project Phase 5 (CMIP5) global climate models under Representative Concentration Pathways (RCP) scenarios (RCP4.5 and RCP8.5). All the global climate models used in the present analysis indicate that the study region would be warmer by mid-century. The temperature trends from all the models studied here are statistically significant at 95% confidence interval. Multi-model ensemble spreads show that there are large differences among the models in their projections of future climate with spread in temperature ranging from about 1.5 °C to 5 °C over various areas of western Himalayas in all the seasons. Spread in precipitation projections lies between 0.3 and 1 mm/day in all the seasons. Major shift in the timing of evaporation maxima and minima is noticed. The GFDL_ESM2G model products have been downscaled to Satluj River basin using the weather research and forecast (WRF) model and impact of climate change on streamflow has been studied. The reduction of precipitation during JJAS is expected to be > 3-6 mm/day in RCP8.5 as compared to present climate. It is expected that precipitation amount shall increase over Satluj basin in future (mid-21st century) The soil and water assessment tool (SWAT) model has been used to simulate the Satluj streamflow for the present and future climate using GFDL_ESM2G precipitation and temperature data as well as the WRF model downscaled data. The computations using the global model data show that total annual discharge from Satluj will be less in future than that in present climate, especially in peak discharge season (JJAS). The SWAT model with downscaled output indicates that during winter and spring, more discharge shall occur in future (RCP8.5) in Satluj River.
Building Systems from Scratch: an Exploratory Study of Students Learning About Climate Change
NASA Astrophysics Data System (ADS)
Puttick, Gillian; Tucker-Raymond, Eli
2018-01-01
Science and computational practices such as modeling and abstraction are critical to understanding the complex systems that are integral to climate science. Given the demonstrated affordances of game design in supporting such practices, we implemented a free 4-day intensive workshop for middle school girls that focused on using the visual programming environment, Scratch, to design games to teach others about climate change. The experience was carefully constructed so that girls of widely differing levels of experience were able to engage in a cycle of game design. This qualitative study aimed to explore the representational choices the girls made as they took up aspects of climate change systems and modeled them in their games. Evidence points to the ways in which designing games about climate science fostered emergent systems thinking and engagement in modeling practices as learners chose what to represent in their games, grappled with the realism of their respective representations, and modeled interactions among systems components. Given the girls' levels of programming skill, parts of systems were more tractable to create than others. The educational purpose of the games was important to the girls' overall design experience, since it influenced their choice of topic, and challenged their emergent understanding of climate change as a systems problem.
Comparison of GEOS-5 AGCM planetary boundary layer depths computed with various definitions
NASA Astrophysics Data System (ADS)
McGrath-Spangler, E. L.; Molod, A.
2014-07-01
Accurate models of planetary boundary layer (PBL) processes are important for forecasting weather and climate. The present study compares seven methods of calculating PBL depth in the GEOS-5 atmospheric general circulation model (AGCM) over land. These methods depend on the eddy diffusion coefficients, bulk and local Richardson numbers, and the turbulent kinetic energy. The computed PBL depths are aggregated to the Köppen-Geiger climate classes, and some limited comparisons are made using radiosonde profiles. Most methods produce similar midday PBL depths, although in the warm, moist climate classes the bulk Richardson number method gives midday results that are lower than those given by the eddy diffusion coefficient methods. Additional analysis revealed that methods sensitive to turbulence driven by radiative cooling produce greater PBL depths, this effect being most significant during the evening transition. Nocturnal PBLs based on Richardson number methods are generally shallower than eddy diffusion coefficient based estimates. The bulk Richardson number estimate is recommended as the PBL height to inform the choice of the turbulent length scale, based on the similarity to other methods during the day, and the improved nighttime behavior.
Comparison of GEOS-5 AGCM Planetary Boundary Layer Depths Computed with Various Definitions
NASA Technical Reports Server (NTRS)
Mcgrath-Spangler, E. L.; Molod, A.
2014-01-01
Accurate models of planetary boundary layer (PBL) processes are important for forecasting weather and climate. The present study compares seven methods of calculating PBL depth in the GEOS-5 atmospheric general circulation model (AGCM) over land. These methods depend on the eddy diffusion coefficients, bulk and local Richardson numbers, and the turbulent kinetic energy. The computed PBL depths are aggregated to the Koppen climate classes, and some limited comparisons are made using radiosonde profiles. Most methods produce similar midday PBL depths, although in the warm, moist climate classes, the bulk Richardson number method gives midday results that are lower than those given by the eddy diffusion coefficient methods. Additional analysis revealed that methods sensitive to turbulence driven by radiative cooling produce greater PBL depths, this effect being most significant during the evening transition. Nocturnal PBLs based on Richardson number are generally shallower than eddy diffusion coefficient based estimates. The bulk Richardson number estimate is recommended as the PBL height to inform the choice of the turbulent length scale, based on the similarity to other methods during the day, and the improved nighttime behavior.
Comparison of GEOS-5 AGCM planetary boundary layer depths computed with various definitions
NASA Astrophysics Data System (ADS)
McGrath-Spangler, E. L.; Molod, A.
2014-03-01
Accurate models of planetary boundary layer (PBL) processes are important for forecasting weather and climate. The present study compares seven methods of calculating PBL depth in the GEOS-5 atmospheric general circulation model (AGCM) over land. These methods depend on the eddy diffusion coefficients, bulk and local Richardson numbers, and the turbulent kinetic energy. The computed PBL depths are aggregated to the Köppen climate classes, and some limited comparisons are made using radiosonde profiles. Most methods produce similar midday PBL depths, although in the warm, moist climate classes, the bulk Richardson number method gives midday results that are lower than those given by the eddy diffusion coefficient methods. Additional analysis revealed that methods sensitive to turbulence driven by radiative cooling produce greater PBL depths, this effect being most significant during the evening transition. Nocturnal PBLs based on Richardson number are generally shallower than eddy diffusion coefficient based estimates. The bulk Richardson number estimate is recommended as the PBL height to inform the choice of the turbulent length scale, based on the similarity to other methods during the day, and the improved nighttime behavior.
New methods in hydrologic modeling and decision support for culvert flood risk under climate change
NASA Astrophysics Data System (ADS)
Rosner, A.; Letcher, B. H.; Vogel, R. M.; Rees, P. S.
2015-12-01
Assessing culvert flood vulnerability under climate change poses an unusual combination of challenges. We seek a robust method of planning for an uncertain future, and therefore must consider a wide range of plausible future conditions. Culverts in our case study area, northwestern Massachusetts, USA, are predominantly found in small, ungaged basins. The need to predict flows both at numerous sites and under numerous plausible climate conditions requires a statistical model with low data and computational requirements. We present a statistical streamflow model that is driven by precipitation and temperature, allowing us to predict flows without reliance on reference gages of observed flows. The hydrological analysis is used to determine each culvert's risk of failure under current conditions. We also explore the hydrological response to a range of plausible future climate conditions. These results are used to determine the tolerance of each culvert to future increases in precipitation. In a decision support context, current flood risk as well as tolerance to potential climate changes are used to provide a robust assessment and prioritization for culvert replacements.
Determination of the Changes of Drought Occurrence in Turkey Using Regional Climate Modeling
NASA Astrophysics Data System (ADS)
Sibel Saygili, Fatma; Tufan Turp, M.; Kurnaz, M. Levent
2017-04-01
As a consequence of the negative impacts of climate change, Turkey, being a country in the Mediterranean Basin, is under a serious risk of increased drought conditions. In this study, it is aimed to determine and compare the spatial distributions of climatological drought probabilities for Turkey. For this purpose, by making use of Regional Climate Model (RegCM4.4) of the Abdus Salam International Centre for Theoretical Physics (ICTP), the outputs of the MPI-ESM-MR global climate model of the Max Planck Institute for Meteorology are downscaled to 50km for Turkey. To make the future projection over Turkey for the period of 2071-2100 with respect to the reference period of 1986-2005, the worst case emission pathway RCP8.5 is used. The Palmer Drought Severity Index (PDSI) values are computed and classified in accordance with the seven classifications of National Oceanic and Atmospheric Administration (NOAA). Finally, the spatial distribution maps showing the changes in drought probabilities over Turkey are obtained in order to see the impact of climate change on Turkey's drought patterns.
Consequence of climate mitigation on the risk of hunger.
Hasegawa, Tomoko; Fujimori, Shinichiro; Shin, Yonghee; Tanaka, Akemi; Takahashi, Kiyoshi; Masui, Toshihiko
2015-06-16
Climate change and mitigation measures have three major impacts on food consumption and the risk of hunger: (1) changes in crop yields caused by climate change; (2) competition for land between food crops and energy crops driven by the use of bioenergy; and (3) costs associated with mitigation measures taken to meet an emissions reduction target that keeps the global average temperature increase to 2 °C. In this study, we combined a global computable general equilibrium model and a crop model (M-GAEZ), and we quantified the three impacts on risk of hunger through 2050 based on the uncertainty range associated with 12 climate models and one economic and demographic scenario. The strong mitigation measures aimed at attaining the 2 °C target reduce the negative effects of climate change on yields but have large negative impacts on the risk of hunger due to mitigation costs in the low-income countries. We also found that in a strongly carbon-constrained world, the change in food consumption resulting from mitigation measures depends more strongly on the change in incomes than the change in food prices.
Toward a Unified Representation of Atmospheric Convection in Variable-Resolution Climate Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walko, Robert
2016-11-07
The purpose of this project was to improve the representation of convection in atmospheric weather and climate models that employ computational grids with spatially-variable resolution. Specifically, our work targeted models whose grids are fine enough over selected regions that convection is resolved explicitly, while over other regions the grid is coarser and convection is represented as a subgrid-scale process. The working criterion for a successful scheme for representing convection over this range of grid resolution was that identical convective environments must produce very similar convective responses (i.e., the same precipitation amount, rate, and timing, and the same modification of themore » atmospheric profile) regardless of grid scale. The need for such a convective scheme has increased in recent years as more global weather and climate models have adopted variable resolution meshes that are often extended into the range of resolving convection in selected locations.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
NASA Astrophysics Data System (ADS)
Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prather, Michael J.; Hsu, Juno; Nicolau, Alex
Atmospheric chemistry controls the abundances and hence climate forcing of important greenhouse gases including N 2O, CH 4, HFCs, CFCs, and O 3. Attributing climate change to human activities requires, at a minimum, accurate models of the chemistry and circulation of the atmosphere that relate emissions to abundances. This DOE-funded research provided realistic, yet computationally optimized and affordable, photochemical modules to the Community Earth System Model (CESM) that augment the CESM capability to explore the uncertainty in future stratospheric-tropospheric ozone, stratospheric circulation, and thus the lifetimes of chemically controlled greenhouse gases from climate simulations. To this end, we have successfullymore » implemented Fast-J (radiation algorithm determining key chemical photolysis rates) and Linoz v3.0 (linearized photochemistry for interactive O 3, N 2O, NO y and CH 4) packages in LLNL-CESM and for the first time demonstrated how change in O2 photolysis rate within its uncertainty range can significantly impact on the stratospheric climate and ozone abundances. From the UCI side, this proposal also helped LLNL develop a CAM-Superfast Chemistry model that was implemented for the IPCC AR5 and contributed chemical-climate simulations to CMIP5.« less
Heather L. Kimball; Paul C. Selmants; Alvaro Moreno; Steve W. Running; Christian P. Giardina; Benjamin Poulter
2017-01-01
Gross primary production (GPP) is the Earthâs largest carbon flux into the terrestrial biosphere and plays a critical role in regulating atmospheric chemistry and global climate. The Moderate Resolution Imaging Spectrometer (MODIS)-MOD17 data product is a widely used remote sensing-based model that provides global estimates of spatiotemporal trends in GPP. When the...
NASA Astrophysics Data System (ADS)
Castro, C. L.; Dominguez, F.; Chang, H.
2010-12-01
Current seasonal climate forecasts and climate change projections of the North American monsoon are based on the use of course-scale information from a general circulation model. The global models, however, have substantial difficulty in resolving the regional scale forcing mechanisms of precipitation. This is especially true during the period of the North American Monsoon in the warm season. Precipitation is driven primarily due to the diurnal cycle of convection, and this process cannot be resolve in coarse-resolution global models that have a relatively poor representation of terrain. Though statistical downscaling may offer a relatively expedient method to generate information more appropriate for the regional scale, and is already being used in the resource decision making processes in the Southwest U.S., its main drawback is that it cannot account for a non-stationary climate. Here we demonstrate the use of a regional climate model, specifically the Weather Research and Forecast (WRF) model, for dynamical downscaling of the North American Monsoon. To drive the WRF simulations, we use retrospective reforecasts from the Climate Forecast System (CFS) model, the operational model used at the U.S. National Center for Environmental Prediction, and three select “well performing” IPCC AR 4 models for the A2 emission scenario. Though relatively computationally expensive, the use of WRF as a regional climate model in this way adds substantial value in the representation of the North American Monsoon. In both cases, the regional climate model captures a fairly realistic and reasonable monsoon, where none exists in the driving global model, and captures the dominant modes of precipitation anomalies associated with ENSO and the Pacific Decadal Oscillation (PDO). Long-term precipitation variability and trends in these simulations is considered via the standardized precipitation index (SPI), a commonly used metric to characterize long-term drought. Dynamically downscaled climate projection data will be integrated into future water resource projections in the state of Arizona, through a cooperative effort involving numerous water resource stakeholders.
Modelling Climate/Global Change and Assessing Environmental Risks for Siberia
NASA Astrophysics Data System (ADS)
Lykosov, V. N.; Kabanov, M. V.; Heimann, M.; Gordov, E. P.
2009-04-01
The state-of-the-art climate models are based on a combined atmosphere-ocean general circulation model. A central direction of their development is associated with an increasingly accurate description of all physical processes participating in climate formation. In modeling global climate, it is necessary to reconstruct seasonal and monthly mean values, seasonal variability (monsoon cycle, parameters of storm-tracks, etc.), climatic variability (its dominating modes, such as El Niño or Arctic Oscillation), etc. At the same time, it is quite urgent now to use modern mathematical models in studying regional climate and ecological peculiarities, in particular, that of Northern Eurasia. It is related with the fact that, according to modern ideas, natural environment in mid- and high latitudes of the Northern hemisphere is most sensitive to the observed global climate changes. One should consider such tasks of modeling regional climate as detailed reconstruction of its characteristics, investigation of the peculiarities of hydrological cycle, estimation of the possibility of extreme phenomena to occur, and investigation of the consequences of the regional climate changes for the environment and socio-economic relations as its basic tasks. Changes in nature and climate in Siberia are of special interest in view of the global change in the Earth system. The vast continental territory of Siberia is undoubtedly a ponderable natural territorial region of Eurasian continent, which is characterized by the various combinations of climate-forming factors. Forests, water, and wetland areas are situated on a significant part of Siberia. They play planetary important regulating role due to the processes of emission and accumulation of the main greenhouse gases (carbon dioxide, methane, etc.). Evidence of the enhanced rates of the warming observed in the region and the consequences of such warming for natural environment are undoubtedly important reason for integrated regional investigations in this region of the planet. Reported is an overview of some risk consequences of Climate/Global Change for Siberia environment as follows from results of current scientific activity in climate monitoring and modelling. At present, the challenge facing the weather and climate scientists is to improve the prediction of interactions between weather/climate and Earth system. Taking into account significantly increased computing capacity, a special attention in the report is paid to perspectives of the Earth system modelling.
NASA Astrophysics Data System (ADS)
Titov, A.; Gordov, E.; Okladnikov, I.
2009-04-01
In this report the results of the work devoted to the development of working model of the software system for storage, semantically-enabled search and retrieval along with processing and visualization of environmental datasets containing results of meteorological and air pollution observations and mathematical climate modeling are presented. Specially designed metadata standard for machine-readable description of datasets related to meteorology, climate and atmospheric pollution transport domains is introduced as one of the key system components. To provide semantic interoperability the Resource Description Framework (RDF, http://www.w3.org/RDF/) technology means have been chosen for metadata description model realization in the form of RDF Schema. The final version of the RDF Schema is implemented on the base of widely used standards, such as Dublin Core Metadata Element Set (http://dublincore.org/), Directory Interchange Format (DIF, http://gcmd.gsfc.nasa.gov/User/difguide/difman.html), ISO 19139, etc. At present the system is available as a Web server (http://climate.risks.scert.ru/metadatabase/) based on the web-portal ATMOS engine [1] and is implementing dataset management functionality including SeRQL-based semantic search as well as statistical analysis and visualization of selected data archives [2,3]. The core of the system is Apache web server in conjunction with Tomcat Java Servlet Container (http://jakarta.apache.org/tomcat/) and Sesame Server (http://www.openrdf.org/) used as a database for RDF and RDF Schema. At present statistical analysis of meteorological and climatic data with subsequent visualization of results is implemented for such datasets as NCEP/NCAR Reanalysis, Reanalysis NCEP/DOE AMIP II, JMA/CRIEPI JRA-25, ECMWF ERA-40 and local measurements obtained from meteorological stations on the territory of Russia. This functionality is aimed primarily at finding of main characteristics of regional climate dynamics. The proposed system represents a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.
Statistical models of global Langmuir mixing
NASA Astrophysics Data System (ADS)
Li, Qing; Fox-Kemper, Baylor; Breivik, Øyvind; Webb, Adrean
2017-05-01
The effects of Langmuir mixing on the surface ocean mixing may be parameterized by applying an enhancement factor which depends on wave, wind, and ocean state to the turbulent velocity scale in the K-Profile Parameterization. Diagnosing the appropriate enhancement factor online in global climate simulations is readily achieved by coupling with a prognostic wave model, but with significant computational and code development expenses. In this paper, two alternatives that do not require a prognostic wave model, (i) a monthly mean enhancement factor climatology, and (ii) an approximation to the enhancement factor based on the empirical wave spectra, are explored and tested in a global climate model. Both appear to reproduce the Langmuir mixing effects as estimated using a prognostic wave model, with nearly identical and substantial improvements in the simulated mixed layer depth and intermediate water ventilation over control simulations, but significantly less computational cost. Simpler approaches, such as ignoring Langmuir mixing altogether or setting a globally constant Langmuir number, are found to be deficient. Thus, the consequences of Stokes depth and misaligned wind and waves are important.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.
2013-12-01
The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).
Fitzpatrick, Joan; Gray, Floyd; Dubiel, Russell; Langman, Jeff; Moring, J. Bruce; Norman, Laura M.; Page, William R.; Parcher, Jean W.
2013-01-01
The prediction of global climate change in response to both natural forces and human activity is one of the defining issues of our times. The unprecedented observational capacity of modern earth-orbiting satellites coupled with the development of robust computational representations (models) of the Earth’s weather and climate systems afford us the opportunity to observe and investigate how these systems work now, how they have worked in the past, and how they will work in the future when forced in specific ways. In the most recent report on global climate change by the Intergovernmental Panel on Climate Change (IPCC; Solomon and others, 2007), analyses using multiple climate models support recent observations that the Earth’s climate is changing in response to a combination of natural and human-induced causes. These changes will be significant in the United States–Mexican border region, where the process of climate change affects all of the Borderlands challenge themes discussed in the preceding chapters. The dual possibilities of both significantly-changed climate and increasing variability in climate make it challenging to take full measure of the potential effects because the Borderlands already experience a high degree of interannual variability and climatological extremes.
Theoretical Assessment of the Impact of Climatic Factors in a Vibrio Cholerae Model.
Kolaye, G; Damakoa, I; Bowong, S; Houe, R; Békollè, D
2018-05-04
A mathematical model for Vibrio Cholerae (V. Cholerae) in a closed environment is considered, with the aim of investigating the impact of climatic factors which exerts a direct influence on the bacterial metabolism and on the bacterial reservoir capacity. We first propose a V. Cholerae mathematical model in a closed environment. A sensitivity analysis using the eFast method was performed to show the most important parameters of the model. After, we extend this V. cholerae model by taking account climatic factors that influence the bacterial reservoir capacity. We present the theoretical analysis of the model. More precisely, we compute equilibria and study their stabilities. The stability of equilibria was investigated using the theory of periodic cooperative systems with a concave nonlinearity. Theoretical results are supported by numerical simulations which further suggest the necessity to implement sanitation campaigns of aquatic environments by using suitable products against the bacteria during the periods of growth of aquatic reservoirs.
Improved pattern scaling approaches for the use in climate impact studies
NASA Astrophysics Data System (ADS)
Herger, Nadja; Sanderson, Benjamin M.; Knutti, Reto
2015-05-01
Pattern scaling is a simple way to produce climate projections beyond the scenarios run with expensive global climate models (GCMs). The simplest technique has known limitations and assumes that a spatial climate anomaly pattern obtained from a GCM can be scaled by the global mean temperature (GMT) anomaly. We propose alternatives and assess their skills and limitations. One approach which avoids scaling is to consider a period in a different scenario with the same GMT change. It is attractive as it provides patterns of any temporal resolution that are consistent across variables, and it does not distort variability. Second, we extend the traditional approach with a land-sea contrast term, which provides the largest improvements over the traditional technique. When interpolating between known bounding scenarios, the proposed methods significantly improve the accuracy of the pattern scaled scenario with little computational cost. The remaining errors are much smaller than the Coupled Model Intercomparison Project Phase 5 model spread.
Uncertainty Analysis of Downscaled CMIP5 Precipitation Data for Louisiana, USA
NASA Astrophysics Data System (ADS)
Sumi, S. J.; Tamanna, M.; Chivoiu, B.; Habib, E. H.
2014-12-01
The downscaled CMIP3 and CMIP5 Climate and Hydrology Projections dataset contains fine spatial resolution translations of climate projections over the contiguous United States developed using two downscaling techniques (monthly Bias Correction Spatial Disaggregation (BCSD) and daily Bias Correction Constructed Analogs (BCCA)). The objective of this study is to assess the uncertainty of the CMIP5 downscaled general circulation models (GCM). We performed an analysis of the daily, monthly, seasonal and annual variability of precipitation downloaded from the Downscaled CMIP3 and CMIP5 Climate and Hydrology Projections website for the state of Louisiana, USA at 0.125° x 0.125° resolution. A data set of daily gridded observations of precipitation of a rectangular boundary covering Louisiana is used to assess the validity of 21 downscaled GCMs for the 1950-1999 period. The following statistics are computed using the CMIP5 observed dataset with respect to the 21 models: the correlation coefficient, the bias, the normalized bias, the mean absolute error (MAE), the mean absolute percentage error (MAPE), and the root mean square error (RMSE). A measure of variability simulated by each model is computed as the ratio of its standard deviation, in both space and time, to the corresponding standard deviation of the observation. The correlation and MAPE statistics are also computed for each of the nine climate divisions of Louisiana. Some of the patterns that we observed are: 1) Average annual precipitation rate shows similar spatial distribution for all the models within a range of 3.27 to 4.75 mm/day from Northwest to Southeast. 2) Standard deviation of summer (JJA) precipitation (mm/day) for the models maintains lower value than the observation whereas they have similar spatial patterns and range of values in winter (NDJ). 3) Correlation coefficients of annual precipitation of models against observation have a range of -0.48 to 0.36 with variable spatial distribution by model. 4) Most of the models show negative correlation coefficients in summer and positive in winter. 5) MAE shows similar spatial distribution for all the models within a range of 5.20 to 7.43 mm/day from Northwest to Southeast of Louisiana. 6) Highest values of correlation coefficients are found at seasonal scale within a range of 0.36 to 0.46.
NASA Astrophysics Data System (ADS)
Tansey, M. K.; Flores-Lopez, F.; Young, C. A.; Huntington, J. L.
2012-12-01
Long term planning for the management of California's water resources requires assessment of the effects of future climate changes on both water supply and demand. Considerable progress has been made on the evaluation of the effects of future climate changes on water supplies but less information is available with regard to water demands. Uncertainty in future climate projections increases the difficulty of assessing climate impacts and evaluating long range adaptation strategies. Compounding the uncertainty in the future climate projections is the fact that most readily available downscaled climate projections lack sufficient meteorological information to compute evapotranspiration (ET) by the widely accepted ASCE Penman-Monteith (PM) method. This study addresses potential changes in future Central Valley water demands and crop yields by examining the effects of climate change on soil evaporation, plant transpiration, growth and yield for major types of crops grown in the Central Valley of California. Five representative climate scenarios based on 112 bias corrected spatially downscaled CMIP 3 GCM climate simulations were developed using the hybrid delta ensemble method to span a wide range future climate uncertainty. Analysis of historical California Irrigation Management Information System meteorological data was combined with several meteorological estimation methods to compute future solar radiation, wind speed and dew point temperatures corresponding to the GCM projected temperatures and precipitation. Future atmospheric CO2 concentrations corresponding to the 5 representative climate projections were developed based on weighting IPCC SRES emissions scenarios. The Land, Atmosphere, and Water Simulator (LAWS) model was used to compute ET and yield changes in the early, middle and late 21st century for 24 representative agricultural crops grown in the Sacramento, San Joaquin and Tulare Lake basins. Study results indicate that changes in ET and yield vary between crops due to plant specific sensitivities to temperature, solar radiation and the vapor pressure deficits. Shifts in the growth period to earlier in the year, shortened growth period for annual crops as well as extended fall growth can also exert important influences. Projected increases in CO2 concentrations in the late 21st century exert very significant influences on ET and yield for many crops. To characterize potential impacts and the range of uncertainty, changes in total agricultural water demands and yields were computed assuming that current crop types and acreages in 21 Central Valley regional planning areas remained constant throughout the 21st century for each of the 5 representative future climate scenarios.
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Application of solar max ACRIM data to analyze solar-driven climatic variability on Earth
NASA Technical Reports Server (NTRS)
Hoffert, M. I.
1986-01-01
Terrestrial climatic effects associated with solar variability have been proposed for at least a century, but could not be assessed quantitatively owing to observational uncertainities in solar flux variations. Measurements from 1980 to 1984 by the Active Cavity Radiometer Irradiance Monitor (ACRIM), capable of resolving fluctuations above the sensible atmosphere less than 0.1% of the solar constant, permit direct albeit preliminary assessments of solar forcing effects on global temperatures during this period. The global temperature response to ACRIM-measured fluctuations was computed from 1980 to 1985 using the NYU transient climate model including thermal inertia effects of the world ocean; and compared the results with observations of recent temperature trends. Monthly mean ACRIM-driven global surface temperature fluctuations computed with the climate model are an order of magnitude smaller, of order 0.01 C. In constrast, global mean surface temperature observations indicate an approx. 0.1 C increase during this period. Solar variability is therefore likely to have been a minor factor in global climate change during this period compared with variations in atmospheric albedo, greenhouse gases and internal self-inducedoscillations. It was not possible to extend the applicability of the measured flux variations to longer periods since a possible correlation of luminosity with solar annual activity is not supported by statistical analysis. The continuous monitoring of solar flux by satellite-based instruments over timescales of 20 years or more comparable to timescales for thermal relaxation of the oceans and of the solar cycle itself is needed to resolve the question of long-term solar variation effects on climate.
NASA Astrophysics Data System (ADS)
Gampe, D.; Ludwig, R.
2017-12-01
Regional Climate Models (RCMs) that downscale General Circulation Models (GCMs) are the primary tool to project future climate and serve as input to many impact models to assess the related changes and impacts under such climate conditions. Such RCMs are made available through the Coordinated Regional climate Downscaling Experiment (CORDEX). The ensemble of models provides a range of possible future climate changes around the ensemble mean climate change signal. The model outputs however are prone to biases compared to regional observations. A bias correction of these deviations is a crucial step in the impact modelling chain to allow the reproduction of historic conditions of i.e. river discharge. However, the detection and quantification of model biases are highly dependent on the selected regional reference data set. Additionally, in practice due to computational constraints it is usually not feasible to consider the entire ensembles of climate simulations with all members as input for impact models which provide information to support decision-making. Although more and more studies focus on model selection based on the preservation of the climate model spread, a selection based on validity, i.e. the representation of the historic conditions is still a widely applied approach. In this study, several available reference data sets for precipitation are selected to detect the model bias for the reference period 1989 - 2008 over the alpine catchment of the Adige River located in Northern Italy. The reference data sets originate from various sources, such as station data or reanalysis. These data sets are remapped to the common RCM grid at 0.11° resolution and several indicators, such as dry and wet spells, extreme precipitation and general climatology, are calculate to evaluate the capability of the RCMs to produce the historical conditions. The resulting RCM spread is compared against the spread of the reference data set to determine the related uncertainties and detect potential model biases with respect to each reference data set. The RCMs are then ranked based on various statistical measures for each indicator and a score matrix is derived to select a subset of RCMs. We show the impact and importance of the reference data set with respect to the resulting climate change signal on the catchment scale.
NASA Astrophysics Data System (ADS)
Jacobs, P.; de Mutsert, K.
2013-12-01
Paleoclimatic reconstructions, particularly from periods that may serve as an analog to the present and future greenhouse-driven warming, are increasingly being used to validate climate models as well as to provide constraints on broad impacts such as global temperature and sea level change. However, paleoclimatic data remains under-utilized in decision-making processes by stakeholders, who typically rely on scenarios produced by computer models or naive extrapolation of present trends. We hope to increase the information available to stakeholders by incorporating paleoclimatic data from the mid-Pliocene Warm Period (mPWP, ~3ma) into a fisheries model of the North Atlantic. North Atlantic fisheries are economically important and are expected to be sensitive to climatic change. State of the art climate models remain unable to realistically simulate the North Atlantic, both over the observational record as well as during times in the geologic past such as the mPWP. Given that the mPWP shares many of the same boundary conditions as those likely to be seen in the near future, we seek to answer the question 'What if the climate of the future looks more like the climate of the past?' relative to what state of the art computer models currently project. To that end we have created a suite of future North Atlantic Ocean scenarios using output from the CMIP3 and CMIP5 modeling experiments, as well as the PRISM group's Mid-Pliocene ocean reconstruction. We use these scenarios to drive an ecosystem-based fisheries model using the Ecopath with Ecosim (EwE) software to identify differences between the scenarios as the North Atlantic Ocean changes through time. Additionally, we examine the spatial component of these differences by using the Ecospace module of EwE. Whereas the Ecosim realizations are intended to capture the dynamic response to changing oceanographic parameters (SST, SSS, DO) over time, the Ecospace experiments are intended to explore the impact of different equilibrium conditions on fish community longer-term spatial redistribution. By making use not only of climate model output but also paleoclimatic data from a period that closely resembles our near future, stakeholders can make decisions informed by a more robust range of potential outcomes as greenhouse emissions warm the planet.
Insights into low-latitude cloud feedbacks from high-resolution models.
Bretherton, Christopher S
2015-11-13
Cloud feedbacks are a leading source of uncertainty in the climate sensitivity simulated by global climate models (GCMs). Low-latitude boundary-layer and cumulus cloud regimes are particularly problematic, because they are sustained by tight interactions between clouds and unresolved turbulent circulations. Turbulence-resolving models better simulate such cloud regimes and support the GCM consensus that they contribute to positive global cloud feedbacks. Large-eddy simulations using sub-100 m grid spacings over small computational domains elucidate marine boundary-layer cloud response to greenhouse warming. Four observationally supported mechanisms contribute: 'thermodynamic' cloudiness reduction from warming of the atmosphere-ocean column, 'radiative' cloudiness reduction from CO2- and H2O-induced increase in atmospheric emissivity aloft, 'stability-induced' cloud increase from increased lower tropospheric stratification, and 'dynamical' cloudiness increase from reduced subsidence. The cloudiness reduction mechanisms typically dominate, giving positive shortwave cloud feedback. Cloud-resolving models with horizontal grid spacings of a few kilometres illuminate how cumulonimbus cloud systems affect climate feedbacks. Limited-area simulations and superparameterized GCMs show upward shift and slight reduction of cloud cover in a warmer climate, implying positive cloud feedbacks. A global cloud-resolving model suggests tropical cirrus increases in a warmer climate, producing positive longwave cloud feedback, but results are sensitive to subgrid turbulence and ice microphysics schemes. © 2015 The Author(s).
Challenges in identifying sites climatically matched to the native ranges of animal invaders.
Rodda, Gordon H; Jarnevich, Catherine S; Reed, Robert N
2011-02-09
Species distribution models are often used to characterize a species' native range climate, so as to identify sites elsewhere in the world that may be climatically similar and therefore at risk of invasion by the species. This endeavor provoked intense public controversy over recent attempts to model areas at risk of invasion by the Indian Python (Python molurus). We evaluated a number of MaxEnt models on this species to assess MaxEnt's utility for vertebrate climate matching. Overall, we found MaxEnt models to be very sensitive to modeling choices and selection of input localities and background regions. As used, MaxEnt invoked minimal protections against data dredging, multi-collinearity of explanatory axes, and overfitting. As used, MaxEnt endeavored to identify a single ideal climate, whereas different climatic considerations may determine range boundaries in different parts of the native range. MaxEnt was extremely sensitive to both the choice of background locations for the python, and to selection of presence points: inclusion of just four erroneous localities was responsible for Pyron et al.'s conclusion that no additional portions of the U.S. mainland were at risk of python invasion. When used with default settings, MaxEnt overfit the realized climate space, identifying models with about 60 parameters, about five times the number of parameters justifiable when optimized on the basis of Akaike's Information Criterion. When used with default settings, MaxEnt may not be an appropriate vehicle for identifying all sites at risk of colonization. Model instability and dearth of protections against overfitting, multi-collinearity, and data dredging may combine with a failure to distinguish fundamental from realized climate envelopes to produce models of limited utility. A priori identification of biologically realistic model structure, combined with computational protections against these statistical problems, may produce more robust models of invasion risk.
Challenges in Identifying Sites Climatically Matched to the Native Ranges of Animal Invaders
Rodda, Gordon H.; Jarnevich, Catherine S.; Reed, Robert N.
2011-01-01
Background Species distribution models are often used to characterize a species' native range climate, so as to identify sites elsewhere in the world that may be climatically similar and therefore at risk of invasion by the species. This endeavor provoked intense public controversy over recent attempts to model areas at risk of invasion by the Indian Python (Python molurus). We evaluated a number of MaxEnt models on this species to assess MaxEnt's utility for vertebrate climate matching. Methodology/Principal Findings Overall, we found MaxEnt models to be very sensitive to modeling choices and selection of input localities and background regions. As used, MaxEnt invoked minimal protections against data dredging, multi-collinearity of explanatory axes, and overfitting. As used, MaxEnt endeavored to identify a single ideal climate, whereas different climatic considerations may determine range boundaries in different parts of the native range. MaxEnt was extremely sensitive to both the choice of background locations for the python, and to selection of presence points: inclusion of just four erroneous localities was responsible for Pyron et al.'s conclusion that no additional portions of the U.S. mainland were at risk of python invasion. When used with default settings, MaxEnt overfit the realized climate space, identifying models with about 60 parameters, about five times the number of parameters justifiable when optimized on the basis of Akaike's Information Criterion. Conclusions/Significance When used with default settings, MaxEnt may not be an appropriate vehicle for identifying all sites at risk of colonization. Model instability and dearth of protections against overfitting, multi-collinearity, and data dredging may combine with a failure to distinguish fundamental from realized climate envelopes to produce models of limited utility. A priori identification of biologically realistic model structure, combined with computational protections against these statistical problems, may produce more robust models of invasion risk. PMID:21347411
Data-driven Climate Modeling and Prediction
NASA Astrophysics Data System (ADS)
Kondrashov, D. A.; Chekroun, M.
2016-12-01
Global climate models aim to simulate a broad range of spatio-temporal scales of climate variability with state vector having many millions of degrees of freedom. On the other hand, while detailed weather prediction out to a few days requires high numerical resolution, it is fairly clear that a major fraction of large-scale climate variability can be predicted in a much lower-dimensional phase space. Low-dimensional models can simulate and predict this fraction of climate variability, provided they are able to account for linear and nonlinear interactions between the modes representing large scales of climate dynamics, as well as their interactions with a much larger number of modes representing fast and small scales. This presentation will highlight several new applications by Multilayered Stochastic Modeling (MSM) [Kondrashov, Chekroun and Ghil, 2015] framework that has abundantly proven its efficiency in the modeling and real-time forecasting of various climate phenomena. MSM is a data-driven inverse modeling technique that aims to obtain a low-order nonlinear system of prognostic equations driven by stochastic forcing, and estimates both the dynamical operator and the properties of the driving noise from multivariate time series of observations or a high-end model's simulation. MSM leads to a system of stochastic differential equations (SDEs) involving hidden (auxiliary) variables of fast-small scales ranked by layers, which interact with the macroscopic (observed) variables of large-slow scales to model the dynamics of the latter, and thus convey memory effects. New MSM climate applications focus on development of computationally efficient low-order models by using data-adaptive decomposition methods that convey memory effects by time-embedding techniques, such as Multichannel Singular Spectrum Analysis (M-SSA) [Ghil et al. 2002] and recently developed Data-Adaptive Harmonic (DAH) decomposition method [Chekroun and Kondrashov, 2016]. In particular, new results by DAH-MSM modeling and prediction of Arctic Sea Ice, as well as decadal predictions of near-surface Earth temperatures will be presented.
Challenges in identifying sites climatically matched to the native ranges of animal invaders
Rodda, G.H.; Jarnevich, C.S.; Reed, R.N.
2011-01-01
Background: Species distribution models are often used to characterize a species' native range climate, so as to identify sites elsewhere in the world that may be climatically similar and therefore at risk of invasion by the species. This endeavor provoked intense public controversy over recent attempts to model areas at risk of invasion by the Indian Python (Python molurus). We evaluated a number of MaxEnt models on this species to assess MaxEnt's utility for vertebrate climate matching. Methodology/Principal Findings: Overall, we found MaxEnt models to be very sensitive to modeling choices and selection of input localities and background regions. As used, MaxEnt invoked minimal protections against data dredging, multi-collinearity of explanatory axes, and overfitting. As used, MaxEnt endeavored to identify a single ideal climate, whereas different climatic considerations may determine range boundaries in different parts of the native range. MaxEnt was extremely sensitive to both the choice of background locations for the python, and to selection of presence points: inclusion of just four erroneous localities was responsible for Pyron et al.'s conclusion that no additional portions of the U.S. mainland were at risk of python invasion. When used with default settings, MaxEnt overfit the realized climate space, identifying models with about 60 parameters, about five times the number of parameters justifiable when optimized on the basis of Akaike's Information Criterion. Conclusions/Significance: When used with default settings, MaxEnt may not be an appropriate vehicle for identifying all sites at risk of colonization. Model instability and dearth of protections against overfitting, multi-collinearity, and data dredging may combine with a failure to distinguish fundamental from realized climate envelopes to produce models of limited utility. A priori identification of biologically realistic model structure, combined with computational protections against these statistical problems, may produce more robust models of invasion risk.
Quantitative Decision Support Requires Quantitative User Guidance
NASA Astrophysics Data System (ADS)
Smith, L. A.
2009-12-01
Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output for a given problem is presented. Based on climate science, meteorology, and the details of the question in hand, this approach identifies necessary (never sufficient) conditions required for the rational use of climate model output in quantitative decision support tools. Inasmuch as climate forecasting is a problem of extrapolation, there will always be harsh limits on our ability to establish where a model is fit for purpose, this does not, however, limit us from identifying model noise as such, and thereby avoiding some cases of the misapplication and over interpretation of model output. It is suggested that failure to clearly communicate the limits of today’s climate model in providing quantitative decision relevant climate information to today’s users of climate information, would risk the credibility of tomorrow’s climate science and science based policy more generally.
NASA Astrophysics Data System (ADS)
Stainforth, D. A.; Allen, M.; Kettleborough, J.; Collins, M.; Heaps, A.; Stott, P.; Wehner, M.
2001-12-01
The climateprediction.com project is preparing to carry out the first systematic uncertainty analysis of climate forecasts using large ensembles of GCM climate simulations. This will be done by involving schools, businesses and members of the public, and utilizing the novel technology of distributed computing. Each participant will be asked to run one member of the ensemble on their PC. The model used will initially be the UK Met Office's Unified Model (UM). It will be run under Windows and software will be provided to enable those involved to view their model output as it develops. The project will use this method to carry out large perturbed physics GCM ensembles and thereby analyse the uncertainty in the forecasts from such models. Each participant/ensemble member will therefore have a version of the UM in which certain aspects of the model physics have been perturbed from their default values. Of course the non-linear nature of the system means that it will be necessary to look not just at perturbations to individual parameters in specific schemes, such as the cloud parameterization, but also to the many combinations of perturbations. This rapidly leads to the need for very large, perhaps multi-million member ensembles, which could only be undertaken using the distributed computing methodology. The status of the project will be presented and the Windows client will be demonstrated. In addition, initial results will be presented from beta test runs using a demo release for Linux PCs and Alpha workstations. Although small by comparison to the whole project, these pilot results constitute a 20-50 member perturbed physics climate ensemble with results indicating how climate sensitivity can be substantially affected by individual parameter values in the cloud scheme.
Compressing climate model simulations: reducing storage burden while preserving information
NASA Astrophysics Data System (ADS)
Hammerling, Dorit; Baker, Allison; Xu, Haiying; Clyne, John; Li, Samuel
2017-04-01
Climate models, which are run at high spatial and temporal resolutions, generate massive quantities of data. As our computing capabilities continue to increase, storing all of the generated data is becoming a bottleneck, which negatively affects scientific progress. It is thus important to develop methods for representing the full datasets by smaller compressed versions, which still preserve all the critical information and, as an added benefit, allow for faster read and write operations during analysis work. Traditional lossy compression algorithms, as for example used for image files, are not necessarily ideally suited for climate data. While visual appearance is relevant, climate data has additional critical features such as the preservation of extreme values and spatial and temporal gradients. Developing alternative metrics to quantify information loss in a manner that is meaningful to climate scientists is an ongoing process still in its early stages. We will provide an overview of current efforts to develop such metrics to assess existing algorithms and to guide the development of tailored compression algorithms to address this pressing challenge.
Modeling climate change impacts on water trading.
Luo, Bin; Maqsood, Imran; Gong, Yazhen
2010-04-01
This paper presents a new method of evaluating the impacts of climate change on the long-term performance of water trading programs, through designing an indicator to measure the mean of periodic water volume that can be released by trading through a water-use system. The indicator is computed with a stochastic optimization model which can reflect the random uncertainty of water availability. The developed method was demonstrated in the Swift Current Creek watershed of Prairie Canada under two future scenarios simulated by a Canadian Regional Climate Model, in which total water availabilities under future scenarios were estimated using a monthly water balance model. Frequency analysis was performed to obtain the best probability distributions for both observed and simulated water quantity data. Results from the case study indicate that the performance of a trading system is highly scenario-dependent in future climate, with trading effectiveness highly optimistic or undesirable under different future scenarios. Trading effectiveness also largely depends on trading costs, with high costs resulting in failure of the trading program. (c) 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Neeman, Binyamin U.; Ohring, George; Joseph, Joachim H.
1988-01-01
A vertically integrated formulation (VIF) model for sea ice/snow and land snow is discussed which can simulate the nonlinear effects of heat storage and transfer through the layers of snow and ice. The VIF demonstates the accuracy of the multilayer formulation, while benefitting from the computational flexibility of linear formulations. In the second part, the model is implemented in a seasonal dynamic zonally averaged climate model. It is found that, in response to a change between extreme high and low summer insolation orbits, the winter orbital change dominates over the opposite summer change for sea ice. For snow over land the shorter but more pronounced summer orbital change is shown to dominate.
NASA Astrophysics Data System (ADS)
Lin, J. W. B.
2015-12-01
Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiledlanguages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionalityavailable with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Pythonto create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran, to optimize model performance but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.
Evaluation of Ten Methods for Initializing a Land Surface Model
NASA Technical Reports Server (NTRS)
Rodell, M.; Houser, P. R.; Berg, A. A.; Famiglietti, J. S.
2005-01-01
Land surface models (LSMs) are computer programs, similar to weather and climate prediction models, which simulate the stocks and fluxes of water (including soil moisture, snow, evaporation, and runoff) and energy (including the temperature of and sensible heat released from the soil) after they arrive on the land surface as precipitation and sunlight. It is not currently possible to measure all of the variables of interest everywhere on Earth with sufficient accuracy and space-time resolution. Hence LSMs have been developed to integrate the available observations with our understanding of the physical processes involved, using powerful computers, in order to map these stocks and fluxes as they change in time. The maps are used to improve weather forecasts, support water resources and agricultural applications, and study the Earth"s water cycle and climate variability. NASA"s Global Land Data Assimilation System (GLDAS) project facilitates testing of several different LSMs with a variety of input datasets (e.g., precipitation, plant type).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allen, Melissa R
2013-10-01
The following pages represent the status of policy regarding adaptation of the electric grid to climate change and proposed directions for new policy development. While strides are being made to understand the current climate and to predict hazards it may present to human systems, both the science and the policy remain at present in an analytical state. The policy proposed in this document involves first continued computational modeling of outcomes which will produce a portfolio of options to be considered in light of specific region-related risks. It is proposed that the modeling continue not only until reasonable policy at variousmore » levels of jurisdiction can be derived from its outcome but also on a continuing basis so that as improvements in the understanding of the state and trajectory of climate science along with advancements in technology arise, they can be incorporated into an appropriate and evolving policy.« less
Improved Analysis of Earth System Models and Observations using Simple Climate Models
NASA Astrophysics Data System (ADS)
Nadiga, B. T.; Urban, N. M.
2016-12-01
Earth system models (ESM) are the most comprehensive tools we have to study climate change and develop climate projections. However, the computational infrastructure required and the cost incurred in running such ESMs precludes direct use of such models in conjunction with a wide variety of tools that can further our understanding of climate. Here we are referring to tools that range from dynamical systems tools that give insight into underlying flow structure and topology to tools that come from various applied mathematical and statistical techniques and are central to quantifying stability, sensitivity, uncertainty and predictability to machine learning tools that are now being rapidly developed or improved. Our approach to facilitate the use of such models is to analyze output of ESM experiments (cf. CMIP) using a range of simpler models that consider integral balances of important quantities such as mass and/or energy in a Bayesian framework.We highlight the use of this approach in the context of the uptake of heat by the world oceans in the ongoing global warming. Indeed, since in excess of 90% of the anomalous radiative forcing due greenhouse gas emissions is sequestered in the world oceans, the nature of ocean heat uptake crucially determines the surface warming that is realized (cf. climate sensitivity). Nevertheless, ESMs themselves are never run long enough to directly assess climate sensitivity. So, we consider a range of models based on integral balances--balances that have to be realized in all first-principles based models of the climate system including the most detailed state-of-the art climate simulations. The models range from simple models of energy balance to those that consider dynamically important ocean processes such as the conveyor-belt circulation (Meridional Overturning Circulation, MOC), North Atlantic Deep Water (NADW) formation, Antarctic Circumpolar Current (ACC) and eddy mixing. Results from Bayesian analysis of such models using both ESM experiments and actual observations are presented. One such result points to the importance of direct sequestration of heat below 700 m, a process that is not allowed for in the simple models that have been traditionally used to deduce climate sensitivity.
Edlund, Stefan; Davis, Matthew; Douglas, Judith V; Kershenbaum, Arik; Waraporn, Narongrit; Lessler, Justin; Kaufman, James H
2012-09-18
The role of the Anopheles vector in malaria transmission and the effect of climate on Anopheles populations are well established. Models of the impact of climate change on the global malaria burden now have access to high-resolution climate data, but malaria surveillance data tends to be less precise, making model calibration problematic. Measurement of malaria response to fluctuations in climate variables offers a way to address these difficulties. Given the demonstrated sensitivity of malaria transmission to vector capacity, this work tests response functions to fluctuations in land surface temperature and precipitation. This study of regional sensitivity of malaria incidence to year-to-year climate variations used an extended Macdonald Ross compartmental disease model (to compute malaria incidence) built on top of a global Anopheles vector capacity model (based on 10 years of satellite climate data). The predicted incidence was compared with estimates from the World Health Organization and the Malaria Atlas. The models and denominator data used are freely available through the Eclipse Foundation's Spatiotemporal Epidemiological Modeller (STEM). Although the absolute scale factor relating reported malaria to absolute incidence is uncertain, there is a positive correlation between predicted and reported year-to-year variation in malaria burden with an averaged root mean square (RMS) error of 25% comparing normalized incidence across 86 countries. Based on this, the proposed measure of sensitivity of malaria to variations in climate variables indicates locations where malaria is most likely to increase or decrease in response to specific climate factors. Bootstrapping measures the increased uncertainty in predicting malaria sensitivity when reporting is restricted to national level and an annual basis. Results indicate a potential 20x improvement in accuracy if data were available at the level ISO 3166-2 national subdivisions and with monthly time sampling. The high spatial resolution possible with state-of-the-art numerical models can identify regions most likely to require intervention due to climate changes. Higher-resolution surveillance data can provide a better understanding of how climate fluctuations affect malaria incidence and improve predictions. An open-source modelling framework, such as STEM, can be a valuable tool for the scientific community and provide a collaborative platform for developing such models.
NASA Astrophysics Data System (ADS)
Yang, S.; Madsen, M. S.; Rodehacke, C. B.; Svendsen, S. H.; Adalgeirsdottir, G.
2014-12-01
Recent observations show that the Greenland ice sheet (GrIS) has been losing mass with an increasing speed during the past decades. Predicting the GrIS changes and their climate consequences relies on the understanding of the interaction of the GrIS with the climate system on both global and local scales, and requires climate model systems with an explicit and physically consistent ice sheet module. A fully coupled global climate model with a dynamical ice sheet model for the GrIS has recently been developed. The model system, EC-EARTH - PISM, consists of the EC-EARTH, an atmosphere, ocean and sea ice model system, and the Parallel Ice Sheet Model (PISM). The coupling of PISM includes a modified surface physical parameterization in EC-EARTH adapted to the land ice surface over glaciated regions in Greenland. The PISM ice sheet model is forced with the surface mass balance (SMB) directly computed inside the EC-EARTH atmospheric module and accounting for the precipitation, the surface evaporation, and the melting of snow and ice over land ice. PISM returns the simulated basal melt, ice discharge and ice cover (extent and thickness) as boundary conditions to EC-EARTH. This coupled system is mass and energy conserving without being constrained by any anomaly correction or flux adjustment, and hence is suitable for investigation of ice sheet - climate feedbacks. Three multi-century experiments for warm climate scenarios under (1) the RCP85 climate forcing, (2) an abrupt 4xCO2 and (3) an idealized 1% per year CO2 increase are performed using the coupled model system. The experiments are compared with their counterparts of the standard CMIP5 simulations (without the interactive ice sheet) to evaluate the performance of the coupled system and to quantify the GrIS feedbacks. In particular, the evolution of the Greenland ice sheet under the warm climate and its impacts on the climate system are investigated. Freshwater fluxes from the Greenland ice sheet melt to the Arctic and North Atlantic basin and their influence on the ocean stratification and ocean circulation are analysed. The changes in the surface climate and the atmospheric circulation associated with the impact of the Greenland ice sheet changes are quantified. The interaction between the Greenland ice sheet and Arctic sea ice is also examined.
Probabilistic Mapping of Storm-induced Coastal Inundation for Climate Change Adaptation
NASA Astrophysics Data System (ADS)
Li, N.; Yamazaki, Y.; Roeber, V.; Cheung, K. F.; Chock, G.
2016-02-01
Global warming is posing an imminent threat to coastal communities worldwide. Under the IPCC RCP8.5 scenario, we utilize hurricane events downscaled from a CMIP5 global climate model using the stochastic-deterministic method of Emanuel (2013, Proc. Nat. Acad. Sci.) in a pilot study to develop an inundation map with projected sea-level rise for the urban Honolulu coast. The downscaling is performed for a 20-year period from 2081 to 2100 to capture the ENSO, which strongly influences the hurricane activity in the Pacific. A total of 50 simulations provide a quasi-stationary dataset of 1000 years for probabilistic analysis of the flood hazards toward the end of the century. We utilize the meta-model Hakou, which is based on precomputed hurricane scenarios using ADCIRC, SWAN, and a 1D Boussinesq model (Kennedy et al., 2012, Ocean Modelling), to estimate the annual maximum inundation along the project coastline at the present sea level. Screening of the preliminary results identifies the most severe three events for detailed inundation modeling using the package of Li et al. (2014, Ocean Modelling) at the projected sea level. For each event, the third generation spectral model WAVEWATCH III of Tolman (2008, Ocean Modelling) provides the hurricane waves and the circulation model NEOWAVE of Yamazaki et al. (2009, 2011, Int. J. Num. Meth. Fluids) computes the surge using a system of telescopic nested grids from the open ocean to the project coastline. The output defines the boundary conditions and initial still-water elevation for computation of phase-resolving surf-zone and inundation processes using the 2D Boussinesq model of Roeber and Cheung (2012, Coastal Engineering). Each computed inundation event corresponds to an annual maximum, and with 1000 years of data, has an occurrence probability of 0.1% in a given year. Barring the tail of the distribution, aggregation of the three computed events allow delineation of the inundation zone with annual exceedance probability equal to or greater than 0.2% (equivalent to a 500-year return period). An immediate application is to assess the inventory of buildings and structures in Honolulu that would be exposed to increased flood risks due to climate change and identify potential revisions to the building code as part of the adaptation process.
NASA Astrophysics Data System (ADS)
Dessens, Olivier
2016-04-01
Integrated Assessment Models (IAMs) are used as crucial inputs to policy-making on climate change. These models simulate aspect of the economy and climate system to deliver future projections and to explore the impact of mitigation and adaptation policies. The IAMs' climate representation is extremely important as it can have great influence on future political action. The step-function-response is a simple climate model recently developed by the UK Met Office and is an alternate method of estimating the climate response to an emission trajectory directly from global climate model step simulations. Good et al., (2013) have formulated a method of reconstructing general circulation models (GCMs) climate response to emission trajectories through an idealized experiment. This method is called the "step-response approach" after and is based on an idealized abrupt CO2 step experiment results. TIAM-UCL is a technology-rich model that belongs to the family of, partial-equilibrium, bottom-up models, developed at University College London to represent a wide spectrum of energy systems in 16 regions of the globe (Anandarajah et al. 2011). The model uses optimisation functions to obtain cost-efficient solutions, in meeting an exogenously defined set of energy-service demands, given certain technological and environmental constraints. Furthermore, it employs linear programming techniques making the step function representation of the climate change response adapted to the model mathematical formulation. For the first time, we have introduced the "step-response approach" method developed at the UK Met Office in an IAM, the TIAM-UCL energy system, and we investigate the main consequences of this modification on the results of the model in term of climate and energy system responses. The main advantage of this approach (apart from the low computational cost it entails) is that its results are directly traceable to the GCM involved and closely connected to well-known methods of analysing GCMs with the step-experiments. Acknowledgments: This work is supported by the FP7 HELIX project (www.helixclimate.eu) References: Anandarajah, G., Pye, S., Usher, W., Kesicki, F., & Mcglade, C. (2011). TIAM-UCL Global model documentation. https://www.ucl.ac.uk/energy-models/models/tiam-ucl/tiam-ucl-manual Good, P., Gregory, J. M., Lowe, J. A., & Andrews, T. (2013). Abrupt CO2 experiments as tools for predicting and understanding CMIP5 representative concentration pathway projections. Climate Dynamics, 40(3-4), 1041-1053.
Uncertainty Quantification and Sensitivity Analysis in the CICE v5.1 Sea Ice Model
NASA Astrophysics Data System (ADS)
Urrego-Blanco, J. R.; Urban, N. M.
2015-12-01
Changes in the high latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with mid latitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. In this work we characterize parametric uncertainty in Los Alamos Sea Ice model (CICE) and quantify the sensitivity of sea ice area, extent and volume with respect to uncertainty in about 40 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one-at-a-time, this study uses a global variance-based approach in which Sobol sequences are used to efficiently sample the full 40-dimensional parameter space. This approach requires a very large number of model evaluations, which are expensive to run. A more computationally efficient approach is implemented by training and cross-validating a surrogate (emulator) of the sea ice model with model output from 400 model runs. The emulator is used to make predictions of sea ice extent, area, and volume at several model configurations, which are then used to compute the Sobol sensitivity indices of the 40 parameters. A ranking based on the sensitivity indices indicates that model output is most sensitive to snow parameters such as conductivity and grain size, and the drainage of melt ponds. The main effects and interactions among the most influential parameters are also estimated by a non-parametric regression technique based on generalized additive models. It is recommended research to be prioritized towards more accurately determining these most influential parameters values by observational studies or by improving existing parameterizations in the sea ice model.
An Online Approach for Training International Climate Scientists to Use Computer Models
NASA Astrophysics Data System (ADS)
Yarker, M. B.; Mesquita, M. D.; Veldore, V.
2013-12-01
With the mounting evidence by the work of IPCC (2007), climate change has been acknowledged as a significant challenge to Sustainable Development by the international community. It is important that scientists in developing countries have access to knowledge and tools so that well-informed decisions can be made about the mitigation and adaptation of climate change. However, training researchers to use climate modeling techniques and data analysis has become a challenge, because current capacity building approaches train researchers to use climate models through short-term workshops, which requires a large amount of funding. It has also been observed that many participants who recently completed capacity building courses still view climate and weather models as a metaphorical 'black box', where data goes in and results comes out; and there is evidence that these participants lack a basic understanding of the climate system. Both of these issues limit the ability of some scientists to go beyond running a model based on rote memorization of the process. As a result, they are unable to solve problems regarding run-time errors, thus cannot determine whether or not their model simulation is reasonable. Current research in the field of science education indicates that there are effective strategies to teach learners about science models. They involve having the learner work with, experiment with, modify, and apply models in a way that is significant and informative to the learner. It has also been noted that in the case of computational models, the installation and set up process alone can be time consuming and confusing for new users, which can hinder their ability to concentrate on using, experimenting with, and applying the model to real-world scenarios. Therefore, developing an online version of capacity building is an alternative approach to the workshop training programs, which makes use of new technologies and it allows for a long-term educational process in a way that engages the learners with the subject matter, in a way that is meaningful for their region. A number of science-education courses are being conducted online within a capacity building project called 'The Future of Climate Extremes in the Caribbean (XCUBE)'. If accepted, this presentation will explore a case study related to the online training courses provided via the website m2lab.org for the XCUBE project: 'Regional Climate Modeling using WRF'. The course relates to teaching participants how to run WRF for climate simulations using a special version of the model called e-WRF (WRF for Educational purposes). This version of WRF does not require installation so that student learning can be focused on using the model itself. In order to explore the effectiveness of the course, data will be collected from the participants as they complete it. There are currently over 200 participants registered for the course and are made up of graduate students, professors, and researchers from many different science fields. Preliminary results indicate that many students enrolled in this course have previously taken a WRF tutorial, but do not feel confident enough to use it. Despite having taken a tutorial previously, for some participants the basic design of the model was a new concept to them. If accepted, a statistical analysis will be performed as more students complete the course.
Comparing impacts of climate change and mitigation on global agriculture by 2050
NASA Astrophysics Data System (ADS)
van Meijl, Hans; Havlik, Petr; Lotze-Campen, Hermann; Stehfest, Elke; Witzke, Peter; Pérez Domínguez, Ignacio; Bodirsky, Benjamin Leon; van Dijk, Michiel; Doelman, Jonathan; Fellmann, Thomas; Humpenöder, Florian; Koopman, Jason F. L.; Müller, Christoph; Popp, Alexander; Tabeau, Andrzej; Valin, Hugo; van Zeist, Willem-Jan
2018-06-01
Systematic model inter-comparison helps to narrow discrepancies in the analysis of the future impact of climate change on agricultural production. This paper presents a set of alternative scenarios by five global climate and agro-economic models. Covering integrated assessment (IMAGE), partial equilibrium (CAPRI, GLOBIOM, MAgPIE) and computable general equilibrium (MAGNET) models ensures a good coverage of biophysical and economic agricultural features. These models are harmonized with respect to basic model drivers, to assess the range of potential impacts of climate change on the agricultural sector by 2050. Moreover, they quantify the economic consequences of stringent global emission mitigation efforts, such as non-CO2 emission taxes and land-based mitigation options, to stabilize global warming at 2 °C by the end of the century under different Shared Socioeconomic Pathways. A key contribution of the paper is a vis-à-vis comparison of climate change impacts relative to the impact of mitigation measures. In addition, our scenario design allows assessing the impact of the residual climate change on the mitigation challenge. From a global perspective, the impact of climate change on agricultural production by mid-century is negative but small. A larger negative effect on agricultural production, most pronounced for ruminant meat production, is observed when emission mitigation measures compliant with a 2 °C target are put in place. Our results indicate that a mitigation strategy that embeds residual climate change effects (RCP2.6) has a negative impact on global agricultural production relative to a no-mitigation strategy with stronger climate impacts (RCP6.0). However, this is partially due to the limited impact of the climate change scenarios by 2050. The magnitude of price changes is different amongst models due to methodological differences. Further research to achieve a better harmonization is needed, especially regarding endogenous food and feed demand, including substitution across individual commodities, and endogenous technological change.
Hadoop for High-Performance Climate Analytics: Use Cases and Lessons Learned
NASA Technical Reports Server (NTRS)
Tamkin, Glenn
2013-01-01
Scientific data services are a critical aspect of the NASA Center for Climate Simulations mission (NCCS). Hadoop, via MapReduce, provides an approach to high-performance analytics that is proving to be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. The NCCS is particularly interested in the potential of Hadoop to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we prototyped a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. The initial focus was on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. After preliminary results suggested that this approach improves efficiencies within data intensive analytic workflows, we invested in building a cyber infrastructure resource for developing a new generation of climate data analysis capabilities using Hadoop. This resource is focused on reducing the time spent in the preparation of reanalysis data used in data-model inter-comparison, a long sought goal of the climate community. This paper summarizes the related use cases and lessons learned.
NASA Astrophysics Data System (ADS)
Smith, B.
2015-12-01
In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs, and web browsers. The framework is designed to be scalable to large datasets, yet easy to use and familiar to scientists using previous tools. Integration in the ACME overall user interface facilitates data publication, further analysis, and quick feedback to model developers and scientists making component or coupled model runs.
qtcm 0.1.2: A Python Implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation model
NASA Astrophysics Data System (ADS)
Lin, J. W.-B.
2008-10-01
Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.
qtcm 0.1.2: a Python implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation Model
NASA Astrophysics Data System (ADS)
Lin, J. W.-B.
2009-02-01
Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.
Geoengineering by cloud seeding: influence on sea ice and climate system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rasch, Philip J.; Latham, John; Chen, Chih-Chieh
2009-12-18
GCM computations using a fully coupled ocean atmosphere model indicate that increasing cloud reflectivity by seeding maritime boundary layer clouds with particles made from seawater may compensate for some of the effects on climate of increasing greenhouse gas concentrations. The chosen seeding strategy (one of many possible scenarios) can restore global averages of temperature, precipitation and sea ice to present day values, but not simultaneously. The response varies nonlinearly with extent of the seeding, and geoengineering generates local changes to important climatic features. The global tradeoffs of restoring ice cover and cooling the planet must be assessed alongside the localmore » changes to climate features.« less
Postglacial migration supplements climate in determining plant species ranges in Europe
Normand, Signe; Ricklefs, Robert E.; Skov, Flemming; Bladt, Jesper; Tackenberg, Oliver; Svenning, Jens-Christian
2011-01-01
The influence of dispersal limitation on species ranges remains controversial. Considering the dramatic impacts of the last glaciation in Europe, species might not have tracked climate changes through time and, as a consequence, their present-day ranges might be in disequilibrium with current climate. For 1016 European plant species, we assessed the relative importance of current climate and limited postglacial migration in determining species ranges using regression modelling and explanatory variables representing climate, and a novel species-specific hind-casting-based measure of accessibility to postglacial colonization. Climate was important for all species, while postglacial colonization also constrained the ranges of more than 50 per cent of the species. On average, climate explained five times more variation in species ranges than accessibility, but accessibility was the strongest determinant for one-sixth of the species. Accessibility was particularly important for species with limited long-distance dispersal ability, with southern glacial ranges, seed plants compared with ferns, and small-range species in southern Europe. In addition, accessibility explained one-third of the variation in species' disequilibrium with climate as measured by the realized/potential range size ratio computed with niche modelling. In conclusion, we show that although climate is the dominant broad-scale determinant of European plant species ranges, constrained dispersal plays an important supplementary role. PMID:21543356
Paleoclimates: Understanding climate change past and present
Cronin, Thomas M.
2010-01-01
The field of paleoclimatology relies on physical, chemical, and biological proxies of past climate changes that have been preserved in natural archives such as glacial ice, tree rings, sediments, corals, and speleothems. Paleoclimate archives obtained through field investigations, ocean sediment coring expeditions, ice sheet coring programs, and other projects allow scientists to reconstruct climate change over much of earth's history. When combined with computer model simulations, paleoclimatic reconstructions are used to test hypotheses about the causes of climatic change, such as greenhouse gases, solar variability, earth's orbital variations, and hydrological, oceanic, and tectonic processes. This book is a comprehensive, state-of-the art synthesis of paleoclimate research covering all geological timescales, emphasizing topics that shed light on modern trends in the earth's climate. Thomas M. Cronin discusses recent discoveries about past periods of global warmth, changes in atmospheric greenhouse gas concentrations, abrupt climate and sea-level change, natural temperature variability, and other topics directly relevant to controversies over the causes and impacts of climate change. This text is geared toward advanced undergraduate and graduate students and researchers in geology, geography, biology, glaciology, oceanography, atmospheric sciences, and climate modeling, fields that contribute to paleoclimatology. This volume can also serve as a reference for those requiring a general background on natural climate variability.
Impacts of Climate Change on the Global Invasion Potential of the African Clawed Frog Xenopus laevis
Ihlow, Flora; Courant, Julien; Secondi, Jean; Herrel, Anthony; Rebelo, Rui; Measey, G. John; Lillo, Francesco; De Villiers, F. André; Vogt, Solveig; De Busschere, Charlotte; Backeljau, Thierry; Rödder, Dennis
2016-01-01
By altering or eliminating delicate ecological relationships, non-indigenous species are considered a major threat to biodiversity, as well as a driver of environmental change. Global climate change affects ecosystems and ecological communities, leading to changes in the phenology, geographic ranges, or population abundance of several species. Thus, predicting the impacts of global climate change on the current and future distribution of invasive species is an important subject in macroecological studies. The African clawed frog (Xenopus laevis), native to South Africa, possesses a strong invasion potential and populations have become established in numerous countries across four continents. The global invasion potential of X. laevis was assessed using correlative species distribution models (SDMs). SDMs were computed based on a comprehensive set of occurrence records covering South Africa, North America, South America and Europe and a set of nine environmental predictors. Models were built using both a maximum entropy model and an ensemble approach integrating eight algorithms. The future occurrence probabilities for X. laevis were subsequently computed using bioclimatic variables for 2070 following four different IPCC scenarios. Despite minor differences between the statistical approaches, both SDMs predict the future potential distribution of X. laevis, on a global scale, to decrease across all climate change scenarios. On a continental scale, both SDMs predict decreasing potential distributions in the species’ native range in South Africa, as well as in the invaded areas in North and South America, and in Australia where the species has not been introduced. In contrast, both SDMs predict the potential range size to expand in Europe. Our results suggest that all probability classes will be equally affected by climate change. New regional conditions may promote new invasions or the spread of established invasive populations, especially in France and Great Britain. PMID:27248830
Ihlow, Flora; Courant, Julien; Secondi, Jean; Herrel, Anthony; Rebelo, Rui; Measey, G John; Lillo, Francesco; De Villiers, F André; Vogt, Solveig; De Busschere, Charlotte; Backeljau, Thierry; Rödder, Dennis
2016-01-01
By altering or eliminating delicate ecological relationships, non-indigenous species are considered a major threat to biodiversity, as well as a driver of environmental change. Global climate change affects ecosystems and ecological communities, leading to changes in the phenology, geographic ranges, or population abundance of several species. Thus, predicting the impacts of global climate change on the current and future distribution of invasive species is an important subject in macroecological studies. The African clawed frog (Xenopus laevis), native to South Africa, possesses a strong invasion potential and populations have become established in numerous countries across four continents. The global invasion potential of X. laevis was assessed using correlative species distribution models (SDMs). SDMs were computed based on a comprehensive set of occurrence records covering South Africa, North America, South America and Europe and a set of nine environmental predictors. Models were built using both a maximum entropy model and an ensemble approach integrating eight algorithms. The future occurrence probabilities for X. laevis were subsequently computed using bioclimatic variables for 2070 following four different IPCC scenarios. Despite minor differences between the statistical approaches, both SDMs predict the future potential distribution of X. laevis, on a global scale, to decrease across all climate change scenarios. On a continental scale, both SDMs predict decreasing potential distributions in the species' native range in South Africa, as well as in the invaded areas in North and South America, and in Australia where the species has not been introduced. In contrast, both SDMs predict the potential range size to expand in Europe. Our results suggest that all probability classes will be equally affected by climate change. New regional conditions may promote new invasions or the spread of established invasive populations, especially in France and Great Britain.
The impact of global warming on river runoff
NASA Technical Reports Server (NTRS)
Miller, James R.; Russell, Gary L.
1992-01-01
A global atmospheric model is used to calculate the annual river runoff for 33 of the world's major rivers for the present climate and for a doubled CO2 climate. The model has a horizontal resolution of 4 x 5 deg, but the runoff from each model grid box is quartered and added to the appropriate river drainage basin on a 2 x 2.5 deg resolution. The computed runoff depends on the model's precipitation, evapotranspiration, and soil moisture storage. For the doubled CO2 climate, the runoff increased for 25 of the 33 rivers, and in most cases the increases coincide with increased rainfall within the drainage basins. There were runoff increases in all rivers in high northern latitudes, with a maximum increase of 47 percent. At low latitudes there were both increases and decreases ranging from a 96 increase to a 43 percent decrease. The effect of the simplified model assumptions of land-atmosphere interactions on the results is discussed.
NASA Astrophysics Data System (ADS)
Durán-Barroso, Pablo; González, Javier; Valdés, Juan B.
2016-04-01
Rainfall-runoff quantification is one of the most important tasks in both engineering and watershed management as it allows to identify, forecast and explain watershed response. For that purpose, the Natural Resources Conservation Service Curve Number method (NRCS CN) is the conceptual lumped model more recognized in the field of rainfall-runoff estimation. Furthermore, there is still an ongoing discussion about the procedure to determine the portion of rainfall retained in the watershed before runoff is generated, called as initial abstractions. This concept is computed as a ratio (λ) of the soil potential maximum retention S of the watershed. Initially, this ratio was assumed to be 0.2, but later it has been proposed to be modified to 0.05. However, the actual procedures to convert NRCS CN model parameters obtained under a different hypothesis about λ do not incorporate any adaptation of climatic conditions of each watershed. By this reason, we propose a new simple method for computing model parameters which is adapted to local conditions taking into account regional patterns of climate conditions. After checking the goodness of this procedure against the actual ones in 34 different watersheds located in Ohio and Texas (United States), we concluded that this novel methodology represents the most accurate and efficient alternative to refit the initial abstraction ratio.
Climatic Forecasting of Net Infiltration at Yucca Mountain, Using Analogue Meteorological Data
NASA Astrophysics Data System (ADS)
Faybishenko, B.
2005-12-01
Net infiltration is a key hydrologic parameter that, throughout the unsaturated zone, controls the rate of deep percolation, the groundwater recharge, radionuclide transport, and seepage into underground tunnels. Because net infiltration is largely affected by climatic conditions, future changes in climatic conditions will potentially alter net infiltration. The objectives of this presentation are to: (1) Present a conceptual model and a semi-empirical approach for regional climatic forecasting of net infiltration, based on precipitation and temperature data from analogue meteorological stations; and (2) Demonstrate the results of forecasting net infiltration for future climates - interglacial, monsoon and glacial - over the Yucca Mountain region for a period of 500,000 years. Calculations of net infiltration were performed using a modified Budyko's water-balance model, and potential evapotranspiration was evaluated from the temperature-based Thornthwaite formula. (Both Budyko's and Thornthwaite's formulae have been used broadly in hydrological studies.) The results of these calculations were used for ranking net infiltration, along with aridity and precipitation-effectiveness (P-E) indices, for future climatic scenarios. Using this approach, we determined a general trend of increasing net infiltration from the present-day (interglacial) climate to the monsoon, intermediate (glacial transition) climate, a trend that continued into the glacial climate time frame. The ranking of aridity and P-E indices is practically the same as that for net infiltration. Validation of the computed net infiltration rates yielded a good match with other field and modeling study results related to groundwater recharge and net infiltration evaluation.
NASA Astrophysics Data System (ADS)
Malone, A.; Pierrehumbert, R.; Insel, N.; Lowell, T. V.; Kelly, M. A.
2012-12-01
The response of the tropics to climate forcing mechanisms is poorly understood, and there is limited data regarding past tropical climate fluctuations. Past climate fluctuations often leave a detectable record of glacial response in the location of moraines. Computer reconstructions of glacial length variations can thus help constrain past climate fluctuations. Chronology and position data for Holocene moraines are available for the Quelccaya Ice Cap in the Peruvian Andes. The Quelccaya Ice Cap is the equatorial region's largest glaciated area, and given its size and the available data, it is an ideal location at which to use a computer glacier model to reconstruct past glacial extents and constrain past tropical climate fluctuations. We can reproduce the current length and shape of the glacier in the Huancane Valley of the Quelccaya Ice Cap using a 1-D mountain glacier flowline model with an orographic precipitation scheme, an energy balance model for the ablation scheme, and reasonable modern climate conditions. We conduct two experiments. First, we determine the amount of cooling necessary to reproduce the observed Holocene moraine locations by holding the precipitation profile constant and varying the mean sea surface temperature (SST) values. Second, we determine the amount of precipitation increase necessary to reproduce the observed moraine locations by holding the mean SST value constant and varying the maximum precipitation values. We find that the glacier's length is highly sensitive to changes in temperature while only weakly sensitive to changes in precipitation. In the constant precipitation experiment, a decrease in the mean SST of only 0.35 °C can reproduce the nearest Holocene moraine downslope from the current glacier terminus and a decrease in the mean SST of only 1.43 °C can reproduce the furthest Holocene moraine downslope from the current terminus. In the experiment with constant SST, the necessary increase in maximum precipitation is much greater. An increase in the maximum precipitation of 30% is necessary to reproduce the nearest Holocene moraine and an increase in the maximum precipitation of 130% is necessary to reproduce the furthest Holocene moraine. Our results provide a range of values for the mean SST and maximum precipitation that can reproduce the location of Holocene glacial moraines, constraining some of the climate fluctuations in the tropics during the Holocene. These constraints can be used to test hypotheses for climate forcing mechanisms during Holocene events such as the Little Ice Age and possibly provide insight into future tropical climate fluctuations given current and future forcing mechanisms.
Dugan, J.T.; Peckenpaugh, J.M.
1985-01-01
The Central Midwest aquifer system, in parts of Arkansas, Colorado, Kansas, Missouri, Nebraska, New Mexico, South Dakota, and Texas, is a region of great hydrologic diversity. This study examines the relationships between climate, vegetation, and soil that affect consumptive water use and recharge to the groundwater system. Computations of potential recharge and consumptive water use were restricted to those areas where the aquifers under consideration were the immediate underlying system. The principal method of analysis utilized a soil moisture computer model. This model requires four types of input: (1) hydrologic properties of the soils, (2) vegetation types, (3) monthly precipitation, and (4) computed monthly potential evapotranspiration (PET) values. The climatic factors that affect consumptive water use and recharge were extensively mapped for the study area. Nearly all the pertinent climatic elements confirmed the extreme diversity of the region. PET and those factors affecting it--solar radiation, temperature, and humidity--showed large regional differences; mean annual PET ranged from 36 to 70 inches in the study area. The seasonal climatic patterns indicate significant regional differences in those factors affecting seasonal consumptive water use and recharge. In the southern and western parts of the study area, consumptive water use occurred nearly the entire year; whereas, in northern parts it occurred primarily during the warm season (April through September). Results of the soil-moisture program, which added the effects of vegetation and the hydrologic characteristics of the soil to computed PET values, confirmed the significant regional differences in consumptive water use or actual evapotranspiration (AET) and potential groundwater recharge. Under two different vegetative conditions--the 1978 conditions and pre-agricultural conditions consisting of only grassland and woodland--overall differences in recharge were minimal. Mean annual recharge under both conditions averaged slightly more than 4.5 inches for the entire study area, but ranged from less than 0.10 inches in eastern Colorado to slightly more than 15 inches in Arkansas. (Lantz-PTT)
NASA Astrophysics Data System (ADS)
Huq, E.; Abdul-Aziz, O. I.
2017-12-01
We computed the historical and future storm runoff scenarios for the Shingle Creek Basin, including the growing urban centers of central Florida (e.g., City of Orlando). Storm Water Management Model (SWMM 5.1) of US EPA was used to develop a mechanistic hydrologic model for the basin by incorporating components of urban hydrology, hydroclimatological variables, and land use/cover features. The model was calibrated and validated with historical streamflow of 2004-2013 near the outlet of the Shingle Creek. The calibrated model was used to compute the sensitivities of stormwater budget to reference changes in hydroclimatological variables (rainfall and evapotranspiration) and land use/cover features (imperviousness, roughness). Basin stormwater budgets for the historical (2010s = 2004-2013) and future periods (2050s = 2030-2059; 2080s = 2070-2099) were also computed based on downscaled climatic projections of 20 GCMs-RCMs representing the coupled model intercomparison project (CMIP5), and anticipated changes in land use/cover. The sensitivity analyses indicated the dominant drivers of urban runoff in the basin. Comparative assessment of the historical and future stormwater runoff scenarios helped to locate basin areas that would be at a higher risk of future stormwater flooding. Importance of the study lies in providing valuable guidelines for managing stormwater flooding in central Florida and similar growing urban centers around the world.
NASA Astrophysics Data System (ADS)
Straatsma, Menno; Droogers, Peter; Brandsma, Jaïrus; Buytaert, Wouter; Karssenberg, Derek; Van Beek, Rens; Wada, Yoshihide; Sutanudjaja, Edwin; Vitolo, Claudia; Schmitz, Oliver; Meijer, Karen; Van Aalst, Maaike; Bierkens, Marc
2014-05-01
Water scarcity affects large parts of the world. Over the course of the twenty-first century, water demand is likely to increase due to population growth and associated food production, and increased economic activity, while water supply is projected to decrease in many regions due to climate change. Despite recent studies that analyze the effect of climate change on water scarcity, e.g. using climate projections under representative concentration pathways (RCP) of the fifth assessment report of the IPCC (AR5), decision support for closing the water gap between now and 2100 does not exist at a meaningful scale and with a global coverage. In this study, we aimed (i) to assess the joint impact of climatic and socio-economic change on water scarcity, (ii) to integrate impact and potential adaptation in one workflow, (iii) to prioritize adaptation options to counteract water scarcity based on their financial, regional socio-economic and environmental implications, and (iv) to deliver all this information in an integrated user-friendly web-based service. To enable the combination of global coverage with local relevance, we aggregated all results for 1604 water provinces (food producing units) delineated in this study, which is five times smaller than previous food producing units. Water supply was computed using the PCR-GLOBWB hydrological and water resources model, parameterized at 5 arcminutes for the whole globe, excluding Antarctica and Greenland. We ran PCR-GLOBWB with a daily forcing derived from five different GCM models from the CMIP5 (GFDL-ESM2M, Hadgem2-ES, IPSL-CMA5-LR, MIROC-ESM-CHEM, NorESM1-M) that were bias corrected using observation-based WATCH data between 1960-1999. For each of the models all four RCPs (RCP 2.6, 4.5, 6.0, and 8.5) were run, producing the ensemble of 20 future projections. The blue water supply was aggregated per month and per water province. Industrial, domestic and irrigation water demands were computed for a limited number of realistic combinations of a shared socio-economic pathways (SSPs) and RCPs. Our Water And Climate Adaptation Model (WatCAM) was used to compute the water gap based on reservoir capacity, water supply, and water demand. WatCam is based on the existing ModSim (Labadie, 2010) water allocation model, and facilitated the evaluation of nine technological and infrastructural adaptation measures to assess the investments needed to bridge the future water gap. Regional environmental and socio-economic effects of these investments, such as environmental flows or downstream effects, were evaluated. A scheme was developed to evaluate the strategies on robustness and flexibility under climate change and scenario uncertainty, and each measure was linked to possibilities for investment and financing mechanisms. The WatCAM is available as a web modeling service from www.water2invest.com, and enables user specified adaptation measures and the creation of an ensemble of water gap forecasts.
A Simplified Biosphere Model for Global Climate Studies.
NASA Astrophysics Data System (ADS)
Xue, Y.; Sellers, P. J.; Kinter, J. L.; Shukla, J.
1991-03-01
The Simple Biosphere Model (SiB) as described in Sellers et al. is a bio-physically based model of land surface-atmosphere interaction. For some general circulation model (GCM) climate studies, further simplifications are desirable to have greater computation efficiency, and more important, to consolidate the parametric representation. Three major reductions in the complexity of SiB have been achieved in the present study.The diurnal variation of surface albedo is computed in SiB by means of a comprehensive yet complex calculation. Since the diurnal cycle is quite regular for each vegetation type, this calculation can be simplified considerably. The effect of root zone soil moisture on stomatal resistance is substantial, but the computation in SiB is complicated and expensive. We have developed approximations, which simulate the effects of reduced soil moisture more simply, keeping the essence of the biophysical concepts used in SiB.The surface stress and the fluxes of heat and moisture between the top of the vegetation canopy and an atmospheric reference level have been parameterized in an off-line version of SiB based upon the studies by Businger et al. and Paulson. We have developed a linear relationship between Richardson number and aero-dynamic resistance. Finally, the second vegetation layer of the original model does not appear explicitly after simplification. Compared to the model of Sellers et al., we have reduced the number of input parameters from 44 to 21. A comparison of results using the reduced parameter biosphere with those from the original formulation in a GCM and a zero-dimensional model shows the simplified version to reproduce the original results quite closely. After simplification, the computational requirement of SiB was reduced by about 55%.
Are Plant Species Able to Keep Pace with the Rapidly Changing Climate?
Cunze, Sarah; Heydel, Felix; Tackenberg, Oliver
2013-01-01
Future climate change is predicted to advance faster than the postglacial warming. Migration may therefore become a key driver for future development of biodiversity and ecosystem functioning. For 140 European plant species we computed past range shifts since the last glacial maximum and future range shifts for a variety of Intergovernmental Panel on Climate Change (IPCC) scenarios and global circulation models (GCMs). Range shift rates were estimated by means of species distribution modelling (SDM). With process-based seed dispersal models we estimated species-specific migration rates for 27 dispersal modes addressing dispersal by wind (anemochory) for different wind conditions, as well as dispersal by mammals (dispersal on animal's coat – epizoochory and dispersal by animals after feeding and digestion – endozoochory) considering different animal species. Our process-based modelled migration rates generally exceeded the postglacial range shift rates indicating that the process-based models we used are capable of predicting migration rates that are in accordance with realized past migration. For most of the considered species, the modelled migration rates were considerably lower than the expected future climate change induced range shift rates. This implies that most plant species will not entirely be able to follow future climate-change-induced range shifts due to dispersal limitation. Animals with large day- and home-ranges are highly important for achieving high migration rates for many plant species, whereas anemochory is relevant for only few species. PMID:23894290
USDA-ARS?s Scientific Manuscript database
Accurate estimates of terrestrial carbon sequestration is essential for evaluating changes in the carbon cycle due to global climate change. In a recent assessment of 26 carbon assimilation models at 39 FLUXNET tower sites across the United States and Canada, all models failed to adequately compute...
Studies of Trace Gas Chemical Cycles Using Inverse Methods and Global Chemical Transport Models
NASA Technical Reports Server (NTRS)
Prinn, Ronald G.
2003-01-01
We report progress in the first year, and summarize proposed work for the second year of the three-year dynamical-chemical modeling project devoted to: (a) development, testing, and refining of inverse methods for determining regional and global transient source and sink strengths for long lived gases important in ozone depletion and climate forcing, (b) utilization of inverse methods to determine these source/sink strengths using either MATCH (Model for Atmospheric Transport and Chemistry) which is based on analyzed observed wind fields or back-trajectories computed from these wind fields, (c) determination of global (and perhaps regional) average hydroxyl radical concentrations using inverse methods with multiple titrating gases, and (d) computation of the lifetimes and spatially resolved destruction rates of trace gases using 3D models. Important goals include determination of regional source strengths of methane, nitrous oxide, methyl bromide, and other climatically and chemically important biogenic/anthropogenic trace gases and also of halocarbons restricted by the Montreal protocol and its follow-on agreements and hydrohalocarbons now used as alternatives to the restricted halocarbons.
Projecting Wind Energy Potential Under Climate Change with Ensemble of Climate Model Simulations
NASA Astrophysics Data System (ADS)
Jain, A.; Shashikanth, K.; Ghosh, S.; Mukherjee, P. P.
2013-12-01
Recent years have witnessed an increasing global concern over energy sustainability and security, triggered by a number of issues, such as (though not limited to): fossil fuel depletion, energy resource geopolitics, economic efficiency versus population growth debate, environmental concerns and climate change. Wind energy is a renewable and sustainable form of energy in which wind turbines convert the kinetic energy of wind into electrical energy. Global warming and differential surface heating may significantly impact the wind velocity and hence the wind energy potential. Sustainable design of wind mills requires understanding the impacts of climate change on wind energy potential, which we evaluate here with multiple General Circulation Models (GCMs). GCMs simulate the climate variables globally considering the greenhouse emission scenarios provided as Representation Concentration path ways (RCPs). Here we use new generation climate model outputs obtained from Coupled model Intercomparison Project 5(CMIP5). We first compute the wind energy potential with reanalysis data (NCEP/ NCAR), at a spatial resolution of 2.50, where the gridded data is fitted to Weibull distribution and with the Weibull parameters, the wind energy densities are computed at different grids. The same methodology is then used, to CMIP5 outputs (resultant of U-wind and V-wind) of MRI, CMCC, BCC, CanESM, and INMCM4 for historical runs. This is performed separately for four seasons globally, MAM, JJA, SON and DJF. We observe the muti-model average of wind energy density for historic period has significant bias with respect to that of reanalysis product. Here we develop a quantile based superensemble approach where GCM quantiles corresponding to selected CDF values are regressed to reanalysis data. It is observed that this regression approach takes care of both, bias in GCMs and combination of GCMs. With superensemble, we observe that the historical wind energy density resembles quite well with reanalysis/ observed output. We apply the same for future under RCP scenarios. We observe spatially and temporally varying global change of wind energy density. The underlying assumption is that the regression relationship will also hold good for future. The results highlight the needs to change the design standards of wind mills at different locations, considering climate change and at the same time the requirement of height modifications for existing mills to produce same energy in future.
Challenges of Representing Sub-Grid Physics in an Adaptive Mesh Refinement Atmospheric Model
NASA Astrophysics Data System (ADS)
O'Brien, T. A.; Johansen, H.; Johnson, J. N.; Rosa, D.; Benedict, J. J.; Keen, N. D.; Collins, W.; Goodfriend, E.
2015-12-01
Some of the greatest potential impacts from future climate change are tied to extreme atmospheric phenomena that are inherently multiscale, including tropical cyclones and atmospheric rivers. Extremes are challenging to simulate in conventional climate models due to existing models' coarse resolutions relative to the native length-scales of these phenomena. Studying the weather systems of interest requires an atmospheric model with sufficient local resolution, and sufficient performance for long-duration climate-change simulations. To this end, we have developed a new global climate code with adaptive spatial and temporal resolution. The dynamics are formulated using a block-structured conservative finite volume approach suitable for moist non-hydrostatic atmospheric dynamics. By using both space- and time-adaptive mesh refinement, the solver focuses computational resources only where greater accuracy is needed to resolve critical phenomena. We explore different methods for parameterizing sub-grid physics, such as microphysics, macrophysics, turbulence, and radiative transfer. In particular, we contrast the simplified physics representation of Reed and Jablonowski (2012) with the more complex physics representation used in the System for Atmospheric Modeling of Khairoutdinov and Randall (2003). We also explore the use of a novel macrophysics parameterization that is designed to be explicitly scale-aware.
LLNL Scientists Use NERSC to Advance Global Aerosol Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergmann, D J; Chuang, C; Rotman, D
2004-10-13
While ''greenhouse gases'' have been the focus of climate change research for a number of years, DOE's ''Aerosol Initiative'' is now examining how aerosols (small particles of approximately micron size) affect the climate on both a global and regional scale. Scientists in the Atmospheric Science Division at Lawrence Livermore National Laboratory (LLNL) are using NERSC's IBM supercomputer and LLNL's IMPACT (atmospheric chemistry) model to perform simulations showing the historic effects of sulfur aerosols at a finer spatial resolution than ever done before. Simulations were carried out for five decades, from the 1950s through the 1990s. The results clearly show themore » effects of the changing global pattern of sulfur emissions. Whereas in 1950 the United States emitted 41 percent of the world's sulfur aerosols, this figure had dropped to 15 percent by 1990, due to conservation and anti-pollution policies. By contrast, the fraction of total sulfur emissions of European origin has only dropped by a factor of 2 and the Asian emission fraction jumped six fold during the same time, from 7 percent in 1950 to 44 percent in 1990. Under a special allocation of computing time provided by the Office of Science INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program, Dan Bergmann, working with a team of LLNL scientists including Cathy Chuang, Philip Cameron-Smith, and Bala Govindasamy, was able to carry out a large number of calculations during the past month, making the aerosol project one of the largest users of NERSC resources. The applications ran on 128 and 256 processors. The objective was to assess the effects of anthropogenic (man-made) sulfate aerosols. The IMPACT model calculates the rate at which SO{sub 2} (a gas emitted by industrial activity) is oxidized and forms particles known as sulfate aerosols. These particles have a short lifespan in the atmosphere, often washing out in about a week. This means that their effects on climate tend to be more regional, occurring near the area where the SO{sub 2} is emitted. To accurately study these regional effects, Bergmann needed to run the simulations at a finer horizontal resolution, as the coarser resolution (typically 300km by 300km) of other climate models are insufficient for studying changes on a regional scale. Livermore's use of CAM3, the Community Atmospheric Model which is a high-resolution climate model developed at NCAR (with collaboration from DOE), allows a 100km by 100km grid to be applied. NERSC's terascale computing capability provided the needed computational horsepower to run the application at the finer level.« less
Modeling transport of nutrients & sediment loads into Lake Tahoe under climate change
Riverson, John; Coats, Robert; Costa-Cabral, Mariza; Dettinger, Mike; Reuter, John; Sahoo, Goloka; Schladow, Geoffrey
2013-01-01
The outputs from two General Circulation Models (GCMs) with two emissions scenarios were downscaled and bias-corrected to develop regional climate change projections for the Tahoe Basin. For one model—the Geophysical Fluid Dynamics Laboratory or GFDL model—the daily model results were used to drive a distributed hydrologic model. The watershed model used an energy balance approach for computing evapotranspiration and snowpack dynamics so that the processes remain a function of the climate change projections. For this study, all other aspects of the model (i.e. land use distribution, routing configuration, and parameterization) were held constant to isolate impacts of climate change projections. The results indicate that (1) precipitation falling as rain rather than snow will increase, starting at the current mean snowline, and moving towards higher elevations over time; (2) annual accumulated snowpack will be reduced; (3) snowpack accumulation will start later; and (4) snowmelt will start earlier in the year. Certain changes were masked (or counter-balanced) when summarized as basin-wide averages; however, spatial evaluation added notable resolution. While rainfall runoff increased at higher elevations, a drop in total precipitation volume decreased runoff and fine sediment load from the lower elevation meadow areas and also decreased baseflow and nitrogen loads basin-wide. This finding also highlights the important role that the meadow areas could play as high-flow buffers under climatic change. Because the watershed model accounts for elevation change and variable meteorological patterns, it provided a robust platform for evaluating the impacts of projected climate change on hydrology and water quality.
NASA Astrophysics Data System (ADS)
von Trentini, F.; Willkofer, F.; Wood, R. R.; Schmid, F. J.; Ludwig, R.
2017-12-01
The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. Therefore, a hydro-meteorological model chain is applied. It employs high performance computing capacity of the Leibniz Supercomputing Centre facility SuperMUC to dynamically downscale 50 members of the Global Circulation Model CanESM2 over European and Eastern North American domains using the Canadian Regional Climate Model (RCM) CRCM5. Over Europe, the unique single model ensemble is conjointly analyzed with the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change in the dynamics of extreme events. Furthermore, these 50 members of a single RCM will enhance extreme value statistics (extreme return periods) by exploiting the available 1500 model years for the reference period from 1981 to 2010. Hence, the RCM output is applied to drive the process based, fully distributed, and deterministic hydrological model WaSiM in high temporal (3h) and spatial (500m) resolution. WaSiM and the large ensemble are further used to derive a variety of hydro-meteorological patterns leading to severe flood events. A tool for virtual perfect prediction shall provide a combination of optimal lead time and management strategy to mitigate certain flood events following these patterns.
Utilization of Short-Simulations for Tuning High-Resolution Climate Model
NASA Astrophysics Data System (ADS)
Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.
2016-12-01
Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (< 10 days ) and longer ( 1 year) Perturbed Parameters Ensemble (PPE) simulations at low resolution to identify model feature sensitivity to parameter changes. The CAPT tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in greater detail once an educated set of parameter choice is selected. Limitations on using short-term simulations for tuning climate model are also discussed.
NASA Astrophysics Data System (ADS)
Gallice, A.
2015-12-01
Stream temperature controls important aspects of the riverine habitat, such as the rate of spawning or death of many fish species, or the concentration of numerous dissolved substances. In the current context of accelerating climate change, the future evolution of stream temperature is regarded as uncertain, particularly in the Alps. This uncertainty fostered the development of many prediction models, which are usually classified in two categories: mechanistic models and statistical models. Based on the numerical resolution of physical conservation laws, mechanistic models are generally considered to provide more reliable long-term estimates than regression models. However, despite their physical basis, these models are observed to differ quite significantly in some aspects of their implementation, notably (1) the routing of water in the river channel and (2) the estimation of the temperature of groundwater discharging into the stream. For each one of these two aspects, we considered several of the standard modeling approaches reported in the literature and implemented them in a new modular framework. The latter is based on the spatially-distributed snow model Alpine3D, which is essentially used in the framework to compute the amount of water infiltrating in the upper soil layer. Starting from there, different methods can be selected for the computation of the water and energy fluxes in the hillslopes and in the river network. We relied on this framework to compare the various methodologies for river channel routing and groundwater temperature modeling. We notably assessed the impact of each these approaches on the long-term stream temperature predictions of the model under a typical climate change scenario. The case study was conducted over a high Alpine catchment in Switzerland, whose hydrological and thermal regimes are expected to be markedly affected by climate change. The results show that the various modeling approaches lead to significant differences in the model predictions, and that these differences may be larger than the uncertainties in future air temperature. It is also shown that the temperature of groundwater discharging into the stream has a marked impact on the modeled stream temperature at the catchment outlet.
The Program for climate Model diagnosis and Intercomparison: 20-th anniversary Symposium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potter, Gerald L; Bader, David C; Riches, Michael
Twenty years ago, W. Lawrence (Larry) Gates approached the U.S. Department of Energy (DOE) Office of Energy Research (now the Office of Science) with a plan to coordinate the comparison and documentation of climate model differences. This effort would help improve our understanding of climate change through a systematic approach to model intercomparison. Early attempts at comparing results showed a surprisingly large range in control climate from such parameters as cloud cover, precipitation, and even atmospheric temperature. The DOE agreed to fund the effort at the Lawrence Livermore National Laboratory (LLNL), in part because of the existing computing environment andmore » because of a preexisting atmospheric science group that contained a wide variety of expertise. The project was named the Program for Climate Model Diagnosis and Intercomparison (PCMDI), and it has changed the international landscape of climate modeling over the past 20 years. In spring 2009 the DOE hosted a 1-day symposium to celebrate the twentieth anniversary of PCMDI and to honor its founder, Larry Gates. Through their personal experiences, the morning presenters painted an image of climate science in the 1970s and 1980s, that generated early support from the international community for model intercomparison, thereby bringing PCMDI into existence. Four talks covered Gates's early contributions to climate research at the University of California, Los Angeles (UCLA), the RAND Corporation, and Oregon State University through the founding of PCMDI to coordinate the Atmospheric Model Intercomparison Project (AMIP). The speakers were, in order of presentation, Warren Washington [National Center for Atmospheric Research (NCAR)], Kelly Redmond (Western Regional Climate Center), George Boer (Canadian Centre for Climate Modelling and Analysis), and Lennart Bengtsson [University of Reading, former director of the European Centre for Medium-Range Weather Forecasts (ECMWF)]. The afternoon session emphasized the scientific ideas that are the basis of PCMDI's success, summarizing their evolution and impact. Four speakers followed the various PCMDI-supported climate model intercomparison projects, beginning with early work on cloud representations in models, presented by Robert D. Cess (Distinguished Professor Emeritus, Stony Brook University), and then the latest Cloud Feedback Model Intercomparison Projects (CFMIPs) led by Sandrine Bony (Laboratoire de M'©t'©orologie Dynamique). Benjamin Santer (LLNL) presented a review of the climate change detection and attribution (D & A) work pioneered at PCMDI, and Gerald A. Meehl (NCAR) ended the day with a look toward the future of climate change research.« less
Assembling Large, Multi-Sensor Climate Datasets Using the SciFlo Grid Workflow System
NASA Astrophysics Data System (ADS)
Wilson, B.; Manipon, G.; Xing, Z.; Fetzer, E.
2009-04-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To meet these large-scale challenges, we are utilizing a Grid computing and dataflow framework, named SciFlo, in which we are deploying a set of versatile and reusable operators for data query, access, subsetting, co-registration, mining, fusion, and advanced statistical analysis. SciFlo is a semantically-enabled ("smart") Grid Workflow system that ties together a peer-to-peer network of computers into an efficient engine for distributed computation. The SciFlo workflow engine enables scientists to do multi-instrument Earth Science by assembling remotely-invokable Web Services (SOAP or http GET URLs), native executables, command-line scripts, and Python codes into a distributed computing flow. A scientist visually authors the graph of operation in the VizFlow GUI, or uses a text editor to modify the simple XML workflow documents. The SciFlo client & server engines optimize the execution of such distributed workflows and allow the user to transparently find and use datasets and operators without worrying about the actual location of the Grid resources. The engine transparently moves data to the operators, and moves operators to the data (on the dozen trusted SciFlo nodes). SciFlo also deploys a variety of Data Grid services to: query datasets in space and time, locate & retrieve on-line data granules, provide on-the-fly variable and spatial subsetting, perform pairwise instrument matchups for A-Train datasets, and compute fused products. These services are combined into efficient workflows to assemble the desired large-scale, merged climate datasets. SciFlo is currently being applied in several large climate studies: comparisons of aerosol optical depth between MODIS, MISR, AERONET ground network, and U. Michigan's IMPACT aerosol transport model; characterization of long-term biases in microwave and infrared instruments (AIRS, MLS) by comparisons to GPS temperature retrievals accurate to 0.1 degrees Kelvin; and construction of a decade-long, multi-sensor water vapor climatology stratified by classified cloud scene by bringing together datasets from AIRS/AMSU, AMSR-E, MLS, MODIS, and CloudSat (NASA MEASUREs grant, Fetzer PI). The presentation will discuss the SciFlo technologies, their application in these distributed workflows, and the many challenges encountered in assembling and analyzing these massive datasets.
NASA Technical Reports Server (NTRS)
Russell, Philip A.; Bergstrom, Robert A.; Schmid, Beat; Livingston, John M.
2000-01-01
Aerosol effects on atmospheric radiative fluxes provide a forcing function that can change the climate in potentially significant ways. This aerosol radiative forcing is a major source of uncertainty in understanding the climate change of the past century and predicting future climate. To help reduce this uncertainty, the 1996 Tropospheric Aerosol Radiative Forcing Observational Experiment (TARFOX) and the 1997 Aerosol Characterization Experiment (ACE-2) measured the properties and radiative effects of aerosols over the Atlantic Ocean. Both experiments used remote and in situ measurements from aircraft and the surface, coordinated with overpasses by a variety of satellite radiometers. TARFOX focused on the urban-industrial haze plume flowing from the United States over the western Atlantic, whereas ACE-2 studied aerosols over the eastern Atlantic from both Europe and Africa. These aerosols often have a marked impact on satellite-measured radiances. However, accurate derivation of flux changes, or radiative forcing, from the satellite measured radiances or retrieved aerosol optical depths (AODs) remains a difficult challenge. Here we summarize key initial results from TARFOX and ACE-2, with a focus on closure analyses that yield aerosol microphysical models for use in improved assessments of flux changes. We show how one such model gives computed radiative flux sensitivities (dF/dAOD) that agree with values measured in TARFOX and preliminary values computed for the polluted marine boundary layer in ACE-2. A companion paper uses the model to compute aerosol-induced flux changes over the North Atlantic from AVHRR-derived AOD fields.
NASA Technical Reports Server (NTRS)
Miller, Ronald L.; Garcia-Pando, Carlos Perez; Perlwitz, Jan; Ginoux, Paul
2015-01-01
Past decades have seen an accelerating increase in computing efficiency, while climate models are representing a rapidly widening set of physical processes. Yet simulations of some fundamental aspects of climate like precipitation or aerosol forcing remain highly uncertain and resistant to progress. Dust aerosol modeling of soil particles lofted by wind erosion has seen a similar conflict between increasing model sophistication and remaining uncertainty. Dust aerosols perturb the energy and water cycles by scattering radiation and acting as ice nuclei, while mediating atmospheric chemistry and marine photosynthesis (and thus the carbon cycle). These effects take place across scales from the dimensions of an ice crystal to the planetary-scale circulation that disperses dust far downwind of its parent soil. Representing this range leads to several modeling challenges. Should we limit complexity in our model, which consumes computer resources and inhibits interpretation? How do we decide if a process involving dust is worthy of inclusion within our model? Can we identify a minimal representation of a complex process that is efficient yet retains the physics relevant to climate? Answering these questions about the appropriate degree of representation is guided by model evaluation, which presents several more challenges. How do we proceed if the available observations do not directly constrain our process of interest? (This could result from competing processes that influence the observed variable and obscure the signature of our process of interest.) Examples will be presented from dust modeling, with lessons that might be more broadly applicable. The end result will either be clinical depression or there assuring promise of continued gainful employment as the community confronts these challenges.
Recommendations for diagnosing effective radiative forcing from climate models for CMIP6
NASA Astrophysics Data System (ADS)
Forster, Piers M.; Richardson, Thomas; Maycock, Amanda C.; Smith, Christopher J.; Samset, Bjorn H.; Myhre, Gunnar; Andrews, Timothy; Pincus, Robert; Schulz, Michael
2016-10-01
The usefulness of previous Coupled Model Intercomparison Project (CMIP) exercises has been hampered by a lack of radiative forcing information. This has made it difficult to understand reasons for differences between model responses. Effective radiative forcing (ERF) is easier to diagnose than traditional radiative forcing in global climate models (GCMs) and is more representative of the eventual temperature response. Here we examine the different methods of computing ERF in two GCMs. We find that ERF computed from a fixed sea surface temperature (SST) method (ERF_fSST) has much more certainty than regression based methods. Thirty year integrations are sufficient to reduce the 5-95% confidence interval in global ERF_fSST to 0.1 W m-2. For 2xCO2 ERF, 30 year integrations are needed to ensure that the signal is larger than the local confidence interval over more than 90% of the globe. Within the ERF_fSST method there are various options for prescribing SSTs and sea ice. We explore these and find that ERF is only weakly dependent on the methodological choices. Prescribing the monthly averaged seasonally varying model's preindustrial climatology is recommended for its smaller random error and easier implementation. As part of CMIP6, the Radiative Forcing Model Intercomparison Project (RFMIP) asks models to conduct 30 year ERF_fSST experiments using the model's own preindustrial climatology of SST and sea ice. The Aerosol and Chemistry Model Intercomparison Project (AerChemMIP) will also mainly use this approach. We propose this as a standard method for diagnosing ERF and recommend that it be used across the climate modeling community to aid future comparisons.
Neoproterozoic 'snowball Earth' simulations with a coupled climate/ice-sheet model.
Hyde, W T; Crowley, T J; Baum, S K; Peltier, W R
2000-05-25
Ice sheets may have reached the Equator in the late Proterozoic era (600-800 Myr ago), according to geological and palaeomagnetic studies, possibly resulting in a 'snowball Earth'. But this period was a critical time in the evolution of multicellular animals, posing the question of how early life survived under such environmental stress. Here we present computer simulations of this unusual climate stage with a coupled climate/ice-sheet model. To simulate a snowball Earth, we use only a reduction in the solar constant compared to present-day conditions and we keep atmospheric CO2 concentrations near present levels. We find rapid transitions into and out of full glaciation that are consistent with the geological evidence. When we combine these results with a general circulation model, some of the simulations result in an equatorial belt of open water that may have provided a refugium for multicellular animals.
NASA Astrophysics Data System (ADS)
von Storch, Jin-Song
2014-05-01
The German consortium STORM was built to explore high-resolution climate simulations using the high-performance computer stored at the German Climate Computer Center (DKRZ). One of the primary goals is to quantify the effect of unresolved (and parametrized) processes on climate sensitivity. We use ECHAM6/MPIOM, the coupled atmosphere-ocean model developed at the Max-Planck Institute for Meteorology. The resolution is T255L95 for the atmosphere and 1/10 degree and 80 vertical levels for the ocean. We discuss results of stand-alone runs, i.e. the ocean-only simulation driven by the NCEP/NCAR renalaysis and the atmosphere-only AMIP-type of simulation. Increasing resolution leads to a redistribution of biases, even though some improvements, both in the atmosphere and in the ocean, can clearly be attributed to the increase in resolution. We represent also new insights on ocean meso-scale eddies, in particular their effects on the ocean's energetics. Finally, we discuss the status and problems of the coupled high-resolution runs.
Multi-criteria evaluation of CMIP5 GCMs for climate change impact analysis
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Rana, Arun; Moradkhani, Hamid; Sharma, Ashish
2017-04-01
Climate change is expected to have severe impacts on global hydrological cycle along with food-water-energy nexus. Currently, there are many climate models used in predicting important climatic variables. Though there have been advances in the field, there are still many problems to be resolved related to reliability, uncertainty, and computing needs, among many others. In the present work, we have analyzed performance of 20 different global climate models (GCMs) from Climate Model Intercomparison Project Phase 5 (CMIP5) dataset over the Columbia River Basin (CRB) in the Pacific Northwest USA. We demonstrate a statistical multicriteria approach, using univariate and multivariate techniques, for selecting suitable GCMs to be used for climate change impact analysis in the region. Univariate methods includes mean, standard deviation, coefficient of variation, relative change (variability), Mann-Kendall test, and Kolmogorov-Smirnov test (KS-test); whereas multivariate methods used were principal component analysis (PCA), singular value decomposition (SVD), canonical correlation analysis (CCA), and cluster analysis. The analysis is performed on raw GCM data, i.e., before bias correction, for precipitation and temperature climatic variables for all the 20 models to capture the reliability and nature of the particular model at regional scale. The analysis is based on spatially averaged datasets of GCMs and observation for the period of 1970 to 2000. Ranking is provided to each of the GCMs based on the performance evaluated against gridded observational data on various temporal scales (daily, monthly, and seasonal). Results have provided insight into each of the methods and various statistical properties addressed by them employed in ranking GCMs. Further; evaluation was also performed for raw GCM simulations against different sets of gridded observational dataset in the area.
Mean-state acceleration of cloud-resolving models and large eddy simulations
Jones, C. R.; Bretherton, C. S.; Pritchard, M. S.
2015-10-29
In this study, large eddy simulations and cloud-resolving models (CRMs) are routinely used to simulate boundary layer and deep convective cloud processes, aid in the development of moist physical parameterization for global models, study cloud-climate feedbacks and cloud-aerosol interaction, and as the heart of superparameterized climate models. These models are computationally demanding, placing practical constraints on their use in these applications, especially for long, climate-relevant simulations. In many situations, the horizontal-mean atmospheric structure evolves slowly compared to the turnover time of the most energetic turbulent eddies. We develop a simple scheme to reduce this time scale separation to accelerate themore » evolution of the mean state. Using this approach we are able to accelerate the model evolution by a factor of 2–16 or more in idealized stratocumulus, shallow and deep cumulus convection without substantial loss of accuracy in simulating mean cloud statistics and their sensitivity to climate change perturbations. As a culminating test, we apply this technique to accelerate the embedded CRMs in the Superparameterized Community Atmosphere Model by a factor of 2, thereby showing that the method is robust and stable to realistic perturbations across spatial and temporal scales typical in a GCM.« less
Advances in Engineering Software for Lift Transportation Systems
NASA Astrophysics Data System (ADS)
Kazakoff, Alexander Borisoff
2012-03-01
In this paper an attempt is performed at computer modelling of ropeway ski lift systems. The logic in these systems is based on a travel form between the two terminals, which operates with high capacity cabins, chairs, gondolas or draw-bars. Computer codes AUTOCAD, MATLAB and Compaq-Visual Fortran - version 6.6 are used in the computer modelling. The rope systems computer modelling is organized in two stages in this paper. The first stage is organization of the ground relief profile and a design of the lift system as a whole, according to the terrain profile and the climatic and atmospheric conditions. The ground profile is prepared by the geodesists and is presented in an AUTOCAD view. The next step is the design of the lift itself which is performed by programmes using the computer code MATLAB. The second stage of the computer modelling is performed after the optimization of the co-ordinates and the lift profile using the computer code MATLAB. Then the co-ordinates and the parameters are inserted into a program written in Compaq Visual Fortran - version 6.6., which calculates 171 lift parameters, organized in 42 tables. The objective of the work presented in this paper is an attempt at computer modelling of the design and parameters derivation of the rope way systems and their computer variation and optimization.
High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6
Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; ...
2016-11-22
Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relativelymore » few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. Lastly, HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.« less