Ocean Modeling and Visualization on Massively Parallel Computer
NASA Technical Reports Server (NTRS)
Chao, Yi; Li, P. Peggy; Wang, Ping; Katz, Daniel S.; Cheng, Benny N.
1997-01-01
Climate modeling is one of the grand challenges of computational science, and ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change.
Climate Ocean Modeling on Parallel Computers
NASA Technical Reports Server (NTRS)
Wang, P.; Cheng, B. N.; Chao, Y.
1998-01-01
Ocean modeling plays an important role in both understanding the current climatic conditions and predicting future climate change. However, modeling the ocean circulation at various spatial and temporal scales is a very challenging computational task.
Parallel computing method for simulating hydrological processesof large rivers under climate change
NASA Astrophysics Data System (ADS)
Wang, H.; Chen, Y.
2016-12-01
Climate change is one of the proverbial global environmental problems in the world.Climate change has altered the watershed hydrological processes in time and space distribution, especially in worldlarge rivers.Watershed hydrological process simulation based on physically based distributed hydrological model can could have better results compared with the lumped models.However, watershed hydrological process simulation includes large amount of calculations, especially in large rivers, thus needing huge computing resources that may not be steadily available for the researchers or at high expense, this seriously restricted the research and application. To solve this problem, the current parallel method are mostly parallel computing in space and time dimensions.They calculate the natural features orderly thatbased on distributed hydrological model by grid (unit, a basin) from upstream to downstream.This articleproposes ahigh-performancecomputing method of hydrological process simulation with high speedratio and parallel efficiency.It combinedthe runoff characteristics of time and space of distributed hydrological model withthe methods adopting distributed data storage, memory database, distributed computing, parallel computing based on computing power unit.The method has strong adaptability and extensibility,which means it canmake full use of the computing and storage resources under the condition of limited computing resources, and the computing efficiency can be improved linearly with the increase of computing resources .This method can satisfy the parallel computing requirements ofhydrological process simulation in small, medium and large rivers.
ParCAT: A Parallel Climate Analysis Toolkit
NASA Astrophysics Data System (ADS)
Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.
2012-12-01
Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.
The Parallel System for Integrating Impact Models and Sectors (pSIMS)
NASA Technical Reports Server (NTRS)
Elliott, Joshua; Kelly, David; Chryssanthacopoulos, James; Glotter, Michael; Jhunjhnuwala, Kanika; Best, Neil; Wilde, Michael; Foster, Ian
2014-01-01
We present a framework for massively parallel climate impact simulations: the parallel System for Integrating Impact Models and Sectors (pSIMS). This framework comprises a) tools for ingesting and converting large amounts of data to a versatile datatype based on a common geospatial grid; b) tools for translating this datatype into custom formats for site-based models; c) a scalable parallel framework for performing large ensemble simulations, using any one of a number of different impacts models, on clusters, supercomputers, distributed grids, or clouds; d) tools and data standards for reformatting outputs to common datatypes for analysis and visualization; and e) methodologies for aggregating these datatypes to arbitrary spatial scales such as administrative and environmental demarcations. By automating many time-consuming and error-prone aspects of large-scale climate impacts studies, pSIMS accelerates computational research, encourages model intercomparison, and enhances reproducibility of simulation results. We present the pSIMS design and use example assessments to demonstrate its multi-model, multi-scale, and multi-sector versatility.
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment
Parallel computing of a climate model on the dawn 1000 by domain decomposition method
NASA Astrophysics Data System (ADS)
Bi, Xunqiang
1997-12-01
In this paper the parallel computing of a grid-point nine-level atmospheric general circulation model on the Dawn 1000 is introduced. The model was developed by the Institute of Atmospheric Physics (IAP), Chinese Academy of Sciences (CAS). The Dawn 1000 is a MIMD massive parallel computer made by National Research Center for Intelligent Computer (NCIC), CAS. A two-dimensional domain decomposition method is adopted to perform the parallel computing. The potential ways to increase the speed-up ratio and exploit more resources of future massively parallel supercomputation are also discussed.
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts
USDA-ARS?s Scientific Manuscript database
With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...
SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Palamuttam, R. S.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; Verma, R.; Waliser, D. E.; Lee, H.
2015-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark under a NASA AIST grant (PI Mattmann). Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 10 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. We have implemented a parallel data ingest capability in which the user specifies desired variables (arrays) as several time-sorted lists of URL's (i.e. using OPeNDAP model.nc?varname, or local files). The specified variables are partitioned by time/space and then each Spark node pulls its bundle of arrays into memory to begin a computation pipeline. We also investigated the performance of several N-dim. array libraries (scala breeze, java jblas & netlib-java, and ND4J). We are currently developing science codes using ND4J and studying memory behavior on the JVM. On the pyspark side, many of our science codes already use the numpy and SciPy ecosystems. The talk will cover: the architecture of SciSpark, the design of the scientific RDD (sRDD) data structure, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.
Light-weight Parallel Python Tools for Earth System Modeling Workflows
NASA Astrophysics Data System (ADS)
Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.
2015-12-01
With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.
SciSpark: Highly Interactive and Scalable Model Evaluation and Climate Metrics
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Mattmann, C. A.; Waliser, D. E.; Kim, J.; Loikith, P.; Lee, H.; McGibbney, L. J.; Whitehall, K. D.
2014-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We are developing a lightning fast Big Data technology called SciSpark based on ApacheTM Spark. Spark implements the map-reduce paradigm for parallel computing on a cluster, but emphasizes in-memory computation, "spilling" to disk only as needed, and so outperforms the disk-based ApacheTM Hadoop by 100x in memory and by 10x on disk, and makes iterative algorithms feasible. SciSpark will enable scalable model evaluation by executing large-scale comparisons of A-Train satellite observations to model grids on a cluster of 100 to 1000 compute nodes. This 2nd generation capability for NASA's Regional Climate Model Evaluation System (RCMES) will compute simple climate metrics at interactive speeds, and extend to quite sophisticated iterative algorithms such as machine-learning (ML) based clustering of temperature PDFs, and even graph-based algorithms for searching for Mesocale Convective Complexes. The goals of SciSpark are to: (1) Decrease the time to compute comparison statistics and plots from minutes to seconds; (2) Allow for interactive exploration of time-series properties over seasons and years; (3) Decrease the time for satellite data ingestion into RCMES to hours; (4) Allow for Level-2 comparisons with higher-order statistics or PDF's in minutes to hours; and (5) Move RCMES into a near real time decision-making platform. We will report on: the architecture and design of SciSpark, our efforts to integrate climate science algorithms in Python and Scala, parallel ingest and partitioning (sharding) of A-Train satellite observations from HDF files and model grids from netCDF files, first parallel runs to compute comparison statistics and PDF's, and first metrics quantifying parallel speedups and memory & disk usage.
NASA Astrophysics Data System (ADS)
Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.
2017-12-01
Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jain, Atul K.
The overall objectives of this DOE funded project is to combine scientific and computational challenges in climate modeling by expanding our understanding of the biogeophysical-biogeochemical processes and their interactions in the northern high latitudes (NHLs) using an earth system modeling (ESM) approach, and by adopting an adaptive parallel runtime system in an ESM to achieve efficient and scalable climate simulations through improved load balancing algorithms.
Parallel community climate model: Description and user`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake, J.B.; Flanery, R.E.; Semeraro, B.D.
This report gives an overview of a parallel version of the NCAR Community Climate Model, CCM2, implemented for MIMD massively parallel computers using a message-passing programming paradigm. The parallel implementation was developed on an Intel iPSC/860 with 128 processors and on the Intel Delta with 512 processors, and the initial target platform for the production version of the code is the Intel Paragon with 2048 processors. Because the implementation uses a standard, portable message-passing libraries, the code has been easily ported to other multiprocessors supporting a message-passing programming paradigm. The parallelization strategy used is to decompose the problem domain intomore » geographical patches and assign each processor the computation associated with a distinct subset of the patches. With this decomposition, the physics calculations involve only grid points and data local to a processor and are performed in parallel. Using parallel algorithms developed for the semi-Lagrangian transport, the fast Fourier transform and the Legendre transform, both physics and dynamics are computed in parallel with minimal data movement and modest change to the original CCM2 source code. Sequential or parallel history tapes are written and input files (in history tape format) are read sequentially by the parallel code to promote compatibility with production use of the model on other computer systems. A validation exercise has been performed with the parallel code and is detailed along with some performance numbers on the Intel Paragon and the IBM SP2. A discussion of reproducibility of results is included. A user`s guide for the PCCM2 version 2.1 on the various parallel machines completes the report. Procedures for compilation, setup and execution are given. A discussion of code internals is included for those who may wish to modify and use the program in their own research.« less
A global database with parallel measurements to study non-climatic changes
NASA Astrophysics Data System (ADS)
Venema, Victor; Auchmann, Renate; Aguilar, Enric
2015-04-01
n this work we introduce the rationale behind the ongoing compilation of a parallel measurements database, under the umbrella of the International Surface Temperatures Initiative (ISTI) and with the support of the World Meteorological Organization. We intend this database to become instrumental for a better understanding of inhomogeneities affecting the evaluation of long term changes in daily climate data. Long instrumental climate records are usually affected by non-climatic changes, due to, e.g., relocations and changes in instrumentation, instrument height or data collection and manipulation procedures. These so-called inhomogeneities distort the climate signal and can hamper the assessment of trends and variability. Thus to study climatic changes we need to accurately distinguish non-climatic and climatic signals. .The most direct way to study the influence of non-climatic changes on the distribution and to understand the reasons for these biases is the analysis of parallel measurements representing the old and new situation (in terms of e.g. instruments, location). According to the limited number of available studies and our understanding of the causes of inhomogeneity, we expect that they will have a strong impact on the tails of the distribution of temperatures and most likely of other climate elements. Our abilities to statistically homogenize daily data will be increased by systematically studying different causes of inhomogeneity replicated through parallel measurements. Current studies of non-climatic changes using parallel data are limited to local and regional case studies. However, the effect of specific transitions depends on the local climate and the most interesting climatic questions are about the systematic large-scale biases produced by transitions that occurred in many regions. Important potentially biasing transitions are the adoption of Stevenson screens, efforts to reduce undercatchment of precipitation or the move to automatic weather stations. Thus a large global parallel dataset is highly desirable as it allows for the study of systematic biases in the global record. In the ISTI Parallel Observations Science Team (POST), we will gather parallel data in their native format (to avoid undetectable conversion errors we will convert it to a standard format ourselves). We are interested in data from all climate variables at all time scales; from annual to sub-daily. High-resolution data is important for understanding the physical causes for the differences between the parallel measurements. For the same reason, we are also interested in other climate variables measured at the same station. For example, in case of parallel temperature measurements, the influencing factors are expected to be insolation, wind and clouds cover; in case of parallel precipitation measurements, wind and temperature are potentially important. Metadata that describe the parallel measurements is as important as the data itself and will be collected as well. For example, the types of the instruments, their siting, height, maintenance, etc. Because they are widely used to study moderate extremes, we will compute the indices of the Expert Team on Climate Change Detection and Indices (ETCCDI). In case the daily data cannot be shared, we would appreciate these indices from parallel measurements. For more information: http://tinyurl.com/ISTI-Parallel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smyth, Padhraic
2013-07-22
This is the final report for a DOE-funded research project describing the outcome of research on non-homogeneous hidden Markov models (NHMMs) and coupled ocean-atmosphere (O-A) intermediate-complexity models (ICMs) to identify the potentially predictable modes of climate variability, and to investigate their impacts on the regional-scale. The main results consist of extensive development of the hidden Markov models for rainfall simulation and downscaling specifically within the non-stationary climate change context together with the development of parallelized software; application of NHMMs to downscaling of rainfall projections over India; identification and analysis of decadal climate signals in data and models; and, studies ofmore » climate variability in terms of the dynamics of atmospheric flow regimes.« less
Accelerating Climate and Weather Simulations through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark
2011-01-01
Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.
An Interactive Multi-Model for Consensus on Climate Change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kocarev, Ljupco
This project purports to develop a new scheme for forming consensus among alternative climate models, that give widely divergent projections as to the details of climate change, that is more intelligent than simply averaging the model outputs, or averaging with ex post facto weighting factors. The method under development effectively allows models to assimilate data from one another in run time with weights that are chosen in an adaptive training phase using 20th century data, so that the models synchronize with one another as well as with reality. An alternate approach that is being explored in parallel is the automatedmore » combination of equations from different models in an expert-system-like framework.« less
A global database with parallel measurements to study non-climatic changes
NASA Astrophysics Data System (ADS)
Venema, Victor; Auchmann, Renate; Aguilar, Enric; Auer, Ingeborg; Azorin-Molina, Cesar; Brandsma, Theo; Brunetti, Michele; Dienst, Manuel; Domonkos, Peter; Gilabert, Alba; Lindén, Jenny; Milewska, Ewa; Nordli, Øyvind; Prohom, Marc; Rennie, Jared; Stepanek, Petr; Trewin, Blair; Vincent, Lucie; Willett, Kate; Wolff, Mareile
2016-04-01
In this work we introduce the rationale behind the ongoing compilation of a parallel measurements database, in the framework of the International Surface Temperatures Initiative (ISTI) and with the support of the World Meteorological Organization. We intend this database to become instrumental for a better understanding of inhomogeneities affecting the evaluation of long-term changes in daily climate data. Long instrumental climate records are usually affected by non-climatic changes, due to, e.g., (i) station relocations, (ii) instrument height changes, (iii) instrumentation changes, (iv) observing environment changes, (v) different sampling intervals or data collection procedures, among others. These so-called inhomogeneities distort the climate signal and can hamper the assessment of long-term trends and variability of climate. Thus to study climatic changes we need to accurately distinguish non-climatic and climatic signals. The most direct way to study the influence of non-climatic changes on the distribution and to understand the reasons for these biases is the analysis of parallel measurements representing the old and new situation (in terms of e.g. instruments, location, different radiation shields, etc.). According to the limited number of available studies and our understanding of the causes of inhomogeneity, we expect that they will have a strong impact on the tails of the distribution of air temperatures and most likely of other climate elements. Our abilities to statistically homogenize daily data will be increased by systematically studying different causes of inhomogeneity replicated through parallel measurements. Current studies of non-climatic changes using parallel data are limited to local and regional case studies. However, the effect of specific transitions depends on the local climate and the most interesting climatic questions are about the systematic large-scale biases produced by transitions that occurred in many regions. Important potentially biasing transitions are the adoption of Stevenson screens, relocations (to airports) efforts to reduce undercatchment of precipitation or the move to automatic weather stations. Thus a large global parallel dataset is highly desirable as it allows for the study of systematic biases in the global record. We are interested in data from all climate variables at all time scales; from annual to sub-daily. High-resolution data is important for understanding the physical causes for the differences between the parallel measurements. For the same reason, we are also interested in other climate variables measured at the same station. For example, in case of parallel air temperature measurements, the influencing factors are expected to be global radiation, wind, humidity and cloud cover; in case of parallel precipitation measurements, wind and wet-bulb temperature are potentially important. Metadata that describe the parallel measurements is as important as the data itself and will be collected as well. For example, the types of the instruments, their siting, height, maintenance, etc. Because they are widely used to study moderate extremes, we will compute the indices of the Expert Team on Climate Change Detection and Indices (ETCCDI). In case the daily data cannot be shared, we would appreciate contributions containing these indices from parallel measurements. For more information: http://tinyurl.com/ISTI-Parallel
Humor Climate of the Primary Schools
ERIC Educational Resources Information Center
Sahin, Ahmet
2018-01-01
The aim of this study is to determine the opinions primary school administrators and teachers on humor climates in primary schools. The study was modeled as a convergent parallel design, one of the mixed methods. The data gathered from 253 administrator questionnaires, and 651 teacher questionnaires was evaluated for the quantitative part of the…
Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF
NASA Astrophysics Data System (ADS)
Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.
2014-12-01
Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.
TECA: A Parallel Toolkit for Extreme Climate Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabhat, Mr; Ruebel, Oliver; Byna, Surendra
2012-03-12
We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.
A global database with parallel measurements to study non-climatic changes
NASA Astrophysics Data System (ADS)
Venema, Victor; Auchman, Renate; Aguilar, Enric
2017-04-01
In this work we introduce the rationale behind the ongoing compilation of a parallel measurements database, in the framework of the International Surface Temperatures Initiative (ISTI) and with the support of the World Meteorological Organization. We intend this database to become instrumental for a better understanding of inhomogeneities affecting the evaluation of long-term changes in daily climate data. Long instrumental climate records are usually affected by non-climatic changes, due to, e.g., (i) station re- locations, (ii) instrument height changes, (iii) instrumentation changes, (iv) observing environment changes, (v) different sampling intervals or data collection procedures, among others. These so-called inhomogeneities distort the climate signal and can hamper the assessment of long-term trends and variability of climate. Thus to study climatic changes we need to accurately distinguish non-climatic and climatic signals. The most direct way to study the influence of non-climatic changes on the distribution and to understand the reasons for these biases is the analysis of parallel measurements representing the old and new situation (in terms of e.g. instruments, location, different radiation shields, etc.). According to the limited number of available studies and our understanding of the causes of inhomogeneity, we expect that they will have a strong impact on the tails of the distribution of air temperatures and most likely of other climate elements. Our abilities to statistically homogenize daily data will be increased by systematically studying different causes of inhomogeneity replicated through parallel measurements. Current studies of non-climatic changes using parallel data are limited to local and regional case studies. However, the effect of specific transitions depends on the local climate and the most interesting climatic questions are about the systematic large-scale biases produced by transitions that occurred in many regions. Important potentially biasing transitions are the adoption of Stevenson screens, relocations (to airports) efforts to reduce undercatchment of precipitation or the move to automatic weather stations. Thus a large global parallel dataset is highly desirable as it allows for the study of systematic biases in the global record. We are interested in data from all climate variables at all time scales; from annual to sub-daily. High-resolution data is important for understanding the physical causes for the differences between the parallel measurements. For the same reason, we are also interested in other climate variables measured at the same station. For example, in case of parallel air temperature measurements, the influencing factors are expected to be global radiation, wind, humidity and cloud cover; in case of parallel precipitation measurements, wind and wet-bulb temperature are potentially important.
Thorpe, Roger S; Barlow, Axel; Malhotra, Anita; Surget-Groba, Yann
2015-03-01
Global warming will impact species in a number of ways, and it is important to know the extent to which natural populations can adapt to anthropogenic climate change by natural selection. Parallel microevolution within separate species can demonstrate natural selection, but several studies of homoplasy have not yet revealed examples of widespread parallel evolution in a generic radiation. Taking into account primary phylogeographic divisions, we investigate numerous quantitative traits (size, shape, scalation, colour pattern and hue) in anole radiations from the mountainous Lesser Antillean islands. Adaptation to climatic differences can lead to very pronounced differences between spatially close populations with all studied traits showing some evidence of parallel evolution. Traits from shape, scalation, pattern and hue (particularly the latter) show widespread evolutionary parallels within these species in response to altitudinal climate variation greater than extreme anthropogenic climate change predicted for 2080. This gives strong evidence of the ability to adapt to climate variation by natural selection throughout this radiation. As anoles can evolve very rapidly, it suggests anthropogenic climate change is likely to be less of a conservation threat than other factors, such as habitat loss and invasive species, in this, Lesser Antillean, biodiversity hot spot. © 2015 John Wiley & Sons Ltd.
Interactive Correlation Analysis and Visualization of Climate Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Kwan-Liu
The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods formore » visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.« less
NASA Astrophysics Data System (ADS)
Behrens, Jörg; Hanke, Moritz; Jahns, Thomas
2014-05-01
In this talk we present a way to facilitate efficient use of MPI communication for developers of climate models. Exploitation of the performance potential of today's highly parallel supercomputers with real world simulations is a complex task. This is partly caused by the low level nature of the MPI communication library which is the dominant communication tool at least for inter-node communication. In order to manage the complexity of the task, climate simulations with non-trivial communication patterns often use an internal abstraction layer above MPI without exploiting the benefits of communication aggregation or MPI-datatypes. The solution for the complexity and performance problem we propose is the communication library YAXT. This library is built on top of MPI and takes high level descriptions of arbitrary domain decompositions and automatically derives an efficient collective data exchange. Several exchanges can be aggregated in order to reduce latency costs. Examples are given which demonstrate the simplicity and the performance gains for selected climate applications.
Description of the NCAR Community Climate Model (CCM3). Technical note
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiehl, J.T.; Hack, J.J.; Bonan, G.B.
This repor presents the details of the governing equations, physical parameterizations, and numerical algorithms defining the version of the NCAR Community Climate Model designated CCM3. The material provides an overview of the major model components, and the way in which they interact as the numerical integration proceeds. This version of the CCM incorporates significant improvements to the physic package, new capabilities such as the incorporation of a slab ocean component, and a number of enhancements to the implementation (e.g., the ability to integrate the model on parallel distributed-memory computational platforms).
Impacts of Stratospheric Black Carbon on Agriculture
NASA Astrophysics Data System (ADS)
Xia, L.; Robock, A.; Elliott, J. W.
2017-12-01
A regional nuclear war between India and Pakistan could inject 5 Tg of soot into the stratosphere, which would absorb sunlight, decrease global surface temperature by about 1°C for 5-10 years and have major impacts on precipitation and the amount of solar radiation reaching Earth's surface. Using two global gridded crop models forced by one global climate model simulation, we investigate the impacts on agricultural productivity in various nations. The crop model in the Community Land Model 4.5 (CLM-crop4.5) and the parallel Decision Support System for Agricultural Technology (pDSSAT) in the parallel System for Integrating Impact Models and Sectors are participating in the Global Gridded Crop Model Intercomparison. We force these two crop models with output from the Whole Atmospheric Community Climate Model to characterize the global agricultural impact from climate changes due to a regional nuclear war. Crops in CLM-crop4.5 include maize, rice, soybean, cotton and sugarcane, and crops in pDSSAT include maize, rice, soybean and wheat. Although the two crop models require a different time frequency of weather input, we downscale the climate model output to provide consistent temperature, precipitation and solar radiation inputs. In general, CLM-crop4.5 simulates a larger global average reduction of maize and soybean production relative to pDSSAT. Global rice production shows negligible change with climate anomalies from a regional nuclear war. Cotton and sugarcane benefit from a regional nuclear war from CLM-crop4.5 simulation, and global wheat production would decrease significantly in the pDSSAT simulation. The regional crop yield responses to a regional nuclear conflict are different for each crop, and we present the changes in production on a national basis. These models do not include the crop responses to changes in ozone, ultraviolet radiation, or diffuse radiation, and we would like to encourage more modelers to improve crop models to account for those impacts. We present these results as a demonstration of using different crop models to study this problem, and we invite more global crop modeling groups to use the same climate forcing, which we would be happy to provide, to gain a better understanding of global agricultural responses under different future climate scenarios with stratospheric aerosols.
J. G. Isebrands; G. E. Host; K. Lenz; G. Wu; H. W. Stech
2000-01-01
Process models are powerful research tools for assessing the effects of multiple environmental stresses on forest plantations. These models are driven by interacting environmental variables and often include genetic factors necessary for assessing forest plantation growth over a range of different site, climate, and silvicultural conditions. However, process models are...
Atlas : A library for numerical weather prediction and climate modelling
NASA Astrophysics Data System (ADS)
Deconinck, Willem; Bauer, Peter; Diamantakis, Michail; Hamrud, Mats; Kühnlein, Christian; Maciel, Pedro; Mengaldo, Gianmarco; Quintino, Tiago; Raoult, Baudouin; Smolarkiewicz, Piotr K.; Wedi, Nils P.
2017-11-01
The algorithms underlying numerical weather prediction (NWP) and climate models that have been developed in the past few decades face an increasing challenge caused by the paradigm shift imposed by hardware vendors towards more energy-efficient devices. In order to provide a sustainable path to exascale High Performance Computing (HPC), applications become increasingly restricted by energy consumption. As a result, the emerging diverse and complex hardware solutions have a large impact on the programming models traditionally used in NWP software, triggering a rethink of design choices for future massively parallel software frameworks. In this paper, we present Atlas, a new software library that is currently being developed at the European Centre for Medium-Range Weather Forecasts (ECMWF), with the scope of handling data structures required for NWP applications in a flexible and massively parallel way. Atlas provides a versatile framework for the future development of efficient NWP and climate applications on emerging HPC architectures. The applications range from full Earth system models, to specific tools required for post-processing weather forecast products. The Atlas library thus constitutes a step towards affordable exascale high-performance simulations by providing the necessary abstractions that facilitate the application in heterogeneous HPC environments by promoting the co-design of NWP algorithms with the underlying hardware.
Using Clustering to Establish Climate Regimes from PCM Output
NASA Technical Reports Server (NTRS)
Oglesby, Robert; Arnold, James E. (Technical Monitor); Hoffman, Forrest; Hargrove, W. W.; Erickson, D.
2002-01-01
A multivariate statistical clustering technique--based on the k-means algorithm of Hartigan has been used to extract patterns of climatological significance from 200 years of general circulation model (GCM) output. Originally developed and implemented on a Beowulf-style parallel computer constructed by Hoffman and Hargrove from surplus commodity desktop PCs, the high performance parallel clustering algorithm was previously applied to the derivation of ecoregions from map stacks of 9 and 25 geophysical conditions or variables for the conterminous U.S. at a resolution of 1 sq km. Now applied both across space and through time, the clustering technique yields temporally-varying climate regimes predicted by transient runs of the Parallel Climate Model (PCM). Using a business-as-usual (BAU) scenario and clustering four fields of significance to the global water cycle (surface temperature, precipitation, soil moisture, and snow depth) from 1871 through 2098, the authors' analysis shows an increase in spatial area occupied by the cluster or climate regime which typifies desert regions (i.e., an increase in desertification) and a decrease in the spatial area occupied by the climate regime typifying winter-time high latitude perma-frost regions. The patterns of cluster changes have been analyzed to understand the predicted variability in the water cycle on global and continental scales. In addition, representative climate regimes were determined by taking three 10-year averages of the fields 100 years apart for northern hemisphere winter (December, January, and February) and summer (June, July, and August). The result is global maps of typical seasonal climate regimes for 100 years in the past, for the present, and for 100 years into the future. Using three-dimensional data or phase space representations of these climate regimes (i.e., the cluster centroids), the authors demonstrate the portion of this phase space occupied by the land surface at all points in space and time. Any single spot on the globe will exist in one of these climate regimes at any single point in time. By incrementing time, that same spot will trace out a trajectory or orbit between and among these climate regimes (or atmospheric states) in phase (or state) space. When a geographic region enters a state it never previously visited, a climatic change is said to have occurred. Tracing out the entire trajectory of a single spot on the globe yields a 'manifold' in state space representing the shape of its predicted climate occupancy. This sort of analysis enables a researcher to more easily grasp the multivariate behavior of the climate system.
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.
Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems
NASA Astrophysics Data System (ADS)
Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.
2016-12-01
We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.
Predicting Coupled Ocean-Atmosphere Modes with a Climate Modeling Hierarchy -- Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael Ghil, UCLA; Andrew W. Robertson, IRI, Columbia Univ.; Sergey Kravtsov, U. of Wisconsin, Milwaukee
The goal of the project was to determine midlatitude climate predictability associated with tropical-extratropical interactions on interannual-to-interdecadal time scales. Our strategy was to develop and test a hierarchy of climate models, bringing together large GCM-based climate models with simple fluid-dynamical coupled ocean-ice-atmosphere models, through the use of advanced probabilistic network (PN) models. PN models were used to develop a new diagnostic methodology for analyzing coupled ocean-atmosphere interactions in large climate simulations made with the NCAR Parallel Climate Model (PCM), and to make these tools user-friendly and available to other researchers. We focused on interactions between the tropics and extratropics throughmore » atmospheric teleconnections (the Hadley cell, Rossby waves and nonlinear circulation regimes) over both the North Atlantic and North Pacific, and the ocean’s thermohaline circulation (THC) in the Atlantic. We tested the hypothesis that variations in the strength of the THC alter sea surface temperatures in the tropical Atlantic, and that the latter influence the atmosphere in high latitudes through an atmospheric teleconnection, feeding back onto the THC. The PN model framework was used to mediate between the understanding gained with simplified primitive equations models and multi-century simulations made with the PCM. The project team is interdisciplinary and built on an existing synergy between atmospheric and ocean scientists at UCLA, computer scientists at UCI, and climate researchers at the IRI.« less
CICE, The Los Alamos Sea Ice Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunke, Elizabeth; Lipscomb, William; Jones, Philip
The Los Alamos sea ice model (CICE) is the result of an effort to develop a computationally efficient sea ice component for a fully coupled atmosphere–land–ocean–ice global climate model. It was originally designed to be compatible with the Parallel Ocean Program (POP), an ocean circulation model developed at Los Alamos National Laboratory for use on massively parallel computers. CICE has several interacting components: a vertical thermodynamic model that computes local growth rates of snow and ice due to vertical conductive, radiative and turbulent fluxes, along with snowfall; an elastic-viscous-plastic model of ice dynamics, which predicts the velocity field of themore » ice pack based on a model of the material strength of the ice; an incremental remapping transport model that describes horizontal advection of the areal concentration, ice and snow volume and other state variables; and a ridging parameterization that transfers ice among thickness categories based on energetic balances and rates of strain. It also includes a biogeochemical model that describes evolution of the ice ecosystem. The CICE sea ice model is used for climate research as one component of complex global earth system models that include atmosphere, land, ocean and biogeochemistry components. It is also used for operational sea ice forecasting in the polar regions and in numerical weather prediction models.« less
NASA Technical Reports Server (NTRS)
Glotter, Michael J.; Ruane, Alex C.; Moyer, Elisabeth J.; Elliott, Joshua W.
2015-01-01
Projections of future food production necessarily rely on models, which must themselves be validated through historical assessments comparing modeled and observed yields. Reliable historical validation requires both accurate agricultural models and accurate climate inputs. Problems with either may compromise the validation exercise. Previous studies have compared the effects of different climate inputs on agricultural projections but either incompletely or without a ground truth of observed yields that would allow distinguishing errors due to climate inputs from those intrinsic to the crop model. This study is a systematic evaluation of the reliability of a widely used crop model for simulating U.S. maize yields when driven by multiple observational data products. The parallelized Decision Support System for Agrotechnology Transfer (pDSSAT) is driven with climate inputs from multiple sources reanalysis, reanalysis that is bias corrected with observed climate, and a control dataset and compared with observed historical yields. The simulations show that model output is more accurate when driven by any observation-based precipitation product than when driven by non-bias-corrected reanalysis. The simulations also suggest, in contrast to previous studies, that biased precipitation distribution is significant for yields only in arid regions. Some issues persist for all choices of climate inputs: crop yields appear to be oversensitive to precipitation fluctuations but under sensitive to floods and heat waves. These results suggest that the most important issue for agricultural projections may be not climate inputs but structural limitations in the crop models themselves.
Evaluating the sensitivity of agricultural model performance to different climate inputs
Glotter, Michael J.; Moyer, Elisabeth J.; Ruane, Alex C.; Elliott, Joshua W.
2017-01-01
Projections of future food production necessarily rely on models, which must themselves be validated through historical assessments comparing modeled to observed yields. Reliable historical validation requires both accurate agricultural models and accurate climate inputs. Problems with either may compromise the validation exercise. Previous studies have compared the effects of different climate inputs on agricultural projections, but either incompletely or without a ground truth of observed yields that would allow distinguishing errors due to climate inputs from those intrinsic to the crop model. This study is a systematic evaluation of the reliability of a widely-used crop model for simulating U.S. maize yields when driven by multiple observational data products. The parallelized Decision Support System for Agrotechnology Transfer (pDSSAT) is driven with climate inputs from multiple sources – reanalysis, reanalysis bias-corrected with observed climate, and a control dataset – and compared to observed historical yields. The simulations show that model output is more accurate when driven by any observation-based precipitation product than when driven by un-bias-corrected reanalysis. The simulations also suggest, in contrast to previous studies, that biased precipitation distribution is significant for yields only in arid regions. However, some issues persist for all choices of climate inputs: crop yields appear oversensitive to precipitation fluctuations but undersensitive to floods and heat waves. These results suggest that the most important issue for agricultural projections may be not climate inputs but structural limitations in the crop models themselves. PMID:29097985
Blaney-Morin-Nigeria evapotranspiration model
NASA Astrophysics Data System (ADS)
Duru, J. Obiukwu
1984-02-01
An evapotranspiration model which parallels that proposed earlier by Blaney and Morin has been developed for application in Nigeria. The model, designated as the Blaney-Morin-Nigeria evapotranspiration model, predicts potential evapotranspiration with accuracy and consistency that are better than the Penman model, under Nigerian conditions. It is suggested that the Blaney-Morin evapotranspiration concept may have similar potential elsewhere when given specific form with appropriate constants derived to reflect climatic peculiarities.
NASA Astrophysics Data System (ADS)
Balaji, V.; Benson, Rusty; Wyman, Bruce; Held, Isaac
2016-10-01
Climate models represent a large variety of processes on a variety of timescales and space scales, a canonical example of multi-physics multi-scale modeling. Current hardware trends, such as Graphical Processing Units (GPUs) and Many Integrated Core (MIC) chips, are based on, at best, marginal increases in clock speed, coupled with vast increases in concurrency, particularly at the fine grain. Multi-physics codes face particular challenges in achieving fine-grained concurrency, as different physics and dynamics components have different computational profiles, and universal solutions are hard to come by. We propose here one approach for multi-physics codes. These codes are typically structured as components interacting via software frameworks. The component structure of a typical Earth system model consists of a hierarchical and recursive tree of components, each representing a different climate process or dynamical system. This recursive structure generally encompasses a modest level of concurrency at the highest level (e.g., atmosphere and ocean on different processor sets) with serial organization underneath. We propose to extend concurrency much further by running more and more lower- and higher-level components in parallel with each other. Each component can further be parallelized on the fine grain, potentially offering a major increase in the scalability of Earth system models. We present here first results from this approach, called coarse-grained component concurrency, or CCC. Within the Geophysical Fluid Dynamics Laboratory (GFDL) Flexible Modeling System (FMS), the atmospheric radiative transfer component has been configured to run in parallel with a composite component consisting of every other atmospheric component, including the atmospheric dynamics and all other atmospheric physics components. We will explore the algorithmic challenges involved in such an approach, and present results from such simulations. Plans to achieve even greater levels of coarse-grained concurrency by extending this approach within other components, such as the ocean, will be discussed.
Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
2011-07-20
This report summarizes work carried out by the Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of January 1, 2011 through June 30, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. To learn more about our project, please visit our UV-CDAT website (URL: http://uv-cdat.org). This report will be forwarded to the program manager for the Department of Energy (DOE) Office of Biological and Environmental Research (BER), national and international collaborators and stakeholders, and to researchers working on a wide range of other climate model, reanalysis, and observation evaluation activities. Themore » UV-CDAT executive committee consists of Dean N. Williams of Lawrence Livermore National Laboratory (LLNL); Dave Bader and Galen Shipman of Oak Ridge National Laboratory (ORNL); Phil Jones and James Ahrens of Los Alamos National Laboratory (LANL), Claudio Silva of Polytechnic Institute of New York University (NYU-Poly); and Berk Geveci of Kitware, Inc. The UV-CDAT team consists of researchers and scientists with diverse domain knowledge whose home institutions also include the National Aeronautics and Space Administration (NASA) and the University of Utah. All work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Working directly with BER climate science analysis projects, this consortium will develop and deploy data and computational resources useful to a wide variety of stakeholders, including scientists, policymakers, and the general public. Members of this consortium already collaborate with other institutions and universities in researching data discovery, management, visualization, workflow analysis, and provenance. The UV-CDAT team will address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, and Visualization interfaces.« less
IPSL-CM5A2. An Earth System Model designed to run long simulations for past and future climates.
NASA Astrophysics Data System (ADS)
Sepulchre, Pierre; Caubel, Arnaud; Marti, Olivier; Hourdin, Frédéric; Dufresne, Jean-Louis; Boucher, Olivier
2017-04-01
The IPSL-CM5A model was developed and released in 2013 "to study the long-term response of the climate system to natural and anthropogenic forcings as part of the 5th Phase of the Coupled Model Intercomparison Project (CMIP5)" [Dufresne et al., 2013]. Although this model also has been used for numerous paleoclimate studies, a major limitation was its computation time, which averaged 10 model-years / day on 32 cores of the Curie supercomputer (on TGCC computing center, France). Such performances were compatible with the experimental designs of intercomparison projects (e.g. CMIP, PMIP) but became limiting for modelling activities involving several multi-millenial experiments, which are typical for Quaternary or "deeptime" paleoclimate studies, in which a fully-equilibrated deep-ocean is mandatory. Here we present the Earth-System model IPSL-CM5A2. Based on IPSL-CM5A, technical developments have been performed both on separate components and on the coupling system in order to speed up the whole coupled model. These developments include the integration of hybrid parallelization MPI-OpenMP in LMDz atmospheric component, the use of a new input-ouput library to perform parallel asynchronous input/output by using computing cores as "IO servers", the use of a parallel coupling library between the ocean and the atmospheric components. Running on 304 cores, the model can now simulate 55 years per day, opening new gates towards multi-millenial simulations. Apart from obtaining better computing performances, one aim of setting up IPSL-CM5A2 was also to overcome the cold bias depicted in global surface air temperature (t2m) in IPSL-CM5A. We present the tuning strategy to overcome this bias as well as the main characteristics (including biases) of the pre-industrial climate simulated by IPSL-CM5A2. Lastly, we shortly present paleoclimate simulations run with this model, for the Holocene and for deeper timescales in the Cenozoic, for which the particular continental configuration was overcome by a new design of the ocean tripolar grid.
A parallel direct numerical simulation of dust particles in a turbulent flow
NASA Astrophysics Data System (ADS)
Nguyen, H. V.; Yokota, R.; Stenchikov, G.; Kocurek, G.
2012-04-01
Due to their effects on radiation transport, aerosols play an important role in the global climate. Mineral dust aerosol is a predominant natural aerosol in the desert and semi-desert regions of the Middle East and North Africa (MENA). The Arabian Peninsula is one of the three predominant source regions on the planet "exporting" dust to almost the entire world. Mineral dust aerosols make up about 50% of the tropospheric aerosol mass and therefore produces a significant impact on the Earth's climate and the atmospheric environment, especially in the MENA region that is characterized by frequent dust storms and large aerosol generation. Understanding the mechanisms of dust emission, transport and deposition is therefore essential for correctly representing dust in numerical climate prediction. In this study we present results of numerical simulations of dust particles in a turbulent flow to study the interaction between dust and the atmosphere. Homogenous and passive dust particles in the boundary layers are entrained and advected under the influence of a turbulent flow. Currently no interactions between particles are included. Turbulence is resolved through direct numerical simulation using a parallel incompressible Navier-Stokes flow solver. Model output provides information on particle trajectories, turbulent transport of dust and effects of gravity on dust motion, which will be used to compare with the wind tunnel experiments at University of Texas at Austin. Results of testing of parallel efficiency and scalability is provided. Future versions of the model will include air-particle momentum exchanges, varying particle sizes and saltation effect. The results will be used for interpreting wind tunnel and field experiments and for improvement of dust generation parameterizations in meteorological models.
The seasonal-cycle climate model
NASA Technical Reports Server (NTRS)
Marx, L.; Randall, D. A.
1981-01-01
The seasonal cycle run which will become the control run for the comparison with runs utilizing codes and parameterizations developed by outside investigators is discussed. The climate model currently exists in two parallel versions: one running on the Amdahl and the other running on the CYBER 203. These two versions are as nearly identical as machine capability and the requirement for high speed performance will allow. Developmental changes are made on the Amdahl/CMS version for ease of testing and rapidity of turnaround. The changes are subsequently incorporated into the CYBER 203 version using vectorization techniques where speed improvement can be realized. The 400 day seasonal cycle run serves as a control run for both medium and long range climate forecasts alsensitivity studies.
Toward GEOS-6, A Global Cloud System Resolving Atmospheric Model
NASA Technical Reports Server (NTRS)
Putman, William M.
2010-01-01
NASA is committed to observing and understanding the weather and climate of our home planet through the use of multi-scale modeling systems and space-based observations. Global climate models have evolved to take advantage of the influx of multi- and many-core computing technologies and the availability of large clusters of multi-core microprocessors. GEOS-6 is a next-generation cloud system resolving atmospheric model that will place NASA at the forefront of scientific exploration of our atmosphere and climate. Model simulations with GEOS-6 will produce a realistic representation of our atmosphere on the scale of typical satellite observations, bringing a visual comprehension of model results to a new level among the climate enthusiasts. In preparation for GEOS-6, the agency's flagship Earth System Modeling Framework [JDl] has been enhanced to support cutting-edge high-resolution global climate and weather simulations. Improvements include a cubed-sphere grid that exposes parallelism; a non-hydrostatic finite volume dynamical core, and algorithm designed for co-processor technologies, among others. GEOS-6 represents a fundamental advancement in the capability of global Earth system models. The ability to directly compare global simulations at the resolution of spaceborne satellite images will lead to algorithm improvements and better utilization of space-based observations within the GOES data assimilation system
PP-SWAT: A phython-based computing software for efficient multiobjective callibration of SWAT
USDA-ARS?s Scientific Manuscript database
With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
NASA Astrophysics Data System (ADS)
Kanzawa, H.; Emori, S.; Nishimura, T.; Suzuki, T.; Inoue, T.; Hasumi, H.; Saito, F.; Abe-Ouchi, A.; Kimoto, M.; Sumi, A.
2002-12-01
The fastest supercomputer of the world, the Earth Simulator (total peak performance 40TFLOPS) has recently been available for climate researches in Yokohama, Japan. We are planning to conduct a series of future climate change projection experiments on the Earth Simulator with a high-resolution coupled ocean-atmosphere climate model. The main scientific aims for the experiments are to investigate 1) the change in global ocean circulation with an eddy-permitting ocean model, 2) the regional details of the climate change including Asian monsoon rainfall pattern, tropical cyclones and so on, and 3) the change in natural climate variability with a high-resolution model of the coupled ocean-atmosphere system. To meet these aims, an atmospheric GCM, CCSR/NIES AGCM, with T106(~1.1o) horizontal resolution and 56 vertical layers is to be coupled with an oceanic GCM, COCO, with ~ 0.28ox 0.19o horizontal resolution and 48 vertical layers. This coupled ocean-atmosphere climate model, named MIROC, also includes a land-surface model, a dynamic-thermodynamic seaice model, and a river routing model. The poles of the oceanic model grid system are rotated from the geographic poles so that they are placed in Greenland and Antarctic land masses to avoild the singularity of the grid system. Each of the atmospheric and the oceanic parts of the model is parallelized with the Message Passing Interface (MPI) technique. The coupling of the two is to be done with a Multi Program Multi Data (MPMD) fashion. A 100-model-year integration will be possible in one actual month with 720 vector processors (which is only 14% of the full resources of the Earth Simulator).
Hydrologic Implications of Dynamical and Statistical Approaches to Downscaling Climate Model Outputs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Andrew W; Leung, Lai R; Sridhar, V
Six approaches for downscaling climate model outputs for use in hydrologic simulation were evaluated, with particular emphasis on each method's ability to produce precipitation and other variables used to drive a macroscale hydrology model applied at much higher spatial resolution than the climate model. Comparisons were made on the basis of a twenty-year retrospective (1975–1995) climate simulation produced by the NCAR-DOE Parallel Climate Model (PCM), and the implications of the comparison for a future (2040–2060) PCM climate scenario were also explored. The six approaches were made up of three relatively simple statistical downscaling methods – linear interpolation (LI), spatial disaggregationmore » (SD), and bias-correction and spatial disaggregation (BCSD) – each applied to both PCM output directly (at T42 spatial resolution), and after dynamical downscaling via a Regional Climate Model (RCM – at ½-degree spatial resolution), for downscaling the climate model outputs to the 1/8-degree spatial resolution of the hydrological model. For the retrospective climate simulation, results were compared to an observed gridded climatology of temperature and precipitation, and gridded hydrologic variables resulting from forcing the hydrologic model with observations. The most significant findings are that the BCSD method was successful in reproducing the main features of the observed hydrometeorology from the retrospective climate simulation, when applied to both PCM and RCM outputs. Linear interpolation produced better results using RCM output than PCM output, but both methods (PCM-LI and RCM-LI) lead to unacceptably biased hydrologic simulations. Spatial disaggregation of the PCM output produced results similar to those achieved with the RCM interpolated output; nonetheless, neither PCM nor RCM output was useful for hydrologic simulation purposes without a bias-correction step. For the future climate scenario, only the BCSD-method (using PCM or RCM) was able to produce hydrologically plausible results. With the BCSD method, the RCM-derived hydrology was more sensitive to climate change than the PCM-derived hydrology.« less
NASA Technical Reports Server (NTRS)
Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David
1987-01-01
The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.
Assessing Confidence in Pliocene Sea Surface Temperatures to Evaluate Predictive Models
NASA Technical Reports Server (NTRS)
Dowsett, Harry J.; Robinson, Marci M.; Haywood, Alan M.; Hill, Daniel J.; Dolan, Aisling. M.; Chan, Wing-Le; Abe-Ouchi, Ayako; Chandler, Mark A.; Rosenbloom, Nan A.; Otto-Bliesner, Bette L.;
2012-01-01
In light of mounting empirical evidence that planetary warming is well underway, the climate research community looks to palaeoclimate research for a ground-truthing measure with which to test the accuracy of future climate simulations. Model experiments that attempt to simulate climates of the past serve to identify both similarities and differences between two climate states and, when compared with simulations run by other models and with geological data, to identify model-specific biases. Uncertainties associated with both the data and the models must be considered in such an exercise. The most recent period of sustained global warmth similar to what is projected for the near future occurred about 3.33.0 million years ago, during the Pliocene epoch. Here, we present Pliocene sea surface temperature data, newly characterized in terms of level of confidence, along with initial experimental results from four climate models. We conclude that, in terms of sea surface temperature, models are in good agreement with estimates of Pliocene sea surface temperature in most regions except the North Atlantic. Our analysis indicates that the discrepancy between the Pliocene proxy data and model simulations in the mid-latitudes of the North Atlantic, where models underestimate warming shown by our highest-confidence data, may provide a new perspective and insight into the predictive abilities of these models in simulating a past warm interval in Earth history.This is important because the Pliocene has a number of parallels to present predictions of late twenty-first century climate.
Assessing confidence in Pliocene sea surface temperatures to evaluate predictive models
Dowsett, Harry J.; Robinson, Marci M.; Haywood, Alan M.; Hill, Daniel J.; Dolan, Aisling M.; Stoll, Danielle K.; Chan, Wing-Le; Abe-Ouchi, Ayako; Chandler, Mark A.; Rosenbloom, Nan A.; Otto-Bliesner, Bette L.; Bragg, Fran J.; Lunt, Daniel J.; Foley, Kevin M.; Riesselman, Christina R.
2012-01-01
In light of mounting empirical evidence that planetary warming is well underway, the climate research community looks to palaeoclimate research for a ground-truthing measure with which to test the accuracy of future climate simulations. Model experiments that attempt to simulate climates of the past serve to identify both similarities and differences between two climate states and, when compared with simulations run by other models and with geological data, to identify model-specific biases. Uncertainties associated with both the data and the models must be considered in such an exercise. The most recent period of sustained global warmth similar to what is projected for the near future occurred about 3.3–3.0 million years ago, during the Pliocene epoch. Here, we present Pliocene sea surface temperature data, newly characterized in terms of level of confidence, along with initial experimental results from four climate models. We conclude that, in terms of sea surface temperature, models are in good agreement with estimates of Pliocene sea surface temperature in most regions except the North Atlantic. Our analysis indicates that the discrepancy between the Pliocene proxy data and model simulations in the mid-latitudes of the North Atlantic, where models underestimate warming shown by our highest-confidence data, may provide a new perspective and insight into the predictive abilities of these models in simulating a past warm interval in Earth history. This is important because the Pliocene has a number of parallels to present predictions of late twenty-first century climate.
Accelerating Climate Simulations Through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark
2009-01-01
Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.
Climate Ocean Modeling on a Beowulf Class System
NASA Technical Reports Server (NTRS)
Cheng, B. N.; Chao, Y.; Wang, P.; Bondarenko, M.
2000-01-01
With the growing power and shrinking cost of personal computers. the availability of fast ethernet interconnections, and public domain software packages, it is now possible to combine them to build desktop parallel computers (named Beowulf or PC clusters) at a fraction of what it would cost to buy systems of comparable power front supercomputer companies. This led as to build and assemble our own sys tem. specifically for climate ocean modeling. In this article, we present our experience with such a system, discuss its network performance, and provide some performance comparison data with both HP SPP2000 and Cray T3E for an ocean Model used in present-day oceanographic research.
NASA Technical Reports Server (NTRS)
Shen, Bo-Wen; Cheung, Samson; Li, Jui-Lin F.; Wu, Yu-ling
2013-01-01
In this study, we discuss the performance of the parallel ensemble empirical mode decomposition (EMD) in the analysis of tropical waves that are associated with tropical cyclone (TC) formation. To efficiently analyze high-resolution, global, multiple-dimensional data sets, we first implement multilevel parallelism into the ensemble EMD (EEMD) and obtain a parallel speedup of 720 using 200 eight-core processors. We then apply the parallel EEMD (PEEMD) to extract the intrinsic mode functions (IMFs) from preselected data sets that represent (1) idealized tropical waves and (2) large-scale environmental flows associated with Hurricane Sandy (2012). Results indicate that the PEEMD is efficient and effective in revealing the major wave characteristics of the data, such as wavelengths and periods, by sifting out the dominant (wave) components. This approach has a potential for hurricane climate study by examining the statistical relationship between tropical waves and TC formation.
Integrated approaches to climate-crop modelling: needs and challenges.
Betts, Richard A
2005-11-29
This paper discusses the need for a more integrated approach to modelling changes in climate and crops, and some of the challenges posed by this. While changes in atmospheric composition are expected to exert an increasing radiative forcing of climate change leading to further warming of global mean temperatures and shifts in precipitation patterns, these are not the only climatic processes which may influence crop production. Changes in the physical characteristics of the land cover may also affect climate; these may arise directly from land use activities and may also result from the large-scale responses of crops to seasonal, interannual and decadal changes in the atmospheric state. Climate models used to drive crop models may, therefore, need to consider changes in the land surface, either as imposed boundary conditions or as feedbacks from an interactive climate-vegetation model. Crops may also respond directly to changes in atmospheric composition, such as the concentrations of carbon dioxide (CO2), ozone (03) and compounds of sulphur and nitrogen, so crop models should consider these processes as well as climate change. Changes in these, and the responses of the crops, may be intimately linked with meteorological processes so crop and climate models should consider synergies between climate and atmospheric chemistry. Some crop responses may occur at scales too small to significantly influence meteorology, so may not need to be included as feedbacks within climate models. However, the volume of data required to drive the appropriate crop models may be very large, especially if short-time-scale variability is important. Implementation of crop models within climate models would minimize the need to transfer large quantities of data between separate modelling systems. It should also be noted that crop responses to climate change may interact with other impacts of climate change, such as hydrological changes. For example, the availability of water for irrigation may be affected by changes in runoff as a direct consequence of climate change, and may also be affected by climate-related changes in demand for water for other uses. It is, therefore, necessary to consider the interactions between the responses of several impacts sectors to climate change. Overall, there is a strong case for a much closer coupling between models of climate, crops and hydrology, but this in itself poses challenges arising from issues of scale and errors in the models. A strategy is proposed whereby the pursuit of a fully coupled climate-chemistry-crop-hydrology model is paralleled by continued use of separate climate and land surface models but with a focus on consistency between the models.
Empirical and modeled synoptic cloud climatology of the Arctic Ocean
NASA Technical Reports Server (NTRS)
Barry, R. G.; Newell, J. P.; Schweiger, A.; Crane, R. G.
1986-01-01
A set of cloud cover data were developed for the Arctic during the climatically important spring/early summer transition months. Parallel with the determination of mean monthly cloud conditions, data for different synoptic pressure patterns were also composited as a means of evaluating the role of synoptic variability on Arctic cloud regimes. In order to carry out this analysis, a synoptic classification scheme was developed for the Arctic using an objective typing procedure. A second major objective was to analyze model output of pressure fields and cloud parameters from a control run of the Goddard Institue for Space Studies climate model for the same area and to intercompare the synoptic climatatology of the model with that based on the observational data.
The importance of land cover change across urban-rural typologies for climate modeling.
Vargo, Jason; Habeeb, Dana; Stone, Brian
2013-01-15
Land cover changes affect local surface energy balances by changing the amount of solar energy reflected, the magnitude and duration over which absorbed energy is released as heat, and the amount of energy that is diverted to non-heating fluxes through evaporation. However, such local influences often are only crudely included in climate modeling exercises, if at all. A better understanding of local land conversion dynamics can serve to inform inputs for climate models and increase the role for land use planning in climate management policy. Here we present a new approach for projecting and incorporating metropolitan land cover change into mesoscale climate and other environmental assessment models. Our results demonstrate the relative contributions of different land development patterns to land cover change and conversion and suggest that regional growth management strategies serving to increase settlement densities over time can have a significant influence on the rate of deforestation per unit of population growth. Employing the approach presented herein, the impacts of land conversion on climate change and on parallel environmental systems and services, such as ground water recharge, habitat provision, and food production, may all be investigated more closely and managed through land use planning. Copyright © 2012 Elsevier Ltd. All rights reserved.
The Potsdam Parallel Ice Sheet Model (PISM-PIK) - Part 1: Model description
NASA Astrophysics Data System (ADS)
Winkelmann, R.; Martin, M. A.; Haseloff, M.; Albrecht, T.; Bueler, E.; Khroulev, C.; Levermann, A.
2011-09-01
We present the Potsdam Parallel Ice Sheet Model (PISM-PIK), developed at the Potsdam Institute for Climate Impact Research to be used for simulations of large-scale ice sheet-shelf systems. It is derived from the Parallel Ice Sheet Model (Bueler and Brown, 2009). Velocities are calculated by superposition of two shallow stress balance approximations within the entire ice covered region: the shallow ice approximation (SIA) is dominant in grounded regions and accounts for shear deformation parallel to the geoid. The plug-flow type shallow shelf approximation (SSA) dominates the velocity field in ice shelf regions and serves as a basal sliding velocity in grounded regions. Ice streams can be identified diagnostically as regions with a significant contribution of membrane stresses to the local momentum balance. All lateral boundaries in PISM-PIK are free to evolve, including the grounding line and ice fronts. Ice shelf margins in particular are modeled using Neumann boundary conditions for the SSA equations, reflecting a hydrostatic stress imbalance along the vertical calving face. The ice front position is modeled using a subgrid-scale representation of calving front motion (Albrecht et al., 2011) and a physically-motivated calving law based on horizontal spreading rates. The model is tested in experiments from the Marine Ice Sheet Model Intercomparison Project (MISMIP). A dynamic equilibrium simulation of Antarctica under present-day conditions is presented in Martin et al. (2011).
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Williams, Dean; Aloisio, Giovanni
2016-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.
Integrated Task and Data Parallel Programming
NASA Technical Reports Server (NTRS)
Grimshaw, A. S.
1998-01-01
This research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers 1995 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program. Additional 1995 Activities During the fall I collaborated with Andrew Grimshaw and Adam Ferrari to write a book chapter which will be included in Parallel Processing in C++ edited by Gregory Wilson. I also finished two courses, Compilers and Advanced Compilers, in 1995. These courses complete my class requirements at the University of Virginia. I have only my dissertation research and defense to complete.
Integrated Task And Data Parallel Programming: Language Design
NASA Technical Reports Server (NTRS)
Grimshaw, Andrew S.; West, Emily A.
1998-01-01
his research investigates the combination of task and data parallel language constructs within a single programming language. There are an number of applications that exhibit properties which would be well served by such an integrated language. Examples include global climate models, aircraft design problems, and multidisciplinary design optimization problems. Our approach incorporates data parallel language constructs into an existing, object oriented, task parallel language. The language will support creation and manipulation of parallel classes and objects of both types (task parallel and data parallel). Ultimately, the language will allow data parallel and task parallel classes to be used either as building blocks or managers of parallel objects of either type, thus allowing the development of single and multi-paradigm parallel applications. 1995 Research Accomplishments In February I presented a paper at Frontiers '95 describing the design of the data parallel language subset. During the spring I wrote and defended my dissertation proposal. Since that time I have developed a runtime model for the language subset. I have begun implementing the model and hand-coding simple examples which demonstrate the language subset. I have identified an astrophysical fluid flow application which will validate the data parallel language subset. 1996 Research Agenda Milestones for the coming year include implementing a significant portion of the data parallel language subset over the Legion system. Using simple hand-coded methods, I plan to demonstrate (1) concurrent task and data parallel objects and (2) task parallel objects managing both task and data parallel objects. My next steps will focus on constructing a compiler and implementing the fluid flow application with the language. Concurrently, I will conduct a search for a real-world application exhibiting both task and data parallelism within the same program m. Additional 1995 Activities During the fall I collaborated with Andrew Grimshaw and Adam Ferrari to write a book chapter which will be included in Parallel Processing in C++ edited by Gregory Wilson. I also finished two courses, Compilers and Advanced Compilers, in 1995. These courses complete my class requirements at the University of Virginia. I have only my dissertation research and defense to complete.
Integrated approaches to climate–crop modelling: needs and challenges
A. Betts, Richard
2005-01-01
This paper discusses the need for a more integrated approach to modelling changes in climate and crops, and some of the challenges posed by this. While changes in atmospheric composition are expected to exert an increasing radiative forcing of climate change leading to further warming of global mean temperatures and shifts in precipitation patterns, these are not the only climatic processes which may influence crop production. Changes in the physical characteristics of the land cover may also affect climate; these may arise directly from land use activities and may also result from the large-scale responses of crops to seasonal, interannual and decadal changes in the atmospheric state. Climate models used to drive crop models may, therefore, need to consider changes in the land surface, either as imposed boundary conditions or as feedbacks from an interactive climate–vegetation model. Crops may also respond directly to changes in atmospheric composition, such as the concentrations of carbon dioxide (CO2), ozone (O3) and compounds of sulphur and nitrogen, so crop models should consider these processes as well as climate change. Changes in these, and the responses of the crops, may be intimately linked with meteorological processes so crop and climate models should consider synergies between climate and atmospheric chemistry. Some crop responses may occur at scales too small to significantly influence meteorology, so may not need to be included as feedbacks within climate models. However, the volume of data required to drive the appropriate crop models may be very large, especially if short-time-scale variability is important. Implementation of crop models within climate models would minimize the need to transfer large quantities of data between separate modelling systems. It should also be noted that crop responses to climate change may interact with other impacts of climate change, such as hydrological changes. For example, the availability of water for irrigation may be affected by changes in runoff as a direct consequence of climate change, and may also be affected by climate-related changes in demand for water for other uses. It is, therefore, necessary to consider the interactions between the responses of several impacts sectors to climate change. Overall, there is a strong case for a much closer coupling between models of climate, crops and hydrology, but this in itself poses challenges arising from issues of scale and errors in the models. A strategy is proposed whereby the pursuit of a fully coupled climate–chemistry–crop–hydrology model is paralleled by continued use of separate climate and land surface models but with a focus on consistency between the models. PMID:16433093
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-08-01
Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
Software Testing and Verification in Climate Model Development
NASA Technical Reports Server (NTRS)
Clune, Thomas L.; Rood, RIchard B.
2011-01-01
Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.
Agents, Bayes, and Climatic Risks - a modular modelling approach
NASA Astrophysics Data System (ADS)
Haas, A.; Jaeger, C.
2005-08-01
When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.
Benchmarking NWP Kernels on Multi- and Many-core Processors
NASA Astrophysics Data System (ADS)
Michalakes, J.; Vachharajani, M.
2008-12-01
Increased computing power for weather, climate, and atmospheric science has provided direct benefits for defense, agriculture, the economy, the environment, and public welfare and convenience. Today, very large clusters with many thousands of processors are allowing scientists to move forward with simulations of unprecedented size. But time-critical applications such as real-time forecasting or climate prediction need strong scaling: faster nodes and processors, not more of them. Moreover, the need for good cost- performance has never been greater, both in terms of performance per watt and per dollar. For these reasons, the new generations of multi- and many-core processors being mass produced for commercial IT and "graphical computing" (video games) are being scrutinized for their ability to exploit the abundant fine- grain parallelism in atmospheric models. We present results of our work to date identifying key computational kernels within the dynamics and physics of a large community NWP model, the Weather Research and Forecast (WRF) model. We benchmark and optimize these kernels on several different multi- and many-core processors. The goals are to (1) characterize and model performance of the kernels in terms of computational intensity, data parallelism, memory bandwidth pressure, memory footprint, etc. (2) enumerate and classify effective strategies for coding and optimizing for these new processors, (3) assess difficulties and opportunities for tool or higher-level language support, and (4) establish a continuing set of kernel benchmarks that can be used to measure and compare effectiveness of current and future designs of multi- and many-core processors for weather and climate applications.
NASA Astrophysics Data System (ADS)
Sorokin, V. A.; Volkov, Yu V.; Sherstneva, A. I.; Botygin, I. A.
2016-11-01
This paper overviews a method of generating climate regions based on an analytic signal theory. When applied to atmospheric surface layer temperature data sets, the method allows forming climatic structures with the corresponding changes in the temperature to make conclusions on the uniformity of climate in an area and to trace the climate changes in time by analyzing the type group shifts. The algorithm is based on the fact that the frequency spectrum of the thermal oscillation process is narrow-banded and has only one mode for most weather stations. This allows using the analytic signal theory, causality conditions and introducing an oscillation phase. The annual component of the phase, being a linear function, was removed by the least squares method. The remaining phase fluctuations allow consistent studying of their coordinated behavior and timing, using the Pearson correlation coefficient for dependence evaluation. This study includes program experiments to evaluate the calculation efficiency in the phase grouping task. The paper also overviews some single-threaded and multi-threaded computing models. It is shown that the phase grouping algorithm for meteorological data can be parallelized and that a multi-threaded implementation leads to a 25-30% increase in the performance.
ClimateSpark: An in-memory distributed computing framework for big climate data analytics
NASA Astrophysics Data System (ADS)
Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei
2018-06-01
The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, A.W.; Ghil, M.; Kravtsov, K.
2011-04-08
This project was a continuation of previous work under DOE CCPP funding in which we developed a twin approach of non-homogeneous hidden Markov models (NHMMs) and coupled ocean-atmosphere (O-A) intermediate-complexity models (ICMs) to identify the potentially predictable modes of climate variability, and to investigate their impacts on the regional-scale. We have developed a family of latent-variable NHMMs to simulate historical records of daily rainfall, and used them to downscale seasonal predictions. We have also developed empirical mode reduction (EMR) models for gaining insight into the underlying dynamics in observational data and general circulation model (GCM) simulations. Using coupled O-A ICMs,more » we have identified a new mechanism of interdecadal climate variability, involving the midlatitude oceans mesoscale eddy field and nonlinear, persistent atmospheric response to the oceanic anomalies. A related decadal mode is also identified, associated with the oceans thermohaline circulation. The goal of the continuation was to build on these ICM results and NHMM/EMR model developments and software to strengthen two key pillars of support for the development and application of climate models for climate change projections on time scales of decades to centuries, namely: (a) dynamical and theoretical understanding of decadal-to-interdecadal oscillations and their predictability; and (b) an interface from climate models to applications, in order to inform societal adaptation strategies to climate change at the regional scale, including model calibration, correction, downscaling and, most importantly, assessment and interpretation of spread and uncertainties in multi-model ensembles. Our main results from the grant consist of extensive further development of the hidden Markov models for rainfall simulation and downscaling specifically within the non-stationary climate change context together with the development of parallelized software; application of NHMMs to downscaling of rainfall projections over India; identification and analysis of decadal climate signals in data and models; and, studies of climate variability in terms of the dynamics of atmospheric flow regimes. Each of these project components is elaborated on below, followed by a list of publications resulting from the grant.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kravtsov, S.; Robertson, Andrew W.; Ghil, Michael
2011-04-08
This project was a continuation of previous work under DOE CCPP funding in which we developed a twin approach of non-homogeneous hidden Markov models (NHMMs) and coupled ocean-atmosphere (O-A) intermediate-complexity models (ICMs) to identify the potentially predictable modes of climate variability, and to investigate their impacts on the regional-scale. We have developed a family of latent-variable NHMMs to simulate historical records of daily rainfall, and used them to downscale seasonal predictions. We have also developed empirical mode reduction (EMR) models for gaining insight into the underlying dynamics in observational data and general circulation model (GCM) simulations. Using coupled O-A ICMs,more » we have identified a new mechanism of interdecadal climate variability, involving the midlatitude oceans mesoscale eddy field and nonlinear, persistent atmospheric response to the oceanic anomalies. A related decadal mode is also identified, associated with the oceans thermohaline circulation. The goal of the continuation was to build on these ICM results and NHMM/EMR model developments and software to strengthen two key pillars of support for the development and application of climate models for climate change projections on time scales of decades to centuries, namely: (a) dynamical and theoretical understanding of decadal-to-interdecadal oscillations and their predictability; and (b) an interface from climate models to applications, in order to inform societal adaptation strategies to climate change at the regional scale, including model calibration, correction, downscaling and, most importantly, assessment and interpretation of spread and uncertainties in multi-model ensembles. Our main results from the grant consist of extensive further development of the hidden Markov models for rainfall simulation and downscaling specifically within the non-stationary climate change context together with the development of parallelized software; application of NHMMs to downscaling of rainfall projections over India; identification and analysis of decadal climate signals in data and models; and, studies of climate variability in terms of the dynamics of atmospheric flow regimes. Each of these project components is elaborated on below, followed by a list of publications resulting from the grant.« less
Can reducing black carbon emissions counteract global warming?
Bond, Tami C; Sun, Haolin
2005-08-15
Field measurements and model results have recently shown that aerosols may have important climatic impacts. One line of inquiry has investigated whether reducing climate-warming soot or black carbon aerosol emissions can form a viable component of mitigating global warming. We review and acknowledge scientific arguments against considering aerosols and greenhouse gases in a common framework, including the differences in the physical mechanisms of climate change and relevant time scales. We argue that such a joint consideration is consistent with the language of the United Nations Framework Convention on Climate Change. We synthesize results from published climate-modeling studies to obtain a global warming potential for black carbon relative to that of CO2 (680 on a 100 year basis). This calculation enables a discussion of cost-effectiveness for mitigating the largest sources of black carbon. We find that many emission reductions are either expensive or difficult to enact when compared with greenhouse gases, particularly in Annex I countries. Finally, we propose a role for black carbon in climate mitigation strategies that is consistent with the apparently conflicting arguments raised during our discussion. Addressing these emissions is a promising way to reduce climatic interference primarily for nations that have not yet agreed to address greenhouse gas emissions and provides the potential for a parallel climate agreement.
Progress in fast, accurate multi-scale climate simulations
Collins, W. D.; Johansen, H.; Evans, K. J.; ...
2015-06-01
We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
The Potsdam Parallel Ice Sheet Model (PISM-PIK) - Part 1: Model description
NASA Astrophysics Data System (ADS)
Winkelmann, R.; Martin, M. A.; Haseloff, M.; Albrecht, T.; Bueler, E.; Khroulev, C.; Levermann, A.
2010-08-01
We present the Potsdam Parallel Ice Sheet Model (PISM-PIK), developed at the Potsdam Institute for Climate Impact Research to be used for simulations of large-scale ice sheet-shelf systems. It is derived from the Parallel Ice Sheet Model (Bueler and Brown, 2009). Velocities are calculated by superposition of two shallow stress balance approximations within the entire ice covered region: the shallow ice approximation (SIA) is dominant in grounded regions and accounts for shear deformation parallel to the geoid. The plug-flow type shallow shelf approximation (SSA) dominates the velocity field in ice shelf regions and serves as a basal sliding velocity in grounded regions. Ice streams naturally emerge through this approach and can be identified diagnostically as regions with a significant contribution of membrane stresses to the local momentum balance. All lateral boundaries in PISM-PIK are free to evolve, including the grounding line and ice fronts. Ice shelf margins in particular are modeled using Neumann boundary conditions for the SSA equations, reflecting a hydrostatic stress imbalance along the vertical calving face. The ice front position is modeled using a subgrid scale representation of calving front motion (Albrecht et al., 2010) and a physically motivated dynamic calving law based on horizontal spreading rates. The model is validated within the Marine Ice Sheet Model Intercomparison Project (MISMIP) and is used for a dynamic equilibrium simulation of Antarctica under present-day conditions in the second part of this paper (Martin et al., 2010).
Painter, Scott L.; Coon, Ethan T.; Atchley, Adam L.; ...
2016-08-11
The need to understand potential climate impacts and feedbacks in Arctic regions has prompted recent interest in modeling of permafrost dynamics in a warming climate. A new fine-scale integrated surface/subsurface thermal hydrology modeling capability is described and demonstrated in proof-of-concept simulations. The new modeling capability combines a surface energy balance model with recently developed three-dimensional subsurface thermal hydrology models and new models for nonisothermal surface water flows and snow distribution in the microtopography. Surface water flows are modeled using the diffusion wave equation extended to include energy transport and phase change of ponded water. Variation of snow depth in themore » microtopography, physically the result of wind scour, is also modeled heuristically with a diffusion wave equation. The multiple surface and subsurface processes are implemented by leveraging highly parallel community software. Fully integrated thermal hydrology simulations on the tilted open book catchment, an important test case for integrated surface/subsurface flow modeling, are presented. Fine-scale 100-year projections of the integrated permafrost thermal hydrological system on an ice wedge polygon at Barrow Alaska in a warming climate are also presented. Finally, these simulations demonstrate the feasibility of microtopography-resolving, process-rich simulations as a tool to help understand possible future evolution of the carbon-rich Arctic tundra in a warming climate.« less
High resolution global climate modelling; the UPSCALE project, a large simulation campaign
NASA Astrophysics Data System (ADS)
Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.
2014-01-01
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.
High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign
NASA Astrophysics Data System (ADS)
Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.
2014-08-01
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.
NASA Technical Reports Server (NTRS)
Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)
2002-01-01
One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.
CPMIP: measurements of real computational performance of Earth system models in CMIP6
NASA Astrophysics Data System (ADS)
Balaji, Venkatramani; Maisonnave, Eric; Zadeh, Niki; Lawrence, Bryan N.; Biercamp, Joachim; Fladrich, Uwe; Aloisio, Giovanni; Benson, Rusty; Caubel, Arnaud; Durachta, Jeffrey; Foujols, Marie-Alice; Lister, Grenville; Mocavero, Silvia; Underwood, Seth; Wright, Garrett
2017-01-01
A climate model represents a multitude of processes on a variety of timescales and space scales: a canonical example of multi-physics multi-scale modeling. The underlying climate system is physically characterized by sensitive dependence on initial conditions, and natural stochastic variability, so very long integrations are needed to extract signals of climate change. Algorithms generally possess weak scaling and can be I/O and/or memory-bound. Such weak-scaling, I/O, and memory-bound multi-physics codes present particular challenges to computational performance. Traditional metrics of computational efficiency such as performance counters and scaling curves do not tell us enough about real sustained performance from climate models on different machines. They also do not provide a satisfactory basis for comparative information across models. codes present particular challenges to computational performance. We introduce a set of metrics that can be used for the study of computational performance of climate (and Earth system) models. These measures do not require specialized software or specific hardware counters, and should be accessible to anyone. They are independent of platform and underlying parallel programming models. We show how these metrics can be used to measure actually attained performance of Earth system models on different machines, and identify the most fruitful areas of research and development for performance engineering. codes present particular challenges to computational performance. We present results for these measures for a diverse suite of models from several modeling centers, and propose to use these measures as a basis for a CPMIP, a computational performance model intercomparison project (MIP).
Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.
2012-12-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within SciReduce a versatile set of python operators for data lookup, access, subsetting, co-registration, mining, fusion, and statistical analysis. All operators take in sets of geo-located arrays and generate more arrays. Large, multi-year satellite and model datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of granules) can be compared or fused in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP or webification URLs, thereby minimizing the size of the stored input and intermediate datasets. A typical map function might assemble and quality control AIRS Level-2 water vapor profiles for a year of data in parallel, then a reduce function would average the profiles in lat/lon bins (again, in parallel), and a final reduce would aggregate the climatology and write it to output files. We are using SciReduce to automate the production of multiple versions of a multi-year water vapor climatology (AIRS & MODIS), stratified by Cloudsat cloud classification, and compare it to models (ECMWF & MERRA reanalysis). We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing huge datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer.
Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud
NASA Astrophysics Data System (ADS)
Wilson, B.; Manipon, G.; Hua, H.
2012-04-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within SciReduce a versatile set of python operators for data lookup, access, subsetting, co-registration, mining, fusion, and statistical analysis. All operators take in sets of geo-arrays and generate more arrays. Large, multi-year satellite and model datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of granules) can be compared or fused in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP or webification URLs, thereby minimizing the size of the stored input and intermediate datasets. A typical map function might assemble and quality control AIRS Level-2 water vapor profiles for a year of data in parallel, then a reduce function would average the profiles in bins (again, in parallel), and a final reduce would aggregate the climatology and write it to output files. We are using SciReduce to automate the production of multiple versions of a multi-year water vapor climatology (AIRS & MODIS), stratified by Cloudsat cloud classification, and compare it to models (ECMWF & MERRA reanalysis). We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing huge datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer.
NASA Astrophysics Data System (ADS)
Heyn, K.; Campbell, E.
2016-12-01
The Portland Water Bureau has been studying the anticipated effects of climate change on its primary surface water source, the Bull Run Watershed, since the early 2000's. Early efforts by the bureau were almost exclusively reliant on outside expertise from climate modelers and researchers, particularly those at the Climate Impacts Group (CIG) at the University of Washington. Early work products from CIG formed the basis of the bureau's understanding of the most likely and consequential impacts to the watershed from continued GHG-caused warming. However, by mid-decade, as key supply and demand conditions for the bureau changed, it found it lacked the technical capacity and tools to conduct more refined and updated research to build on the outside analysis it had obtained. Beginning in 2010 through its participation in the Pilot Utility Modeling Applications (PUMA) project, the bureau identified and began working to address the holes in its technical and institutional capacity by embarking on a process to assess and select a hydrologic model while obtaining downscaled climate change data to utilize within it. Parallel to the development of these technical elements, the bureau made investments in qualified staff to lead the model selection, development and utilization, while working to establish productive, collegial and collaborative relationships with key climate research staff at the Oregon Climate Change Research Institute (OCCRI), the University of Washington and the University of Idaho. This presentation describes the learning process of a major metropolitan area drinking water utility as its approach to addressing the complex problem of climate change evolves, matures, and begins to influence broader aspects of the organization's planning efforts.
Time variation of effective climate sensitivity in GCMs
NASA Astrophysics Data System (ADS)
Williams, K. D.; Ingram, W. J.; Gregory, J. M.
2009-04-01
Effective climate sensitivity is often assumed to be constant (if uncertain), but some previous studies of General Circulation Model (GCM) simulations have found it varying as the simulation progresses. This complicates the fitting of simple models to such simulations, as well as having implications for the estimation of climate sensitivity from observations. This study examines the evolution of the feedbacks determining the climate sensitivity in GCMs submitted to the Coupled Model Intercomparison Project. Apparent centennial-timescale variations of effective climate sensitivity during stabilisation to a forcing can be considered an artefact of using conventional forcings which only allow for instantaneous effects and stratospheric adjustment. If the forcing is adjusted for processes occurring on timescales which are short compared to the climate stabilisation timescale then there is little centennial timescale evolution of effective climate sensitivity in any of the GCMs. We suggest that much of the apparent variation in effective climate sensitivity identified in previous studies is actually due to the comparatively fast forcing adjustment. Persistent differences are found in the strength of the feedbacks between the coupled atmosphere - ocean (AO) versions and their atmosphere - mixed-layer ocean (AML) counterparts, (the latter are often assumed to give the equilibrium climate sensitivity of the AOGCM). The AML model can typically only estimate the equilibrium climate sensitivity of the parallel AO version to within about 0.5K. The adjustment to the forcing to account for comparatively fast processes varies in magnitude and sign between GCMs, as well as differing between AO and AML versions of the same model. There is evidence from one AOGCM that the forcing adjustment may take a couple of decades, with implications for observationally based estimates of equilibrium climate sensitivity. We suggest that at least some of the spread in 21st century global temperature predictions between GCMs is due to differing adjustment processes, hence work to understand these differences should be a priority.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Ping
Recent studies have revealed that among all the tropical oceans, the tropical Atlantic has experienced the most pronounced warming trend over the 20th century. Many extreme climate events affecting the U.S., such as hurricanes, severe precipitation and drought events, are influenced by conditions in the Gulf of Mexico and the Atlantic Ocean. It is therefore imperative to have accurate simulations of the climatic mean and variability in the Atlantic region to be able to make credible projections of future climate change affecting the U.S. and other countries adjoining the Atlantic Ocean. Unfortunately, almost all global climate models exhibit large biasesmore » in their simulations of tropical Atlantic climate. The atmospheric convection simulation errors in the Amazon region and the associated errors in the trade wind simulations are hypothesized to be a leading cause of the tropical Atlantic biases in climate models. As global climate models have resolutions that are too coarse to resolve some of the atmospheric and oceanic processes responsible for the model biases, we propose to use a high-resolution coupled regional climate model (CRCM) framework to address the tropical bias issue. We propose to combine the expertise in tropical coupled atmosphere-ocean modeling at Texas A&M University (TAMU) and the coupled land-atmosphere modeling expertise at Pacific Northwest National Laboratory (PNNL) to develop a comprehensive CRCM for the Atlantic sector within a general and flexible modeling framework. The atmospheric component of the CRCM will be the NCAR WRF model and the oceanic component will be the Rutgers/UCLA ROMS. For the land component, we will use CLM modified at PNNL to include more detailed representations of vegetation and soil hydrology processes. The combined TAMU-PNNL CRCM model will be used to simulate the Atlantic climate, and the associated land-atmosphere-ocean interactions at a horizontal resolution of 9 km or finer. A particular focus of the model development effort will be to optimize the performance of WRF and ROMS over several thousand of cores by focusing on both the parallel communication libraries and the I/O interfaces, in order to achieve the sustained throughput needed to perform simulations on such fine resolution grids. The CRCM model will be developed within the framework of the Coupler (CPL7) software that is part of the NCAR Community Earth System Model (CESM). Through efforts at PNNL and within the community, WRF and CLM have already been coupled via CPL7. Using the flux coupler approach for the whole CRCM model will allow us to flexibly couple WRF, ROMS, and CLM with each model running on its own grid at different resolutions. In addition, this framework will allow us to easily port parameterizations between CESM and the CRCM, and potentially allow partial coupling between the CESM and the CRCM. TAMU and PNNL will contribute cooperatively to this research endeavor. The TAMU team led by Chang and Saravanan has considerable experience in studying atmosphere-ocean interactions within tropical Atlantic sector and will focus on modeling issues that relate to coupling WRF and ROMS. The PNNL team led by Leung has extensive expertise in atmosphere-land interaction and will be responsible for improving the land surface parameterization. Both teams will jointly work on integrating WRF-ROMS and WRF-CLM to couple WRF, ROMS, and CLM through CPL7. Montuoro of the TAMU Supercomputing Center will be responsible for improving the MPI and Parallel IO interfaces of the CRCM. Both teams will contribute to the design and execution of the proposed numerical experiments and jointly perform analysis of the numerical experiments.« less
An approach to secure weather and climate models against hardware faults
NASA Astrophysics Data System (ADS)
Düben, Peter D.; Dawson, Andrew
2017-03-01
Enabling Earth System models to run efficiently on future supercomputers is a serious challenge for model development. Many publications study efficient parallelization to allow better scaling of performance on an increasing number of computing cores. However, one of the most alarming threats for weather and climate predictions on future high performance computing architectures is widely ignored: the presence of hardware faults that will frequently hit large applications as we approach exascale supercomputing. Changes in the structure of weather and climate models that would allow them to be resilient against hardware faults are hardly discussed in the model development community. In this paper, we present an approach to secure the dynamical core of weather and climate models against hardware faults using a backup system that stores coarse resolution copies of prognostic variables. Frequent checks of the model fields on the backup grid allow the detection of severe hardware faults, and prognostic variables that are changed by hardware faults on the model grid can be restored from the backup grid to continue model simulations with no significant delay. To justify the approach, we perform model simulations with a C-grid shallow water model in the presence of frequent hardware faults. As long as the backup system is used, simulations do not crash and a high level of model quality can be maintained. The overhead due to the backup system is reasonable and additional storage requirements are small. Runtime is increased by only 13 % for the shallow water model.
NASA Astrophysics Data System (ADS)
Yang, S.; Madsen, M. S.; Rodehacke, C. B.; Svendsen, S. H.; Adalgeirsdottir, G.
2014-12-01
Recent observations show that the Greenland ice sheet (GrIS) has been losing mass with an increasing speed during the past decades. Predicting the GrIS changes and their climate consequences relies on the understanding of the interaction of the GrIS with the climate system on both global and local scales, and requires climate model systems with an explicit and physically consistent ice sheet module. A fully coupled global climate model with a dynamical ice sheet model for the GrIS has recently been developed. The model system, EC-EARTH - PISM, consists of the EC-EARTH, an atmosphere, ocean and sea ice model system, and the Parallel Ice Sheet Model (PISM). The coupling of PISM includes a modified surface physical parameterization in EC-EARTH adapted to the land ice surface over glaciated regions in Greenland. The PISM ice sheet model is forced with the surface mass balance (SMB) directly computed inside the EC-EARTH atmospheric module and accounting for the precipitation, the surface evaporation, and the melting of snow and ice over land ice. PISM returns the simulated basal melt, ice discharge and ice cover (extent and thickness) as boundary conditions to EC-EARTH. This coupled system is mass and energy conserving without being constrained by any anomaly correction or flux adjustment, and hence is suitable for investigation of ice sheet - climate feedbacks. Three multi-century experiments for warm climate scenarios under (1) the RCP85 climate forcing, (2) an abrupt 4xCO2 and (3) an idealized 1% per year CO2 increase are performed using the coupled model system. The experiments are compared with their counterparts of the standard CMIP5 simulations (without the interactive ice sheet) to evaluate the performance of the coupled system and to quantify the GrIS feedbacks. In particular, the evolution of the Greenland ice sheet under the warm climate and its impacts on the climate system are investigated. Freshwater fluxes from the Greenland ice sheet melt to the Arctic and North Atlantic basin and their influence on the ocean stratification and ocean circulation are analysed. The changes in the surface climate and the atmospheric circulation associated with the impact of the Greenland ice sheet changes are quantified. The interaction between the Greenland ice sheet and Arctic sea ice is also examined.
Heinrich events simulated across the glacial
NASA Astrophysics Data System (ADS)
Ziemen, F. A.; Mikolajewicz, U.
2015-12-01
Heinrich events are among the most prominent climate change events recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. We analyze simulations where the ISM is coupled asynchronously to the AOVGCM and simulations where the ISM and the ocean model are coupled synchronously and the atmosphere model is coupled asynchronously to them. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport.
Emissions pathways, climate change, and impacts on California
Hayhoe, K.; Cayan, D.; Field, C.B.; Frumhoff, P.C.; Maurer, E.P.; Miller, N.L.; Moser, S.C.; Schneider, S.H.; Cahill, K.N.; Cleland, E.E.; Dale, L.; Drapek, R.; Hanemann, R.M.; Kalkstein, L.S.; Lenihan, J.; Lunch, C.K.; Neilson, R.P.; Sheridan, S.C.; Verville, J.H.
2004-01-01
The magnitude of future climate change depends substantially on the greenhouse gas emission pathways we choose. Here we explore the implications of the highest and lowest Intergovernmental Panel on Climate Change emissions pathways for climate change and associated impacts in California. Based on climate projections from two state-of-the-art climate models with low and medium sensitivity (Parallel Climate Model and Hadley Centre Climate Model, version 3, respectively), we find that annual temperature increases nearly double from the lower B1 to the higher A1fi emissions scenario before 2100. Three of four simulations also show greater increases in summer temperatures as compared with winter. Extreme heat and the associated impacts on a range of temperature-sensitive sectors are substantially greater under the higher emissions scenario, with some interscenario differences apparent before midcentury. By the end of the century under the B1 scenario, heatwaves and extreme heat in Los Angeles quadruple in frequency while heat-related mortality increases two to three times; alpine/subalpine forests are reduced by 50-75%; and Sierra snowpack is reduced 30-70%. Under A1fi, heatwaves in Los Angeles are six to eight times more frequent, with heat-related excess mortality increasing five to seven times; alpine/subalpine forests are reduced by 75-90%; and snowpack declines 73-90%, with cascading impacts on runoff and streamflow that, combined with projected modest declines in winter precipitation, could fundamentally disrupt California's water rights system. Although interscenario differences in climate impacts and costs of adaptation emerge mainly in the second half of the century, they are strongly dependent on emissions from preceding decades.
NASA Technical Reports Server (NTRS)
Larson, Jay W.
1998-01-01
Atmospheric data assimilation is a method of combining actual observations with model forecasts to produce a more accurate description of the earth system than the observations or forecast alone can provide. The output of data assimilation, sometimes called the analysis, are regular, gridded datasets of observed and unobserved variables. Analysis plays a key role in numerical weather prediction and is becoming increasingly important for climate research. These applications, and the need for timely validation of scientific enhancements to the data assimilation system pose computational demands that are best met by distributed parallel software. The mission of the NASA Data Assimilation Office (DAO) is to provide datasets for climate research and to support NASA satellite and aircraft missions. The system used to create these datasets is the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The core components of the the GEOS DAS are: the GEOS General Circulation Model (GCM), the Physical-space Statistical Analysis System (PSAS), the Observer, the on-line Quality Control (QC) system, the Coupler (which feeds analysis increments back to the GCM), and an I/O package for processing the large amounts of data the system produces (which will be described in another presentation in this session). The discussion will center on the following issues: the computational complexity for the whole GEOS DAS, assessment of the performance of the individual elements of GEOS DAS, and parallelization strategy for some of the components of the system.
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-01-01
Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
Working with South Florida County Planners to Understand and Mitigate Uncertain Climate Risks
NASA Astrophysics Data System (ADS)
Knopman, D.; Groves, D. G.; Berg, N.
2017-12-01
This talk describes a novel approach for evaluating climate change vulnerabilities and adaptations in Southeast Florida to support long-term resilience planning. The work is unique in that it combines state-of-the-art hydrologic modeling with the region's long-term land use and transportation plans to better assess the future climate vulnerability and adaptations for the region. Addressing uncertainty in future projections is handled through the use of decisionmaking under deep uncertainty methods. Study findings, including analysis of key tradeoffs, were conveyed to the region's stakeholders through an innovative web-based decision support tool. This project leverages existing groundwater models spanning Miami-Dade and Broward Counties developed by the USGS, along with projections of land use and asset valuations for Miami-Dade and Broward County planning agencies. Model simulations are executed on virtual cloud-based servers for a highly scalable and parallelized platform. Groundwater elevations and the saltwater-freshwater interface and intrusion zones from the integrated modeling framework are analyzed under a wide range of long-term climate futures, including projected sea level rise and precipitation changes. The hydrologic hazards are then combined with current and future land use and asset valuation projections to estimate assets at risk across the range of futures. Lastly, an interactive decision support tool highlights the areas with critical climate vulnerabilities; distinguishes between vulnerability due to new development, increased climate hazards, or both; and provides guidance for adaptive management and development practices and decisionmaking in Southeast Florida.
Toward 10-km mesh global climate simulations
NASA Astrophysics Data System (ADS)
Ohfuchi, W.; Enomoto, T.; Takaya, K.; Yoshioka, M. K.
2002-12-01
An atmospheric general circulation model (AGCM) that runs very efficiently on the Earth Simulator (ES) was developed. The ES is a gigantic vector-parallel computer with the peak performance of 40 Tflops. The AGCM, named AFES (AGCM for ES), was based on the version 5.4.02 of an AGCM developed jointly by the Center for Climate System Research, the University of Tokyo and the Japanese National Institute for Environmental Sciences. The AFES was, however, totally rewritten in FORTRAN90 and MPI while the original AGCM was written in FORTRAN77 and not capable of parallel computing. The AFES achieved 26 Tflops (about 65 % of the peak performance of the ES) at resolution of T1279L96 (10-km horizontal resolution and 500-m vertical resolution in middle troposphere to lower stratosphere). Some results of 10- to 20-day global simulations will be presented. At this moment, only short-term simulations are possible due to data storage limitation. As ten tera flops computing is achieved, peta byte data storage are necessary to conduct climate-type simulations at this super-high resolution global simulations. Some possibilities for future research topics in global super-high resolution climate simulations will be discussed. Some target topics are mesoscale structures and self-organization of the Baiu-Meiyu front over Japan, cyclogenecsis over the North Pacific and typhoons around the Japan area. Also improvement in local precipitation with increasing horizontal resolution will be demonstrated.
Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT
NASA Technical Reports Server (NTRS)
Maxwell, Thomas
2012-01-01
Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.
Constraints and Opportunities in GCM Model Development
NASA Technical Reports Server (NTRS)
Schmidt, Gavin; Clune, Thomas
2010-01-01
Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.
NASA Astrophysics Data System (ADS)
Li, X.; St George, S.
2013-12-01
Both dendrochronological theory and regional and global networks of tree-ring width measurements indicate that trees can respond to climate variations quite differently from one location to another. To explain these geographical differences at hemispheric scale, we used a process-based model of tree-ring formation (the Vaganov-Shashkin model) to simulate tree growth at over 6000 locations across the Northern Hemisphere. We compared the seasonality and strength of climate signals in the simulated tree-ring records against parallel analysis conducted on a hemispheric network of real tree-ring observations, tested the ability of the model to reproduce behaviors that emerge from large networks of tree-ring widths and used the model outputs to explain why the network exhibits these behaviors. The simulated tree-ring records are consistent with observations with respect to the seasonality and relative strength of the encoded climate signals, and time-related changes in these climate signals can be predicted using the modeled relative growth rate due to temperature or soil moisture. The positive imprint of winter (DJF) precipitation is strongest in simulations from the American Southwest and northern Mexico as well as selected locations in the Mediterranean and central Asia. Summer (JJA) precipitation has higher positive correlations with simulations in the mid-latitudes, but some high-latitude coastal sites exhibit a negative association. The influence of summer temperature is mainly positive at high-latitude or high-altitude sites and negative in the mid-latitudes. The absolute magnitude of climate correlations are generally higher in simulations than in observations, but the pattern and geographical differences remain the same, demonstrating that the model has skill in reproducing tree-ring growth response to climate variability in the Northern Hemisphere. Because the model uses only temperature, precipitation and latitude as input and is not adjusted for species or other biological factors, the fact that the climate response of the simulations largely agrees with the observations may imply that climate, rather than biology, is the main factor that influences large-scale patterns of the climate information recorded by tree rings. Our results also suggest that the Vaganov-Shashkin model could be used to estimate the likely climate response of trees in ';frontier' areas that have not been sampled extensively. Seasonal Climate Correlations of Simulated Tree-ring Records
NASA Astrophysics Data System (ADS)
Mani, N. J.; Waliser, D. E.; Jiang, X.
2014-12-01
While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, William D; Johansen, Hans; Evans, Katherine J
We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy andmore » fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)
2002-01-01
The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 km or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed-shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)
2002-01-01
The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 kin or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed- shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.
NASA Astrophysics Data System (ADS)
Dettinger, M. D.; Cayan, D. R.; Cayan, D. R.; Meyer, M. K.
2001-12-01
Sensitivities of river basins in the Sierra Nevada of California to historical and future climate variations and changes are analyzed by simulating daily streamflow and water-balance responses to simulated climate variations over a continuous 200-year period. The coupled atmosphere-ocean-ice-land Parallel Climate Model provides the simulated climate histories, and existing hydrologic models of the Merced, Carson, and American Rivers are used to simulate the basin responses. The historical simulations yield stationary climate and hydrologic variations through the first part of the 20th Century until about 1975, when temperatures begin to warm noticeably and when snowmelt and streamflow peaks begin to occur progressively earlier within the seasonal cycle. A future climate simulated with business-as-usual increases in greenhouse-gas and aerosol radiative forcings continues those recent trends through the 21st Century with an attendant +2.5ºC warming and a hastening of snowmelt and streamflow within the seasonal cycle by almost a month. In contrast, a control simulation in which radiative forcings are held constant at 1995 levels for the 50 years following 1995, yields climate and streamflow-timing conditions much like the 1980s and 1990s throughout its duration. Long-term average totals of streamflow and other hydrologic fluxes remain similar to the historical mean in all three simulations. The various projected trends in the business-as-usual simulations become readily visible above simulated natural climatic and hydrologic variability by about 2020.
Nelson, Kären C; Palmer, Margaret A; Pizzuto, James E; Moglen, Glenn E; Angermeier, Paul L; Hilderbrand, Robert H; Dettinger, Michael; Hayhoe, Katharine
2009-01-01
Streams collect runoff, heat, and sediment from their watersheds, making them highly vulnerable to anthropogenic disturbances such as urbanization and climate change. Forecasting the effects of these disturbances using process-based models is critical to identifying the form and magnitude of likely impacts. Here, we integrate a new biotic model with four previously developed physical models (downscaled climate projections, stream hydrology, geomorphology, and water temperature) to predict how stream fish growth and reproduction will most probably respond to shifts in climate and urbanization over the next several decades. The biotic submodel couples dynamics in fish populations and habitat suitability to predict fish assemblage composition, based on readily available biotic information (preferences for habitat, temperature, and food, and characteristics of spawning) and day-to-day variability in stream conditions. We illustrate the model using Piedmont headwater streams in the Chesapeake Bay watershed of the USA, projecting ten scenarios: Baseline (low urbanization; no on-going construction; and present-day climate); one Urbanization scenario (higher impervious surface, lower forest cover, significant construction activity); four future climate change scenarios [Hadley CM3 and Parallel Climate Models under medium-high (A2) and medium-low (B2) emissions scenarios]; and the same four climate change scenarios plus Urbanization. Urbanization alone depressed growth or reproduction of 8 of 39 species, while climate change alone depressed 22 to 29 species. Almost every recreationally important species (i.e. trouts, basses, sunfishes) and six of the ten currently most common species were predicted to be significantly stressed. The combined effect of climate change and urbanization on adult growth was sometimes large compared to the effect of either stressor alone. Thus, the model predicts considerable change in fish assemblage composition, including loss of diversity. Synthesis and applications. The interaction of climate change and urban growth may entail significant reconfiguring of headwater streams, including a loss of ecosystem structure and services, which will be more costly than climate change alone. On local scales, stakeholders cannot control climate drivers but they can mitigate stream impacts via careful land use. Therefore, to conserve stream ecosystems, we recommend that proactive measures be taken to insure against species loss or severe population declines. Delays will inevitably exacerbate the impacts of both climate change and urbanization on headwater systems. PMID:19536343
FastQuery: A Parallel Indexing System for Scientific Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chou, Jerry; Wu, Kesheng; Prabhat,
2011-07-29
Modern scientific datasets present numerous data management and analysis challenges. State-of-the- art index and query technologies such as FastBit can significantly improve accesses to these datasets by augmenting the user data with indexes and other secondary information. However, a challenge is that the indexes assume the relational data model but the scientific data generally follows the array data model. To match the two data models, we design a generic mapping mechanism and implement an efficient input and output interface for reading and writing the data and their corresponding indexes. To take advantage of the emerging many-core architectures, we also developmore » a parallel strategy for indexing using threading technology. This approach complements our on-going MPI-based parallelization efforts. We demonstrate the flexibility of our software by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using data from a particle accelerator model and a global climate model. We also conducted a detailed performance study using these scientific datasets. The results show that FastQuery speeds up the query time by a factor of 2.5x to 50x, and it reduces the indexing time by a factor of 16 on 24 cores.« less
Simulation of the present-day climate with the climate model INMCM5
NASA Astrophysics Data System (ADS)
Volodin, E. M.; Mortikov, E. V.; Kostrykin, S. V.; Galin, V. Ya.; Lykossov, V. N.; Gritsun, A. S.; Diansky, N. A.; Gusev, A. V.; Iakovlev, N. G.
2017-12-01
In this paper we present the fifth generation of the INMCM climate model that is being developed at the Institute of Numerical Mathematics of the Russian Academy of Sciences (INMCM5). The most important changes with respect to the previous version (INMCM4) were made in the atmospheric component of the model. Its vertical resolution was increased to resolve the upper stratosphere and the lower mesosphere. A more sophisticated parameterization of condensation and cloudiness formation was introduced as well. An aerosol module was incorporated into the model. The upgraded oceanic component has a modified dynamical core optimized for better implementation on parallel computers and has two times higher resolution in both horizontal directions. Analysis of the present-day climatology of the INMCM5 (based on the data of historical run for 1979-2005) shows moderate improvements in reproduction of basic circulation characteristics with respect to the previous version. Biases in the near-surface temperature and precipitation are slightly reduced compared with INMCM4 as well as biases in oceanic temperature, salinity and sea surface height. The most notable improvement over INMCM4 is the capability of the new model to reproduce the equatorial stratospheric quasi-biannual oscillation and statistics of sudden stratospheric warmings.
A New Attempt of 2-D Numerical Ice Flow Model to Reconstruct Paleoclimate from Mountain Glaciers
NASA Astrophysics Data System (ADS)
Candaş, Adem; Akif Sarıkaya, Mehmet
2017-04-01
A new two dimensional (2D) numerical ice flow model is generated to simulate the steady-state glacier extent for a wide range of climate conditions. The simulation includes the flow of ice enforced by the annual mass balance gradient of a valley glacier. The annual mass balance is calculated by the difference of the net accumulation and ablation of snow and (or) ice. The generated model lets users to compare the simulated and field observed ice extent of paleoglaciers. As a result, model results provide the conditions about the past climates since simulated ice extent is a function of predefined climatic conditions. To predict the glacier shape and distribution in two dimension, time dependent partial differential equation (PDE) is solved. Thus, a 2D glacier flow model code is constructed in MATLAB and a finite difference method is used to solve this equation. On the other hand, Parallel Ice Sheet Model (PISM) is used to regenerate paleoglaciers in the same area where the MATLAB code is applied. We chose the Mount Dedegöl, an extensively glaciated mountain in SW Turkey, to apply both models. Model results will be presented and discussed in this presentation. This study was supported by TÜBİTAK 114Y548 project.
Emissions pathways, climate change, and impacts on California
Hayhoe, Katharine; Cayan, Daniel; Field, Christopher B.; Frumhoff, Peter C.; Maurer, Edwin P.; Miller, Norman L.; Moser, Susanne C.; Schneider, Stephen H.; Cahill, Kimberly Nicholas; Cleland, Elsa E.; Dale, Larry; Drapek, Ray; Hanemann, R. Michael; Kalkstein, Laurence S.; Lenihan, James; Lunch, Claire K.; Neilson, Ronald P.; Sheridan, Scott C.; Verville, Julia H.
2004-01-01
The magnitude of future climate change depends substantially on the greenhouse gas emission pathways we choose. Here we explore the implications of the highest and lowest Intergovernmental Panel on Climate Change emissions pathways for climate change and associated impacts in California. Based on climate projections from two state-of-the-art climate models with low and medium sensitivity (Parallel Climate Model and Hadley Centre Climate Model, version 3, respectively), we find that annual temperature increases nearly double from the lower B1 to the higher A1fi emissions scenario before 2100. Three of four simulations also show greater increases in summer temperatures as compared with winter. Extreme heat and the associated impacts on a range of temperature-sensitive sectors are substantially greater under the higher emissions scenario, with some interscenario differences apparent before midcentury. By the end of the century under the B1 scenario, heatwaves and extreme heat in Los Angeles quadruple in frequency while heat-related mortality increases two to three times; alpine/subalpine forests are reduced by 50–75%; and Sierra snowpack is reduced 30–70%. Under A1fi, heatwaves in Los Angeles are six to eight times more frequent, with heat-related excess mortality increasing five to seven times; alpine/subalpine forests are reduced by 75–90%; and snowpack declines 73–90%, with cascading impacts on runoff and streamflow that, combined with projected modest declines in winter precipitation, could fundamentally disrupt California's water rights system. Although interscenario differences in climate impacts and costs of adaptation emerge mainly in the second half of the century, they are strongly dependent on emissions from preceding decades. PMID:15314227
NASA Astrophysics Data System (ADS)
Steinke, R. C.; Ogden, F. L.; Lai, W.; Moreno, H. A.; Pureza, L. G.
2014-12-01
Physics-based watershed models are useful tools for hydrologic studies, water resources management and economic analyses in the contexts of climate, land-use, and water-use changes. This poster presents a parallel implementation of a quasi 3-dimensional, physics-based, high-resolution, distributed water resources model suitable for simulating large watersheds in a massively parallel computing environment. Developing this model is one of the objectives of the NSF EPSCoR RII Track II CI-WATER project, which is joint between Wyoming and Utah EPSCoR jurisdictions. The model, which we call ADHydro, is aimed at simulating important processes in the Rocky Mountain west, including: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow, water management and irrigation. Model forcing is provided by the Weather Research and Forecasting (WRF) model, and ADHydro is coupled with the NOAH-MP land-surface scheme for calculating fluxes between the land and atmosphere. The ADHydro implementation uses the Charm++ parallel run time system. Charm++ is based on location transparent message passing between migrateable C++ objects. Each object represents an entity in the model such as a mesh element. These objects can be migrated between processors or serialized to disk allowing the Charm++ system to automatically provide capabilities such as load balancing and checkpointing. Objects interact with each other by passing messages that the Charm++ system routes to the correct destination object regardless of its current location. This poster discusses the algorithms, communication patterns, and caching strategies used to implement ADHydro with Charm++. The ADHydro model code will be released to the hydrologic community in late 2014.
NASA Astrophysics Data System (ADS)
Sushkevich, T. A.; Strelkov, S. A.; Maksakova, S. V.
2017-11-01
We are talking about the national achievements of the world level in theory of radiation transfer in the system atmosphere-oceans and about the modern scientific potential developing in Russia, which adequately provides a methodological basis for theoretical and computational studies of radiation processes and radiation fields in the natural environments with the use of supercomputers and massively parallel processing for problems of remote sensing and the climate of Earth. A model of the radiation field in system "clouds cover the atmosphere-ocean" to the separation of the contributions of clouds, atmosphere and ocean.
NASA Astrophysics Data System (ADS)
Yang, S.; Christensen, J. H.; Madsen, M. S.; Ringgaard, I. M.; Petersen, R. A.; Langen, P. P.
2017-12-01
Greenland ice sheet (GrIS) is observed undergoing a rapid change in the recent decades, with an increasing area of surface melting and ablation and a speeding mass loss. Predicting the GrIS changes and their climate consequences relies on the understanding of the interaction of the GrIS with the climate system on both global and local scales, and requires climate model systems incorporating with an explicit and physically consistent ice sheet module. In this work we study the GrIS evolution and its interaction with the climate system using a fully coupled global climate model with a dynamical ice sheet model for the GrIS. The coupled model system, EC-EARTH - PISM, consisting of the atmosphere-ocean-sea ice model system EC-EARTH, and the Parallel Ice Sheet Model (PISM), has been employed for a 1400-year simulation forced by CMIP5 historical forcing from 1850 to 2005 and continued along an extended RCP8.5 scenario with the forcing peaking at 2200 and stabilized hereafter. The simulation reveals that, following the anthropogenic forcing increase, the global mean surface temperature rapidly rises about 10 °C in the 21st and 22nd century. After the forcing stops increasing after 2200, the temperature change slows down and eventually stabilizes at about 12.5 °C above the preindustrial level. In response to the climate warming, the GrIS starts losing mass slowly in the 21st century, but the ice retreat accelerates substantially after 2100 and ice mass loss continues hereafter at a constant rate of approximately 0.5 m sea level rise equivalence per 100 years, even as the warming rate gradually levels off. Ultimately the volume and extent of GrIS reduce to less than half of its preindustrial value. To understand the interaction of GrIS with the climate system, the characteristics of atmospheric and oceanic circulation in the warm climate are analyzed. The circulation patterns associated with the negative surface mass balance that leads to GrIS retreat are investigated. The impact of the simulated surface warming on the ice flow and ice dynamics is explored.
NASA Astrophysics Data System (ADS)
Maslowski, W.; Roberts, A.; Osinski, R.; Brunke, M.; Cassano, J. J.; Clement Kinney, J. L.; Craig, A.; Duvivier, A.; Fisel, B. J.; Gutowski, W. J., Jr.; Hamman, J.; Hughes, M.; Nijssen, B.; Zeng, X.
2014-12-01
The Arctic is undergoing rapid climatic changes, which are some of the most coordinated changes currently occurring anywhere on Earth. They are exemplified by the retreat of the perennial sea ice cover, which integrates forcing by, exchanges with and feedbacks between atmosphere, ocean and land. While historical reconstructions from Global Climate and Global Earth System Models (GC/ESMs) are in broad agreement with these changes, the rate of change in the GC/ESMs remains outpaced by observations. Reasons for that stem from a combination of coarse model resolution, inadequate parameterizations, unrepresented processes and a limited knowledge of physical and other real world interactions. We demonstrate the capability of the Regional Arctic System Model (RASM) in addressing some of the GC/ESM limitations in simulating observed seasonal to decadal variability and trends in the sea ice cover and climate. RASM is a high resolution, fully coupled, pan-Arctic climate model that uses the Community Earth System Model (CESM) framework. It uses the Los Alamos Sea Ice Model (CICE) and Parallel Ocean Program (POP) configured at an eddy-permitting resolution of 1/12° as well as the Weather Research and Forecasting (WRF) and Variable Infiltration Capacity (VIC) models at 50 km resolution. All RASM components are coupled via the CESM flux coupler (CPL7) at 20-minute intervals. RASM is an example of limited-area, process-resolving, fully coupled earth system model, which due to the additional constraints from lateral boundary conditions and nudging within a regional model domain facilitates detailed comparisons with observational statistics that are not possible with GC/ESMs. In this talk, we will emphasize the utility of RASM to understand sensitivity to variable parameter space, importance of critical processes, coupled feedbacks and ultimately to reduce uncertainty in arctic climate change projections.
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
The impacts of climate changes in the renewable energy resources in the Caribbean region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson III, David J
2010-02-01
Assessment of renewable energy resources such as surface solar radiation and wind current has great relevance in the development of local and regional energy policies. This paper examines the variability and availability of these resources as a function of possible climate changes for the Caribbean region. Global climate changes have been reported in the last decades, causing changes in the atmospheric dynamics, which affects the net solar radiation balance at the surface and the wind strength and direction. For this investigation, the future climate changes for the Caribbean are predicted using the parallel climate model (PCM) and it is coupledmore » with the numerical model regional atmospheric modeling system (RAMS) to simulate the solar and wind energy spatial patterns changes for the specific case of the island of Puerto Rico. Numerical results from PCM indicate that the Caribbean basin from 2041 to 2055 will experience a slight decrease in the net surface solar radiation (with respect to the years 1996-2010), which is more pronounced in the western Caribbean sea. Results also indicate that the easterly winds have a tendency to increase in its magnitude, especially from the years 2070 to 2098. The regional model showed that important areas to collect solar energy are located in the eastern side of Puerto Rico, while the more intense wind speed is placed around the coast. A future climate change is expected in the Caribbean that will result in higher energy demands, but both renewable energy sources will have enough intensity to be used in the future as alternative energy resources to mitigate future climate changes.« less
NASA Astrophysics Data System (ADS)
Parrish, D. D.; Lamarque, J.-F.; Naik, V.; Horowitz, L.; Shindell, D. T.; Staehelin, J.; Derwent, R.; Cooper, O. R.; Tanimoto, H.; Volz-Thomas, A.; Gilge, S.; Scheel, H.-E.; Steinbacher, M.; Fröhlich, M.
2014-05-01
Two recent papers have quantified long-term ozone (O3) changes observed at northern midlatitude sites that are believed to represent baseline (here understood as representative of continental to hemispheric scales) conditions. Three chemistry-climate models (NCAR CAM-chem, GFDL-CM3, and GISS-E2-R) have calculated retrospective tropospheric O3 concentrations as part of the Atmospheric Chemistry and Climate Model Intercomparison Project and Coupled Model Intercomparison Project Phase 5 model intercomparisons. We present an approach for quantitative comparisons of model results with measurements for seasonally averaged O3 concentrations. There is considerable qualitative agreement between the measurements and the models, but there are also substantial and consistent quantitative disagreements. Most notably, models (1) overestimate absolute O3 mixing ratios, on average by 5 to 17 ppbv in the year 2000, (2) capture only 50% of O3 changes observed over the past five to six decades, and little of observed seasonal differences, and (3) capture 25 to 45% of the rate of change of the long-term changes. These disagreements are significant enough to indicate that only limited confidence can be placed on estimates of present-day radiative forcing of tropospheric O3 derived from modeled historic concentration changes and on predicted future O3 concentrations. Evidently our understanding of tropospheric O3, or the incorporation of chemistry and transport processes into current chemical climate models, is incomplete. Modeled O3 trends approximately parallel estimated trends in anthropogenic emissions of NOx, an important O3 precursor, while measured O3 changes increase more rapidly than these emission estimates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan, Hui; Rasch, Philip J.; Zhang, Kai
2014-09-08
This paper explores the feasibility of an experimentation strategy for investigating sensitivities in fast components of atmospheric general circulation models. The basic idea is to replace the traditional serial-in-time long-term climate integrations by representative ensembles of shorter simulations. The key advantage of the proposed method lies in its efficiency: since fewer days of simulation are needed, the computational cost is less, and because individual realizations are independent and can be integrated simultaneously, the new dimension of parallelism can dramatically reduce the turnaround time in benchmark tests, sensitivities studies, and model tuning exercises. The strategy is not appropriate for exploring sensitivitymore » of all model features, but it is very effective in many situations. Two examples are presented using the Community Atmosphere Model version 5. The first example demonstrates that the method is capable of characterizing the model cloud and precipitation sensitivity to time step length. A nudging technique is also applied to an additional set of simulations to help understand the contribution of physics-dynamics interaction to the detected time step sensitivity. In the second example, multiple empirical parameters related to cloud microphysics and aerosol lifecycle are perturbed simultaneously in order to explore which parameters have the largest impact on the simulated global mean top-of-atmosphere radiation balance. Results show that in both examples, short ensembles are able to correctly reproduce the main signals of model sensitivities revealed by traditional long-term climate simulations for fast processes in the climate system. The efficiency of the ensemble method makes it particularly useful for the development of high-resolution, costly and complex climate models.« less
Mathieu, Jordane A; Hatté, Christine; Balesdent, Jérôme; Parent, Éric
2015-11-01
The response of soil carbon dynamics to climate and land-use change will affect both the future climate and the quality of ecosystems. Deep soil carbon (>20 cm) is the primary component of the soil carbon pool, but the dynamics of deep soil carbon remain poorly understood. Therefore, radiocarbon activity (Δ14C), which is a function of the age of carbon, may help to understand the rates of soil carbon biodegradation and stabilization. We analyzed the published 14C contents in 122 profiles of mineral soil that were well distributed in most of the large world biomes, except for the boreal zone. With a multivariate extension of a linear mixed-effects model whose inference was based on the parallel combination of two algorithms, the expectation-maximization (EM) and the Metropolis-Hasting algorithms, we expressed soil Δ14C profiles as a four-parameter function of depth. The four-parameter model produced insightful predictions of soil Δ14C as dependent on depth, soil type, climate, vegetation, land-use and date of sampling (R2=0.68). Further analysis with the model showed that the age of topsoil carbon was primarily affected by climate and cultivation. By contrast, the age of deep soil carbon was affected more by soil taxa than by climate and thus illustrated the strong dependence of soil carbon dynamics on other pedologic traits such as clay content and mineralogy. © 2015 John Wiley & Sons Ltd.
Range-wide parallel climate-associated genomic clines in Atlantic salmon
Stanley, Ryan R. E.; Wringe, Brendan F.; Guijarro-Sabaniel, Javier; Bourret, Vincent; Bernatchez, Louis; Bentzen, Paul; Beiko, Robert G.; Gilbey, John; Clément, Marie; Bradbury, Ian R.
2017-01-01
Clinal variation across replicated environmental gradients can reveal evidence of local adaptation, providing insight into the demographic and evolutionary processes that shape intraspecific diversity. Using 1773 genome-wide single nucleotide polymorphisms we evaluated latitudinal variation in allele frequency for 134 populations of North American and European Atlantic salmon (Salmo salar). We detected 84 (4.74%) and 195 (11%) loci showing clinal patterns in North America and Europe, respectively, with 12 clinal loci in common between continents. Clinal single nucleotide polymorphisms were evenly distributed across the salmon genome and logistic regression revealed significant associations with latitude and seasonal temperatures, particularly average spring temperature in both continents. Loci displaying parallel clines were associated with several metabolic and immune functions, suggesting a potential basis for climate-associated adaptive differentiation. These climate-based clines collectively suggest evidence of large-scale environmental associated differences on either side of the North Atlantic. Our results support patterns of parallel evolution on both sides of the North Atlantic, with evidence of both similar and divergent underlying genetic architecture. The identification of climate-associated genomic clines illuminates the role of selection and demographic processes on intraspecific diversity in this species and provides a context in which to evaluate the impacts of climate change. PMID:29291123
Climate Model Ensemble Methodology: Rationale and Challenges
NASA Astrophysics Data System (ADS)
Vezer, M. A.; Myrvold, W.
2012-12-01
A tractable model of the Earth's atmosphere, or, indeed, any large, complex system, is inevitably unrealistic in a variety of ways. This will have an effect on the model's output. Nonetheless, we want to be able to rely on certain features of the model's output in studies aiming to detect, attribute, and project climate change. For this, we need assurance that these features reflect the target system, and are not artifacts of the unrealistic assumptions that go into the model. One technique for overcoming these limitations is to study ensembles of models which employ different simplifying assumptions and different methods of modelling. One then either takes as reliable certain outputs on which models in the ensemble agree, or takes the average of these outputs as the best estimate. Since the Intergovernmental Panel on Climate Change's Fourth Assessment Report (IPCC AR4) modellers have aimed to improve ensemble analysis by developing techniques to account for dependencies among models, and to ascribe unequal weights to models according to their performance. The goal of this paper is to present as clearly and cogently as possible the rationale for climate model ensemble methodology, the motivation of modellers to account for model dependencies, and their efforts to ascribe unequal weights to models. The method of our analysis is as follows. We will consider a simpler, well-understood case of taking the mean of a number of measurements of some quantity. Contrary to what is sometimes said, it is not a requirement of this practice that the errors of the component measurements be independent; one must, however, compensate for any lack of independence. We will also extend the usual accounts to include cases of unknown systematic error. We draw parallels between this simpler illustration and the more complex example of climate model ensembles, detailing how ensembles can provide more useful information than any of their constituent models. This account emphasizes the epistemic importance of considering degrees of model dependence, and the practice of ascribing unequal weights to models of unequal skill.
NASA Astrophysics Data System (ADS)
Hernandez, M.; Ummenhofer, C.; Anchukaitis, K. J.
2014-12-01
The Asian monsoon system influences the lives of over 60% of the planet's population, with widespread socioeconomic effects resulting from weakening or failure of monsoon rains. Spatially broad and temporally extended drought episodes have been known to dramatically influence human history, including the Strange Parallels Drought in the mid-18th century. Here, we explore the dynamics of sustained monsoon failure using the Monsoon Asia Drought Atlas - a high-resolution network of hydro-climatically sensitive tree-ring records - and a 1300-year pre-industrial control run of the Community Earth System Model (CESM). Spatial drought patterns in the instrumental and model-based Palmer Drought Severity Index (PDSI) during years with extremely weakened South Asian monsoon are similar to those reconstructed during the Strange Parallels Drought in the MADA. We further explore how the large-scale Indo-Pacific climate during weakened South Asian monsoon differs between interannual and decadal timescales. The Strange Parallels Drought pattern is observed during March-April-May primarily over Southeast Asia, with decreased precipitation and reduced moisture fluxes, while anomalies in June-July-August are confined to the Indian subcontinent during both individual and decadal events. Individual years with anomalous drying exhibit canonical El Niño conditions over the eastern equatorial Pacific and associated shifts in the Walker circulation, while decadal events appear to be related to anomalous warming around the dateline in the equatorial Pacific, typical of El Niño Modoki events. The results suggest different dynamical processes influence drought at different time scales through distinct remote ocean influences.
Present, Future, and Novel Bioclimates of the San Francisco, California Region
Torregrosa, Alicia; Taylor, Maxwell D.; Flint, Lorraine E.; Flint, Alan L.
2013-01-01
Bioclimates are syntheses of climatic variables into biologically relevant categories that facilitate comparative studies of biotic responses to climate conditions. Isobioclimates, unique combinations of bioclimatic indices (continentality, ombrotype, and thermotype), were constructed for northern California coastal ranges based on the Rivas-Martinez worldwide bioclimatic classification system for the end of the 20th century climatology (1971–2000) and end of the 21st century climatology (2070–2099) using two models, Geophysical Fluid Dynamics Laboratory (GFDL) model and the Parallel Climate Model (PCM), under the medium-high A2 emission scenario. The digitally mapped results were used to 1) assess the relative redistribution of isobioclimates and their magnitude of change, 2) quantify the loss of isobioclimates into the future, 3) identify and locate novel isobioclimates projected to appear, and 4) explore compositional change in vegetation types among analog isobioclimate patches. This study used downscaled climate variables to map the isobioclimates at a fine spatial resolution −270 m grid cells. Common to both models of future climate was a large change in thermotype. Changes in ombrotype differed among the two models. The end of 20th century climatology has 83 isobioclimates covering the 63,000 km2 study area. In both future projections 51 of those isobioclimates disappear over 40,000 km2. The ordination of vegetation-bioclimate relationships shows very strong correlation of Rivas-Martinez indices with vegetation distribution and composition. Comparisons of vegetation composition among analog patches suggest that vegetation change will be a local rearrangement of species already in place rather than one requiring long distance dispersal. The digitally mapped results facilitate comparison with other Mediterranean regions. Major remaining challenges include predicting vegetation composition of novel isobioclimates and developing metrics to compare differences in climate space. PMID:23526985
Present, future, and novel bioclimates of the San Francisco, California region
Torregrosa, Alicia; Taylor, Maxwell D.; Flint, Lorraine E.; Flint, Alan L.
2013-01-01
Bioclimates are syntheses of climatic variables into biologically relevant categories that facilitate comparative studies of biotic responses to climate conditions. Isobioclimates, unique combinations of bioclimatic indices (continentality, ombrotype, and thermotype), were constructed for northern California coastal ranges based on the Rivas-Martinez worldwide bioclimatic classification system for the end of the 20th century climatology (1971–2000) and end of the 21st century climatology (2070–2099) using two models, Geophysical Fluid Dynamics Laboratory (GFDL) model and the Parallel Climate Model (PCM), under the medium-high A2 emission scenario. The digitally mapped results were used to 1) assess the relative redistribution of isobioclimates and their magnitude of change, 2) quantify the loss of isobioclimates into the future, 3) identify and locate novel isobioclimates projected to appear, and 4) explore compositional change in vegetation types among analog isobioclimate patches. This study used downscaled climate variables to map the isobioclimates at a fine spatial resolution −270 m grid cells. Common to both models of future climate was a large change in thermotype. Changes in ombrotype differed among the two models. The end of 20th century climatology has 83 isobioclimates covering the 63,000 km2 study area. In both future projections 51 of those isobioclimates disappear over 40,000 km2. The ordination of vegetation-bioclimate relationships shows very strong correlation of Rivas-Martinez indices with vegetation distribution and composition. Comparisons of vegetation composition among analog patches suggest that vegetation change will be a local rearrangement of species already in place rather than one requiring long distance dispersal. The digitally mapped results facilitate comparison with other Mediterranean regions. Major remaining challenges include predicting vegetation composition of novel isobioclimates and developing metrics to compare differences in climate space.
ISI-MIP: The Inter-Sectoral Impact Model Intercomparison Project
NASA Astrophysics Data System (ADS)
Huber, V.; Dahlemann, S.; Frieler, K.; Piontek, F.; Schewe, J.; Serdeczny, O.; Warszawski, L.
2013-12-01
The Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) aims to synthesize the state-of-the-art knowledge of climate change impacts at different levels of global warming. The project's experimental design is formulated to distinguish the uncertainty introduced by the impact models themselves, from the inherent uncertainty in the climate projections and the variety of plausible socio-economic futures. The unique cross-sectoral scope of the project provides the opportunity to study cascading effects of impacts in interacting sectors and to identify regional 'hot spots' where multiple sectors experience extreme impacts. Another emphasis lies on the development of novel metrics to describe societal impacts of a warmer climate. We briefly outline the methodological framework, and then present selected results of the first, fast-tracked phase of ISI-MIP. The fast track brought together 35 global impact models internationally, spanning five sectors across human society and the natural world (agriculture, water, natural ecosystems, health and coastal infrastructure), and using the latest generation of global climate simulations (RCP projections from the CMIP5 archive) and socioeconomic drivers provided within the SSP process. We also introduce the second phase of the project, which will enlarge the scope of ISI-MIP by encompassing further impact sectors (e.g., forestry, fisheries, permafrost) and regional modeling approaches. The focus for the next round of simulations will be the validation and improvement of models based on historical observations and the analysis of variability and extreme events. Last but not least, we discuss the longer-term objective of ISI-MIP to initiate a coordinated, ongoing impact assessment process, driven by the entire impact community and in parallel with well-established climate model intercomparisons (CMIP).
Parallel Semi-Implicit Spectral Element Atmospheric Model
NASA Astrophysics Data System (ADS)
Fournier, A.; Thomas, S.; Loft, R.
2001-05-01
The shallow-water equations (SWE) have long been used to test atmospheric-modeling numerical methods. The SWE contain essential wave-propagation and nonlinear effects of more complete models. We present a semi-implicit (SI) improvement of the Spectral Element Atmospheric Model to solve the SWE (SEAM, Taylor et al. 1997, Fournier et al. 2000, Thomas & Loft 2000). SE methods are h-p finite element methods combining the geometric flexibility of size-h finite elements with the accuracy of degree-p spectral methods. Our work suggests that exceptional parallel-computation performance is achievable by a General-Circulation-Model (GCM) dynamical core, even at modest climate-simulation resolutions (>1o). The code derivation involves weak variational formulation of the SWE, Gauss(-Lobatto) quadrature over the collocation points, and Legendre cardinal interpolators. Appropriate weak variation yields a symmetric positive-definite Helmholtz operator. To meet the Ladyzhenskaya-Babuska-Brezzi inf-sup condition and avoid spurious modes, we use a staggered grid. The SI scheme combines leapfrog and Crank-Nicholson schemes for the nonlinear and linear terms respectively. The localization of operations to elements ideally fits the method to cache-based microprocessor computer architectures --derivatives are computed as collections of small (8x8), naturally cache-blocked matrix-vector products. SEAM also has desirable boundary-exchange communication, like finite-difference models. Timings on on the IBM SP and Compaq ES40 supercomputers indicate that the SI code (20-min timestep) requires 1/3 the CPU time of the explicit code (2-min timestep) for T42 resolutions. Both codes scale nearly linearly out to 400 processors. We achieved single-processor performance up to 30% of peak for both codes on the 375-MHz IBM Power-3 processors. Fast computation and linear scaling lead to a useful climate-simulation dycore only if enough model time is computed per unit wall-clock time. An efficient SI solver is essential to substantially increase this rate. Parallel preconditioning for an iterative conjugate-gradient elliptic solver is described. We are building a GCM dycore capable of 200 GF% lOPS sustained performance on clustered RISC/cache architectures using hybrid MPI/OpenMP programming.
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
Visualization and Analysis of Climate Simulation Performance Data
NASA Astrophysics Data System (ADS)
Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg
2015-04-01
Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and solutions that greatly aided our understanding. The software employed is based on Avizo Green, ParaView and SimVis, as well as own developed software extensions.
Dettinger, M.D.; Cayan, D.R.; Meyer, M.K.; Jeton, A.
2004-01-01
Hydrologic responses of river basins in the Sierra Nevada of California to historical and future climate variations and changes are assessed by simulating daily streamflow and water-balance responses to simulated climate variations over a continuous 200-yr period. The coupled atmosphere-ocean-ice-land Parallel Climate Model provides the simulated climate histories, and existing hydrologic models of the Merced, Carson, and American Rivers are used to simulate the basin responses. The historical simulations yield stationary climate and hydrologic variations through the first part of the 20th century until about 1975 when temperatures begin to warm noticeably and when snowmelt and streamflow peaks begin to occur progressively earlier within the seasonal cycle. A future climate simulated with business-as-usual increases in greenhouse-gas and aerosol radiative forcings continues those recent trends through the 21st century with an attendant +2.5??C warming and a hastening of snowmelt and streamflow within the seasonal cycle by almost a month. The various projected trends in the business-as-usual simulations become readily visible despite realistic simulated natural climatic and hydrologic variability by about 2025. In contrast to these changes that are mostly associated with streamflow timing, long-term average totals of streamflow and other hydrologic fluxes remain similar to the historical mean in all three simulations. A control simulation in which radiative forcings are held constant at 1995 levels for the 50 years following 1995 yields climate and streamflow timing conditions much like the 1980s and 1990s throughout its duration. The availability of continuous climate-change projection outputs and careful design of initial conditions and control experiments, like those utilized here, promise to improve the quality and usability of future climate-change impact assessments.
Cultural and climatic changes shape the evolutionary history of the Uralic languages.
Honkola, T; Vesakoski, O; Korhonen, K; Lehtinen, J; Syrjänen, K; Wahlberg, N
2013-06-01
Quantitative phylogenetic methods have been used to study the evolutionary relationships and divergence times of biological species, and recently, these have also been applied to linguistic data to elucidate the evolutionary history of language families. In biology, the factors driving macroevolutionary processes are assumed to be either mainly biotic (the Red Queen model) or mainly abiotic (the Court Jester model) or a combination of both. The applicability of these models is assumed to depend on the temporal and spatial scale observed as biotic factors act on species divergence faster and in smaller spatial scale than the abiotic factors. Here, we used the Uralic language family to investigate whether both 'biotic' interactions (i.e. cultural interactions) and abiotic changes (i.e. climatic fluctuations) are also connected to language diversification. We estimated the times of divergence using Bayesian phylogenetics with a relaxed-clock method and related our results to climatic, historical and archaeological information. Our timing results paralleled the previous linguistic studies but suggested a later divergence of Finno-Ugric, Finnic and Saami languages. Some of the divergences co-occurred with climatic fluctuation and some with cultural interaction and migrations of populations. Thus, we suggest that both 'biotic' and abiotic factors contribute either directly or indirectly to the diversification of languages and that both models can be applied when studying language evolution. © 2013 The Authors. Journal of Evolutionary Biology © 2013 European Society For Evolutionary Biology.
Decadal climate prediction in the large ensemble limit
NASA Astrophysics Data System (ADS)
Yeager, S. G.; Rosenbloom, N. A.; Strand, G.; Lindsay, K. T.; Danabasoglu, G.; Karspeck, A. R.; Bates, S. C.; Meehl, G. A.
2017-12-01
In order to quantify the benefits of initialization for climate prediction on decadal timescales, two parallel sets of historical simulations are required: one "initialized" ensemble that incorporates observations of past climate states and one "uninitialized" ensemble whose internal climate variations evolve freely and without synchronicity. In the large ensemble limit, ensemble averaging isolates potentially predictable forced and internal variance components in the "initialized" set, but only the forced variance remains after averaging the "uninitialized" set. The ensemble size needed to achieve this variance decomposition, and to robustly distinguish initialized from uninitialized decadal predictions, remains poorly constrained. We examine a large ensemble (LE) of initialized decadal prediction (DP) experiments carried out using the Community Earth System Model (CESM). This 40-member CESM-DP-LE set of experiments represents the "initialized" complement to the CESM large ensemble of 20th century runs (CESM-LE) documented in Kay et al. (2015). Both simulation sets share the same model configuration, historical radiative forcings, and large ensemble sizes. The twin experiments afford an unprecedented opportunity to explore the sensitivity of DP skill assessment, and in particular the skill enhancement associated with initialization, to ensemble size. This talk will highlight the benefits of a large ensemble size for initialized predictions of seasonal climate over land in the Atlantic sector as well as predictions of shifts in the likelihood of climate extremes that have large societal impact.
Modeling the Climatic Consequences of Geoengineering
NASA Astrophysics Data System (ADS)
Somerville, R. C.
2005-12-01
The last half-century has seen the development of physically comprehensive computer models of the climate system. These models are the primary tool for making predictions of climate change due to human activities, such as emitting greenhouse gases into the atmosphere. Because scientific understanding of the climate system is incomplete, however, any climate model will necessarily have imperfections. The inevitable uncertainties associated with these models have sometimes been cited as reasons for not taking action to reduce such emissions. Climate models could certainly be employed to predict the results of various attempts at geoengineering, but many questions would arise. For example, in considering proposals to increase the planetary reflectivity by brightening parts of the land surface or by orbiting mirrors, can models be used to bound the results and to warm of unintended consequences? How could confidence limits be placed on such model results? How can climate changes due to proposed geoengineering be distinguished from natural variability? There are historical parallels on smaller scales, in which models have been employed to predict the results of attempts to alter the weather, such as the use of cloud seeding for precipitation enhancement, hail suppression and hurricane modification. However, there are also many lessons to be learned from the recent record of using models to simulate the effects of the great unintended geoengineering experiment involving greenhouse gases, now in progress. In this major research effort, the same types of questions have been studied at length. The best modern models have demonstrated an impressive ability to predict some aspects of climate change. A large body of evidence has already accumulated through comparing model predictions to many observed aspects of recent climate change, ranging from increases in ocean heat content to changes in atmospheric water vapor to reductions in glacier extent. The preponderance of expert opinion is that this evidence is now sufficient to establish the human cause of much recent climate change. Nevertheless, no model can provide detailed and fully trustworthy answers to every possible question of interest. As an example, how will the climatology of Atlantic hurricanes change as the greenhouse effect becomes stronger? Can models reliably forecast changes in the length of the hurricane season or changes in the geographical regions affected by hurricanes? The answer is no, or at least, not yet. Additionally, climate models are not based entirely on first principles, such as Newtonian physics. Instead, they have been developed primarily to simulate the present climate and relatively small departures from it. To achieve this goal, a certain amount of empiricism has been built into the models. The result has sometimes been to increase the apparent realism of models at the cost of limiting their generality. Thus, the available climate models may well be less capable of simulating a geoengineering experiment that might lead to a radically different climate. New model development may be required for this new application. The challenge is to distinguish between what models can and cannot do well. It would be irresponsible and unethical, either to undertake geoengineering projects without modeling their consequences, or to place blind faith in the models. To decide how best to model a proposed geoengineering technique requires a deep understanding of the strengths and weaknesses of climate models. The history of modeling successes and failures is a valuable guide to the wise interpretation of model results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gittens, Alex; Devarakonda, Aditya; Racah, Evan
We explore the trade-offs of performing linear algebra using Apache Spark, compared to traditional C and MPI implementations on HPC platforms. Spark is designed for data analytics on cluster computing platforms with access to local disks and is optimized for data-parallel tasks. We examine three widely-used and important matrix factorizations: NMF (for physical plausibility), PCA (for its ubiquity) and CX (for data interpretability). We apply these methods to 1.6TB particle physics, 2.2TB and 16TB climate modeling and 1.1TB bioimaging data. The data matrices are tall-and-skinny which enable the algorithms to map conveniently into Spark’s data parallel model. We perform scalingmore » experiments on up to 1600 Cray XC40 nodes, describe the sources of slowdowns, and provide tuning guidance to obtain high performance.« less
NASA Astrophysics Data System (ADS)
Liu, Yongwen; Wang, Tao; Huang, Mengtian; Yao, Yitong; Ciais, Philippe; Piao, Shilong
2016-03-01
Terrestrial carbon fluxes are sensitive to climate change, but the interannual climate sensitivity of the land carbon cycle can also change with time. We analyzed the changes in responses of net biome production (NBP), net primary production (NPP), and heterotrophic respiration (Rh) to interannual climate variations over the 21st century in the Earth System Models (ESMs) from the Coupled Model Intercomparison Project 5. Under Representative Concentration Pathway (RCP) 4.5, interannual temperature sensitivities of NBP (γTempNBP), NPP (γTempNPP), and Rh (γTempRh) remain relatively stable at global scale, yet with large differences among ESMs and spatial heterogeneity. Modeled γTempNPP and γTempRh appear to increase in parallel in boreal regions, resulting in unchanged γTempNBP. Tropical γTempNBP decreases in most models, due to decreasing γTempNPP and relatively stable γTempRh. Across models, the changes in γTempNBP can be mainly explained by changes in γTempNPP rather than changes in γTempRh, at both global and regional scales. Interannual precipitation sensitivities of global NBP (γPrecNBP), NPP (γPrecNPP), and Rh (γPrecRh) are predicted not to change significantly, with large differences among ESMs. Across models, the changes in γPrecNBP can be mainly explained by changes in γPrecNPP rather than changes in γPrecRh in temperate regions, but not in other regions. Changes in the interannual climate sensitivities of carbon fluxes are consistent across RCPs 4.5, 6.0, and 8.5 but larger in more intensive scenarios. More effort should be considered to improve terrestrial carbon flux responses to interannual climate variability, e.g., incorporating biogeochemical processes of nutrient limitation, permafrost dynamics, and microbial decomposition.
NASA Astrophysics Data System (ADS)
Hori, Y.; Cheng, V. Y. S.; Gough, W. A.
2017-12-01
A network of winter roads in northern Canada connects a number of remote First Nations communities to all-season roads and rails. The extent of the winter road networks depends on the geographic features, socio-economic activities, and the numbers of remote First Nations so that it differs among the provinces. The most extensive winter road networks below the 60th parallel south are located in Ontario and Manitoba, serving 32 and 18 communities respectively. In recent years, a warmer climate has resulted in a shorter winter road season and an increase in unreliable road conditions; thus, limiting access among remote communities. This study focused on examining the future freezing degree-days (FDDs) accumulations during the winter road season at selected locations throughout Ontario's Far North and northern Manitoba using recent climate model projections from the multi-model ensembles of General Circulation Models (GCMs) under the Representative Concentration Pathway (RCP) scenarios. First, the non-parametric Mann-Kendall correlation test and the Theil-Sen method were used to identify any statistically significant trends between FDDs and time for the base period (1981-2010). Second, future climate scenarios are developed for the study areas using statistical downscaling methods. This study also examined the lowest threshold of FDDs during the winter road construction in a future period. Our previous study established the lowest threshold of 380 FDDs, which derived from the relationship between the FDDs and the opening dates of James Bay Winter Road near the Hudson-James Bay coast. Thus, this study applied the threshold measure as a conservative estimate of the minimum threshold of FDDs to examine the effects of climate change on the winter road construction period.
NASA Technical Reports Server (NTRS)
Parrish, D. D.; Lamarque, J.-F.; Naik, V.; Horowitz, L.; Shindell, D. T.; Staehelin, J.; Derwent, R.; Cooper, O. R.; Tanimoto, H.; Volz-Thomas, A.;
2014-01-01
Two recent papers have quantified long-term ozone (O3) changes observed at northernmidlatitude sites that are believed to represent baseline (here understood as representative of continental to hemispheric scales) conditions. Three chemistry-climate models (NCAR CAM-chem, GFDL-CM3, and GISS-E2-R) have calculated retrospective tropospheric O3 concentrations as part of the Atmospheric Chemistry and Climate Model Intercomparison Project and Coupled Model Intercomparison Project Phase 5 model intercomparisons. We present an approach for quantitative comparisons of model results with measurements for seasonally averaged O3 concentrations. There is considerable qualitative agreement between the measurements and the models, but there are also substantial and consistent quantitative disagreements. Most notably, models (1) overestimate absolute O3 mixing ratios, on average by approximately 5 to 17 ppbv in the year 2000, (2) capture only approximately 50% of O3 changes observed over the past five to six decades, and little of observed seasonal differences, and (3) capture approximately 25 to 45% of the rate of change of the long-term changes. These disagreements are significant enough to indicate that only limited confidence can be placed on estimates of present-day radiative forcing of tropospheric O3 derived from modeled historic concentration changes and on predicted future O3 concentrations. Evidently our understanding of tropospheric O3, or the incorporation of chemistry and transport processes into current chemical climate models, is incomplete. Modeled O3 trends approximately parallel estimated trends in anthropogenic emissions of NO(sub x), an important O3 precursor, while measured O3 changes increase more rapidly than these emission estimates.
DYNAMICO, an atmospheric dynamical core for high-performance climate modeling
NASA Astrophysics Data System (ADS)
Dubos, Thomas; Meurdesoif, Yann; Spiga, Aymeric; Millour, Ehouarn; Fita, Lluis; Hourdin, Frédéric; Kageyama, Masa; Traore, Abdoul-Khadre; Guerlet, Sandrine; Polcher, Jan
2017-04-01
Institut Pierre Simon Laplace has developed a very scalable atmospheric dynamical core, DYNAMICO, based on energy-conserving finite-difference/finite volume numerics on a quasi-uniform icosahedral-hexagonal mesh. Scalability is achieved by combining hybrid MPI/OpenMP parallelism to asynchronous I/O. This dynamical core has been coupled to radiative transfer physics tailored to the atmosphere of Saturn, allowing unprecedented simulations of the climate of this giant planet. For terrestrial climate studies DYNAMICO is being integrated into the IPSL Earth System Model IPSL-CM. Preliminary aquaplanet and AMIP-style simulations yield reasonable results when compared to outputs from IPSL-CM5. The observed performance suggests that an order of magnitude may be gained with respect to IPSL-CM CMIP5 simulations either on the duration of simulations or on their resolution. Longer simulations would be of interest for the study of paleoclimate, while higher resolution could improve certain aspects of the modeled climate such as extreme events, as will be explored in the HighResMIP project. Following IPSL's strategic vision of building a unified global-regional modelling system, a fully-compressible, non-hydrostatic prototype of DYNAMICO has been developed, enabling future convection-resolving simulations. Work supported by ANR project "HEAT", grant number CE23_2014_HEAT Dubos, T., Dubey, S., Tort, M., Mittal, R., Meurdesoif, Y., and Hourdin, F.: DYNAMICO-1.0, an icosahedral hydrostatic dynamical core designed for consistency and versatility, Geosci. Model Dev., 8, 3131-3150, doi:10.5194/gmd-8-3131-2015, 2015.
Reliability of regional climate simulations
NASA Astrophysics Data System (ADS)
Ahrens, W.; Block, A.; Böhm, U.; Hauffe, D.; Keuler, K.; Kücken, M.; Nocke, Th.
2003-04-01
Quantification of uncertainty becomes more and more a key issue for assessing the trustability of future climate scenarios. In addition to the mean conditions, climate impact modelers focus in particular on extremes. Before generating such scenarios using e.g. dynamic regional climate models, a careful validation of present-day simulations should be performed to determine the range of errors for the quantities of interest under recent conditions as a raw estimate of their uncertainty in the future. Often, multiple aspects shall be covered together, and the required simulation accuracy depends on the user's demand. In our approach, a massive parallel regional climate model shall be used on the one hand to generate "long-term" high-resolution climate scenarios for several decades, and on the other hand to provide very high-resolution ensemble simulations of future dry spells or heavy rainfall events. To diagnosis the model's performance for present-day simulations, we have recently developed and tested a first version of a validation and visualization chain for this model. It is, however, applicable in a much more general sense and could be used as a common test bed for any regional climate model aiming at this type of simulations. Depending on the user's interest, integrated quality measures can be derived for near-surface parameters using multivariate techniques and multidimensional distance measures in a first step. At this point, advanced visualization techniques have been developed and included to allow for visual data mining and to qualitatively identify dominating aspects and regularities. Univariate techniques that are especially designed to assess climatic aspects in terms of statistical properties can then be used to quantitatively diagnose the error contributions of the individual used parameters. Finally, a comprehensive in-depth diagnosis tool allows to investigate, why the model produces the obtained near-surface results to answer the question if the model performs well from the modeler's point of view. Examples will be presented for results obtained using this approach for assessing the risk of potential total agricultural yield loss under drought conditions in Northeast Brazil and for evaluating simulation results for a 10-year period for Europe. To support multi-run simulations and result evaluation, the model will be embedded into an already existing simulation environment that provides further postprocessing tools for sensitivity studies, behavioral analysis and Monte-Carlo simulations, but also for ensemble scenario analysis in one of the next steps.
Public Understanding of Climate Change: Certainty and Willingness To Act.
ERIC Educational Resources Information Center
Fortner, Rosanne W.; Lee, Jae-Young; Corney, Jeffrey R.; Romanello, Samantha; Bonnell, Joseph; Luthy, Brian; Figuerido, Claudia; Ntsiko, Nyathi
2000-01-01
Describes two parallel studies conducted shortly before the Kyoto conference on climate change: (1) an examination of media portrayals of global warming and the certainty with which information was reported; and (2) a telephone survey to assess public knowledge and attitudes about global climate change. Findings do not support a hypothesis that…
Parallel Work of CO2 Ejectors Installed in a Multi-Ejector Module of Refrigeration System
NASA Astrophysics Data System (ADS)
Bodys, Jakub; Palacz, Michal; Haida, Michal; Smolka, Jacek; Nowak, Andrzej J.; Banasiak, Krzysztof; Hafner, Armin
2016-09-01
A performance analysis on of fixed ejectors installed in a multi-ejector module in a CO2 refrigeration system is presented in this study. The serial and the parallel work of four fixed-geometry units that compose the multi-ejector pack was carried out. The executed numerical simulations were performed with the use of validated Homogeneous Equilibrium Model (HEM). The computational tool ejectorPL for typical transcritical parameters at the motive nozzle were used in all the tests. A wide range of the operating conditions for supermarket applications in three different European climate zones were taken into consideration. The obtained results present the high and stable performance of all the ejectors in the multi-ejector pack.
A Framework for Prioritizing NOAA's Climate Data Portfolio to Improve Relevance and Value
NASA Astrophysics Data System (ADS)
Privette, J. L.; Hutchins, C.; McPherson, T.; Wunder, D.
2016-12-01
NOAA's National Centers for Environmental Information (NCEI) is the largest civilian environmental data archive in the world. NCEI operationally provides hundreds of long term homogeneous climate data records and assessments that describe Earth's atmosphere, oceans and land surface. For decades, these data have underpinned leading climate research and modeling efforts and provided key insights into weather and climate changes. Recently, NCEI has increased support for economic and societal sectors beyond climate research by emphasizing use-inspired product development and services. Accordingly, NCEI has begun comprehensively assessing customer needs and user applications. In parallel, NCEI is analyzing and adjusting its full product portfolio to best address those needs and applications. In this presentation, we will describe NCEI's new approaches to capturing needs, performing use analytics, and molding a more responsive portfolio. We will summarize the findings of a quantitative relevance- and cost-scoring analysis that suggests the relative effectiveness of NCEI science and service investments. Finally, we will describe NCEI's effort to review, document and validate customer-driven product requirements. Results will help guide future prioritization of measurements, research and development, and product services.
Parallel Climate Data Assimilation PSAS Package
NASA Technical Reports Server (NTRS)
Ding, Hong Q.; Chan, Clara; Gennery, Donald B.; Ferraro, Robert D.
1996-01-01
We have designed and implemented a set of highly efficient and highly scalable algorithms for an unstructured computational package, the PSAS data assimilation package, as demonstrated by detailed performance analysis of systematic runs on up to 512node Intel Paragon. The equation solver achieves a sustained 18 Gflops performance. As the results, we achieved an unprecedented 100-fold solution time reduction on the Intel Paragon parallel platform over the Cray C90. This not only meets and exceeds the DAO time requirements, but also significantly enlarges the window of exploration in climate data assimilations.
Simulating Heinrich events in a coupled atmosphere-ocean-ice sheet model
NASA Astrophysics Data System (ADS)
Mikolajewicz, Uwe; Ziemen, Florian
2016-04-01
Heinrich events are among the most prominent events of long-term climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet - climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under discussion, and their climatic consequences are far from being fully understood. We contribute to answering the open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability without the need to prescribe external perturbations, as was the standard approach in almost all model studies so far. The setup consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global coarse resolution AOVGCM ECHAM5/MPIOM/LPJ. The simulations used for this analysis were an ensemble covering substantial parts of the late Glacial forced with transient insolation and prescribed atmospheric greenhouse gas concentrations. The modeled Heinrich events show a marked influence of the ice discharge on the Atlantic circulation and heat transport, but none of the Heinrich events during the Glacial did show a complete collapse of the North Atlantic meridional overturning circulation. The simulated main consequences of the Heinrich events are a freshening and cooling over the North Atlantic and a drying over northern Europe.
Application of Local Discretization Methods in the NASA Finite-Volume General Circulation Model
NASA Technical Reports Server (NTRS)
Yeh, Kao-San; Lin, Shian-Jiann; Rood, Richard B.
2002-01-01
We present the basic ideas of the dynamics system of the finite-volume General Circulation Model developed at NASA Goddard Space Flight Center for climate simulations and other applications in meteorology. The dynamics of this model is designed with emphases on conservative and monotonic transport, where the property of Lagrangian conservation is used to maintain the physical consistency of the computational fluid for long-term simulations. As the model benefits from the noise-free solutions of monotonic finite-volume transport schemes, the property of Lagrangian conservation also partly compensates the accuracy of transport for the diffusion effects due to the treatment of monotonicity. By faithfully maintaining the fundamental laws of physics during the computation, this model is able to achieve sufficient accuracy for the global consistency of climate processes. Because the computing algorithms are based on local memory, this model has the advantage of efficiency in parallel computation with distributed memory. Further research is yet desirable to reduce the diffusion effects of monotonic transport for better accuracy, and to mitigate the limitation due to fast-moving gravity waves for better efficiency.
A new climate modeling framework for convection-resolving simulation at continental scale
NASA Astrophysics Data System (ADS)
Charpilloz, Christophe; di Girolamo, Salvatore; Arteaga, Andrea; Fuhrer, Oliver; Hoefler, Torsten; Schulthess, Thomas; Schär, Christoph
2017-04-01
Major uncertainties remain in our understanding of the processes that govern the water cycle in a changing climate and their representation in weather and climate models. Of particular concern are heavy precipitation events of convective origin (thunderstorms and rain showers). The aim of the crCLIM project [1] is to propose a new climate modeling framework that alleviates the I/O-bottleneck in large-scale, convection-resolving climate simulations and thus to enable new analysis techniques for climate scientists. Due to the large computational costs, convection-resolving simulations are currently restricted to small computational domains or very short time scales, unless the largest available supercomputers system such as hybrid CPU-GPU architectures are used [3]. Hence, the COSMO model has been adapted to run on these architectures for research and production purposes [2]. However, the amount of generated data also increases and storing this data becomes infeasible making the analysis of simulations results impractical. To circumvent this problem and enable high-resolution models in climate we propose a data-virtualization layer (DVL) that re-runs simulations on demand and transparently manages the data for the analysis, that means we trade off computational effort (time) for storage (space). This approach also requires a bit-reproducible version of the COSMO model that produces identical results on different architectures (CPUs and GPUs) [4] that will be coupled with a performance model in order enable optimal re-runs depending on requirements of the re-run and available resources. In this contribution, we discuss the strategy to develop the DVL, a first performance model, the challenge of bit-reproducibility and the first results of the crCLIM project. [1] http://www.c2sm.ethz.ch/research/crCLIM.html [2] O. Fuhrer, C. Osuna, X. Lapillonne, T. Gysi, M. Bianco, and T. Schulthess. "Towards gpu-accelerated operational weather forecasting." In The GPU Technology Conference, GTC. 2013. [3] D. Leutwyler, O. Fuhrer, X. Lapillonne, D. Lüthi, and C. Schär. "Towards European-scale convection-resolving climate simulations with GPUs: a study with COSMO 4.19." Geoscientific Model Development 9, no. 9 (2016): 3393. [4] A. Arteaga, O. Fuhrer, and T. Hoefler. "Designing bit-reproducible portable high-performance applications." In Parallel and Distributed Processing Symposium, 2014 IEEE 28th International, pp. 1235-1244. IEEE, 2014.
Duveneck, Matthew J; Scheller, Robert M
2015-09-01
Within the time frame of the longevity of tree species, climate change will change faster than the ability of natural tree migration. Migration lags may result in reduced productivity and reduced diversity in forests under current management and climate change. We evaluated the efficacy of planting climate-suitable tree species (CSP), those tree species with current or historic distributions immediately south of a focal landscape, to maintain or increase aboveground biomass productivity, and species and functional diversity. We modeled forest change with the LANDIS-II forest simulation model for 100 years (2000-2100) at a 2-ha cell resolution and five-year time steps within two landscapes in the Great Lakes region (northeastern Minnesota and northern lower Michigan, USA). We compared current climate to low- and high-emission futures. We simulated a low-emission climate future with the Intergovernmental Panel on Climate Change (IPCC) 2007 B1 emission scenario and the Parallel Climate Model Global Circulation Model (GCM). We simulated a high-emission climate future with the IPCC A1FI emission scenario and the Geophysical Fluid Dynamics Laboratory (GFDL) GCM. We compared current forest management practices (business-as-usual) to CSP management. In the CSP scenario, we simulated a target planting of 5.28% and 4.97% of forested area per five-year time step in the Minnesota and Michigan landscapes, respectively. We found that simulated CSP species successfully established in both landscapes under all climate scenarios. The presence of CSP species generally increased simulated aboveground biomass. Species diversity increased due to CSP; however, the effect on functional diversity was variable. Because the planted species were functionally similar to many native species, CSP did not result in a consistent increase nor decrease in functional diversity. These results provide an assessment of the potential efficacy and limitations of CSP management. These results have management implications for sites where diversity and productivity are expected to decline. Future efforts to restore a specific species or forest type may not be possible, but CSP may sustain a more general ecosystem service (e.g., aboveground biomass).
Blanchard, Julia L; Jennings, Simon; Holmes, Robert; Harle, James; Merino, Gorka; Allen, J Icarus; Holt, Jason; Dulvy, Nicholas K; Barange, Manuel
2012-11-05
Existing methods to predict the effects of climate change on the biomass and production of marine communities are predicated on modelling the interactions and dynamics of individual species, a very challenging approach when interactions and distributions are changing and little is known about the ecological mechanisms driving the responses of many species. An informative parallel approach is to develop size-based methods. These capture the properties of food webs that describe energy flux and production at a particular size, independent of species' ecology. We couple a physical-biogeochemical model with a dynamic, size-based food web model to predict the future effects of climate change on fish biomass and production in 11 large regional shelf seas, with and without fishing effects. Changes in potential fish production are shown to most strongly mirror changes in phytoplankton production. We project declines of 30-60% in potential fish production across some important areas of tropical shelf and upwelling seas, most notably in the eastern Indo-Pacific, the northern Humboldt and the North Canary Current. Conversely, in some areas of the high latitude shelf seas, the production of pelagic predators was projected to increase by 28-89%.
The dynamics of climate-induced deglacial ice stream acceleration
NASA Astrophysics Data System (ADS)
Robel, A.; Tziperman, E.
2015-12-01
Geological observations indicate that ice streams were a significant contributor to ice flow in the Laurentide Ice Sheet during the Last Glacial Maximum. Conceptual and simple model studies have also argued that the gradual development of ice streams increases the sensitivity of large ice sheets to weak climate forcing. In this study, we use an idealized configuration of the Parallel Ice Sheet Model to explore the role of ice streams in rapid deglaciation. In a growing ice sheet, ice streams develop gradually as the bed warms and the margin expands outward onto the continental shelf. Then, a weak change in equilibrium line altitude commensurate with Milankovitch forcing results in a rapid deglacial response, as ice stream acceleration leads to enhanced calving and surface melting at low elevations. We explain the dynamical mechanism that drives this ice stream acceleration and its broader applicability as a feedback for enhancing ice sheet decay in response to climate forcing. We show how our idealized ice sheet simulations match geomorphological observations of deglacial ice stream variability and previous model-data analyses. We conclude with observations on the potential for interaction between ice streams and other feedback mechanisms within the earth system.
NASA Astrophysics Data System (ADS)
Tao, F.; Rötter, R.
2013-12-01
Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for better informed decision-making on adaptation strategies. References 1. Coumou, D. & Rahmstorf, S. A decade of extremes. Nature Clim. Change, 2, 491-496 (2012). 2. Rötter, R. P., Carter, T. R., Olesen, J. E. & Porter, J. R. Crop-climate models need an overhaul. Nature Clim. Change, 1, 175-177 (2011). 3. Asseng, S. et al., Uncertainty in simulating wheat yields under climate change. Nature Clim. Change. 10.1038/nclimate1916. (2013). 4. Porter, J.R., & Semenov, M., Crop responses to climatic variation . Trans. R. Soc. B., 360, 2021-2035 (2005). 5. Porter, J.R. & Christensen, S. Deconstructing crop processes and models via identities. Plant, Cell and Environment . doi: 10.1111/pce.12107 (2013). 6. Boogaard, H.L., van Diepen C.A., Rötter R.P., Cabrera J.M. & van Laar H.H. User's guide for the WOFOST 7.1 crop growth simulation model and Control Center 1.5, Alterra, Wageningen, The Netherlands. (1998) 7. Tao, F. & Zhang, Z. Climate change, wheat productivity and water use in the North China Plain: a new super-ensemble-based probabilistic projection. Agric. Forest Meteorol., 170, 146-165. (2013).
Modelling the Madden Julian Oscillation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slingo, J M; Inness, P M; Sperber, K R
2004-05-21
The MJO has long been an aspect of the global climate that has provided a tough test for the climate modelling community. Since the 1980s there have been numerous studies of the simulation of the MJO in atmospheric general circulation models (GCMs), ranging from Hayashi and Golder (1986, 1988) and Lau and Lau (1986), through to more recent studies such as Wang and Schlesinger (1999) and Wu et al. (2002). Of course, attempts to reproduce the MJO in climate models have proceeded in parallel with developments in our understanding of what the MJO is and what drives it. In fact,more » many advances in understanding the MJO have come through modeling studies. In particular, failure of climate models to simulate various aspects of the MJO has prompted investigations into the mechanisms that are important to its initiation and maintenance, leading to improvements both in our understanding of, and ability to simulate, the MJO. The initial focus of this chapter will be on modeling the MJO during northern winter, when it is characterized as a predominantly eastward propagating mode and is most readily seen in observations. Aspects of the simulation of the MJO will be discussed in the context of its sensitivity to the formulation of the atmospheric model, and the increasing evidence that it may be a coupled ocean-atmosphere phenomenon. Later, we will discuss the challenges regarding the simulation of boreal summer intraseasonal variability, which is more complex since it is a combination of the eastward propagating MJO and the northward propagation of the tropical convergence zone. Finally some concluding remarks on future directions in modeling the MJO and its relationship with other timescales of variability in the tropics will be made.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahowald, Natalie
Soils in natural and managed ecosystems and wetlands are well known sources of methane, nitrous oxides, and reactive nitrogen gases, but the magnitudes of gas flux to the atmosphere are still poorly constrained. Thus, the reasons for the large increases in atmospheric concentrations of methane and nitrous oxide since the preindustrial time period are not well understood. The low atmospheric concentrations of methane and nitrous oxide, despite being more potent greenhouse gases than carbon dioxide, complicate empirical studies to provide explanations. In addition to climate concerns, the emissions of reactive nitrogen gases from soils are important to the changing nitrogenmore » balance in the earth system, subject to human management, and may change substantially in the future. Thus improved modeling of the emission fluxes of these species from the land surface is important. Currently, there are emission modules for methane and some nitrogen species in the Community Earth System Model’s Community Land Model (CLM-ME/N); however, there are large uncertainties and problems in the simulations, resulting in coarse estimates. In this proposal, we seek to improve these emission modules by combining state-of-the-art process modules for emissions, available data, and new optimization methods. In earth science problems, we often have substantial data and knowledge of processes in disparate systems, and thus we need to combine data and a general process level understanding into a model for projections of future climate that are as accurate as possible. The best methodologies for optimization of parameters in earth system models are still being developed. In this proposal we will develop and apply surrogate algorithms that a) were especially developed for computationally expensive simulations like CLM-ME/N models; b) were (in the earlier surrogate optimization Stochastic RBF) demonstrated to perform very well on computationally expensive complex partial differential equations in earth science with limited numbers of simulations; and, c) will be (as part of the proposed research) significantly improved both by adding asynchronous parallelism, early truncation of unsuccessful simulations, and the improvement of both serial and parallel performance by the use of derivative and sensitivity information from global and local surrogate approximations S(x). The algorithm development and testing will be focused on the CLM-ME/N model application, but the methods are general and are expected to also perform well on optimization for parameter estimation of other climate models and other classes of continuous multimodal optimization problems arising from complex simulation models. In addition, this proposal will compile available datasets of emissions of methane, nitrous oxides and reactive nitrogen species and develop protocols for site level comparisons with the CLM-ME/N. Once the model parameters are optimized against site level data, the model will be simulated at the global level and compared to atmospheric concentration measurements for the current climate, and future emissions will be estimated using climate change as simulated by the CESM. This proposal combines experts in earth system modeling, optimization, computer science, and process level understanding of soil gas emissions in an interdisciplinary team in order to improve the modeling of methane and nitrogen gas emissions. This proposal thus meets the requirements of the SciDAC RFP, by integrating state-of-the-art computer science and earth system to build an improved earth system model.« less
Multi-Resolution Climate Ensemble Parameter Analysis with Nested Parallel Coordinates Plots.
Wang, Junpeng; Liu, Xiaotong; Shen, Han-Wei; Lin, Guang
2017-01-01
Due to the uncertain nature of weather prediction, climate simulations are usually performed multiple times with different spatial resolutions. The outputs of simulations are multi-resolution spatial temporal ensembles. Each simulation run uses a unique set of values for multiple convective parameters. Distinct parameter settings from different simulation runs in different resolutions constitute a multi-resolution high-dimensional parameter space. Understanding the correlation between the different convective parameters, and establishing a connection between the parameter settings and the ensemble outputs are crucial to domain scientists. The multi-resolution high-dimensional parameter space, however, presents a unique challenge to the existing correlation visualization techniques. We present Nested Parallel Coordinates Plot (NPCP), a new type of parallel coordinates plots that enables visualization of intra-resolution and inter-resolution parameter correlations. With flexible user control, NPCP integrates superimposition, juxtaposition and explicit encodings in a single view for comparative data visualization and analysis. We develop an integrated visual analytics system to help domain scientists understand the connection between multi-resolution convective parameters and the large spatial temporal ensembles. Our system presents intricate climate ensembles with a comprehensive overview and on-demand geographic details. We demonstrate NPCP, along with the climate ensemble visualization system, based on real-world use-cases from our collaborators in computational and predictive science.
Downscaling GISS ModelE Boreal Summer Climate over Africa
NASA Technical Reports Server (NTRS)
Druyan, Leonard M.; Fulakeza, Matthew
2015-01-01
The study examines the perceived added value of downscaling atmosphere-ocean global climate model simulations over Africa and adjacent oceans by a nested regional climate model. NASA/Goddard Institute for Space Studies (GISS) coupled ModelE simulations for June- September 1998-2002 are used to form lateral boundary conditions for synchronous simulations by the GISS RM3 regional climate model. The ModelE computational grid spacing is 2deg latitude by 2.5deg longitude and the RM3 grid spacing is 0.44deg. ModelE precipitation climatology for June-September 1998-2002 is shown to be a good proxy for 30-year means so results based on the 5-year sample are presumed to be generally representative. Comparison with observational evidence shows several discrepancies in ModelE configuration of the boreal summer inter-tropical convergence zone (ITCZ). One glaring shortcoming is that ModelE simulations do not advance the West African rain band northward during the summer to represent monsoon precipitation onset over the Sahel. Results for 1998-2002 show that onset simulation is an important added value produced by downscaling with RM3. ModelE Eastern South Atlantic Ocean computed sea-surface temperatures (SST) are some 4 K warmer than reanalysis, contributing to large positive biases in overlying surface air temperatures (Tsfc). ModelE Tsfc are also too warm over most of Africa. RM3 downscaling somewhat mitigates the magnitude of Tsfc biases over the African continent, it eliminates the ModelE double ITCZ over the Atlantic and it produces more realistic orographic precipitation maxima. Parallel ModelE and RM3 simulations with observed SST forcing (in place of the predicted ocean) lower Tsfc errors but have mixed impacts on circulation and precipitation biases. Downscaling improvements of the meridional movement of the rain band over West Africa and the configuration of orographic precipitation maxima are realized irrespective of the SST biases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Vittorio, Alan V.; Chini, Louise M.; Bond-Lamberty, Benjamin
2014-11-27
Climate projections depend on scenarios of fossil fuel emissions and land use change, and the IPCC AR5 parallel process assumes consistent climate scenarios across Integrated Assessment and Earth System Models (IAMs and ESMs). To facilitate consistency, CMIP5 used a novel land use harmonization to provide ESMs with seamless, 1500-2100 land use trajectories generated by historical data and four IAMs. However, we have identified and partially addressed a major gap in the CMIP5 land coupling design. The CMIP5 Community ESM (CESM) global afforestation is only 22% of RCP4.5 afforestation from 2005 to 2100. Likewise, only 17% of the Global Change Assessmentmore » Model’s (GCAM’s) 2040 RCP4.5 afforestation signal, and none of the pasture loss, were transmitted to CESM within a newly integrated model. This is a critical problem because afforestation is necessary for achieving the RCP4.5 climate stabilization. We attempted to rectify this problem by modifying only the ESM component of the integrated model, enabling CESM to simulate 66% of GCAM’s afforestation in 2040, and 94% of GCAM’s pasture loss as grassland and shrubland losses. This additional afforestation increases vegetation carbon gain by 19 PgC and decreases atmospheric CO2 gain by 8 ppmv from 2005 to 2040, implying different climate scenarios between CMIP5 GCAM and CESM. Similar inconsistencies likely exist in other CMIP5 model results, primarily because land cover information is not shared between models, with possible contributions from afforestation exceeding model-specific, potentially viable forest area. Further work to harmonize land cover among models will be required to adequately rectify this problem.« less
Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.
2013-05-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept/prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed "near" the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be performed in an efficient way, with the researcher paying only his own Cloud compute bill.; Figure 1 -- Architecture.
Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing
NASA Astrophysics Data System (ADS)
Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.
2013-12-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the 'A-Train' platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (MERRA), stratify the comparisons using a classification of the 'cloud scenes' from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically 'sharded' by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will present the architecture of SciReduce, describe the achieved 'clock time' speedups in fusing datasets on our own compute nodes and in the public Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept/prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed 'near' the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be performed in an efficient way, with the researcher paying only his own Cloud compute bill. SciReduce Architecture
Conservation in the face of climate change: recent developments.
Lawler, Joshua; Watson, James; Game, Edward
2015-01-01
An increased understanding of the current and potential future impacts of climate change has significantly influenced conservation in practice in recent years. Climate change has necessitated a shift toward longer planning time horizons, moving baselines, and evolving conservation goals and targets. This shift has resulted in new perspectives on, and changes in, the basic approaches practitioners use to conserve biodiversity. Restoration, spatial planning and reserve selection, connectivity modelling, extinction risk assessment, and species translocations have all been reimagined in the face of climate change. Restoration is being conducted with a new acceptance of uncertainty and an understanding that goals will need to shift through time. New conservation targets, such as geophysical settings and climatic refugia, are being incorporated into conservation plans. Risk assessments have begun to consider the potentially synergistic impacts of climate change and other threats. Assisted colonization has gained acceptance in recent years as a viable and necessary conservation tool. This evolution has paralleled a larger trend in conservation-a shift toward conservation actions that benefit both people and nature. As we look forward, it is clear that more change is on the horizon. To protect biodiversity and essential ecosystem services, conservation will need to anticipate the human response to climate change and to focus not only on resistance and resilience but on transitions to new states and new ecosystems.
NASA Astrophysics Data System (ADS)
Li, J.; Zhang, T.; Huang, Q.; Liu, Q.
2014-12-01
Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.
NASA Astrophysics Data System (ADS)
Murray, A. B.; Thomas, C.; Hurst, M. D.; Barkwith, A.; Ashton, A. D.; Ellis, M. A.
2014-12-01
Recent numerical modelling demonstrates that when sandy coastlines are affected predominantly by waves approaching from "high" angles (> ~45° between the coastline and wave crests at the offshore limit of shore-parallel contours), large-scale (kms to 100 kms) morphodynamic instabilities and finite-amplitude interactions can lead to the emergence of striking coastline features, including sand waves, capes and spits. The type of feature that emerges depends on the wave climate, defined as the angular distribution of wave influences on alongshore sediment transport. Under a constant wave climate, coastline morphology reaches a dynamical steady state; the cross-shore/alongshore aspect ratio and the general appearance of the features remains constant. In previous modelling involving wave-climate change, as well as comparisons between observed coastline morphologies and wave climates, it has been implicitly assumed that the morphology adjusts in a quasi-equilibrium fashion, so that at any time the coastline shape reflects the current forcing. However, here we present new model results showing pronounced path dependence in coastline morphodynamics. In experiments with a period of constant wave climate followed by a period of transition to a new wave climate and then a run-on phase, the features that exist during the run-on phase can be qualitatively and quantitatively different from those that would develop initially under the final wave climate. Although the features inherited from the past wave-climate history may in some case be true alternate stable states, in other cases the inherited features gradually decay toward the morphology that would be expected given the final wave climate. A suite of such experiments allows us to characterize how the e-folding timescale of this decay depends on 1) the initial wave climate, 2) the path through wave-climate space, and 3) the rate of transition. When the initial features are flying spits with cross-shore amplitudes of 6 - 8 km, e-folding times can be on the order of millennia or longer. These results could provide a new perspective when interpreting current and past coastline features. In addition, the complex paleo-coastline structure that develops in the coastal hinterlands in these experiments could be relevant to the structures observed in some coastal environments.
Attribution of floods in the Okavango basin, Southern Africa
NASA Astrophysics Data System (ADS)
Wolski, Piotr; Stone, Dáithí; Tadross, Mark; Wehner, Michael; Hewitson, Bruce
2014-04-01
In the charismatic wetlands of the Okavango Delta, Botswana, the annual floods of 2009-2011 reached magnitudes last seen 20-30 years ago, considerably affecting life of local populations and the economically important tourism industry. In this study, we analyse results from an attribution modelling system designed to examine how anthropogenic greenhouse gas emissions have contributed to weather and flood risk in our current climate. The system is based on comparison of real world climate and hydrological simulations with parallel counterfactual simulations of the climate and hydrological responses under conditions that might have been had human activities not emitted greenhouse gases. The analyses allow us to address the question of whether anthropogenic climate change contributed to increasing the risk of these high flood events in the Okavango system. Results show that the probability of occurrence of high floods during 2009-2011 in the current climate is likely lower than it would have been in a climate without anthropogenic greenhouse gases. This result is robust across the two climate models and various data processing procedures, although the exact figures for the associated decrease in risk differ. Results also differ between the three years examined, indicating that the “time-slice” method used here needs to be applied to multiple years in order to accurately estimate the contribution of emissions to current risk. Simple sensitivity analyses indicate that the reduction in flood risk is attributed to higher temperatures (and thus evaporation) in the current world, with little difference in the analysed domain's rainfall simulated in the two scenarios.
Global terrestrial water storage connectivity revealed using complex climate network analyses
NASA Astrophysics Data System (ADS)
Sun, A. Y.; Chen, J.; Donges, J.
2015-07-01
Terrestrial water storage (TWS) exerts a key control in global water, energy, and biogeochemical cycles. Although certain causal relationship exists between precipitation and TWS, the latter quantity also reflects impacts of anthropogenic activities. Thus, quantification of the spatial patterns of TWS will not only help to understand feedbacks between climate dynamics and the hydrologic cycle, but also provide new insights and model calibration constraints for improving the current land surface models. This work is the first attempt to quantify the spatial connectivity of TWS using the complex network theory, which has received broad attention in the climate modeling community in recent years. Complex networks of TWS anomalies are built using two global TWS data sets, a remote sensing product that is obtained from the Gravity Recovery and Climate Experiment (GRACE) satellite mission, and a model-generated data set from the global land data assimilation system's NOAH model (GLDAS-NOAH). Both data sets have 1° × 1° grid resolutions and cover most global land areas except for permafrost regions. TWS networks are built by first quantifying pairwise correlation among all valid TWS anomaly time series, and then applying a cutoff threshold derived from the edge-density function to retain only the most important features in the network. Basinwise network connectivity maps are used to illuminate connectivity of individual river basins with other regions. The constructed network degree centrality maps show the TWS anomaly hotspots around the globe and the patterns are consistent with recent GRACE studies. Parallel analyses of networks constructed using the two data sets reveal that the GLDAS-NOAH model captures many of the spatial patterns shown by GRACE, although significant discrepancies exist in some regions. Thus, our results provide further measures for constraining the current land surface models, especially in data sparse regions.
Downscaling climate information for local disease mapping.
Bernardi, M; Gommes, R; Grieser, J
2006-06-01
The study of the impacts of climate on human health requires the interdisciplinary efforts of health professionals, climatologists, biologists, and social scientists to analyze the relationships among physical, biological, ecological, and social systems. As the disease dynamics respond to variations in regional and local climate, climate variability affects every region of the world and the diseases are not necessarily limited to specific regions, so that vectors may become endemic in other regions. Climate data at local level are thus essential to evaluate the dynamics of vector-borne disease through health-climate models and most of the times the climatological databases are not adequate. Climate data at high spatial resolution can be derived by statistical downscaling using historical observations but the method is limited by the availability of historical data at local level. Since the 90s', the statistical interpolation of climate data has been an important priority of the Agrometeorology Group of the Food and Agriculture Organization of the United Nations (FAO), as they are required for agricultural planning and operational activities at the local level. Since 1995, date of the first FAO spatial interpolation software for climate data, more advanced applications have been developed such as SEDI (Satellite Enhanced Data Interpolation) for the downscaling of climate data, LOCCLIM (Local Climate Estimator) and the NEW_LOCCLIM in collaboration with the Deutscher Wetterdienst (German Weather Service) to estimate climatic conditions at locations for which no observations are available. In parallel, an important effort has been made to improve the FAO climate database including at present more than 30,000 stations worldwide and expanding the database from developing countries coverage to global coverage.
Pedersen, Ulrik B; Karagiannis-Voules, Dimitrios-Alexios; Midzi, Nicholas; Mduluza, Tkafira; Mukaratirwa, Samson; Fensholt, Rasmus; Vennervald, Birgitte J; Kristensen, Thomas K; Vounatsou, Penelope; Stensgaard, Anna-Sofie
2017-05-08
Temperature, precipitation and humidity are known to be important factors for the development of schistosome parasites as well as their intermediate snail hosts. Climate therefore plays an important role in determining the geographical distribution of schistosomiasis and it is expected that climate change will alter distribution and transmission patterns. Reliable predictions of distribution changes and likely transmission scenarios are key to efficient schistosomiasis intervention-planning. However, it is often difficult to assess the direction and magnitude of the impact on schistosomiasis induced by climate change, as well as the temporal transferability and predictive accuracy of the models, as prevalence data is often only available from one point in time. We evaluated potential climate-induced changes on the geographical distribution of schistosomiasis in Zimbabwe using prevalence data from two points in time, 29 years apart; to our knowledge, this is the first study investigating this over such a long time period. We applied historical weather data and matched prevalence data of two schistosome species (Schistosoma haematobium and S. mansoni). For each time period studied, a Bayesian geostatistical model was fitted to a range of climatic, environmental and other potential risk factors to identify significant predictors that could help us to obtain spatially explicit schistosomiasis risk estimates for Zimbabwe. The observed general downward trend in schistosomiasis prevalence for Zimbabwe from 1981 and the period preceding a survey and control campaign in 2010 parallels a shift towards a drier and warmer climate. However, a statistically significant relationship between climate change and the change in prevalence could not be established.
The future of climate science analysis in a coming era of exascale computing
NASA Astrophysics Data System (ADS)
Bates, S. C.; Strand, G.
2013-12-01
Projections of Community Earth System Model (CESM) output based on the growth of data archived over 2000-2012 at all of our computing sites (NCAR, NERSC, ORNL) show that we can expect to reach 1,000 PB (1 EB) sometime in the next decade or so. The current paradigms of using site-based archival systems to hold these data that are then accessed via portals or gateways, downloading the data to a local system, and then processing/analyzing the data will be irretrievably broken before then. From a climate modeling perspective, the expertise involved in making climate models themselves efficient on HPC systems will need to be applied to the data as well - providing fast parallel analysis tools co-resident in memory with the data, because the disk I/O bandwidth simply will not keep up with the expected arrival of exaflop systems. The ability of scientists, analysts, stakeholders and others to use climate model output to turn these data into understanding and knowledge will require significant advances in the current typical analysis tools and packages to enable these processes for these vast volumes of data. Allowing data users to enact their own analyses on model output is virtually a requirement as well - climate modelers cannot anticipate all the possibilities for analysis that users may want to do. In addition, the expertise of data scientists, and their knowledge of the model output and their knowledge of best practices in data management (metadata, curation, provenance and so on) will need to be rewarded and exploited to gain the most understanding possible from these volumes of data. In response to growing data size, demand, and future projections, the CESM output has undergone a structure evolution and the data management plan has been reevaluated and updated. The major evolution of the CESM data structure is presented here, along with the CESM experience and role within the CMIP3/CMIP5.
NASA Astrophysics Data System (ADS)
Biercamp, Joachim; Adamidis, Panagiotis; Neumann, Philipp
2017-04-01
With the exa-scale era approaching, length and time scales used for climate research on one hand and numerical weather prediction on the other hand blend into each other. The Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) represents a European consortium comprising partners from climate, weather and HPC in their effort to address key scientific challenges that both communities have in common. A particular challenge is to reach global models with spatial resolutions that allow simulating convective clouds and small-scale ocean eddies. These simulations would produce better predictions of trends and provide much more fidelity in the representation of high-impact regional events. However, running such models in operational mode, i.e with sufficient throughput in ensemble mode clearly will require exa-scale computing and data handling capability. We will discuss the ESiWACE initiative and relate it to work-in-progress on high-resolution simulations in Europe. We present recent strong scalability measurements from ESiWACE to demonstrate current computability in weather and climate simulation. A special focus in this particular talk is on the Icosahedal Nonhydrostatic (ICON) model used for a comparison of high resolution regional and global simulations with high quality observation data. We demonstrate that close-to-optimal parallel efficiency can be achieved in strong scaling global resolution experiments on Mistral/DKRZ, e.g. 94% for 5km resolution simulations using 36k cores on Mistral/DKRZ. Based on our scalability and high-resolution experiments, we deduce and extrapolate future capabilities for ICON that are expected for weather and climate research at exascale.
NASA Astrophysics Data System (ADS)
Sopaheluwakan, Ardhasena; Fajariana, Yuaning; Satyaningsih, Ratna; Aprilina, Kharisma; Astuti Nuraini, Tri; Ummiyatul Badriyah, Imelda; Lukita Sari, Dyah; Haryoko, Urip
2017-04-01
Inhomogeneities are often found in long records of climate data. These can occur because of various reasons, among others such as relocation of observation site, changes in observation method, and the transition to automated instruments. Changes to these automated systems are inevitable, and it is taking place worldwide in many of the National Meteorological Services. However this shift of observational practice must be done cautiously and a sufficient period of parallel observation of co-located manual and automated systems should take place as suggested by the World Meteorological Organization. With a sufficient parallel observation period, biases between the two systems can be analyzed. In this study we analyze the biases of a yearlong parallel observation of manual and automatic weather stations in 30 locations in Indonesia. The location of the sites spans from east to west of approximately 45 longitudinal degrees covering different climate characteristics and geographical settings. We study measurements taken by both sensors for temperature and rainfall parameters. We found that the biases from both systems vary from place to place and are more dependent to the setting of the instrument rather than to the climatic and geographical factors. For instance, daytime observations of the automatic weather stations are found to be consistently higher than the manual observation, and vice versa night time observations of the automatic weather stations are lower than the manual observation.
Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing
NASA Astrophysics Data System (ADS)
Wilson, Brian; Manipon, Gerald; Hua, Hook; Fetzer, Eric
2014-05-01
NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map-reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in a hybrid Cloud (private eucalyptus & public Amazon). Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Multi-year datasets are automatically "sharded" by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will present the architecture of SciReduce, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. We will also present a concept and prototype for staging NASA's A-Train Atmospheric datasets (Levels 2 & 3) in the Amazon Cloud so that any number of compute jobs can be executed "near" the multi-sensor data. Given such a system, multi-sensor climate studies over 10-20 years of data could be perform
The Kalman Filter and High Performance Computing at NASA's Data Assimilation Office (DAO)
NASA Technical Reports Server (NTRS)
Lyster, Peter M.
1999-01-01
Atmospheric data assimilation is a method of combining actual observations with model simulations to produce a more accurate description of the earth system than the observations alone provide. The output of data assimilation, sometimes called "the analysis", are accurate regular, gridded datasets of observed and unobserved variables. This is used not only for weather forecasting but is becoming increasingly important for climate research. For example, these datasets may be used to assess retrospectively energy budgets or the effects of trace gases such as ozone. This allows researchers to understand processes driving weather and climate, which have important scientific and policy implications. The primary goal of the NASA's Data Assimilation Office (DAO) is to provide datasets for climate research and to support NASA satellite and aircraft missions. This presentation will: (1) describe ongoing work on the advanced Kalman/Lagrangian filter parallel algorithm for the assimilation of trace gases in the stratosphere; and (2) discuss the Kalman filter in relation to other presentations from the DAO on Four Dimensional Data Assimilation at this meeting. Although the designation "Kalman filter" is often used to describe the overarching work, the series of talks will show that the scientific software and the kind of parallelization techniques that are being developed at the DAO are very different depending on the type of problem being considered, the extent to which the problem is mission critical, and the degree of Software Engineering that has to be applied.
Understanding the major transitions in Quaternary climate dynamics
NASA Astrophysics Data System (ADS)
Willeit, Matteo; Ganopolski, Andrey
2017-04-01
Climate dynamics over the past 3 million years was characterized by strong variability associated with glacial cycles and several distinct regime changes. The Pliocene-Pleistocene Transition (PPT), which happened around 2.7 million years ago, was characterized by the appearance of the large continental ice sheets over Northern Eurasia and North America. For two million years after the PPT climate variability was dominated by relatively symmetric 40 kyr cycles. At around 1 million years ago the dominant mode of climate variability experienced a relatively rapid transition from 40 kyr to strongly asymmetric 100 kyr cycles of larger amplitude (Mid-Pleistocene Transition). Additionally, during the past 800 kyr there are clear differences between the earlier and the later glacial cycles with the last five cycles characterized by larger magnitude of variability (Mid-Brunhes Event). Here, we use the Earth system model of intermediate complexity CLIMBER-2 to explore possible mechanisms that could explain these regime shifts. CLIMBER-2 incorporates all major components of the Earth system - atmosphere, ocean, land surface, northern hemisphere ice sheets, terrestrial biota and soil carbon, marine biogeochemistry and aeolian dust. The model was optimally tuned to reproduce climate, ice volume and CO2 variability over the last 400,000 years. Using the same model version, we performed a large set of simulations covering the entire Quaternary (3 million years) starting from identical initial conditions and using a parallelization in time technique which consists of starting the model at different times (every 100,000 years) and running each simulation for 500,000 years. The Earth's orbital variations are the only prescribed radiative forcing. Several sets of the Northern Hemisphere orography and sediment thickness representing different stages of landscape evolution during the Quaternary are prescribed as boundary conditions for the ice sheet model and volcanic CO2 outgassing is used as the external forcing for the carbon cycle to allow for different background atmospheric CO2 concentrations. We show that by varying only these two model boundary conditions and volcanic forcing the model is able to reproduce the major regime changes of Quaternary long-term climate dynamics.
Surface energy fluxes and their representation in CMIP5 models
NASA Astrophysics Data System (ADS)
Wild, M.
2016-12-01
Energy fluxes at the Earth surface play a key role in the determination of surface climate and in the coupling of atmosphere, land and ocean components. Unlike their counterparts at the top of atmosphere (TOA), surface fluxes cannot be directly measured from satellites, but have to be inferred from the space-born observations using additional models to account for atmospheric perturbations, or from the limited number of surface observations. Uncertainties in the energy fluxes at the surface have therefore traditionally been larger than at the TOA, and have limited our knowledge on the distribution of the energy flows within the climate system. Accordingly, current climate models still largely differ in their representation of surface and atmospheric energy fluxes. Since the mid-1990s, accurate flux measurements became increasingly available from surface networks such as BSRN, which allow to better constrain the surface energy fluxes. There is, however, still a lack of flux measurements particularly over oceans. Further, the larger-scale representativeness of the station records needs to be assessed to judge their suitability as anchor sites for gridded flux products inferred from satellites, reanalyses and climate models. In addition, historic records need to be carefully quality-checked and homogeneized. In parallel, satellite-derived products of surface fluxes profit from the great advancement in space-born observations since the turn of the millennium, and from improved validation capabilities with surface observations. Ultimately, it is the combination of surface and space-born observations, reanalyses and modeling approaches that will advance our knowledge on the distribution of the surface energy fluxes. Uncertainties remain in the determination of surface albedo, skin temperatures and the partitioning of surface net radiation into the sensible and latent heat. Climate models over generations up to present day (CMIP5) tend to overestimate the downward shortwave and underestimate the downward longwave radiation. A challenge also remains the consistent representation of the global energy and water cycles. Yet it is shown that those climate models with a realistic surface radiation balance also simulate global precipitation amounts within the uncertainty range of observational estimates.
Data informatics for the Detection, Characterization, and Attribution of Climate Extremes
NASA Astrophysics Data System (ADS)
Collins, W.; Wehner, M. F.; O'Brien, T. A.; Paciorek, C. J.; Krishnan, H.; Johnson, J. N.; Prabhat, M.
2015-12-01
The potential for increasing frequency and intensity of extremephenomena including downpours, heat waves, and tropical cyclonesconstitutes one of the primary risks of climate change for society andthe environment. The challenge of characterizing these risks is thatextremes represent the "tails" of distributions of atmosphericphenomena and are, by definition, highly localized and typicallyrelatively transient. Therefore very large volumes of observationaldata and projections of future climate are required to quantify theirproperties in a robust manner. Massive data analytics are required inorder to detect individual extremes, accumulate statistics on theirproperties, quantify how these statistics are changing with time, andattribute the effects of anthropogenic global warming on thesestatistics. We describe examples of the suite of techniques the climate communityis developing to address these analytical challenges. The techniquesinclude massively parallel methods for detecting and trackingatmospheric rivers and cyclones; data-intensive extensions togeneralized extreme value theory to summarize the properties ofextremes; and multi-model ensembles of hindcasts to quantify theattributable risk of anthropogenic influence on individual extremes.We conclude by highlighting examples of these methods developed by ourCASCADE (Calibrated and Systematic Characterization, Attribution, andDetection of Extremes) project.
Pyrolysis kinetics of algal consortia grown using swine manure wastewater.
Sharara, Mahmoud A; Holeman, Nathan; Sadaka, Sammy S; Costello, Thomas A
2014-10-01
In this study, pyrolysis kinetics of periphytic microalgae consortia grown using swine manure slurry in two seasonal climatic patterns in northwest Arkansas were investigated. Four heating rates (5, 10, 20 and 40 °C min(-1)) were used to determine the pyrolysis kinetics. Differences in proximate, ultimate, and heating value analyses reflected variability in growing substrate conditions, i.e., flocculant use, manure slurry dilution, and differences in diurnal solar radiation and air temperature regimes. Peak decomposition temperature in algal harvests varied with changing the heating rate. Analyzing pyrolysis kinetics using differential and integral isoconversional methods (Friedman, Flynn-Wall-Ozawa, and Kissinger-Akahira-Sunose) showed strong dependency of apparent activation energy on the degree of conversion suggesting parallel reaction scheme. Consequently, the weight loss data in each thermogravimetric test was modeled using independent parallel reactions (IPR). The quality of fit (QOF) for the model ranged between 2.09% and 3.31% indicating a good agreement with the experimental data. Copyright © 2014 Elsevier Ltd. All rights reserved.
Hadoop for High-Performance Climate Analytics: Use Cases and Lessons Learned
NASA Technical Reports Server (NTRS)
Tamkin, Glenn
2013-01-01
Scientific data services are a critical aspect of the NASA Center for Climate Simulations mission (NCCS). Hadoop, via MapReduce, provides an approach to high-performance analytics that is proving to be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. The NCCS is particularly interested in the potential of Hadoop to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we prototyped a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. The initial focus was on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. After preliminary results suggested that this approach improves efficiencies within data intensive analytic workflows, we invested in building a cyber infrastructure resource for developing a new generation of climate data analysis capabilities using Hadoop. This resource is focused on reducing the time spent in the preparation of reanalysis data used in data-model inter-comparison, a long sought goal of the climate community. This paper summarizes the related use cases and lessons learned.
ERIC Educational Resources Information Center
Boon, Helen
2009-01-01
Regional Australian students were surveyed to explore their understanding and knowledge of the greenhouse effect, ozone depletion and climate change. Results were compared with a parallel study undertaken in 1991 in a regional UK city. The comparison was conducted to investigate whether more awareness and understanding of these issues is…
NASA Astrophysics Data System (ADS)
Carvalhais, N.; Forkel, M.; Khomik, M.; Bellarby, J.; Migliavacca, M.; Thurner, M.; Beer, C.; Jung, M.; Mu, M.; Randerson, J. T.; Saatchi, S. S.; Santoro, M.; Reichstein, M.
2012-12-01
The turnover rates of carbon in terrestrial ecosystems and their sensitivity to climate are instrumental properties for diagnosing the interannual variability and forecasting trends of biogeochemical processes and carbon-cycle-climate feedbacks. We propose to globally look at the spatial distribution of turnover rates of carbon to explore the association between bioclimatic regimes and the rates at which carbon cycles in terrestrial ecosystems. Based on data-driven approaches of ecosystem carbon fluxes and data-based estimates of ecosystem carbon stocks it is possible to build fully observationally supported diagnostics. These data driven diagnostics support the benchmarking of CMIP5 model outputs (Coupled Model Intercomparison Project Phase 5) with observationally based estimates. The models' performance is addressed by confronting spatial patterns of carbon fluxes and stocks with data, as well as the global and regional sensitivities of turnover rates to climate. Our results show strong latitudinal gradients globally, mostly controlled by temperature, which are not always paralleled by CMIP5 simulations. In northern colder regions is also where the largest difference in temperature sensitivity between models and data occurs. Interestingly, there seem to be two different statistical populations in the data (some with high, others with low apparent temperature sensitivity of carbon turnover rates), where the different models only seem to describe either one or the other population. Additionally, the comparisons within bioclimatic classes can even show opposite patterns between turnover rates and temperature in water limited regions. Overall, our analysis emphasizes the role of finding patterns and intrinsic properties instead of plain magnitudes of fluxes for diagnosing the sensitivities of terrestrial biogeochemical cycles to climate. Further, our regional analysis suggests a significant gap in addressing the partial influence of water in the ecosystem carbon turnover rates especially in very cold or water limited regions.
Climate Information and Misinformation: Getting the Message Out
NASA Astrophysics Data System (ADS)
Carr, M.; Rubenstein, M.; Brash, K.; Hernandez, T. E.; Anderson, R. F.; Fulton, M.; Kahn, B.
2010-12-01
While it is commonly accepted that improved science comprehension is a key element to informed decisions on the many societal issues that interface with science and technology, it is not always clear what that understanding should entail. Is it knowledge of a set of facts and their context, the ability to read scientific papers, familiarity with data sets and their strengths and limitations, the development of original research? Physical scientists continue to operate assuming the deficit model: that lack of societal engagement results from ignorance or lack of information. Yet, in the case of climate, an active community of citizen scientists is engaged in a parallel research activity that aims to audit the basic tenets of the field, thus illustrating that greater literacy does not necessarily lead to consensus. Communication experts have long noted the inadequacy of the deficit model, highlighting the importance of prior knowledge, interests, and values. Science communicators recommend direct public engagement using non-traditional tools and fora. Here we explore three modes of engaging the public on the theme of climate change skepticism: a report published by a major financial institution (following a deficit model, but targeting a highly educated non-science community), blogging (using the broad potential reach and ongoing engagement of the internet), and student discussion groups (taking a participatory 'community outreach' approach).
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
Defining Top-of-Atmosphere Flux Reference Level for Earth Radiation Budget Studies
NASA Technical Reports Server (NTRS)
Loeb, N. G.; Kato, S.; Wielicki, B. A.
2002-01-01
To estimate the earth's radiation budget at the top of the atmosphere (TOA) from satellite-measured radiances, it is necessary to account for the finite geometry of the earth and recognize that the earth is a solid body surrounded by a translucent atmosphere of finite thickness that attenuates solar radiation differently at different heights. As a result, in order to account for all of the reflected solar and emitted thermal radiation from the planet by direct integration of satellite-measured radiances, the measurement viewing geometry must be defined at a reference level well above the earth s surface (e.g., 100 km). This ensures that all radiation contributions, including radiation escaping the planet along slant paths above the earth s tangent point, are accounted for. By using a field-of- view (FOV) reference level that is too low (such as the surface reference level), TOA fluxes for most scene types are systematically underestimated by 1-2 W/sq m. In addition, since TOA flux represents a flow of radiant energy per unit area, and varies with distance from the earth according to the inverse-square law, a reference level is also needed to define satellite-based TOA fluxes. From theoretical radiative transfer calculations using a model that accounts for spherical geometry, the optimal reference level for defining TOA fluxes in radiation budget studies for the earth is estimated to be approximately 20 km. At this reference level, there is no need to explicitly account for horizontal transmission of solar radiation through the atmosphere in the earth radiation budget calculation. In this context, therefore, the 20-km reference level corresponds to the effective radiative top of atmosphere for the planet. Although the optimal flux reference level depends slightly on scene type due to differences in effective transmission of solar radiation with cloud height, the difference in flux caused by neglecting the scene-type dependence is less than 0.1%. If an inappropriate TOA flux reference level is used to define satellite TOA fluxes, and horizontal transmission of solar radiation through the planet is not accounted for in the radiation budget equation, systematic errors in net flux of up to 8 W/sq m can result. Since climate models generally use a plane-parallel model approximation to estimate TOA fluxes and the earth radiation budget, they implicitly assume zero horizontal transmission of solar radiation in the radiation budget equation, and do not need to specify a flux reference level. By defining satellite-based TOA flux estimates at a 20-km flux reference level, comparisons with plane-parallel climate model calculations are simplified since there is no need to explicitly correct plane-parallel climate model fluxes for horizontal transmission of solar radiation through a finite earth.
Good Models Gone Bad: Quantifying and Predicting Parameter-Induced Climate Model Simulation Failures
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Brandon, S.; Covey, C. C.; Domyancic, D.; Ivanova, D. P.
2012-12-01
Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Statistical analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation failures of the Parallel Ocean Program (POP2). About 8.5% of our POP2 runs failed for numerical reasons at certain combinations of parameter values. We apply support vector machine (SVM) classification from the fields of pattern recognition and machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. The SVM classifiers readily predict POP2 failures in an independent validation ensemble, and are subsequently used to determine the causes of the failures via a global sensitivity analysis. Four parameters related to ocean mixing and viscosity are identified as the major sources of POP2 failures. Our method can be used to improve the robustness of complex scientific models to parameter perturbations and to better steer UQ ensembles. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
How to make a tree ring: Coupling stem water flow and cambial activity in mature Alpine conifers
NASA Astrophysics Data System (ADS)
Peters, Richard L.; Frank, David C.; Treydte, Kerstin; Steppe, Kathy; Kahmen, Ansgar; Fonti, Patrick
2017-04-01
Inter-annual tree-ring measurements are used to understand tree-growth responses to climatic variability and reconstruct past climate conditions. In parallel, mechanistic models use experimentally defined plant-atmosphere interactions to explain past growth responses and predict future environmental impact on forest productivity. Yet, substantial inconsistencies within mechanistic model ensembles and mismatches with empirical data indicate that significant progress is still needed to understand the processes occurring at an intra-annual resolution that drive annual growth. However, challenges arise due to i) few datasets describing climatic responses of high-resolution physiological processes over longer time-scales, ii) uncertainties on the main mechanistic process limiting radial stem growth and iii) complex interactions between multiple environmental factors which obscure detection of the main stem growth driver, generating a gap between our understanding of intra- and inter-annual growth mechanisms. We attempt to bridge the gap between inter-annual tree-ring width and sub-daily radial stem-growth and provide a mechanistic perspective on how environmental conditions affect physiological processes that shape tree rings in conifers. We combine sub-hourly sap flow and point dendrometer measurements performed on mature Alpine conifers (Larix decidua) into an individual-based mechanistic tree-growth model to simulate sub-hourly cambial activity. The monitored trees are located along a high elevational transect in the Swiss Alps (Lötschental) to analyse the effect of increasing temperature. The model quantifies internal tree hydraulic pathways that regulate the turgidity within the cambial zone and induce cell enlargement for radial growth. The simulations are validated against intra-annual growth patterns derived from xylogenesis data and anatomical analyses. Our efforts advance the process-based understanding of how climate shapes the annual tree-ring structures and could potentially improve our ability to reconstruct the climate of the past and predict future growth under changing climate.
Numerical Modeling of Geomorphic Change on Sandy Coasts as a Function of Changing Wave Climate
NASA Astrophysics Data System (ADS)
Adams, P. N.; McNamara, D.; Murray, A. B.; Lovering, J.
2009-12-01
Climate change is expected to affect sandy coast geomorphology through two principal mechanisms: (1) sea level rise, which affects cross-shore sediment transport tending to drive shoreline retreat, and (2) alteration of statistical distributions in ocean storm wave climate (deep water wave height, period, and direction), which affects longshore sediment transport gradients that result in shoreline erosion and accretion. To address potential climate change-driven effects on longshore sediment transport gradients, we are developing techniques to link various numerical models of wave transformation with several different longshore sediment transport formulae in accordance with the Community Surface Dynamics Modeling System (CSDMS) project. Results of the various wave transformation models are compared to field observations of cross-shelf wave transformation along the North Florida Atlantic coast for purposes of model verification and calibration. Initial comparisons between wave-transformation methods (assumption of shore-parallel contours, simple wave ray tracing, and the SWAN spectral wave model) on artificially constructed continental shelves reveal an increasing discrepancy of results for increasing complexity of shelf bathymetry. When the more advanced SWAN spectral wave model is coupled with a simple CERC-type formulation of longshore sediment transport and applied to a real coast with complex offshore shoals (Cape Canaveral region of the North Florida Atlantic Coast), the patterns of erosion and accretion agree with results of the simplest wave-propagation models for some wave conditions, but disagree in others. Model simulations in which wave height and period are held constant show that locations of divergence and convergence of sediment flux shift with deep water wave-approach angle in ways that would not always be predicted using less sophisticated wave propagation models. Thus, predicting long-term local shoreline change on actual coastlines featuring complex bathymetry requires the extra computational effort to run the more advanced model over a wide range of wave conditions.
Validating Satellite-Retrieved Cloud Properties for Weather and Climate Applications
NASA Astrophysics Data System (ADS)
Minnis, P.; Bedka, K. M.; Smith, W., Jr.; Yost, C. R.; Bedka, S. T.; Palikonda, R.; Spangenberg, D.; Sun-Mack, S.; Trepte, Q.; Dong, X.; Xi, B.
2014-12-01
Cloud properties determined from satellite imager radiances are increasingly used in weather and climate applications, particularly in nowcasting, model assimilation and validation, trend monitoring, and precipitation and radiation analyses. The value of using the satellite-derived cloud parameters is determined by the accuracy of the particular parameter for a given set of conditions, such as viewing and illumination angles, surface background, and cloud type and structure. Because of the great variety of those conditions and of the sensors used to monitor clouds, determining the accuracy or uncertainties in the retrieved cloud parameters is a daunting task. Sensitivity studies of the retrieved parameters to the various inputs for a particular cloud type are helpful for understanding the errors associated with the retrieval algorithm relative to the plane-parallel world assumed in most of the model clouds that serve as the basis for the retrievals. Real world clouds, however, rarely fit the plane-parallel mold and generate radiances that likely produce much greater errors in the retrieved parameter than can be inferred from sensitivity analyses. Thus, independent, empirical methods are used to provide a more reliable uncertainty analysis. At NASA Langley, cloud properties are being retrieved from both geostationary (GEO) and low-earth orbiting (LEO) satellite imagers for climate monitoring and model validation as part of the NASA CERES project since 2000 and from AVHRR data since 1978 as part of the NOAA CDR program. Cloud properties are also being retrieved in near-real time globally from both GEO and LEO satellites for weather model assimilation and nowcasting for hazards such as aircraft icing. This paper discusses the various independent datasets and approaches that are used to assessing the imager-based satellite cloud retrievals. These include, but are not limited to data from ARM sites, CloudSat, and CALIPSO. This paper discusses the use of the various datasets available, the methods employed to utilize them in the cloud property retrieval validation process, and the results and how they aid future development of the retrieval algorithms. Future needs are also discussed.
Conservation in the face of climate change: recent developments
Lawler, Joshua; Watson, James; Game, Edward
2015-01-01
An increased understanding of the current and potential future impacts of climate change has significantly influenced conservation in practice in recent years. Climate change has necessitated a shift toward longer planning time horizons, moving baselines, and evolving conservation goals and targets. This shift has resulted in new perspectives on, and changes in, the basic approaches practitioners use to conserve biodiversity. Restoration, spatial planning and reserve selection, connectivity modelling, extinction risk assessment, and species translocations have all been reimagined in the face of climate change. Restoration is being conducted with a new acceptance of uncertainty and an understanding that goals will need to shift through time. New conservation targets, such as geophysical settings and climatic refugia, are being incorporated into conservation plans. Risk assessments have begun to consider the potentially synergistic impacts of climate change and other threats. Assisted colonization has gained acceptance in recent years as a viable and necessary conservation tool. This evolution has paralleled a larger trend in conservation—a shift toward conservation actions that benefit both people and nature. As we look forward, it is clear that more change is on the horizon. To protect biodiversity and essential ecosystem services, conservation will need to anticipate the human response to climate change and to focus not only on resistance and resilience but on transitions to new states and new ecosystems. PMID:26937271
Thompson, R.S.; Fleming, R.F.
1996-01-01
The general characteristics of global vegetation during the middle Pliocene warm period can be reconstructed from fossil pollen and plant megafossil data. The largest differences between Pliocene vegetation and that of today occurred at high latitudes in both hemispheres, where warming was pronounced relative to today. In the Northern Hemisphere coniferous forests lived in the modern tundra and polar desert regions, whereas in the Southern Hemisphere southern beech apparently grew in coastal areas of Antarctica. Pliocene middle latitude vegetation differed less, although moister-than-modern conditions supported forest and woodland growth in some regions now covered by steppe or grassland. Pliocene tropical vegetation reflects essentially modern conditions in some regions and slightly cooler-than-or warmer-than- modern climates in other areas. Changes in topography induced by tectonics may be responsible for many of the climatic changes since the Pliocene in both middle and lower latitudes. However, the overall latitudinal progression of climatic conditions on land parallels that seen in the reconstruction of middle Pliocene sea-surface temperatures. Pliocene paleovegetational data was employed to construct a 2????2?? global grid of estimated mid-Pliocene vegetational cover for use as boundary conditions for numerical General Circulation Model simulations of middle Pliocene climates. Continental outlines and topography were first modified to represent the Pliocene landscape on the 2????2?? grid. A modern 1????1?? vegetation grid was simplified and mapped on this Pliocene grid, and then modified following general geographic trends evident in the Pliocene paleovegetation data set.
NASA Astrophysics Data System (ADS)
Reyes, A. V.; Wolfe, A. P.; Royer, D. L.; Greenwood, D. R.; Tierney, J. E.; Doria, G.; Gagen, M. H.; Siver, P.; Westgate, J.
2016-12-01
Eocene paleoclimate reconstructions are rarely accompanied by parallel estimates of CO2, complicating assessment of the equilibrium climate responses to CO2. We reconstruct temperature, precipitation, and CO2 from latest middle Eocene ( 38 Myrs ago) peats in subarctic Canada, preserved in sediments that record infilling of a kimberlite pipe maar crater. Mutual climatic range analyses of pollen, together with oxygen isotope analyses of a-cellulose from unpermineralized wood and inferenecs from branched glycerol diakyl glycerol tetraethers (GDGTs), reveal a high-latitude humid-temperate forest ecosystem with mean annual temperatures (MATs) >17 °C warmer than present, mean coldest month temperatures above 0 °C, and mean annual precipitation 4x present. Metasequoia stomatal indices and gas-exchange modeling produce median CO2 concentrations of 634 and 432 ppm, respectively, with a consensus median estimate of 494 ppm. Reconstructed MATs are >6 °C warmer than those produced by Eocene climate models forced at 560 ppm CO2, underscoring the capacity for exceptional polar amplification of warming and hydrological intensification under relatively modest CO2 concentrations, once both fast and slow feedbacks become expressed.
NASA Astrophysics Data System (ADS)
Pribulick, C. E.; Maxwell, R. M.; Williams, K. H.; Carroll, R. W. H.
2014-12-01
Prediction of environmental response to global climate change is paramount for regions that rely upon snowpack for their dominant water supply. Temperature increases are anticipated to be greater at higher elevations perturbing hydrologic systems that provide water to millions of downstream users. In this study, the relationships between large-scale climatic change and the corresponding small-scale hydrologic processes of mountainous terrain are investigated in the East River headwaters catchment near Gothic, CO. This catchment is emblematic of many others within the upper Colorado River Basin and covers an area of 250 square kilometers, has a topographic relief of 1420 meters, an average elevation of 3266 meters and has varying stream characteristics. This site allows for the examination of the varying effect of climate-induced changes on the hydrologic response of three different characteristic components of the catchment: a steep high-energy mountain system, a medium-grade lower-energy system and a low-grade low-energy meandering floodplain. To capture the surface and subsurface heterogeneity of this headwaters system the basin has been modeled at a 10-meter resolution using ParFlow, a parallel, integrated hydrologic model. Driven by meteorological forcing, ParFlow is able to capture land surface processes and represents surface and subsurface interactions through saturated and variably saturated heterogeneous flow. Data from Digital Elevation Models (DEMs), land cover, permeability, geologic and soil maps, and on-site meteorological stations, were prepared, analyzed and input into ParFlow as layers with a grid size comprised of 1403 by 1685 cells to best represent the small-scale, high resolution model domain. Water table depth, soil moisture, soil temperature, snowpack, runoff and local energy budget values provide useful insight into the catchments response to the Intergovernmental Panel on Climate Change (IPCC) temperature projections. In the near term, coupling this watershed model with one describing a diverse suite of subsurface elemental cycling pathways, including carbon and nitrogen, will provide an improved understanding of the response of the subsurface ecosystems to hydrologic transitions induced as a result of global climate change.
Algorithm of dynamic regulation of a system of duct, for a high accuracy climatic system
NASA Astrophysics Data System (ADS)
Arbatskiy, A. A.; Afonina, G. N.; Glazov, V. S.
2017-11-01
Currently, major part of climatic system, are stationary in projected mode only. At the same time, many modern industrial sites, require constant or periodical changes in technological process. That is 80% of the time, the industrial site is not require ventilation system in projected mode and high precision of climatic parameters must maintain. While that not constantly is in use for climatic systems, which use in parallel for different rooms, we will be have a problem for balance of duct system. For this problem, was created the algorithm for quantity regulation, with minimal changes. Dynamic duct system: Developed of parallel control system of air balance, with high precision of climatic parameters. The Algorithm provide a permanent pressure in main duct, in different a flow of air. Therefore, the ending devises air flow have only one parameter for regulation - flaps open area. Precision of regulation increase and the climatic system provide high precision for temperature and humidity (0,5C for temperature, 5% for relative humidity). Result: The research has been made in CFD-system - PHOENICS. Results for velocity of air in duct, for pressure of air in duct for different operation mode, has been obtained. Equation for air valves positions, with different parameters for climate in room’s, has been obtained. Energy saving potential for dynamic duct system, for different types of a rooms, has been calculated.
NASA Astrophysics Data System (ADS)
Hamann, Ilse; Arnault, Joel; Bliefernicht, Jan; Klein, Cornelia; Heinzeller, Dominikus; Kunstmann, Harald
2014-05-01
Changing climate and hydro-meteorological boundary conditions are among the most severe challenges to Africa in the 21st century. In particular West Africa faces an urgent need to develop effective adaptation and mitigation strategies to cope with negative impacts on humans and environment due to climate change, increased hydro-meteorological variability and land use changes. To help meet these challenges, the German Federal Ministry of Education and Research (BMBF) started an initiative with institutions in Germany and West African countries to establish together a West African Science Service Center on Climate Change and Adapted Land Use (WASCAL). This activity is accompanied by an establishment of trans-boundary observation networks, an interdisciplinary core research program and graduate research programs on climate change and related issues for strengthening the analytical capabilities of the Science Service Center. A key research activity of the WASCAL Competence Center is the provision of regional climate simulations in a fine spatio-temporal resolution for the core research sites of WASCAL for the present and the near future. The climate information is needed for subsequent local climate impact studies in agriculture, water resources and further socio-economic sectors. The simulation experiments are performed using regional climate models such as COSMO-CLM, RegCM and WRF and statistical techniques for a further refinement of the projections. The core research sites of WASCAL are located in the Sudanian Savannah belt in Northern Ghana, Southern Burkina Faso and Northern Benin. The climate in this region is semi-arid with six rainy months. Due to the strong population growth in West Africa, many areas of the Sudanian Savannah have been already converted to farmland since the majority of the people are living directly or indirectly from the income produced in agriculture. The simulation experiments of the Competence Center and the Core Research Program are accompanied by the WASCAL Graduate Research Program on the West African Climate System. The GRP-WACS provides ten scholarships per year for West African PhD students with a duration of three years. Present and future WASCAL PhD students will constitute one important user group of the Linux cluster that will be installed at the Competence Center in Ouagadougou, Burkina Faso. Regional Land-Atmosphere Simulations A key research activity of the WASCAL Core Research Program is the analysis of interactions between the land surface and the atmosphere to investigate how land surface changes affect hydro-meteorological surface fluxes such as evapotranspiration. Since current land surface models of global and regional climate models neglect dominant lateral hydrological processes such as surface runoff, a novel land surface model is used, the NCAR Distributed Hydrological Modeling System (NDHMS). This model can be coupled to WRF (WRF-Hydro) to perform two-way coupled atmospheric-hydrological simulations for the watershed of interest. Hardware and network prerequisites include a HPC cluster, network switches, internal storage media, Internet connectivity of sufficient bandwidth. Competences needed are HPC, storage, and visualization systems optimized for climate research, parallelization and optimization of climate models and workflows, efficient management of highest data volumes.
Quantifying the Climate-Scale Accuracy of Satellite Cloud Retrievals
NASA Astrophysics Data System (ADS)
Roberts, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Liang, L.; Di Girolamo, L.
2014-12-01
Instrument calibration and cloud retrieval algorithms have been developed to minimize retrieval errors on small scales. However, measurement uncertainties and assumptions within retrieval algorithms at the pixel level may alias into decadal-scale trends of cloud properties. We first, therefore, quantify how instrument calibration changes could alias into cloud property trends. For a perfect observing system the climate trend accuracy is limited only by the natural variability of the climate variable. Alternatively, for an actual observing system, the climate trend accuracy is additionally limited by the measurement uncertainty. Drifts in calibration over time may therefore be disguised as a true climate trend. We impose absolute calibration changes to MODIS spectral reflectance used as input to the CERES Cloud Property Retrieval System (CPRS) and run the modified MODIS reflectance through the CPRS to determine the sensitivity of cloud properties to calibration changes. We then use these changes to determine the impact of instrument calibration changes on trend uncertainty in reflected solar cloud properties. Secondly, we quantify how much cloud retrieval algorithm assumptions alias into cloud optical retrieval trends by starting with the largest of these biases: the plane-parallel assumption in cloud optical thickness (τC) retrievals. First, we collect liquid water cloud fields obtained from Multi-angle Imaging Spectroradiometer (MISR) measurements to construct realistic probability distribution functions (PDFs) of 3D cloud anisotropy (a measure of the degree to which clouds depart from plane-parallel) for different ISCCP cloud types. Next, we will conduct a theoretical study with dynamically simulated cloud fields and a 3D radiative transfer model to determine the relationship between 3D cloud anisotropy and 3D τC bias for each cloud type. Combining these results provides distributions of 3D τC bias by cloud type. Finally, we will estimate the change in frequency of occurrence of cloud types between two decades and will have the information needed to calculate the total change in 3D optical thickness bias between two decades. If we uncover aliases in this study, the results will motivate the development and rigorous testing of climate specific cloud retrieval algorithms.
Fu, Yongshuo H; Campioli, Matteo; Deckmyn, Gaby; Janssens, Ivan A
2012-01-01
Budburst phenology is a key driver of ecosystem structure and functioning, and it is sensitive to global change. Both cold winter temperatures (chilling) and spring warming (forcing) are important for budburst. Future climate warming is expected to have a contrasting effect on chilling and forcing, and subsequently to have a non-linear effect on budburst timing. To clarify the different effects of warming during chilling and forcing phases of budburst phenology in deciduous trees, (i) we conducted a temperature manipulation experiment, with separate winter and spring warming treatments on well irrigated and fertilized saplings of beech, birch and oak, and (ii) we analyzed the observations with five temperature-based budburst models (Thermal Time model, Parallel model, Sequential model, Alternating model, and Unified model). The results show that both winter warming and spring warming significantly advanced budburst date, with the combination of winter plus spring warming accelerating budburst most. As expected, all three species were more sensitive to spring warming than to winter warming. Although the different chilling requirement, the warming sensitivity was not significantly different among the studied species. Model evaluation showed that both one- and two- phase models (without and with chilling, respectively) are able to accurately predict budburst. For beech, the Sequential model reproduced budburst dates best. For oak and birch, both Sequential model and the Thermal Time model yielded good fit with the data but the latter was slightly better in case of high parameter uncertainty. However, for late-flushing species, the Sequential model is likely be the most appropriate to predict budburst data in a future warmer climate.
NASA Astrophysics Data System (ADS)
Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi
2015-04-01
Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations from a variety of CMIP5 model ensembles. Here, we present results for the UK 2013/14 winter floods as proof of concept and we show validation and testing results that demonstrate the robustness of our method. We also revisit the record temperatures over Europe in 2014 and present a detailed analysis of this attribution exercise as it is one of the events to demonstrate that we can make a sensible statement of how the odds for such a year to occur have changed while it still unfolds.
NASA Technical Reports Server (NTRS)
Fox-Rabinovitz, Michael S.; Takacs, Lawrence L.; Suarez, Max; Sawyer, William; Govindaraju, Ravi C.
1999-01-01
The results obtained with the variable resolution stretched grid (SG) GEOS GCM (Goddard Earth Observing System General Circulation Models) are discussed, with the emphasis on the regional down-scaling effects and their dependence on the stretched grid design and parameters. A variable resolution SG-GCM and SG-DAS using a global stretched grid with fine resolution over an area of interest, is a viable new approach to REGIONAL and subregional CLIMATE studies and applications. The stretched grid approach is an ideal tool for representing regional to global scale interactions. It is an alternative to the widely used nested grid approach introduced a decade ago as a pioneering step in regional climate modeling. The GEOS SG-GCM is used for simulations of the anomalous U.S. climate events of 1988 drought and 1993 flood, with enhanced regional resolution. The height low level jet, precipitation and other diagnostic patterns are successfully simulated and show the efficient down-scaling over the area of interest the U.S. An imitation of the nested grid approach is performed using the developed SG-DAS (Data Assimilation System) that incorporates the SG-GCM. The SG-DAS is run with withholding data over the area of interest. The design immitates the nested grid framework with boundary conditions provided from analyses. No boundary condition buffer is needed for the case due to the global domain of integration used for the SG-GCM and SG-DAS. The experiments based on the newly developed versions of the GEOS SG-GCM and SG-DAS, with finer 0.5 degree (and higher) regional resolution, are briefly discussed. The major aspects of parallelization of the SG-GCM code are outlined. The KEY OBJECTIVES of the study are: 1) obtaining an efficient DOWN-SCALING over the area of interest with fine and very fine resolution; 2) providing CONSISTENT interactions between regional and global scales including the consistent representation of regional ENERGY and WATER BALANCES; 3) providing a high computational efficiency for future SG-GCM and SG-DAS versions using PARALLEL codes.
Anthropogenic Sulfate, Clouds, and Climate Forcing
NASA Technical Reports Server (NTRS)
Ghan, Steven J.
1997-01-01
This research work is a joint effort between research groups at the Battelle Pacific Northwest Laboratory, Virginia Tech University, Georgia Institute of Technology, Brookhaven National Laboratory, and Texas A&M University. It has been jointly sponsored by the National Aeronautics and Space Administration, the U.S. Department of Energy, and the U.S. Environmental Protection Agency. In this research, a detailed tropospheric aerosol-chemistry model that predicts oxidant concentrations as well as concentrations of sulfur dioxide and sulfate aerosols has been coupled to a general circulation model that distinguishes between cloud water mass and cloud droplet number. The coupled model system has been first validated and then used to estimate the radiative impact of anthropogenic sulfur emissions. Both the direct radiative impact of the aerosols and their indirect impact through their influence on cloud droplet number are represented by distinguishing between sulfuric acid vapor and fresh and aged sulfate aerosols, and by parameterizing cloud droplet nucleation in terms of vertical velocity and the number concentration of aged sulfur aerosols. Natural sulfate aerosols, dust, and carbonaceous and nitrate aerosols and their influence on the radiative impact of anthropogenic sulfate aerosols, through competition as cloud condensation nuclei, will also be simulated. Parallel simulations with and without anthropogenic sulfur emissions are performed for a global domain. The objectives of the research are: To couple a state-of-the-art tropospheric aerosol-chemistry model with a global climate model. To use field and satellite measurements to evaluate the treatment of tropospheric chemistry and aerosol physics in the coupled model. To use the coupled model to simulate the radiative (and ultimately climatic) impacts of anthropogenic sulfur emissions.
NASA Astrophysics Data System (ADS)
Ercan, Mehmet Bulent
Watershed-scale hydrologic models are used for a variety of applications from flood prediction, to drought analysis, to water quality assessments. A particular challenge in applying these models is calibration of the model parameters, many of which are difficult to measure at the watershed-scale. A primary goal of this dissertation is to contribute new computational methods and tools for calibration of watershed-scale hydrologic models and the Soil and Water Assessment Tool (SWAT) model, in particular. SWAT is a physically-based, watershed-scale hydrologic model developed to predict the impact of land management practices on water quality and quantity. The dissertation follows a manuscript format meaning it is comprised of three separate but interrelated research studies. The first two research studies focus on SWAT model calibration, and the third research study presents an application of the new calibration methods and tools to study climate change impacts on water resources in the Upper Neuse Watershed of North Carolina using SWAT. The objective of the first two studies is to overcome computational challenges associated with calibration of SWAT models. The first study evaluates a parallel SWAT calibration tool built using the Windows Azure cloud environment and a parallel version of the Dynamically Dimensioned Search (DDS) calibration method modified to run in Azure. The calibration tool was tested for six model scenarios constructed using three watersheds of increasing size (the Eno, Upper Neuse, and Neuse) for both a 2 year and 10 year simulation duration. Leveraging the cloud as an on demand computing resource allowed for a significantly reduced calibration time such that calibration of the Neuse watershed went from taking 207 hours on a personal computer to only 3.4 hours using 256 cores in the Azure cloud. The second study aims at increasing SWAT model calibration efficiency by creating an open source, multi-objective calibration tool using the Non-Dominated Sorting Genetic Algorithm II (NSGA-II). This tool was demonstrated through an application for the Upper Neuse Watershed in North Carolina, USA. The objective functions used for the calibration were Nash-Sutcliffe (E) and Percent Bias (PB), and the objective sites were the Flat, Little, and Eno watershed outlets. The results show that the use of multi-objective calibration algorithms for SWAT calibration improved model performance especially in terms of minimizing PB compared to the single objective model calibration. The third study builds upon the first two studies by leveraging the new calibration methods and tools to study future climate impacts on the Upper Neuse watershed. Statistically downscaled outputs from eight Global Circulation Models (GCMs) were used for both low and high emission scenarios to drive a well calibrated SWAT model of the Upper Neuse watershed. The objective of the study was to understand the potential hydrologic response of the watershed, which serves as a public water supply for the growing Research Triangle Park region of North Carolina, under projected climate change scenarios. The future climate change scenarios, in general, indicate an increase in precipitation and temperature for the watershed in coming decades. The SWAT simulations using the future climate scenarios, in general, suggest an increase in soil water and water yield, and a decrease in evapotranspiration within the Upper Neuse watershed. In summary, this dissertation advances the field of watershed-scale hydrologic modeling by (i) providing some of the first work to apply cloud computing for the computationally-demanding task of model calibration; (ii) providing a new, open source library that can be used by SWAT modelers to perform multi-objective calibration of their models; and (iii) advancing understanding of climate change impacts on water resources for an important watershed in the Research Triangle Park region of North Carolina. The third study leveraged the methodological advances presented in the first two studies. Therefore, the dissertation contains three independent by interrelated studies that collectively advance the field of watershed-scale hydrologic modeling and analysis.
NASA Astrophysics Data System (ADS)
Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.
2013-12-01
One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.
2013-12-01
The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).
Distributed Memory Parallel Computing with SEAWAT
NASA Astrophysics Data System (ADS)
Verkaik, J.; Huizer, S.; van Engelen, J.; Oude Essink, G.; Ram, R.; Vuik, K.
2017-12-01
Fresh groundwater reserves in coastal aquifers are threatened by sea-level rise, extreme weather conditions, increasing urbanization and associated groundwater extraction rates. To counteract these threats, accurate high-resolution numerical models are required to optimize the management of these precious reserves. The major model drawbacks are long run times and large memory requirements, limiting the predictive power of these models. Distributed memory parallel computing is an efficient technique for reducing run times and memory requirements, where the problem is divided over multiple processor cores. A new Parallel Krylov Solver (PKS) for SEAWAT is presented. PKS has recently been applied to MODFLOW and includes Conjugate Gradient (CG) and Biconjugate Gradient Stabilized (BiCGSTAB) linear accelerators. Both accelerators are preconditioned by an overlapping additive Schwarz preconditioner in a way that: a) subdomains are partitioned using Recursive Coordinate Bisection (RCB) load balancing, b) each subdomain uses local memory only and communicates with other subdomains by Message Passing Interface (MPI) within the linear accelerator, c) it is fully integrated in SEAWAT. Within SEAWAT, the PKS-CG solver replaces the Preconditioned Conjugate Gradient (PCG) solver for solving the variable-density groundwater flow equation and the PKS-BiCGSTAB solver replaces the Generalized Conjugate Gradient (GCG) solver for solving the advection-diffusion equation. PKS supports the third-order Total Variation Diminishing (TVD) scheme for computing advection. Benchmarks were performed on the Dutch national supercomputer (https://userinfo.surfsara.nl/systems/cartesius) using up to 128 cores, for a synthetic 3D Henry model (100 million cells) and the real-life Sand Engine model ( 10 million cells). The Sand Engine model was used to investigate the potential effect of the long-term morphological evolution of a large sand replenishment and climate change on fresh groundwater resources. Speed-ups up to 40 were obtained with the new PKS solver.
NASA Astrophysics Data System (ADS)
Tian, Fang; Cao, Xianyong; Dallmeyer, Anne; Zhao, Yan; Ni, Jian; Herzschuh, Ulrike
2017-01-01
Temporal and spatial stability of the vegetation-climate relationship is a basic ecological assumption for pollen-based quantitative inferences of past climate change and for predicting future vegetation. We explore this assumption for the Holocene in eastern continental Asia (China, Mongolia). Boosted regression trees (BRT) between fossil pollen taxa percentages (Abies, Artemisia, Betula, Chenopodiaceae, Cyperaceae, Ephedra, Picea, Pinus, Poaceae and Quercus) and climate model outputs of mean annual precipitation (Pann) and mean temperature of the warmest month (Mtwa) for 9 and 6 ka (ka = thousand years before present) were set up and results compared to those obtained from relating modern pollen to modern climate. Overall, our results reveal only slight temporal differences in the pollen-climate relationships. Our analyses suggest that the importance of Pann compared with Mtwa for taxa distribution is higher today than it was at 6 ka and 9 ka. In particular, the relevance of Pann for Picea and Pinus increases and has become the main determinant. This change in the climate-tree pollen relationship parallels a widespread tree pollen decrease in north-central China and the eastern Tibetan Plateau. We assume that this is at least partly related to vegetation-climate disequilibrium originating from human impact. Increased atmospheric CO2 concentration may have permitted the expansion of moisture-loving herb taxa (Cyperaceae and Poaceae) during the late Holocene into arid/semi-arid areas. We furthermore find that the pollen-climate relationship between north-central China and the eastern Tibetan Plateau is generally similar, but that regional differences are larger than temporal differences. In summary, vegetation-climate relationships in China are generally stable in space and time, and pollen-based climate reconstructions can be applied to the Holocene. Regional differences imply the calibration-set should be restricted spatially.
Pliocene environments and climates in the western United States
Thompson, R.S.
1991-01-01
The available evidence from the western United States suggests that the climate of the Early and Middle Pliocene (prior to ???2.4 Ma) was less seasonal (more equable) and generally more humid than now. Along the Pacific coast, summer drought was less pronounced than today. In the interior of the Pacific Northwest rainfall was more abundant and mild winter temperatures prevailed across much of the High Plains. In the Northwestern interior, a trend toward drier conditions began after ???4 Ma, although there may have been short periods of relatively humid conditions after this time. The period between 2.5 or 2.4-2.0 Ma was drier than earlier in the Pliocene throughout the American West, and apparently colder in many regions, although the occurrence of land tortoises as far north as Kansas may indicate intermittent frost-free conditions during this interval. After ???2.0 Ma conditions became warmer and more humid. The general climatic trends in the terrestrial data parallel fluctuations seen in North Pacific and in Oxygen Isotopic records of global glacial fluctuations. Global Climate Model (GCM) simulations of the regional effects of Late Cenozoic uplift and mountain-building are generally in accord with the nature, direction, and amplitude of differences between Pliocene and modern climates. ?? 1991.
Climate Signal Detection in Wine Quality Using Gridded vs. Station Data in North-East Hungary
NASA Astrophysics Data System (ADS)
Mika, Janos; Razsi, Andras; Gal, Lajos
2017-04-01
The grapevine is one of the oldest cultivated plants. Today's viticultural regions for quality wine production are located in relatively narrow geographical and therefore climatic niches. Our target area, the Matra Region in NE Hungary is fairly close to the edge of optimal wine production concerning its climate conditions. Fifty year (1961-2010) wine and quality (natural sugar content, in weight % of must) data are analysed and compared to parallel climate variables. Two sets of station-based monthly temperature, sunshine duration and precipitation data, taken from neighbouring stations, Eger-Kőlyuktető (1961-2010) and Kompolt (1976-2006) are used in 132 combinations, together with daily grid-point data provided by the CarpatClim Project (www.carpatclim-eu.org/pages/home). By now it is clear that (1) wine quality, is in significant negative correlation with the annual precipitation and in positive correlation with temperature and sunshine duration. (2) Applying a wide combination of monthly data we obtain even stronger correlations (higher significance according to t-tests) even from the station-based data, but it is difficult to select and optimum model from the many proper combinations differing in performance over the test sample just slightly. (3) The interpolated site-specific areal averages from the grid-point data provide even better results and stronger differences between the best models and the few other candidates. (4) Further improvement of statistical signal detection capacity of the above climate variables by using 5-day averages, point at the strong vulnerability of wine quality on climate anomalies of some key phenological phases of the investigated grapevine-mixes. Enhanced spatial and temporal resolution provides much better fit to the observed wine quality data. The study has been supported by the OTKA-113209 national project.
Mid-Century Ensemble Regional Climate Change Scenarios for the Western United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leung, Lai R.; Qian, Yun; Bian, Xindi
To study the impacts of climate change on water resources in the western U.S., global climate simulations were produced using the National Center for Atmospheric Research/Department of Energy (NCAR/DOE) Parallel Climate Model (PCM). The Penn State/NCAR Mesoscale Model (MM5) was used to downscale the PCM control (1995-2015) and three future (2040-2060) climate simulations to yield ensemble regional climate simulations at 40 km spatial resolution for the western U.S. This paper focuses on analyses of regional simulations in the Columbia River and Sacramento-San Joaquin River Basins. Results based on the regional simulations show that by mid-century, the average regional warming ofmore » 1-2.5oC strongly affects snowpack in the western U.S. Along coastal mountains, reduction in annual snowpack is about 70%. Besides changes in mean temperature, precipitation, and snowpack, cold season extreme daily precipitation is found to increase by 5 to 15 mm/day (15-20%) along the Cascades and the Sierra. The warming results in increased rainfall over snowfall and reduced snow accumulation (or earlier snowmelt) during the cold season. In the Columbia River Basin, these changes are accompanied by more frequent rain-on-snow events. Overall, they induce higher likelihood of wintertime flooding and reduced runoff and soil moisture in the summer. Such changes could have serious impacts on water resources and agriculture in the western U.S. Changes in surface water and energy budgets in the Columbia River and Sacramento-San Joaquin basins are driven mainly by changes in surface temperature, which are statistically significant at the 0.95 confidence level. Changes in precipitation, however, are spatially incoherent and not statistically significant except for the drying trend during summer.« less
NASA Astrophysics Data System (ADS)
Khouider, B.; Goswami, B. B.; Majda, A.; Krishna, R. P. M. M.; Mukhopadhyay, P.
2016-12-01
Improvements in the capability of climate models to realistically capture the synoptic and intra-seasonnal variability, associated with tropical rainfall, are conditioned by improvement in the representation of the subgrid variability due to organized convection and the underlying two-way interactions through multiple scales and thus breaking with the quasi-equilibrium bottleneck. By design, the stochastic multi-cloud model (SMCM) mimics the life cycle of organized tropical convective systems and the interactions of the associated cloud types with each other and with large scales, as it is observed. It is based a lattice particle interaction model for predefined microscopic (subgrid) sites that make random transitions from one cloud type to another conditional to the large scale state. In return the SMCM provides the cloud type area fractions on the form of a Markov chain model which can be run in parallel with the climate model without any significant computational overhead. The SMCM was previously successfully tested in both reduced complexity tropical models and an aquaplanet global atmospheric model. Here, we report for the first time the results of its implementation in the fully coupled NCEP climate model (CFSv2) through the used of prescribed vertical profiles of heating and drying obtained from observations. While many known biases in CFSv2 have been slightly improved there are no noticeable degradation in the simulated mean climatology. Nonetheless, comparison with observations show that the improvements in terms of synoptic and intra-seasonnal variability are spectacular, despite the fact that CFSv2 is one of the best models in this regard. In particular, while CFSv2 exaggerates the intra-seasonnal variance at the expense of the synoptic contribution, the CFS-SMCM shows a good balance between the two as in the observations.
Emergence of long distance bird migrations: a new model integrating global climate changes
NASA Astrophysics Data System (ADS)
Louchart, Antoine
2008-12-01
During modern birds history, climatic and environmental conditions have evolved on wide scales. In a continuously changing world, landbirds annual migrations emerged and developed. However, models accounting for the origins of these avian migrations were formulated with static ecogeographic perspectives. Here I reviewed Cenozoic paleoclimatic and paleontological data relative to the palearctic paleotropical long distance (LD) migration system. This led to propose a new model for the origin of LD migrations, the ‘shifting home’ model (SHM). It is based on a dynamic perspective of climate evolution and may apply to the origins of most modern migrations. Non-migrant tropical African bird taxa were present at European latitudes during most of the Cenozoic. Their distribution limits shifted progressively toward modern tropical latitudes during periods of global cooling and increasing seasonality. In parallel, decreasing winter temperatures in the western Palearctic drove shifts of population winter ranges toward the equator. I propose that this induced the emergence of most short distance migrations, and in turn LD migrations. This model reconciliates ecologically tropical ancestry of most LD migrants with predominant winter range shifts, in accordance with requirements for heritable homing. In addition, it is more parsimonious than other non-exclusive models. Greater intrinsic plasticity of winter ranges implied by the SHM is supported by recently observed impacts of the present global warming on migrating birds. This may induce particular threats to some LD migrants. The ancestral, breeding homes of LD migrants were not ‘northern’ or ‘southern’ but shifted across high and middle latitudes while migrations emerged through winter range shifts themselves.
3D visualization of ultra-fine ICON climate simulation data
NASA Astrophysics Data System (ADS)
Röber, Niklas; Spickermann, Dela; Böttinger, Michael
2016-04-01
Advances in high performance computing and model development allow the simulation of finer and more detailed climate experiments. The new ICON model is based on an unstructured triangular grid and can be used for a wide range of applications, ranging from global coupled climate simulations down to very detailed and high resolution regional experiments. It consists of an atmospheric and an oceanic component and scales very well for high numbers of cores. This allows us to conduct very detailed climate experiments with ultra-fine resolutions. ICON is jointly developed in partnership with DKRZ by the Max Planck Institute for Meteorology and the German Weather Service. This presentation discusses our current workflow for analyzing and visualizing this high resolution data. The ICON model has been used for eddy resolving (<10km) ocean simulations, as well as for ultra-fine cloud resolving (120m) atmospheric simulations. This results in very large 3D time dependent multi-variate data that need to be displayed and analyzed. We have developed specific plugins for the free available visualization software ParaView and Vapor, which allows us to read and handle that much data. Within ParaView, we can additionally compare prognostic variables with performance data side by side to investigate the performance and scalability of the model. With the simulation running in parallel on several hundred nodes, an equal load balance is imperative. In our presentation we show visualizations of high-resolution ICON oceanographic and HDCP2 atmospheric simulations that were created using ParaView and Vapor. Furthermore we discuss our current efforts to improve our visualization capabilities, thereby exploring the potential of regular in-situ visualization, as well as of in-situ compression / post visualization.
Software architecture and design of the web services facilitating climate model diagnostic analysis
NASA Astrophysics Data System (ADS)
Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.
2015-12-01
Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.
NASA Astrophysics Data System (ADS)
Sengupta, D.; Gao, L.; Wilcox, E. M.; Beres, N. D.; Moosmüller, H.; Khlystov, A.
2017-12-01
Radiative forcing and climate change greatly depends on earth's surface albedo and its temporal and spatial variation. The surface albedo varies greatly depending on the surface characteristics ranging from 5-10% for calm ocean waters to 80% for some snow-covered areas. Clean and fresh snow surfaces have the highest albedo and are most sensitive to contamination with light absorbing impurities that can greatly reduce surface albedo and change overall radiative forcing estimates. Accurate estimation of snow albedo as well as understanding of feedbacks on climate from changes in snow-covered areas is important for radiative forcing, snow energy balance, predicting seasonal snowmelt, and run off rates. Such information is essential to inform timely decision making of stakeholders and policy makers. Light absorbing particles deposited onto the snow surface can greatly alter snow albedo and have been identified as a major contributor to regional climate forcing if seasonal snow cover is involved. However, uncertainty associated with quantification of albedo reduction by these light absorbing particles is high. Here, we use Mie theory (under the assumption of spherical snow grains) to reconstruct the single scattering parameters of snow (i.e., single scattering albedo ῶ and asymmetry parameter g) from observation-based size distribution information and retrieved refractive index values. The single scattering parameters of impurities are extracted with the same approach from datasets obtained during laboratory combustion of biomass samples. Instead of using plane-parallel approximation methods to account for multiple scattering, we have used the simple "Monte Carlo ray/photon tracing approach" to calculate the snow albedo. This simple approach considers multiple scattering to be the "collection" of single scattering events. Using this approach, we vary the effective snow grain size and impurity concentrations to explore the evolution of snow albedo over a wide wavelength range (300 nm - 2000 nm). Results will be compared with the SNICAR model to better understand the differences in snow albedo computation between plane-parallel methods and the statistical Monte Carlo methods.
Geomorphologic Mapping of a Last Glacial Maximum Moraine Sequence in the Far Eastern Tibetan Plateau
NASA Astrophysics Data System (ADS)
Lindsay, B. J.; Putnam, A. E.; Strand, P.; Radue, M. J.; Dong, G.; Kong, X.; Li, M.; Sheriff, M.; Stevens, J.
2017-12-01
The abrupt millennial-scale climate events of the last glacial cycle constitute an important component of the ice-age puzzle. A complete explanation of glacial cycles, and their rapid terminations, must account for these millennial climatic `flickers'. Here we present a glacial geomorphologic map of a moraine system in a formerly glaciated valley within the mountains of Litang County in the eastern Tibetan Plateau of China. Geomorphologic mapping was conducted by interpreting satellite imagery, structure-from-motion imagery and digital elevation models, and field observations. This map provides context for a parallel ongoing 10Be exposure-dating effort, the preliminary results of which may be available by the time of this 2017 AGU Fall Meeting. We interpret the mapped moraines to document the millennial-scale pulsebeat of glacier advances in this region during the peak of the last ice age. Because changes in mountain glacier extent in this region are driven by atmospheric temperature, these moraines record past millennial climate changes. Altogether this mapping and exposure-dating approach will provide insight into the mechanisms for millennial-scale glacier and climate fluctuations in the interior of Asia.
Oscillators and relaxation phenomena in Pleistocene climate theory
Crucifix, Michel
2012-01-01
Ice sheets appeared in the northern hemisphere around 3 Ma (million years) ago and glacial–interglacial cycles have paced Earth's climate since then. Superimposed on these long glacial cycles comes an intricate pattern of millennial and sub-millennial variability, including Dansgaard–Oeschger and Heinrich events. There are numerous theories about these oscillations. Here, we review a number of them in order to draw a parallel between climatic concepts and dynamical system concepts, including, in particular, the relaxation oscillator, excitability, slow–fast dynamics and homoclinic orbits. Namely, almost all theories of ice ages reviewed here feature a phenomenon of synchronization between internal climate dynamics and astronomical forcing. However, these theories differ in their bifurcation structure and this has an effect on the way the ice age phenomenon could grow 3 Ma ago. All theories on rapid events reviewed here rely on the concept of a limit cycle excited by changes in the surface freshwater balance of the ocean. The article also reviews basic effects of stochastic fluctuations on these models, including the phenomenon of phase dispersion, shortening of the limit cycle and stochastic resonance. It concludes with a more personal statement about the potential for inference with simple stochastic dynamical systems in palaeoclimate science. PMID:22291227
Understanding the origin of the solar cyclic activity for an improved earth climate prediction
NASA Astrophysics Data System (ADS)
Turck-Chièze, Sylvaine; Lambert, Pascal
This review is dedicated to the processes which could explain the origin of the great extrema of the solar activity. We would like to reach a more suitable estimate and prediction of the temporal solar variability and its real impact on the Earth climatic models. The development of this new field is stimulated by the SoHO helioseismic measurements and by some recent solar modelling improvement which aims to describe the dynamical processes from the core to the surface. We first recall assumptions on the potential different solar variabilities. Then, we introduce stellar seismology and summarize the main SOHO results which are relevant for this field. Finally we mention the dynamical processes which are presently introduced in new solar models. We believe that the knowledge of two important elements: (1) the magnetic field interplay between the radiative zone and the convective zone and (2) the role of the gravity waves, would allow to understand the origin of the grand minima and maxima observed during the last millennium. Complementary observables like acoustic and gravity modes, radius and spectral irradiance from far UV to visible in parallel to the development of 1D-2D-3D simulations will improve this field. PICARD, SDO, DynaMICCS are key projects for a prediction of the next century variability. Some helioseismic indicators constitute the first necessary information to properly describe the Sun-Earth climatic connection.
NASA Astrophysics Data System (ADS)
Robinson, Tyler D.; Crisp, David
2018-05-01
Solar and thermal radiation are critical aspects of planetary climate, with gradients in radiative energy fluxes driving heating and cooling. Climate models require that radiative transfer tools be versatile, computationally efficient, and accurate. Here, we describe a technique that uses an accurate full-physics radiative transfer model to generate a set of atmospheric radiative quantities which can be used to linearly adapt radiative flux profiles to changes in the atmospheric and surface state-the Linearized Flux Evolution (LiFE) approach. These radiative quantities describe how each model layer in a plane-parallel atmosphere reflects and transmits light, as well as how the layer generates diffuse radiation by thermal emission and by scattering light from the direct solar beam. By computing derivatives of these layer radiative properties with respect to dynamic elements of the atmospheric state, we can then efficiently adapt the flux profiles computed by the full-physics model to new atmospheric states. We validate the LiFE approach, and then apply this approach to Mars, Earth, and Venus, demonstrating the information contained in the layer radiative properties and their derivatives, as well as how the LiFE approach can be used to determine the thermal structure of radiative and radiative-convective equilibrium states in one-dimensional atmospheric models.
Red spruce (Picea rubens Sarg.) cold hardiness and freezing injury susceptibility. Chapter 18
Donald H. DeHayes; Paul G. Schaberg; G.Richard Strimbeck
2001-01-01
To survive subfreezing winter temperatmes, perennial plant species have evolved tissue-specific mechanisms to undergo changes in freezing tolerance that parallel seasonal variations in climate. As such, most northern temperate tree species, including conifers, are adapted to the habitat and climatic conditions within their natural ranges and suffer little or no...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Shujia; Duffy, Daniel; Clune, Thomas
The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratiomore » of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.« less
NASA Astrophysics Data System (ADS)
Millstein, D.; Brown, N. J.; Zhai, P.; Menon, S.
2012-12-01
We use the WRF/Chem model (Weather Research and Forecasting model with chemistry) and pollutant emissions based on the EPA National Emission Inventories from 2005 and 2008 to model regional climate and air quality over the continental United States. Additionally, 2030 emission scenarios are developed to investigate the effects of future enhancements to solar power generation. Modeling covered 6 summer and 6 winter weeks each year. We model feedback between aerosols and meteorology and thus capture direct and indirect aerosol effects. The grid resolution is 25 km and includes no nesting. Between 2005 and 2008 significant emission reductions were reported in the National Emission Inventory. The 2008 weekday emissions over the continental U.S. of SO2 and NO were reduced from 2005 values by 28% and 16%, respectively. Emission reductions of this magnitude are similar in scale to the potential emission reductions from various energy policy initiatives. By evaluating modeled and observed air quality changes from 2005 to 2008, we analyze how well the model represents the effects of historical emission changes. We also gain insight into how well the model might predict the effects of future emission changes. In addition to direct comparisons of model outputs to ground and satellite observations, we compare observed differences between 2005 and 2008 to corresponding modeled differences. Modeling was extended to future scenarios (2030) to simulate air quality and regional climate effects of large-scale adoption of solar power. The 2030-year was selected to allow time for development of solar generation infrastructure. The 2030 emission scenario was scaled, with separate factors for different economic sectors, from the 2008 National Emissions Inventory. The changes to emissions caused by the introduction of large-scale solar power (here assumed to be 10% of total energy generation) are based on results from a parallel project that used an electricity grid model applied over multiple regions across the country. The regional climate and air quality effects of future large-scale solar power adoption are analyzed in the context of uncertainty quantified by the dynamic evaluation of the historical (2005 and 2008) WRF/Chem simulations.
NASA Astrophysics Data System (ADS)
Kwiatkowski, L.; Yool, A.; Allen, J. I.; Anderson, T. R.; Barciela, R.; Buitenhuis, E. T.; Butenschön, M.; Enright, C.; Halloran, P. R.; Le Quéré, C.; de Mora, L.; Racault, M.-F.; Sinha, B.; Totterdell, I. J.; Cox, P. M.
2014-07-01
Ocean biogeochemistry (OBGC) models span a wide range of complexities from highly simplified, nutrient-restoring schemes, through nutrient-phytoplankton-zooplankton-detritus (NPZD) models that crudely represent the marine biota, through to models that represent a broader trophic structure by grouping organisms as plankton functional types (PFT) based on their biogeochemical role (Dynamic Green Ocean Models; DGOM) and ecosystem models which group organisms by ecological function and trait. OBGC models are now integral components of Earth System Models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here, we present an inter-comparison of six OBGC models that were candidates for implementation within the next UK Earth System Model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the Nucleus for the European Modelling of the Ocean (NEMO) ocean general circulation model (GCM), and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform or underperform all other models across all metrics. Nonetheless, the simpler models that are easier to tune are broadly closer to observations across a number of fields, and thus offer a high-efficiency option for ESMs that prioritise high resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low resolution climate dynamics and high complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry-climate interactions.
NASA Astrophysics Data System (ADS)
Kwiatkowski, L.; Yool, A.; Allen, J. I.; Anderson, T. R.; Barciela, R.; Buitenhuis, E. T.; Butenschön, M.; Enright, C.; Halloran, P. R.; Le Quéré, C.; de Mora, L.; Racault, M.-F.; Sinha, B.; Totterdell, I. J.; Cox, P. M.
2014-12-01
Ocean biogeochemistry (OBGC) models span a wide variety of complexities, including highly simplified nutrient-restoring schemes, nutrient-phytoplankton-zooplankton-detritus (NPZD) models that crudely represent the marine biota, models that represent a broader trophic structure by grouping organisms as plankton functional types (PFTs) based on their biogeochemical role (dynamic green ocean models) and ecosystem models that group organisms by ecological function and trait. OBGC models are now integral components of Earth system models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here we present an intercomparison of six OBGC models that were candidates for implementation within the next UK Earth system model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the ocean general circulation model Nucleus for European Modelling of the Ocean (NEMO) and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform all other models across all metrics. Nonetheless, the simpler models are broadly closer to observations across a number of fields and thus offer a high-efficiency option for ESMs that prioritise high-resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low-resolution climate dynamics and high-complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry-climate interactions.
New perspectives for European climate services: HORIZON2020
NASA Astrophysics Data System (ADS)
Bruning, Claus; Tilche, Andrea
2014-05-01
The developing of new end-to-end climate services was one of the core priorities of 7th Framework for Research and Technological Development of the European Commission and will become one of the key strategic priorities of Societal Challenge 5 of HORIZON2020 (the new EU Framework Programme for Research and Innovation 2014-2020). Results should increase the competitiveness of European businesses, and the ability of regional and national authorities to make effective decisions in climate-sensitive sectors. In parallel, the production of new tailored climate information should strengthen the resilience of the European society to climate change. In this perspective the strategy to support and foster the underpinning science for climate services in HORIZON2020 will be presented.
Heinrich events modeled in transient glacial simulations
NASA Astrophysics Data System (ADS)
Ziemen, Florian; Kapsch, Marie; Mikolajewicz, Uwe
2017-04-01
Heinrich events are among the most prominent events of climate variability recorded in proxies across the northern hemisphere. They are the archetype of ice sheet — climate interactions on millennial time scales. Nevertheless, the exact mechanisms that cause Heinrich events are still under debate, and their climatic consequences are far from being fully understood. We address open questions by studying Heinrich events in a coupled ice sheet model (ISM) atmosphere-ocean-vegetation general circulation model (AOVGCM) framework, where this variability occurs as part of the model generated internal variability. The framework consists of a northern hemisphere setup of the modified Parallel Ice Sheet Model (mPISM) coupled to the global AOVGCM ECHAM5/MPIOM/LPJ. The simulations were performed fully coupled and with transient orbital and greenhouse gas forcing. They span from several millennia before the last glacial maximum into the deglaciation. To make these long simulations feasible, the atmosphere is accelerated by a factor of 10 relative to the other model components using a periodical-synchronous coupling technique. To disentangle effects of the Heinrich events and the deglaciation, we focus on the events occurring before the deglaciation. The modeled Heinrich events show a peak ice discharge of about 0.05 Sv and raise the sea level by 2.3 m on average. The resulting surface water freshening reduces the Atlantic meridional overturning circulation and ocean heat release. The reduction in ocean heat release causes a sub-surface warming and decreases the air temperature and precipitation regionally and downstream into Eurasia. The surface elevation decrease of the ice sheet enhances moisture transport onto the ice sheet and thus increases precipitation over the Hudson Bay area, thereby accelerating the recovery after an event.
Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study
NASA Technical Reports Server (NTRS)
Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo
2013-01-01
One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.
A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts
NASA Astrophysics Data System (ADS)
Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel
2016-04-01
Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.
Stanley, Steven M.
2010-01-01
Conspicuous global stable carbon isotope excursions that are recorded in marine sedimentary rocks of Phanerozoic age and were associated with major extinctions have generally paralleled global stable oxygen isotope excursions. All of these phenomena are therefore likely to share a common origin through global climate change. Exceptional patterns for carbon isotope excursions resulted from massive carbon burial during warm intervals of widespread marine anoxic conditions. The many carbon isotope excursions that parallel those for oxygen isotopes can to a large degree be accounted for by the Q10 pattern of respiration for bacteria: As temperature changed along continental margins, where ∼90% of marine carbon burial occurs today, rates of remineralization of isotopically light carbon must have changed exponentially. This would have reduced organic carbon burial during global warming and increased it during global cooling. Also contributing to the δ13C excursions have been release and uptake of methane by clathrates, the positive correlation between temperature and degree of fractionation of carbon isotopes by phytoplankton at temperatures below ∼15°, and increased phytoplankton productivity during “icehouse” conditions. The Q10 pattern for bacteria and climate-related changes in clathrate volume represent positive feedbacks for climate change. PMID:21041682
NASA Astrophysics Data System (ADS)
Robles-Morua, A.; Vivoni, E. R.; Rivera-Fernandez, E. R.; Dominguez, F.; Meixner, T.
2012-12-01
Assessing the impact of climate change on large river basins in the southwestern United States is important given the natural water scarcity in the region. The bimodal distribution of annual precipitation also presents a challenge as differential climate impacts during the winter and summer seasons are not currently well understood. In this work, we focus on the hydrological consequences of climate change in the Santa Cruz and San Pedro river basins along the Arizona-Sonora border at high spatiotemporal resolutions (~100 m and ~1 hour). These river systems support rich ecological communities along riparian corridors that provide habitat to migratory birds and support recreational and economic activities. Determining the climate impacts on riparian communities involves assessing how river flows and groundwater recharge will change with altered temperature and precipitation regimes. In this study, we use a distributed hydrologic model, known as the TIN-based Real-time Integrated Basin Simulator (tRIBS), to generate simulated hydrological fields under historical (1991-2000) and climate change (2031-2040) scenarios obtained from an application of the Weather Research and Forecast (WRF) model. Using the distributed model, we transform the meteorological scenarios from WRF at 10-km, hourly resolution into predictions of the annual water budget, seasonal land surface fluxes and individual hydrographs of flood and recharge events. For this contribution, we selected two full years in the historical period and in the future scenario that represent wet and dry conditions for each decade. Given the size of the two basins, we rely on a high performance computing platform and a parallel domain discretization using sub-basin partitioning with higher resolutions maintained at experimental catchments in each river basin. Model simulations utilize best-available data across the Arizona-Sonora border on topography, land cover and soils obtained from analysis of remotely-sensed imagery and government databases. For the historical period, we build confidence in the model simulations through comparisons with streamflow estimates in the region. We also evaluate the WRF forcing outcomes with respect to meteorological inputs from ground rain gauges and the North American Land Data Assimilation System (NLDAS). We then analyze the high-resolution spatiotemporal predictions of soil moisture, evapotranspiration, runoff generation and recharge under past conditions and for the climate change scenario. A comparison with the historical period will yield a first-of-its-kind assessment at very high spatiotemporal resolution on the impacts of climate change on the hydrologic response of two large semiarid river basins of the southwestern United States.
Global warming preceded by increasing carbon dioxide concentrations during the last deglaciation.
Shakun, Jeremy D; Clark, Peter U; He, Feng; Marcott, Shaun A; Mix, Alan C; Liu, Zhengyu; Otto-Bliesner, Bette; Schmittner, Andreas; Bard, Edouard
2012-04-04
The covariation of carbon dioxide (CO(2)) concentration and temperature in Antarctic ice-core records suggests a close link between CO(2) and climate during the Pleistocene ice ages. The role and relative importance of CO(2) in producing these climate changes remains unclear, however, in part because the ice-core deuterium record reflects local rather than global temperature. Here we construct a record of global surface temperature from 80 proxy records and show that temperature is correlated with and generally lags CO(2) during the last (that is, the most recent) deglaciation. Differences between the respective temperature changes of the Northern Hemisphere and Southern Hemisphere parallel variations in the strength of the Atlantic meridional overturning circulation recorded in marine sediments. These observations, together with transient global climate model simulations, support the conclusion that an antiphased hemispheric temperature response to ocean circulation changes superimposed on globally in-phase warming driven by increasing CO(2) concentrations is an explanation for much of the temperature change at the end of the most recent ice age.
Efficient in-situ visualization of unsteady flows in climate simulation
NASA Astrophysics Data System (ADS)
Vetter, Michael; Olbrich, Stephan
2017-04-01
The simulation of climate data tends to produce very large data sets, which hardly can be processed in classical post-processing visualization applications. Typically, the visualization pipeline consisting of the processes data generation, visualization mapping and rendering is distributed into two parts over the network or separated via file transfer. Within most traditional post-processing scenarios the simulation is done on a supercomputer whereas the data analysis and visualization is done on a graphics workstation. That way temporary data sets with huge volume have to be transferred over the network, which leads to bandwidth bottlenecks and volume limitations. The solution to this issue is the avoidance of temporary storage, or at least significant reduction of data complexity. Within the Climate Visualization Lab - as part of the Cluster of Excellence "Integrated Climate System Analysis and Prediction" (CliSAP) at the University of Hamburg, in cooperation with the German Climate Computing Center (DKRZ) - we develop and integrate an in-situ approach. Our software framework DSVR is based on the separation of the process chain between the mapping and the rendering processes. It couples the mapping process directly to the simulation by calling methods of a parallelized data extraction library, which create a time-based sequence of geometric 3D scenes. This sequence is stored on a special streaming server with an interactive post-filtering option and then played-out asynchronously in a separate 3D viewer application. Since the rendering is part of this viewer application, the scenes can be navigated interactively. In contrast to other in-situ approaches where 2D images are created as part of the simulation or synchronous co-visualization takes place, our method supports interaction in 3D space and in time, as well as fixed frame rates. To integrate in-situ processing based on our DSVR framework and methods in the ICON climate model, we are continuously evolving the data structures and mapping algorithms of the framework to support the ICON model's native grid structures, since DSVR originally was designed for rectilinear grids only. We now have implemented a new output module to ICON to take advantage of the DSVR visualization. The visualization can be configured as most output modules by using a specific namelist and is exemplarily integrated within the non-hydrostatic atmospheric model time loop. With the integration of a DSVR based in-situ pathline extraction within ICON, a further milestone is reached. The pathline algorithm as well as the grid data structures have been optimized for the domain decomposition used for the parallelization of ICON based on MPI and OpenMP. The software implementation and evaluation is done on the supercomputers at DKRZ. In principle, the data complexity is reduced from O(n3) to O(m), where n is the grid resolution and m the number of supporting point of all pathlines. The stability and scalability evaluation is done using Atmospheric Model Intercomparison Project (AMIP) runs. We will give a short introduction in our software framework, as well as a short overview on the implementation and usage of DSVR within ICON. Furthermore, we will present visualization and evaluation results of sample applications.
The Co-evolution of Climate Models and the Intergovernmental Panel on Climate Change
NASA Astrophysics Data System (ADS)
Somerville, R. C.
2010-12-01
As recently as the 1950s, global climate models, or GCMs, did not exist, and the notion that man-made carbon dioxide might lead to significant climate change was not regarded as a serious possibility by most experts. Today, of course, the prospect or threat of exactly this type of climate change dominates the science and ranks among the most pressing issues confronting all mankind. Indeed, the prevailing scientific view throughout the first half of the twentieth century was that adding carbon dioxide to the atmosphere would have only a negligible effect on climate. The science of climate change caused by atmospheric carbon dioxide changes has thus undergone a genuine revolution. An extraordinarily rapid development of global climate models has also characterized this period, especially in the three decades since about 1980. In these three decades, the number of GCMs has greatly increased, and their physical and computational aspects have both markedly improved. Modeling progress has been enabled by many scientific advances, of course, but especially by a massive increase in available computer power, with supercomputer speeds increasing by roughly a factor of a million in the three decades from about 1980 to 2010. This technological advance has permitted a rapid increase in the physical comprehensiveness of GCMs as well as in spatial computational resolution. In short, GCMs have dramatically evolved over time, in exactly the same recent period as popular interest and scientific concern about anthropogenic climate change have markedly increased. In parallel, a unique international organization, the Intergovernmental Panel on Climate Change, or IPCC, has also recently come into being and also evolved rapidly. Today, the IPCC has become widely respected and globally influential. The IPCC was founded in 1988, and its history is thus even shorter than that of GCMs. Yet, its stature today is such that a series of IPCC reports assessing climate change science has already been endorsed by many leading scientific professional societies and academies of science worldwide. These reports are considered as definitive summaries of the state of the science. In 2007, in recognition of its exceptional accomplishments, the IPCC shared the Nobel Peace Prize equally with Al Gore. The present era is characterized not only by the reality and seriousness of human-caused climate change, but also by a young yet powerful science that enables us to understand much about the climate change that has occurred already and that awaits in the future. The development of GCMs is a critical part of the scientific story, and the development of the IPCC is a key factor in connecting the science to the perceptions and priorities of the global public and policymakers. GCMs and the IPCC have co-evolved and strongly influenced one another, as both scientists and the world at large have worked to confront the challenge of climate change.
Millennial-Scale Temperature Change Velocity in the Continental Northern Neotropics
Correa-Metrio, Alexander; Bush, Mark; Lozano-García, Socorro; Sosa-Nájera, Susana
2013-01-01
Climate has been inherently linked to global diversity patterns, and yet no empirical data are available to put modern climate change into a millennial-scale context. High tropical species diversity has been linked to slow rates of climate change during the Quaternary, an assumption that lacks an empirical foundation. Thus, there is the need for quantifying the velocity at which the bioclimatic space changed during the Quaternary in the tropics. Here we present rates of climate change for the late Pleistocene and Holocene from Mexico and Guatemala. An extensive modern pollen survey and fossil pollen data from two long sedimentary records (30,000 and 86,000 years for highlands and lowlands, respectively) were used to estimate past temperatures. Derived temperature profiles show a parallel long-term trend and a similar cooling during the Last Glacial Maximum in the Guatemalan lowlands and the Mexican highlands. Temperature estimates and digital elevation models were used to calculate the velocity of isotherm displacement (temperature change velocity) for the time period contained in each record. Our analyses showed that temperature change velocities in Mesoamerica during the late Quaternary were at least four times slower than values reported for the last 50 years, but also at least twice as fast as those obtained from recent models. Our data demonstrate that, given extremely high temperature change velocities, species survival must have relied on either microrefugial populations or persistence of suppressed individuals. Contrary to the usual expectation of stable climates being associated with high diversity, our results suggest that Quaternary tropical diversity was probably maintained by centennial-scale oscillatory climatic variability that forestalled competitive exclusion. As humans have simplified modern landscapes, thereby removing potential microrefugia, and climate change is occurring monotonically at a very high velocity, extinction risk for tropical species is higher than at any time in the last 86,000 years. PMID:24312614
Millennial-scale temperature change velocity in the continental northern Neotropics.
Correa-Metrio, Alexander; Bush, Mark; Lozano-García, Socorro; Sosa-Nájera, Susana
2013-01-01
Climate has been inherently linked to global diversity patterns, and yet no empirical data are available to put modern climate change into a millennial-scale context. High tropical species diversity has been linked to slow rates of climate change during the Quaternary, an assumption that lacks an empirical foundation. Thus, there is the need for quantifying the velocity at which the bioclimatic space changed during the Quaternary in the tropics. Here we present rates of climate change for the late Pleistocene and Holocene from Mexico and Guatemala. An extensive modern pollen survey and fossil pollen data from two long sedimentary records (30,000 and 86,000 years for highlands and lowlands, respectively) were used to estimate past temperatures. Derived temperature profiles show a parallel long-term trend and a similar cooling during the Last Glacial Maximum in the Guatemalan lowlands and the Mexican highlands. Temperature estimates and digital elevation models were used to calculate the velocity of isotherm displacement (temperature change velocity) for the time period contained in each record. Our analyses showed that temperature change velocities in Mesoamerica during the late Quaternary were at least four times slower than values reported for the last 50 years, but also at least twice as fast as those obtained from recent models. Our data demonstrate that, given extremely high temperature change velocities, species survival must have relied on either microrefugial populations or persistence of suppressed individuals. Contrary to the usual expectation of stable climates being associated with high diversity, our results suggest that Quaternary tropical diversity was probably maintained by centennial-scale oscillatory climatic variability that forestalled competitive exclusion. As humans have simplified modern landscapes, thereby removing potential microrefugia, and climate change is occurring monotonically at a very high velocity, extinction risk for tropical species is higher than at any time in the last 86,000 years.
NASA Astrophysics Data System (ADS)
Richetti, J.; Ahmad, I.; Aristizabal, F.; Judge, J.
2017-12-01
Determining maize agricultural production under climate variability is valuable to policy makers in Pakistan since maize is the third most produced crop by area after wheat and rice. This study aims to predict the maize production under climate variability. Two-hundred ground truth points of both maize and non-maize land covers were collected from the Faisalabad district during the growing seasons of 2015 and 2016. Landsat-8 images taken in second week of May which correspond spatially and temporally to the local, peak growing season for maize were gathered. For classifying the region training data was constructed for a variety of machine learning algorithms by sampling the second, third, and fourth bands of the Landsat-8 imagery at these reference locations. Cross validation was used for parameter tuning as well as estimating the generalized performances. All the classifiers resulted in overall accuracies of greater than 90% for both years and a support vector machine with a radial basis kernel recorded the maximum accuracy of 97%. The tuned models were used to determine the spatial distribution of maize fields for both growing seasons in the Faisalabad district using parallel processing to improve computation time. The overall classified maize growing area represented 12% difference than that reported by the Crop Reporting Service (CRS) of Punjab Pakistan for both 2015 and 2016. For the agricultural production normalized difference vegetation index from Landsat-8 and climate indicators from ground stations will be used as inputs in a variety of machine learning regression algorithms. The expected results will be compared to actual yield from 64 commercial farms. To verify the impact of climate variability in the maize agricultural production historical climate data from previous 30 years will be used in the developed model to asses the impact of climate variability on the maize production.
NASA Astrophysics Data System (ADS)
Gusev, Anatoly; Diansky, Nikolay; Zalesny, Vladimir
2010-05-01
The original program complex is proposed for the ocean circulation sigma-model, developed in the Institute of Numerical Mathematics (INM), Russian Academy of Sciences (RAS). The complex can be used in various curvilinear orthogonal coordinate systems. In addition to ocean circulation model, the complex contains a sea ice dynamics and thermodynamics model, as well as the original system of the atmospheric forcing implementation on the basis of both prescribed meteodata and atmospheric model results. This complex can be used as the oceanic block of Earth climate model as well as for solving the scientific and practical problems concerning the World ocean and its separate oceans and seas. The developed program complex can be effectively used on parallel shared memory computational systems and on contemporary personal computers. On the base of the complex proposed the ocean general circulation model (OGCM) was developed. The model is realized in the curvilinear orthogonal coordinate system obtained by the conformal transformation of the standard geographical grid that allowed us to locate the system singularities outside the integration domain. The horizontal resolution of the OGCM is 1 degree on longitude, 0.5 degree on latitude, and it has 40 non-uniform sigma-levels in depth. The model was integrated for 100 years starting from the Levitus January climatology using the realistic atmospheric annual cycle calculated on the base of CORE datasets. The experimental results showed us that the model adequately reproduces the basic characteristics of large-scale World Ocean dynamics, that is in good agreement with both observational data and results of the best climatic OGCMs. This OGCM is used as the oceanic component of the new version of climatic system model (CSM) developed in INM RAS. The latter is now ready for carrying out the new numerical experiments on climate and its change modelling according to IPCC (Intergovernmental Panel on Climate Change) scenarios in the scope of the CMIP-5 (Coupled Model Intercomparison Project). On the base of the complex proposed the Pacific Ocean circulation eddy-resolving model was realized. The integration domain covers the Pacific from Equator to Bering Strait. The model horizontal resolution is 0.125 degree and it has 20 non-uniform sigma-levels in depth. The model adequately reproduces circulation large-scale structure and its variability: Kuroshio meandering, ocean synoptic eddies, frontal zones, etc. Kuroshio high variability is shown. The distribution of contaminant was simulated that is admittedly wasted near Petropavlovsk-Kamchatsky. The results demonstrate contaminant distribution structure and provide us understanding of hydrological fields formation processes in the North-West Pacific.
Large-Scale Transport Responses to Tropospheric Circulation Changes Using GEOS-5
NASA Technical Reports Server (NTRS)
Orbe, Clara; Molod, Andrea; Arnold, Nathan; Waugh, Darryn W.; Yang, Huang
2017-01-01
The mean age since air was last at the Northern Hemisphere midlatitude surface is a fundamental property of tropospheric transport. Recent comparisons among chemistry climate models, however, reveal that there are large differences in the mean age among models and that these differences are most likely related to differences in tropical (parameterized) convection. Here we use aquaplanet simulations of the Goddard Earth Observing System Model Version 5 (GEOS-5) to explore the sensitivity of the mean age to changes in the tropical circulation. Tropical circulation changes are forced by prescribed localized off-equatorial warm sea surface temperature anomalies that (qualitatively) reproduce the convection and circulation differences among the comprehensive models. Idealized chemical species subject to prescribed OH loss are also integrated in parallel in order to illustrate the impact of tropical transport changes on interhemispheric constituent transport.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.
2012-12-01
MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.
NASA Wrangler: Automated Cloud-Based Data Assembly in the RECOVER Wildfire Decision Support System
NASA Technical Reports Server (NTRS)
Schnase, John; Carroll, Mark; Gill, Roger; Wooten, Margaret; Weber, Keith; Blair, Kindra; May, Jeffrey; Toombs, William
2017-01-01
NASA Wrangler is a loosely-coupled, event driven, highly parallel data aggregation service designed to take advantageof the elastic resource capabilities of cloud computing. Wrangler automatically collects Earth observational data, climate model outputs, derived remote sensing data products, and historic biophysical data for pre-, active-, and post-wildfire decision making. It is a core service of the RECOVER decision support system, which is providing rapid-response GIS analytic capabilities to state and local government agencies. Wrangler reduces to minutes the time needed to assemble and deliver crucial wildfire-related data.
Somatic growth dynamics of West Atlantic hawksbill sea turtles: a spatio-temporal perspective
Bjorndal, Karen A.; Chaloupka, Milani; Saba, Vincent S.; Diez, Carlos E.; van Dam, Robert P.; Krueger, Barry H.; Horrocks, Julia A.; Santos, Armando J.B.; Bellini, Cláudio; Marcovaldi, Maria A.G.; Nava, Mabel; Willis, Sue; Godley, Brendan J.; Gore, Shannon; Hawkes, Lucy A.; McGowan, Andrew; Witt, Matthew J.; Stringell, Thomas B.; Sanghera, Amdeep; Richardson, Peter B.; Broderick, Annette C.; Phillips, Quinton; Calosso, Marta C.; Claydon, John A.B.; Blumenthal, Janice; Moncada, Felix; Nodarse, Gonzalo; Medina, Yosvani; Dunbar, Stephen G.; Wood, Lawrence D.; Lagueux, Cynthia J.; Campbell, Cathi L.; Meylan, Anne B.; Meylan, Peter A.; Burns Perez, Virginia R.; Coleman, Robin A.; Strindberg, Samantha; Guzmán-H, Vicente; Hart, Kristen M.; Cherkiss, Michael S.; Hillis-Starr, Zandy; Lundgren, Ian; Boulon, Ralf H.; Connett, Stephen; Outerbridge, Mark E.; Bolten, Alan B.
2016-01-01
Somatic growth dynamics are an integrated response to environmental conditions. Hawksbill sea turtles (Eretmochelys imbricata) are long-lived, major consumers in coral reef habitats that move over broad geographic areas (hundreds to thousands of kilometers). We evaluated spatio-temporal effects on hawksbill growth dynamics over a 33-yr period and 24 study sites throughout the West Atlantic and explored relationships between growth dynamics and climate indices. We compiled the largest ever data set on somatic growth rates for hawksbills – 3541 growth increments from 1980 to 2013. Using generalized additive mixed model analyses, we evaluated 10 covariates, including spatial and temporal variation, that could affect growth rates. Growth rates throughout the region responded similarly over space and time. The lack of a spatial effect or spatio-temporal interaction and the very strong temporal effect reveal that growth rates in West Atlantic hawksbills are likely driven by region-wide forces. Between 1997 and 2013, mean growth rates declined significantly and steadily by 18%. Regional climate indices have significant relationships with annual growth rates with 0- or 1-yr lags: positive with the Multivariate El Niño Southern Oscillation Index (correlation = 0.99) and negative with Caribbean sea surface temperature (correlation = −0.85). Declines in growth rates between 1997 and 2013 throughout the West Atlantic most likely resulted from warming waters through indirect negative effects on foraging resources of hawksbills. These climatic influences are complex. With increasing temperatures, trajectories of decline of coral cover and availability in reef habitats of major prey species of hawksbills are not parallel. Knowledge of how choice of foraging habitats, prey selection, and prey abundance are affected by warming water temperatures is needed to understand how climate change will affect productivity of consumers that live in association with coral reefs. Main conclusions The decadal declines in growth rates between 1997 and 2013 throughout the West Atlantic most likely resulted from warming waters through indirect negative effects on the foraging resources of hawksbills. These climatic influences are complex. With increasing temperatures, the trajectories of decline of coral cover and availability in reef habitats of major prey species of hawksbills are not parallel. Knowledge of how choice of foraging habitats, prey selection, and prey abundance are affected by warming water temperatures is needed to understand how climate change will affect productivity of consumers that live in association with coral reefs.
Recent Climate and Ice-Sheet Changes in West Antarctica Compared with the Past 2,000 Years
NASA Technical Reports Server (NTRS)
Steig, Eric J.; Ding, Qinghua; White, James W.; Kuttel, Marcel; Rupper, Summer B.; Neumann, Thomas Allen; Neff, Peter D.; Gallant, Ailie J. E.; Mayewski, Paul A.; Taylor, Kendrick C.;
2013-01-01
Changes in atmospheric circulation over the past five decades have enhanced the wind-driven inflow of warm ocean water onto the Antarctic continental shelf, where it melts ice shelves from below1-3. Atmospheric circulation changes have also caused rapid warming4 over the West Antarctic Ice Sheet, and contributed to declining sea-ice cover in the adjacent Amundsen-Bellingshausen seas5. It is unknown whether these changes are part of a longer-term trend. Here, we use waterisotope (Delta O-18) data from an array of ice-core records to place recent West Antarctic climate changes in the context of the past two millennia. We find that the d18O of West Antarctic precipitation has increased significantly in the past 50 years, in parallel with the trend in temperature, and was probably more elevated during the 1990s than at any other time during the past 200 years. However, Delta O-18 anomalies comparable to those of recent decades occur about 1% of the time over the past 2,000 years. General circulation model simulations suggest that recent trends in Delta O-18 and climate in West Antarctica cannot be distinguished from decadal variability that originates in the tropics. We conclude that the uncertain trajectory of tropical climate variability represents a significant source of uncertainty in projections of West Antarctic climate and ice-sheet change.
Health Impacts of Air Pollution Under a Changing Climate
NASA Astrophysics Data System (ADS)
Kinney, P. L.; Knowlton, K.; Rosenthal, J.; Hogrefe, C.; Rosenzweig, C.; Solecki, W.
2003-12-01
Outdoor air pollution remains a serious public health problem in cities throughout the world. In the US, despite considerable progress in reducing emissions over the past 30 years, as many as 50,000 premature deaths each year have been attributed to airborne particulate matter alone. Tropospheric ozone has been associated with increased daily mortality and hospitalization rates, and with a variety of related respiratory problems. Weather plays an important role in the transport and transformation of air pollution. In particular, a warming climate is likely to promote the atmospheric reactions that are responsible for ozone and secondary aerosol production, as well as increasing emissions of many of their volatile precursors. Increasingly, efforts to address urban air pollution problems throughout the world will be complicated by trends and variability in climate. The New York Climate and Health Project (NYCHP) is developing and applying tools for integrated assessment of health impacts from air pollution and heat associated with climate and land-use changes in the New York City metropolitan region. Global climate change is modeled over the 21st century based on the Intergovernmental Panel on Climate Change (IPCC) A2 greenhouse gas emissions scenario using the Goddard Institute for Space Studies (GISS) Global Atmosphere-Ocean Model (GCM). Meteorological fields are downscaled to a 36 km grid over the eastern US using the Penn State/NCAR MM5 mesoscale meteorological model. MM5 results are then used as input to the Community Multiscale Air Quality (CMAQ) model for simulating air quality, with emissions based on the Sparse Matrix Operator Kernel Emissions Modeling System (SMOKE). To date, simulations have been performed for five summer seasons each during the 1990s and the 2050s. An evaluation of the present-day climate and air quality predictions indicates that the modeling system largely captures the observed climate-ozone system. Analysis of future-year predictions shows an increase in temperature and humidity as well as mean and extreme ozone concentrations under the IPCC A2 emission scenario. To address public health impacts, a risk assessment framework is used to estimate ozone-related mortality in the region, with a focus on comparing health impact estimates for the 1990s versus the 2050s. This endpoint represents a potentially appreciable public health impact resulting from climate change-induced alterations in regional air quality profiles. Concentration-response functions from the epidemiological literature describing ozone-mortality relationships are used to estimate numbers of regional deaths in a typical 1990s summer and a typical 2050s summer. Preliminary analysis of future-year ozone-related mortality suggests a subtle increase in the number of summer ozone-related deaths in the New York region in the 2050s as compared to the 1990s. A parallel evaluation of heat-related mortality in a typical summer of the 2050s suggests a greater relative increase as compared to the 1990s, with a doubling to tripling of regional summer heat deaths possible by the 2050s.
Modernizing Earth and Space Science Modeling Workflows in the Big Data Era
NASA Astrophysics Data System (ADS)
Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.
2017-12-01
Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and automation in the near term, and longer term investments in virtualized environments for improved scalability, tolerance for lossy data compression, novel data-centric memory and storage technologies, and tools for peer reviewing, preserving and sharing workflows, as well as fundamental statistical and machine learning algorithms.
The stability of ground ice in the equatorial region of Mars
NASA Technical Reports Server (NTRS)
Clifford, S. M.; Hillel, D.
1983-01-01
The lifetime of an unreplenished layer of ground ice lying within 30 deg of the Martian equator was examined within the context of the existing data base on Martian regolith and climate. Data on the partial pressure of H2O in the Martian atmosphere and the range of mean annual temperatures indicated the ground ice would be restricted to latitudes poleward of 40 deg. However, the ground ice near the poles may be a relic from early Martian geologic times held in place by a thin layer of regolith. Consideration of twelve model pore size distributions, similar to silt- and clay-type earth soils, was combined with a parallel pore model of gaseous diffusion to calculate the flux of H2O molecules escaping from the subsurface ground ice layer. Martian equatorial ground ice was found to be influenced by the soil structure, the magnitude of the geothermal gradient, the climatic desorption of CO2 from the regolith. It is concluded that equatorial ground ice is present on Mars only if a process of replenishment is active.
Simplified Parallel Domain Traversal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson III, David J
2011-01-01
Many data-intensive scientific analysis techniques require global domain traversal, which over the years has been a bottleneck for efficient parallelization across distributed-memory architectures. Inspired by MapReduce and other simplified parallel programming approaches, we have designed DStep, a flexible system that greatly simplifies efficient parallelization of domain traversal techniques at scale. In order to deliver both simplicity to users as well as scalability on HPC platforms, we introduce a novel two-tiered communication architecture for managing and exploiting asynchronous communication loads. We also integrate our design with advanced parallel I/O techniques that operate directly on native simulation output. We demonstrate DStep bymore » performing teleconnection analysis across ensemble runs of terascale atmospheric CO{sub 2} and climate data, and we show scalability results on up to 65,536 IBM BlueGene/P cores.« less
Risk Assessment of Hurricane Storm Surge for Tampa Bay
NASA Astrophysics Data System (ADS)
Lin, N.; Emanuel, K.
2011-12-01
Hurricane storm surge presents a major hazard for the United States and many other coastal areas around the world. Risk assessment of current and future hurricane storm surge provides the basis for risk mitigation and related decision making. This study investigates the hurricane surge risk for Tampa Bay, located on the central west coast of Florida. Although fewer storms have made landfall in the central west Florida than in regions farther west in the Gulf of Mexico and the east coast of U.S., Tampa Bay is highly vulnerable to storm surge due to its geophysical features. It is surrounded by low-lying lands, much of which may be inundated by a storm tide of 6 m. Also, edge waves trapped on the west Florida shelf can propagate along the coastline and affect the sea level outside the area of a forced storm surge; Tampa Bay may be affected by storms traversing some distance outside the Bay. Moreover, when the propagation speed of the edge wave is close to that of a storm moving parallel to the coast, resonance may occur and the water elevation in the Bay may be greatly enhanced. Therefore, Tampa Bay is vulnerable to storms with a broad spectrum of characteristics. We apply a model-based risk assessment method to carry out the investigation. To estimate the current surge risk, we apply a statistical/deterministic hurricane model to generate a set of 1500 storms for the Tampa area, under the observed current climate (represented by 1981-2000 statistics) estimated from the NCAR/NCEP reanalysis. To study the effect of climate change, we use four climate models, CNRM-CM3, ECHAM, GFDL-CM2.0, and MIROC3.2, respectively, to drive the hurricane model to generate four sets of 1500 Tampa storms under current climate conditions (represented by 1981-2000 statistics) and another four under future climate conditions of the IPCC-AR4 A1B emission scenario (represented by 2081-2100 statistics). Then, we apply two hydrodynamic models, the Advanced Circulation (ADCIRC) model and the Sea, Lake, and Overland Surges from Hurricanes (SLOSH) model with grids of various resolutions to simulate the surges induced by the synthetic storms. The surge risk under each of the climate scenarios is depicted by a surge return level curve (exceedance probability curve). For the city of Tampa, the heights of the 100-year, 500-year, and 1000-year surges under the current climate are estimated to be 3.85, 5.66, and 6.31 m, respectively. Two of the four climate models predict that surge return periods will be significantly shortened in the future climates due to the change of storm climatology; the current 100-year surge may happen every 50 years or less, the 500-year surge every 200 years or less, and the 1000-year surge every 300 years or less. The other two climate models predict that the surge return periods will become moderately shorter or remain almost unchanged in the future climates. Extreme surges up to 12 m at Tampa appear in our simulations. Although occurring with very small probabilities, these extreme surges would have a devastating impact on the Tampa Bay area. By examining the generated synthetic surge database, we study the characteristics of the extreme storms at Tampa Bay, especially for the storms that may interact with edge waves along the Florida west coast.
Climate Change and Implications for Prevention. California's Efforts to Provide Leadership.
Balmes, John R
2018-04-01
The atmospheric concentration of carbon dioxide (CO 2 ) and the temperature of the earth's surface have been rising in parallel for decades, with the former recently reaching 400 parts per million, consistent with a 1.5°C increase in global warming. Climate change models predict that a "business as usual" approach, that is, no effort to control CO 2 emissions from combustion of fossil fuels, will result in a more than 2°C increase in annual average surface temperature by approximately 2034. With atmospheric warming comes increased air pollution. The concept of a "climate gap" in air quality control captures the decreased effectiveness of regulatory policies to reduce pollution with a hotter climate. Sources of greenhouse gases and climate-forcing aerosols ("black carbon") are the same sources of air pollutants that harm health. California has adopted robust climate change mitigation policies that are also designed to achieve public health cobenefits by improving air quality. These policies include advanced clean car standards, renewable energy, a sustainable communities strategy to limit suburban sprawl, a low carbon fuel standard, and energy efficiency. A market-based mechanism to put a price on CO 2 emissions is the cap-and-trade program that allows capped facilities to trade state-issued greenhouse gas emissions allowances. The "cap" limits total greenhouse gas emissions from all covered sources, and declines over time to progressively reduce emissions. An alternative approach is a carbon tax. California's leadership on air quality and climate change mitigation is increasingly important, given the efforts to slow or even reverse implementation of such policies at the U.S. national level.
NASA Astrophysics Data System (ADS)
Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.
2015-12-01
As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.
Stomata: key players in the earth system, past and present.
Berry, Joseph A; Beerling, David J; Franks, Peter J
2010-06-01
Stomata have played a key role in the Earth System for at least 400 million years. By enabling plants to control the rate of evaporation from their photosynthetic organs, stomata helped to set in motion non-linear processes that led to an acceleration of the hydrologic cycle over the continents and an expansion of climate zones favorable for plant life. Global scale modeling of land-atmosphere interactions provides a way to explore parallels between the influence of vegetation on climate over time, and the influence of spatial and temporal variation in the activities of vegetation in the current Earth System on climate and weather. We use the logic in models that simulate land-atmosphere interactions to illustrate the central role played by stomatal conductance in the Earth System. In the modeling context, most of the activities of plants and their manifold interactions with their genomes and with the environment are communicated to the atmosphere through a single property: the aperture or conductance of their stomata. We tend to think of the controls on vegetation responses in the real world as being distributed among factors such as seasonal patterns of growth, the changing availability of soil water, or changes in light intensity and leaf water potential over a day. However, the impact of these controls on crucial exchanges of energy and water vapor with the atmosphere are also largely mediated by stomata. The decisions 'made by' stomata emerge as an important and inadequately understood component of these models. At the present time we lack effective ways to link advances in the biology of stomata to this decision making process. While not unusual, this failure to connect between disciplines, introduces uncertainty in modeling studies being used to predict weather and climate change and ultimately to inform policy decisions. This problem is also an opportunity.
NASA Astrophysics Data System (ADS)
Robles-Morua, A.; Vivoni, E.; Rivera-Fernandez, E. R.; Dominguez, F.; Meixner, T.
2013-05-01
Hydrologic modeling using high spatiotemporal resolution satellite precipitation products in the southwestern United States and northwest Mexico is important given the sparse nature of available rain gauges. In addition, the bimodal distribution of annual precipitation also presents a challenge as differential climate impacts during the winter and summer seasons are not currently well understood. In this work, we focus on hydrological comparisons using rainfall forcing from a satellite-based product, downscaled GCM precipitation estimates and available ground observations. The simulations are being conducted in the Santa Cruz and San Pedro river basins along the Arizona-Sonora border at high spatiotemporal resolutions (~100 m and ~1 hour). We use a distributed hydrologic model, known as the TIN-based Real-time Integrated Basin Simulator (tRIBS), to generate simulated hydrological fields under historical (1991-2000) and climate change (2031-2040) scenarios obtained from an application of the Weather Research and Forecast (WRF) model. Using the distributed model, we transform the meteorological scenarios at 10-km, hourly resolution into predictions of the annual water budget, seasonal land surface fluxes and individual hydrographs of flood and recharge events. We compare the model outputs and rainfall fields of the WRF products against the forcing from the North American Land Data Assimilation System (NLDAS) and available ground observations from the National Climatic Data Center (NCDC) and Arizona Meteorological Network (AZMET). For this contribution, we selected two full years in the historical period and in the future scenario that represent wet and dry conditions for each decade. Given the size of the two basins, we rely on a high performance computing platform and a parallel domain discretization with higher resolutions maintained at experimental catchments in each river basin. Model simulations utilize best-available data across the Arizona-Sonora border on topography, land cover and soils obtained from analysis of remotely-sensed imagery and government databases. In addition, for the historical period, we build confidence in the model simulations through comparisons with streamflow estimates in the region. The model comparisons during the historical and future periods will yield a first-of-its-kind assessment on the impacts of climate change on the hydrology of two large semiarid river basins of the southwestern United States
Study of phase clustering method for analyzing large volumes of meteorological observation data
NASA Astrophysics Data System (ADS)
Volkov, Yu. V.; Krutikov, V. A.; Botygin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.
2017-11-01
The article describes an iterative parallel phase grouping algorithm for temperature field classification. The algorithm is based on modified method of structure forming by using analytic signal. The developed method allows to solve tasks of climate classification as well as climatic zoning for any time or spatial scale. When used to surface temperature measurement series, the developed algorithm allows to find climatic structures with correlated changes of temperature field, to make conclusion on climate uniformity in a given area and to overview climate changes over time by analyzing offset in type groups. The information on climate type groups specific for selected geographical areas is expanded by genetic scheme of class distribution depending on change in mutual correlation level between ground temperature monthly average.
Impacts of Climate Change on Management of the Colorado River Reservoir System
NASA Astrophysics Data System (ADS)
Christensen, N. S.; Lettenmaier, D. P.
2002-05-01
The Colorado River system provides water supply to a large area of the interior west. It drains a mostly arid area, with naturalized flow (effects of reservoirs and diversions removed) averaging only 40 mm/yr over the 630,000 km2 drainage area at the mouth of the river. Total reservoir storage (mostly behind Hoover and Glen Canyon Dams) is equivalent to over four times the mean flow of the river. Runoff is heavily dominated by high elevation source areas in the Rocky Mountain headwaters, and the seasonal runoff pattern throughout the Colorado basin is strongly dominated by winter snow accumulation and spring melt. Because of the arid nature of the basin and the low runoff per unit area, performance of the reservoir system is potentially susceptible to changes in streamflow that would result from global warming, although those manifestations are somewhat different than elsewhere in the west where reservoir storage is relatively much smaller. We evaluate, using the macroscale Variable Infiltration Capacity (VIC) model, possible changes in streamflow over the next century using three 100-year ensemble climate simulations of the NCAR/DOE Parallel Climate Model corresponding to business-as-usual (BAU) future greenhouse gas emissions. Single ensemble simulations of the U.K. Hadley Center, and the Max Planck Institute, are considered as well. For most of the climate scenarios, the peak runoff shifts about one month earlier relative to the recent past. However, unlike reservoir systems elsewhere in the west, the effect of these timing shifts is largely mitigated by the size of the reservoir system, and changes in reservoir system reliability (for agricultural water supply and hydropower production) are dominated by streamflow volume shifts, which vary considerably across the climate scenarios.
A Petascale Non-Hydrostatic Atmospheric Dynamical Core in the HOMME Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tufo, Henry
The High-Order Method Modeling Environment (HOMME) is a framework for building scalable, conserva- tive atmospheric models for climate simulation and general atmospheric-modeling applications. Its spatial discretizations are based on Spectral-Element (SE) and Discontinuous Galerkin (DG) methods. These are local methods employing high-order accurate spectral basis-functions that have been shown to perform well on massively parallel supercomputers at any resolution and scale particularly well at high resolutions. HOMME provides the framework upon which the CAM-SE community atmosphere model dynamical-core is constructed. In its current incarnation, CAM-SE employs the hydrostatic primitive-equations (PE) of motion, which limits its resolution to simulations coarser thanmore » 0.1 per grid cell. The primary objective of this project is to remove this resolution limitation by providing HOMME with the capabilities needed to build nonhydrostatic models that solve the compressible Euler/Navier-Stokes equations.« less
POM.gpu-v1.0: a GPU-based Princeton Ocean Model
NASA Astrophysics Data System (ADS)
Xu, S.; Huang, X.; Oey, L.-Y.; Xu, F.; Fu, H.; Zhang, Y.; Yang, G.
2015-09-01
Graphics processing units (GPUs) are an attractive solution in many scientific applications due to their high performance. However, most existing GPU conversions of climate models use GPUs for only a few computationally intensive regions. In the present study, we redesign the mpiPOM (a parallel version of the Princeton Ocean Model) with GPUs. Specifically, we first convert the model from its original Fortran form to a new Compute Unified Device Architecture C (CUDA-C) code, then we optimize the code on each of the GPUs, the communications between the GPUs, and the I / O between the GPUs and the central processing units (CPUs). We show that the performance of the new model on a workstation containing four GPUs is comparable to that on a powerful cluster with 408 standard CPU cores, and it reduces the energy consumption by a factor of 6.8.
Interaction Between Ecohydrologic Dynamics and Microtopographic Variability Under Climate Change
NASA Astrophysics Data System (ADS)
Le, Phong V. V.; Kumar, Praveen
2017-10-01
Vegetation acclimation resulting from elevated atmospheric CO2 concentration, along with response to increased temperature and altered rainfall pattern, is expected to result in emergent behavior in ecologic and hydrologic functions. We hypothesize that microtopographic variability, which are landscape features typically of the length scale of the order of meters, such as topographic depressions, will play an important role in determining this dynamics by altering the persistence and variability of moisture. To investigate these emergent ecohydrologic dynamics, we develop a modeling framework, Dhara, which explicitly incorporates the control of microtopographic variability on vegetation, moisture, and energy dynamics. The intensive computational demand from such a modeling framework that allows coupling of multilayer modeling of the soil-vegetation continuum with 3-D surface-subsurface flow processes is addressed using hybrid CPU-GPU parallel computing framework. The study is performed for different climate change scenarios for an intensively managed agricultural landscape in central Illinois, USA, which is dominated by row-crop agriculture, primarily soybean (Glycine max) and maize (Zea mays). We show that rising CO2 concentration will decrease evapotranspiration, thus increasing soil moisture and surface water ponding in topographic depressions. However, increased atmospheric demand from higher air temperature overcomes this conservative behavior resulting in a net increase of evapotranspiration, leading to reduction in both soil moisture storage and persistence of ponding. These results shed light on the linkage between vegetation acclimation under climate change and microtopography variability controls on ecohydrologic processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graumlich, L.J.; Brubaker, L.B.
1995-07-01
The specter of human-induced alteration of atmospheric composition, and the associated changes in climate, have focused attention on how species, communities, and ecosystems respond to climate change. One source of information concerning this is the paleoecological record. Paleoecology offers insights in the nature of climate-vegetation interactions that derive from the well-documented response of plant communities to environmental changes of the past. The spatial and temporal resolution of paleoecological data sets has increased in recent decades, so that relatively detailed histories of conifer forests are available for much of North America and Europe. In addition, comparisons of records of past vegetationmore » dynamics to paleoclimatic simulations by general circulation models have improved the understanding of the role of climate in governing past vegetation change. Several major findings of paleoresearch have importance to investigations of the effects of future climate change on the Earth`s biota. These include the findings (1) that changing seasonality may result in unexpected vegetation patterns, (2) that climatic and vegetation changes can be rapid, with ecosystem-wide implications, and (3) that short-term, extreme events can have long-term effects on tree population structures. In this chapter, we discuss patterns of coniferous forest response to climatic variation at two temporal scales: the Late Quaternary and the last millennium. Our examples illustrate the wide range of potential responses of coniferous forests to climatic variation, and emphasize opportunities for applying paleoecological findings to questions of ecophysiological research. Although we rely largely on examples from North America, our conclusions are well-supported by parallel research results in Europe and Asia.« less
Heat remains unaccounted for in thermal physiology and climate change research.
Flouris, Andreas D; Kenny, Glen P
2017-01-01
In the aftermath of the Paris Agreement, there is a crucial need for scientists in both thermal physiology and climate change research to develop the integrated approaches necessary to evaluate the health, economic, technological, social, and cultural impacts of 1.5°C warming. Our aim was to explore the fidelity of remote temperature measurements for quantitatively identifying the continuous redistribution of heat within both the Earth and the human body. Not accounting for the regional distribution of warming and heat storage patterns can undermine the results of thermal physiology and climate change research. These concepts are discussed herein using two parallel examples: the so-called slowdown of the Earth's surface temperature warming in the period 1998-2013; and the controversial results in thermal physiology, arising from relying heavily on core temperature measurements. In total, the concept of heat is of major importance for the integrity of systems, such as the Earth and human body. At present, our understanding about the interplay of key factors modulating the heat distribution on the surface of the Earth and in the human body remains incomplete. Identifying and accounting for the interconnections among these factors will be instrumental in improving the accuracy of both climate models and health guidelines.
NASA Astrophysics Data System (ADS)
Turney, C. S.; Fogwill, C. J.; Palmer, J. G.; VanSebille, E.; Thomas, Z.; McGlone, M.; Richardson, S.; Wilmshurst, J.; Fenwick, P.; Zunz, V.; Goosse, H.; Wilson, K. J.; Carter, L.; Lipson, M.; Jones, R. T.; Harsch, M.; Clark, G.; Marzinelli, E.; Rogers, T.; Rainsley, E.; Ciasto, L.; Waterman, S.; Thomas, E. R.; Visbeck, M.
2017-12-01
Occupying about 14 % of the world's surface, the Southern Ocean plays a fundamental role in ocean and atmosphere circulation, carbon cycling and Antarctic ice-sheet dynamics. Unfortunately, high interannual variability and a dearth of instrumental observations before the 1950s limits our understanding of how marine-atmosphere-ice domains interact on multi-decadal timescales and the impact of anthropogenic forcing. Here we integrate climate-sensitive tree growth with ocean and atmospheric observations on south-west Pacific subantarctic islands that lie at the boundary of polar and subtropical climates (52-54˚S). Our annually resolved temperature reconstruction captures regional change since the 1870s and demonstrates a significant increase in variability from the 1940s, a phenomenon predating the observational record, and coincident with major changes in mammalian and bird populations. Climate reanalysis and modelling show a parallel change in tropical Pacific sea surface temperatures that generate an atmospheric Rossby wave train which propagates across a large part of the Southern Hemisphere during the austral spring and summer. Our results suggest that modern observed high interannual variability was established across the mid-twentieth century, and that the influence of contemporary equatorial Pacific temperatures may now be a permanent feature across the mid- to high latitudes.
Joët, Thierry; Ourcival, Jean-Marc; Capelli, Mathilde; Dussert, Stéphane; Morin, Xavier
2016-01-01
Background and Aims Dominant tree species in northern temperate forests, for example oak and beech, produce desiccation-sensitive seeds. Despite the potentially major influence of this functional trait on the regeneration and distribution of species under climate change, little is currently known about the ecological determinants of the persistence of desiccation-sensitive seeds in transient soil seed banks. Knowing which key climatic and microsite factors favour seed survival will help define the regeneration niche for species whose seeds display extreme sensitivity to environmental stress Methods Using the Mediterranean Holm oak (Quercus ilex) forest as a model system, an in situ time-course monitoring of seed water status and viability was performed during the unfavourable winter season in two years with contrasting rainfall, at an instrumented site with detailed climate records. In parallel, the characteristics of the microhabitat and their influence on the post-winter water status and viability of seeds were investigated in a regional survey of 33 woodlands representative of the French distribution of the species. Key Results Time-course monitoring of seed water status in natural conditions confirmed that in situ desiccation is the main abiotic cause of mortality in winter. Critical water contents could be reached in a few days during drought spells. Seed dehydration rates were satisfactorily estimated using integrative climate proxies including vapour pressure deficit and potential evapotranspiration. Seed water status was therefore determined by the balance between water uptake after a rainfall event and water loss during dry periods. Structural equation modelling of microhabitat factors highlighted the major influence of canopy openness and resulting incident radiation on the ground. Conclusions This study provides part of the knowledge required to implement species distribution models which incorporate their regeneration niche. It is an important step forward in evaluating the ecological consequences of increasing winter drought and environmental filtering due to climate change on the regeneration of the most dominant Mediterranean tree species. PMID:26420203
NASA Astrophysics Data System (ADS)
Mahmud, A.; Hixson, M.; Kleeman, M. J.
2012-02-01
The effect of climate change on population-weighted concentrations of particulate matter (PM) during extreme events was studied using the Parallel Climate Model (PCM), the Weather Research and Forecasting (WRF) model and the UCD/CIT 3-D photochemical air quality model. A "business as usual" (B06.44) global emissions scenario was dynamically downscaled for the entire state of California between the years 2000-2006 and 2047-2053. Air quality simulations were carried out for 1008 days in each of the present-day and future climate conditions using year-2000 emissions. Population-weighted concentrations of PM0.1, PM2.5, and PM10 total mass, components species, and primary source contributions were calculated for California and three air basins: the Sacramento Valley air basin (SV), the San Joaquin Valley air basin (SJV) and the South Coast Air Basin (SoCAB). Results over annual-average periods were contrasted with extreme events. Climate change between 2000 vs. 2050 did not cause a statistically significant change in annual-average population-weighted PM2.5 mass concentrations within any major sub-region of California in the current study. Climate change did alter the annual-average composition of the airborne particles in the SoCAB, with notable reductions of elemental carbon (EC; -3%) and organic carbon (OC; -3%) due to increased annual-average wind speeds that diluted primary concentrations from gasoline combustion (-3%) and food cooking (-4%). In contrast, climate change caused significant increases in population-weighted PM2.5 mass concentrations in central California during extreme events. The maximum 24-h average PM2.5 concentration experienced by an average person during a ten-year period in the SJV increased by 21% due to enhanced production of secondary particulate matter (manifested as NH4NO3). In general, climate change caused increased stagnation during future extreme pollution events, leading to higher exposure to diesel engines particles (+32%) and wood combustion particles (+14%) when averaging across the population of the entire state. Enhanced stagnation also isolated populations from distant sources such as shipping (-61%) during extreme events. The combination of these factors altered the statewide population-averaged composition of particles during extreme events, with EC increasing by 23%, nitrate increasing by 58%, and sulfate decreasing by 46%.
NASA Astrophysics Data System (ADS)
Mahmud, A.; Hixson, M.; Kleeman, M. J.
2012-08-01
The effect of climate change on population-weighted concentrations of particulate matter (PM) during extreme pollution events was studied using the Parallel Climate Model (PCM), the Weather Research and Forecasting (WRF) model and the UCD/CIT 3-D photochemical air quality model. A "business as usual" (B06.44) global emissions scenario was dynamically downscaled for the entire state of California between the years 2000-2006 and 2047-2053. Air quality simulations were carried out for 1008 days in each of the present-day and future climate conditions using year-2000 emissions. Population-weighted concentrations of PM0.1, PM2.5, and PM10 total mass, components species, and primary source contributions were calculated for California and three air basins: the Sacramento Valley air basin (SV), the San Joaquin Valley air basin (SJV) and the South Coast Air Basin (SoCAB). Results over annual-average periods were contrasted with extreme events. The current study found that the change in annual-average population-weighted PM2.5 mass concentrations due to climate change between 2000 vs. 2050 within any major sub-region in California was not statistically significant. However, climate change did alter the annual-average composition of the airborne particles in the SoCAB, with notable reductions of elemental carbon (EC; -3%) and organic carbon (OC; -3%) due to increased annual-average wind speeds that diluted primary concentrations from gasoline combustion (-3%) and food cooking (-4%). In contrast, climate change caused significant increases in population-weighted PM2.5 mass concentrations in central California during extreme events. The maximum 24-h average PM2.5 concentration experienced by an average person during a ten-yr period in the SJV increased by 21% due to enhanced production of secondary particulate matter (manifested as NH4NO3). In general, climate change caused increased stagnation during future extreme pollution events, leading to higher exposure to diesel engines particles (+32%) and wood combustion particles (+14%) when averaging across the population of the entire state. Enhanced stagnation also isolated populations from distant sources such as shipping (-61%) during extreme events. The combination of these factors altered the statewide population-averaged composition of particles during extreme events, with EC increasing by 23 %, nitrate increasing by 58%, and sulfate decreasing by 46%.
Lindborg, T; Thorne, M; Andersson, E; Becker, J; Brandefelt, J; Cabianca, T; Gunia, M; Ikonen, A T K; Johansson, E; Kangasniemi, V; Kautsky, U; Kirchner, G; Klos, R; Kowe, R; Kontula, A; Kupiainen, P; Lahdenperä, A-M; Lord, N S; Lunt, D J; Näslund, J-O; Nordén, M; Norris, S; Pérez-Sánchez, D; Proverbio, A; Riekki, K; Rübel, A; Sweeck, L; Walke, R; Xu, S; Smith, G; Pröhl, G
2018-03-01
The International Atomic Energy Agency has coordinated an international project addressing climate change and landscape development in post-closure safety assessments of solid radioactive waste disposal. The work has been supported by results of parallel on-going research that has been published in a variety of reports and peer reviewed journal articles. The project is due to be described in detail in a forthcoming IAEA report. Noting the multi-disciplinary nature of post-closure safety assessments, here, an overview of the work is given to provide researchers in the broader fields of radioecology and radiological safety assessment with a review of the work that has been undertaken. It is hoped that such dissemination will support and promote integrated understanding and coherent treatment of climate change and landscape development within an overall assessment process. The key activities undertaken in the project were: identification of the key processes that drive environmental change (mainly those associated with climate and climate change), and description of how a relevant future may develop on a global scale; development of a methodology for characterising environmental change that is valid on a global scale, showing how modelled global changes in climate can be downscaled to provide information that may be needed for characterising environmental change in site-specific assessments, and illustrating different aspects of the methodology in a number of case studies that show the evolution of site characteristics and the implications for the dose assessment models. Overall, the study has shown that quantitative climate and landscape modelling has now developed to the stage that it can be used to define an envelope of climate and landscape change scenarios at specific sites and under specific greenhouse-gas emissions assumptions that is suitable for use in quantitative post-closure performance assessments. These scenarios are not predictions of the future, but are projections based on a well-established understanding of the important processes involved and their impacts on different types of landscape. Such projections support the understanding of, and selection of, plausible ranges of scenarios for use in post-closure safety assessments. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Alicja Breymeyer
1998-01-01
The responses of pine forest to changing climate and environmental chemistry were studied along two transects following the pollution and continentality gradients in Poland. One axis begins on the western border of Poland, crosses the country along the 52nd parallel, and ends on the eastern border of Poland in the area of Bialowieza National Park, Biosphere Reserve....
NASA Technical Reports Server (NTRS)
1991-01-01
Various papers on supercomputing are presented. The general topics addressed include: program analysis/data dependence, memory access, distributed memory code generation, numerical algorithms, supercomputer benchmarks, latency tolerance, parallel programming, applications, processor design, networks, performance tools, mapping and scheduling, characterization affecting performance, parallelism packaging, computing climate change, combinatorial algorithms, hardware and software performance issues, system issues. (No individual items are abstracted in this volume)
Ivan Arismendi; Sherri L. Johnson; Jason B. Dunham; Roy Haggerty
2012-01-01
Temperature is a fundamentally important driver of ecosystem processes in streams. Recent warming of terrestrial climates around the globe has motivated concern about consequent increases in stream temperature. More specifically, observed trends of increasing air temperature and declining stream flow are widely believed to result in corresponding increases in stream...
Rose, Hannah; Hoar, Bryanne; Kutz, Susan J.; Morgan, Eric R.
2014-01-01
Global change, including climate, policy, land use and other associated environmental changes, is likely to have a major impact on parasitic disease in wildlife, altering the spatio-temporal patterns of transmission, with wide-ranging implications for wildlife, domestic animals, humans and ecosystem health. Predicting the potential impact of climate change on parasites infecting wildlife will become increasingly important in the management of species of conservation concern and control of disease at the wildlife–livestock and wildlife–human interface, but is confounded by incomplete knowledge of host–parasite interactions, logistical difficulties, small sample sizes and limited opportunities to manipulate the system. By exploiting parallels between livestock and wildlife, existing theoretical frameworks and research on livestock and their gastrointestinal nematodes can be adapted to wildlife systems. Similarities in the gastrointestinal nematodes and the life-histories of wild and domestic ruminants, coupled with a detailed knowledge of the ecology and life-cycle of the parasites, render the ruminant-GIN host–parasite system particularly amenable to a cross-disciplinary approach. PMID:25197625
Kutnjak, Denis; Kuttner, Michael; Niketić, Marjan; Dullinger, Stefan; Schönswetter, Peter; Frajman, Božo
2014-09-01
The Balkans are a major European biodiversity hotspot, however, almost nothing is known about processes of intraspecific diversification of the region's high-altitude biota and their reaction to the predicted global warming. To fill this gap, genome size measurements, AFLP fingerprints, plastid and nuclear sequences were employed to explore the phylogeography of Cerastium dinaricum. Range size changes under future climatic conditions were predicted by niche-based modeling. Likely the most cold-adapted plant endemic to the Dinaric Mountains in the western Balkan Peninsula, the species has conservation priority in the European Union as its highly fragmented distribution range includes only few small populations. A deep phylogeographic split paralleled by divergent genome size separates the populations into two vicariant groups. Substructure is pronounced within the southeastern group, corresponding to the area's higher geographic complexity. Cerastium dinaricum likely responded to past climatic oscillations with altitudinal range shifts, which, coupled with high topographic complexity of the region and warmer climate in the Holocene, sculptured its present fragmented distribution. Field observations revealed that the species is rarer than previously assumed and, as shown by modeling, severely endangered by global warming as viable habitat was predicted to be reduced by more than 70% by the year 2080. Copyright © 2014 Elsevier Inc. All rights reserved.
Genomic basis for coral resilience to climate change.
Barshis, Daniel J; Ladner, Jason T; Oliver, Thomas A; Seneca, François O; Traylor-Knowles, Nikki; Palumbi, Stephen R
2013-01-22
Recent advances in DNA-sequencing technologies now allow for in-depth characterization of the genomic stress responses of many organisms beyond model taxa. They are especially appropriate for organisms such as reef-building corals, for which dramatic declines in abundance are expected to worsen as anthropogenic climate change intensifies. Different corals differ substantially in physiological resilience to environmental stress, but the molecular mechanisms behind enhanced coral resilience remain unclear. Here, we compare transcriptome-wide gene expression (via RNA-Seq using Illumina sequencing) among conspecific thermally sensitive and thermally resilient corals to identify the molecular pathways contributing to coral resilience. Under simulated bleaching stress, sensitive and resilient corals change expression of hundreds of genes, but the resilient corals had higher expression under control conditions across 60 of these genes. These "frontloaded" transcripts were less up-regulated in resilient corals during heat stress and included thermal tolerance genes such as heat shock proteins and antioxidant enzymes, as well as a broad array of genes involved in apoptosis regulation, tumor suppression, innate immune response, and cell adhesion. We propose that constitutive frontloading enables an individual to maintain physiological resilience during frequently encountered environmental stress, an idea that has strong parallels in model systems such as yeast. Our study provides broad insight into the fundamental cellular processes responsible for enhanced stress tolerances that may enable some organisms to better persist into the future in an era of global climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahn, Tae-Hyuk; Sandu, Adrian; Watson, Layne T.
2015-08-01
Ensembles of simulations are employed to estimate the statistics of possible future states of a system, and are widely used in important applications such as climate change and biological modeling. Ensembles of runs can naturally be executed in parallel. However, when the CPU times of individual simulations vary considerably, a simple strategy of assigning an equal number of tasks per processor can lead to serious work imbalances and low parallel efficiency. This paper presents a new probabilistic framework to analyze the performance of dynamic load balancing algorithms for ensembles of simulations where many tasks are mapped onto each processor, andmore » where the individual compute times vary considerably among tasks. Four load balancing strategies are discussed: most-dividing, all-redistribution, random-polling, and neighbor-redistribution. Simulation results with a stochastic budding yeast cell cycle model are consistent with the theoretical analysis. It is especially significant that there is a provable global decrease in load imbalance for the local rebalancing algorithms due to scalability concerns for the global rebalancing algorithms. The overall simulation time is reduced by up to 25 %, and the total processor idle time by 85 %.« less
NASA Astrophysics Data System (ADS)
Mills, R. T.
2014-12-01
As the high performance computing (HPC) community pushes towards the exascale horizon, the importance and prevalence of fine-grained parallelism in new computer architectures is increasing. This is perhaps most apparent in the proliferation of so-called "accelerators" such as the Intel Xeon Phi or NVIDIA GPGPUs, but the trend also holds for CPUs, where serial performance has grown slowly and effective use of hardware threads and vector units are becoming increasingly important to realizing high performance. This has significant implications for weather, climate, and Earth system modeling codes, many of which display impressive scalability across MPI ranks but take relatively little advantage of threading and vector processing. In addition to increasing parallelism, next generation codes will also need to address increasingly deep hierarchies for data movement: NUMA/cache levels, on node vs. off node, local vs. wide neighborhoods on the interconnect, and even in the I/O system. We will discuss some approaches (grounded in experiences with the Intel Xeon Phi architecture) for restructuring Earth science codes to maximize concurrency across multiple levels (vectors, threads, MPI ranks), and also discuss some novel approaches for minimizing expensive data movement/communication.
Soybean Physiology Calibration in the Community Land Model
NASA Astrophysics Data System (ADS)
Drewniak, B. A.; Bilionis, I.; Constantinescu, E. M.
2014-12-01
With the large influence of agricultural land use on biophysical and biogeochemical cycles, integrating cultivation into Earth System Models (ESMs) is increasingly important. The Community Land Model (CLM) was augmented with a CLM-Crop extension that simulates the development of three crop types: maize, soybean, and spring wheat. The CLM-Crop model is a complex system that relies on a suite of parametric inputs that govern plant growth under a given atmospheric forcing and available resources. However, the strong nonlinearity of ESMs makes parameter fitting a difficult task. In this study, our goal is to calibrate ten of the CLM-Crop parameters for one crop type, soybean, in order to improve model projection of plant development and carbon fluxes. We used measurements of gross primary productivity, net ecosystem exchange, and plant biomass from AmeriFlux sites to choose parameter values that optimize crop productivity in the model. Calibration is performed in a Bayesian framework by developing a scalable and adaptive scheme based on sequential Monte Carlo (SMC). Our scheme can perform model calibration using very few evaluations and, by exploiting parallelism, at a fraction of the time required by plain vanilla Markov Chain Monte Carlo (MCMC). We present the results from a twin experiment (self-validation) and calibration results and validation using real observations from an AmeriFlux tower site in the Midwestern United States, for the soybean crop type. The improved model will help researchers understand how climate affects crop production and resulting carbon fluxes, and additionally, how cultivation impacts climate.
Multitasking TORT under UNICOS: Parallel performance models and measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnett, A.; Azmy, Y.Y.
1999-09-27
The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.
Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azmy, Y.Y.; Barnett, D.A.
1999-09-27
The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.
Predictability Analysis of PM10 Concentrations in Budapest
NASA Astrophysics Data System (ADS)
Ferenczi, Zita
2013-04-01
Climate, weather and air quality may have harmful effects on human health and environment. Over the past few hundred years we had to face the changes in climate in parallel with the changes in air quality. These observed changes in climate, weather and air quality continuously interact with each other: pollutants are changing the climate, thus changing the weather, but climate also has impacts on air quality. The increasing number of extreme weather situations may be a result of climate change, which could create favourable conditions for rising of pollutant concentrations. Air quality in Budapest is determined by domestic and traffic emissions combined with the meteorological conditions. In some cases, the effect of long-range transport could also be essential. While the time variability of the industrial and traffic emissions is not significant, the domestic emissions increase in winter season. In recent years, PM10 episodes have caused the most critical air quality problems in Budapest, especially in winter. In Budapest, an air quality network of 11 stations detects the concentration values of different pollutants hourly. The Hungarian Meteorological Service has developed an air quality prediction model system for the area of Budapest. The system forecasts the concentration of air pollutants (PM10, NO2, SO2 and O3) for two days in advance. In this work we used meteorological parameters and PM10 data detected by the stations of the air quality network, as well as the forecasted PM10 values of the air quality prediction model system. In this work we present the evaluation of PM10 predictions in the last two years and the most important meteorological parameters affecting PM10 concentration. The results of this analysis determine the effect of the meteorological parameters and the emission of aerosol particles on the PM10 concentration values as well as the limits of this prediction system.
Persistence in a Two-Dimensional Moving-Habitat Model.
Phillips, Austin; Kot, Mark
2015-11-01
Environmental changes are forcing many species to track suitable conditions or face extinction. In this study, we use a two-dimensional integrodifference equation to analyze whether a population can track a habitat that is moving due to climate change. We model habitat as a simple rectangle. Our model quickly leads to an eigenvalue problem that determines whether the population persists or declines. After surveying techniques to solve the eigenvalue problem, we highlight three findings that impact conservation efforts such as reserve design and species risk assessment. First, while other models focus on habitat length (parallel to the direction of habitat movement), we show that ignoring habitat width (perpendicular to habitat movement) can lead to overestimates of persistence. Dispersal barriers and hostile landscapes that constrain habitat width greatly decrease the population's ability to track its habitat. Second, for some long-distance dispersal kernels, increasing habitat length improves persistence without limit; for other kernels, increasing length is of limited help and has diminishing returns. Third, it is not always best to orient the long side of the habitat in the direction of climate change. Evidence suggests that the kurtosis of the dispersal kernel determines whether it is best to have a long, wide, or square habitat. In particular, populations with platykurtic dispersal benefit more from a wide habitat, while those with leptokurtic dispersal benefit more from a long habitat. We apply our model to the Rocky Mountain Apollo butterfly (Parnassius smintheus).
SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data
NASA Astrophysics Data System (ADS)
Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.
2015-12-01
Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These solutions are encompassed in SciSpark, an open-source software framework for distributed computing on scientific data.
Effects of Global Change on U.S. Urban Areas: Vulnerabilities, Impacts, and Adaptation
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Wilbanks, Thomas J.; Kirshen, Paul; Romero-Lankao, Patricia; Rosenzweig, Cynthia; Ruth, Mattias; Solecki, William; Tarr, Joel
2008-01-01
This slide presentation reviews some of the effects that global change has on urban areas in the United States and how the growth of urban areas will affect the environment. It presents the elements of our Synthesis and Assessment Report (SAP) report that relate to what vulnerabilities and impacts will occur, what adaptation responses may take place, and what possible effects on settlement patterns and characteristics will potentially arise, on human settlements in the U.S. as a result of climate change and climate variability. We will also present some recommendations about what should be done to further research on how climate change and variability will impact human settlements in the U.S., as well as how to engage government officials, policy and decision makers, and the general public in understanding the implications of climate change and variability on the local and regional levels. Additionally, we wish to explore how technology such as remote sensing data coupled with modeling, can be employed as synthesis tools for deriving insight across a spectrum of impacts (e.g. public health, urban planning for mitigation strategies) on how cities can cope and adapt to climate change and variability. This latter point parallels the concepts and ideas presented in the U.S. National Academy of Sciences, Decadal Survey report on "Earth Science Applications from Space: National Imperatives for the Next Decade and Beyond" wherein the analysis of the impacts of climate change and variability, human health, and land use change are listed as key areas for development of future Earth observing remote sensing systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Micah Johnson, Andrew Slaughter
PIKA is a MOOSE-based application for modeling micro-structure evolution of seasonal snow. The model will be useful for environmental, atmospheric, and climate scientists. Possible applications include application to energy balance models, ice sheet modeling, and avalanche forecasting. The model implements physics from published, peer-reviewed articles. The main purpose is to foster university and laboratory collaboration to build a larger multi-scale snow model using MOOSE. The main feature of the code is that it is implemented using the MOOSE framework, thus making features such as multiphysics coupling, adaptive mesh refinement, and parallel scalability native to the application. PIKA implements three equations:more » the phase-field equation for tracking the evolution of the ice-air interface within seasonal snow at the grain-scale; the heat equation for computing the temperature of both the ice and air within the snow; and the mass transport equation for monitoring the diffusion of water vapor in the pore space of the snow.« less
Responses of leaf traits to climatic gradients: adaptive variation versus compositional shifts
NASA Astrophysics Data System (ADS)
Meng, T.-T.; Wang, H.; Harrison, S. P.; Prentice, I. C.; Ni, J.; Wang, G.
2015-09-01
Dynamic global vegetation models (DGVMs) typically rely on plant functional types (PFTs), which are assigned distinct environmental tolerances and replace one another progressively along environmental gradients. Fixed values of traits are assigned to each PFT; modelled trait variation along gradients is thus driven by PFT replacement. But empirical studies have revealed "universal" scaling relationships (quantitative trait variations with climate that are similar within and between species, PFTs and communities); and continuous, adaptive trait variation has been proposed to replace PFTs as the basis for next-generation DGVMs. Here we analyse quantitative leaf-trait variation on long temperature and moisture gradients in China with a view to understanding the relative importance of PFT replacement vs. continuous adaptive variation within PFTs. Leaf area (LA), specific leaf area (SLA), leaf dry matter content (LDMC) and nitrogen content of dry matter were measured on all species at 80 sites ranging from temperate to tropical climates and from dense forests to deserts. Chlorophyll fluorescence traits and carbon, phosphorus and potassium contents were measured at 47 sites. Generalized linear models were used to relate log-transformed trait values to growing-season temperature and moisture indices, with or without PFT identity as a predictor, and to test for differences in trait responses among PFTs. Continuous trait variation was found to be ubiquitous. Responses to moisture availability were generally similar within and between PFTs, but biophysical traits (LA, SLA and LDMC) of forbs and grasses responded differently from woody plants. SLA and LDMC responses to temperature were dominated by the prevalence of evergreen PFTs with thick, dense leaves at the warm end of the gradient. Nutrient (N, P and K) responses to climate gradients were generally similar within all PFTs. Area-based nutrients generally declined with moisture; Narea and Karea declined with temperature, but Parea increased with temperature. Although the adaptive nature of many of these trait-climate relationships is understood qualitatively, a key challenge for modelling is to predict them quantitatively. Models must take into account that community-level responses to climatic gradients can be influenced by shifts in PFT composition, such as the replacement of deciduous by evergreen trees, which may run either parallel or counter to trait variation within PFTs. The importance of PFT shifts varies among traits, being important for biophysical traits but less so for physiological and chemical traits. Finally, models should take account of the diversity of trait values that is found in all sites and PFTs, representing the "pool" of variation that is locally available for the natural adaptation of ecosystem function to environmental change.
NASA Astrophysics Data System (ADS)
Anantharaj, V.; Mayer, B.; Wang, F.; Hack, J.; McKenna, D.; Hartman-Baker, R.
2012-04-01
The Oak Ridge Leadership Computing Facility (OLCF) facilitates the execution of computational experiments that require tens of millions of CPU hours (typically using thousands of processors simultaneously) while generating hundreds of terabytes of data. A set of ultra high resolution climate experiments in progress, using the Community Earth System Model (CESM), will produce over 35,000 files, ranging in sizes from 21 MB to 110 GB each. The execution of the experiments will require nearly 70 Million CPU hours on the Jaguar and Titan supercomputers at OLCF. The total volume of the output from these climate modeling experiments will be in excess of 300 TB. This model output must then be archived, analyzed, distributed to the project partners in a timely manner, and also made available more broadly. Meeting this challenge would require efficient movement of the data, staging the simulation output to a large and fast file system that provides high volume access to other computational systems used to analyze the data and synthesize results. This file system also needs to be accessible via high speed networks to an archival system that can provide long term reliable storage. Ideally this archival system is itself directly available to other systems that can be used to host services making the data and analysis available to the participants in the distributed research project and to the broader climate community. The various resources available at the OLCF now support this workflow. The available systems include the new Jaguar Cray XK6 2.63 petaflops (estimated) supercomputer, the 10 PB Spider center-wide parallel file system, the Lens/EVEREST analysis and visualization system, the HPSS archival storage system, the Earth System Grid (ESG), and the ORNL Climate Data Server (CDS). The ESG features federated services, search & discovery, extensive data handling capabilities, deep storage access, and Live Access Server (LAS) integration. The scientific workflow enabled on these systems, and developed as part of the Ultra-High Resolution Climate Modeling Project, allows users of OLCF resources to efficiently share simulated data, often multi-terabyte in volume, as well as the results from the modeling experiments and various synthesized products derived from these simulations. The final objective in the exercise is to ensure that the simulation results and the enhanced understanding will serve the needs of a diverse group of stakeholders across the world, including our research partners in U.S. Department of Energy laboratories & universities, domain scientists, students (K-12 as well as higher education), resource managers, decision makers, and the general public.
The Climate Data Analytic Services (CDAS) Framework.
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2016-12-01
Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.
I/O Parallelization for the Goddard Earth Observing System Data Assimilation System (GEOS DAS)
NASA Technical Reports Server (NTRS)
Lucchesi, Rob; Sawyer, W.; Takacs, L. L.; Lyster, P.; Zero, J.
1998-01-01
The National Aeronautics and Space Administration (NASA) Data Assimilation Office (DAO) at the Goddard Space Flight Center (GSFC) has developed the GEOS DAS, a data assimilation system that provides production support for NASA missions and will support NASA's Earth Observing System (EOS) in the coming years. The GEOS DAS will be used to provide background fields of meteorological quantities to EOS satellite instrument teams for use in their data algorithms as well as providing assimilated data sets for climate studies on decadal time scales. The DAO has been involved in prototyping parallel implementations of the GEOS DAS for a number of years and is now embarking on an effort to convert the production version from shared-memory parallelism to distributed-memory parallelism using the portable Message-Passing Interface (MPI). The GEOS DAS consists of two main components, an atmospheric General Circulation Model (GCM) and a Physical-space Statistical Analysis System (PSAS). The GCM operates on data that are stored on a regular grid while PSAS works with observational data that are scattered irregularly throughout the atmosphere. As a result, the two components have different data decompositions. The GCM is decomposed horizontally as a checkerboard with all vertical levels of each box existing on the same processing element(PE). The dynamical core of the GCM can also operate on a rotated grid, which requires communication-intensive grid transformations during GCM integration. PSAS groups observations on PEs in a more irregular and dynamic fashion.
Change We Can Fight Over: The Relationship between Arable Land Supply and Substate Conflict
2010-01-01
environmental impact of global warming has spurred a parallel discussion among national security academics and policymakers about the security...consequences of climate change. Roughly speaking, there are two camps in this discussion -one that ominously predicts the potential for global warming to spark...future climate change, but the stark reality is that global warming is already upon us. Thus, policymakers need to know -both now and in the coming
Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model
NASA Astrophysics Data System (ADS)
Kumar, M.; Duffy, C.
2006-05-01
Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.
Longitudinal trends in climate drive flowering time clines in North American Arabidopsis thaliana.
Samis, Karen E; Murren, Courtney J; Bossdorf, Oliver; Donohue, Kathleen; Fenster, Charles B; Malmberg, Russell L; Purugganan, Michael D; Stinchcombe, John R
2012-06-01
Introduced species frequently show geographic differentiation, and when differentiation mirrors the ancestral range, it is often taken as evidence of adaptive evolution. The mouse-ear cress (Arabidopsis thaliana) was introduced to North America from Eurasia 150-200 years ago, providing an opportunity to study parallel adaptation in a genetic model organism. Here, we test for clinal variation in flowering time using 199 North American (NA) accessions of A. thaliana, and evaluate the contributions of major flowering time genes FRI, FLC, and PHYC as well as potential ecological mechanisms underlying differentiation. We find evidence for substantial within population genetic variation in quantitative traits and flowering time, and putatively adaptive longitudinal differentiation, despite low levels of variation at FRI, FLC, and PHYC and genome-wide reductions in population structure relative to Eurasian (EA) samples. The observed longitudinal cline in flowering time in North America is parallel to an EA cline, robust to the effects of population structure, and associated with geographic variation in winter precipitation and temperature. We detected major effects of FRI on quantitative traits associated with reproductive fitness, although the haplotype associated with higher fitness remains rare in North America. Collectively, our results suggest the evolution of parallel flowering time clines through novel genetic mechanisms.
NASA Astrophysics Data System (ADS)
Sueishi, T.; Yucel, M.; Ashie, Y.; Varquez, A. C. G.; Inagaki, A.; Darmanto, N. S.; Nakayoshi, M.; Kanda, M.
2017-12-01
Recently, temperature in urban areas continue to rise as an effect of climate change and urbanization. Specifically, Asian megacities are projected to expand rapidly resulting to serious in the future atmospheric environment. Thus, detailed analysis of urban meteorology for Asian megacities is needed to prescribe optimum against these negative climate modifications. A building-resolving large eddy simulation (LES) coupled with an energy balance model is conducted for a highly urbanized district in central Jakarta on typical daytime hours. Five cases were considered; case 1 utilizes present urban scenario and four cases representing different urban configurations in 2050. The future configurations were based on representative concentration pathways (RCP) and shared socio-economic pathways (SSP). Building height maps and land use maps of simulation domains are shown in the attached figure (top). Case 1 3 focuses on the difference of future scenarios. Case 1 represents current climatic and urban conditions, case 2 and 3 was an idealized future represented by RCP2.6/SSP1 and RCP8.5/SSP3, respectively. More complex urban morphology was applied in case 4, vegetation and building area were changed in case 5. Meteorological inputs and anthropogenic heat emission (AHE) were calculated using Weather Research and Forecasting (WRF) model (Varquez et al [2017]). Sensible and latent heat flux from surfaces were calculated using an energy balance model (Ashie et al [2011]), with considers multi-reflection, evapotranspiration and evaporation. The results of energy balance model (shown in the middle line of figure), in addition to WRF outputs, were used as input into the PArallelized LES Model (PALM) (Raasch et al [2001]). From standard new effective temperature (SET*) which included the effects of temperature, wind speed, humidity and radiation, thermal comfort in urban area was evaluated. SET* contours at 1 m height are shown in the bottom line of the figure. Extreme climate change increase average SET* as expected; however, construction of dense high-rise buildings (case 2) can minimize this effect due to increased shading throughout the district. Acknowledgement: This research was supported by the Environment Research and Technology Development Fund (S-14) of the Ministry of the Environment, Japan.
Impact of a changing environment on the built heritage
NASA Astrophysics Data System (ADS)
Grossi, C. M.; Brimblecombe, P.; Bonazza, A.
2012-04-01
Stone monuments are degraded by both climate and pollution. Deterioration by pollution was especially intense from the 1700s and until the late 20th century the dominant impact of air pollution was the sulfation of surfaces. The parallel deposition of soot caused blackening and on some surfaces dark coloured crusts. The decrease of sulfur and soot from coal combustion during the last decades of the 20th century led to cleaner air in cities, a decrease of pollution-decay rates on building stones and a public desire for cleaner buildings. Although there were decreases in SO2, it was replaced by ozone, nitrogen oxides and particles richer in organic compounds, the result of an extensive use of automobiles. Deposited organic compounds can oxidise in modern urban environments in a yellowing process. The future may reveal variation in building colour from biological growth in a changing climate. In urban atmospheres with less sulfur, biological growth is more effective. A greater rate of delivery of nitrate to building surfaces that acts as "airborne fertiliser" favours colonisation. Depending on climate, there might be different processes (e.g. greening or reddening) and patterns of colouration. Climate is also a relevant factor in the weathering of monuments. Recent research suggests the concept of Heritage Climatology in the study of climate interactions with monuments, materials and sites. These parameters concentrate on aspects and combinations of meteorological variables that relate to material damage. The Köppen-Geiger climate classification can be a good approximation for some heritage risks. For instance, the number of salt transitions shows distinct seasonality which can be related to Köppen-Geiger climate types and their change during the 21th century. The study of changing pollution and climate impacts on the built heritage needs the output of pollution emissions and climate change models, which are prone to uncertainties. The use of multiple climate models or ENSEMBLES may improve the accuracy and reliability of predictions. This approach has been used to predict salt damage. However, more work is needed on the uncertainty in predictions and the way this affects the management of stone heritage. There is public availability of climate and pollution data, but frequently these need to be unified and made user-friendly for cultural heritage researchers in many countries, e.g. the UKCP09 user interface is a good example of friendly-availability for probabilistic projections and downscaled climate change data, but available data are limited to the UK. The utilisation of these improved techniques can contribute to better strategies for managing buildings.
Hermida-Carrera, Carmen; Fares, Mario A; Fernández, Ángel; Gil-Pelegrín, Eustaquio; Kapralov, Maxim V; Mir, Arnau; Molins, Arántzazu; Peguero-Pina, José Javier; Rocha, Jairo; Sancho-Knapik, Domingo; Galmés, Jeroni
2017-01-01
Phylogenetic analysis by maximum likelihood (PAML) has become the standard approach to study positive selection at the molecular level, but other methods may provide complementary ways to identify amino acid replacements associated with particular conditions. Here, we compare results of the decision tree (DT) model method with ones of PAML using the key photosynthetic enzyme RuBisCO as a model system to study molecular adaptation to particular ecological conditions in oaks (Quercus). We sequenced the chloroplast rbcL gene encoding RuBisCO large subunit in 158 Quercus species, covering about a third of the global genus diversity. It has been hypothesized that RuBisCO has evolved differentially depending on the environmental conditions and leaf traits governing internal gas diffusion patterns. Here, we show, using PAML, that amino acid replacements at the residue positions 95, 145, 251, 262 and 328 of the RuBisCO large subunit have been the subject of positive selection along particular Quercus lineages associated with the leaf traits and climate characteristics. In parallel, the DT model identified amino acid replacements at sites 95, 219, 262 and 328 being associated with the leaf traits and climate characteristics, exhibiting partial overlap with the results obtained using PAML.
Hermida-Carrera, Carmen; Fares, Mario A.; Fernández, Ángel; Gil-Pelegrín, Eustaquio; Kapralov, Maxim V.; Mir, Arnau; Molins, Arántzazu; Peguero-Pina, José Javier; Rocha, Jairo; Sancho-Knapik, Domingo
2017-01-01
Phylogenetic analysis by maximum likelihood (PAML) has become the standard approach to study positive selection at the molecular level, but other methods may provide complementary ways to identify amino acid replacements associated with particular conditions. Here, we compare results of the decision tree (DT) model method with ones of PAML using the key photosynthetic enzyme RuBisCO as a model system to study molecular adaptation to particular ecological conditions in oaks (Quercus). We sequenced the chloroplast rbcL gene encoding RuBisCO large subunit in 158 Quercus species, covering about a third of the global genus diversity. It has been hypothesized that RuBisCO has evolved differentially depending on the environmental conditions and leaf traits governing internal gas diffusion patterns. Here, we show, using PAML, that amino acid replacements at the residue positions 95, 145, 251, 262 and 328 of the RuBisCO large subunit have been the subject of positive selection along particular Quercus lineages associated with the leaf traits and climate characteristics. In parallel, the DT model identified amino acid replacements at sites 95, 219, 262 and 328 being associated with the leaf traits and climate characteristics, exhibiting partial overlap with the results obtained using PAML. PMID:28859145
Associations between safety climate and safety management practices in the construction industry.
Marín, Luz S; Lipscomb, Hester; Cifuentes, Manuel; Punnett, Laura
2017-06-01
Safety climate, a group-level measure of workers' perceptions regarding management's safety priorities, has been suggested as a key predictor of safety outcomes. However, its relationship with actual injury rates is inconsistent. We posit that safety climate may instead be a parallel outcome of workplace safety practices, rather than a determinant of workers' safety behaviors or outcomes. Using a sample of 25 commercial construction companies in Colombia, selected by injury rate stratum (high, medium, low), we examined the relationship between workers' safety climate perceptions and safety management practices (SMPs) reported by safety officers. Workers' perceptions of safety climate were independent of their own company's implementation of SMPs, as measured here, and its injury rates. However, injury rates were negatively related to the implementation of SMPs. Safety management practices may be more important than workers' perceptions of safety climate as direct predictors of injury rates. © 2017 Wiley Periodicals, Inc.
PALM-USM v1.0: A new urban surface model integrated into the PALM large-eddy simulation model
NASA Astrophysics Data System (ADS)
Resler, Jaroslav; Krč, Pavel; Belda, Michal; Juruš, Pavel; Benešová, Nina; Lopata, Jan; Vlček, Ondřej; Damašková, Daša; Eben, Kryštof; Derbek, Přemysl; Maronga, Björn; Kanani-Sühring, Farah
2017-10-01
Urban areas are an important part of the climate system and many aspects of urban climate have direct effects on human health and living conditions. This implies that reliable tools for local urban climate studies supporting sustainable urban planning are needed. However, a realistic implementation of urban canopy processes still poses a serious challenge for weather and climate modelling for the current generation of numerical models. To address this demand, a new urban surface model (USM), describing the surface energy processes for urban environments, was developed and integrated as a module into the PALM large-eddy simulation model. The development of the presented first version of the USM originated from modelling the urban heat island during summer heat wave episodes and thus implements primarily processes important in such conditions. The USM contains a multi-reflection radiation model for shortwave and longwave radiation with an integrated model of absorption of radiation by resolved plant canopy (i.e. trees, shrubs). Furthermore, it consists of an energy balance solver for horizontal and vertical impervious surfaces, and thermal diffusion in ground, wall, and roof materials, and it includes a simple model for the consideration of anthropogenic heat sources. The USM was parallelized using the standard Message Passing Interface and performance testing demonstrates that the computational costs of the USM are reasonable on typical clusters for the tested configurations. The module was fully integrated into PALM and is available via its online repository under the GNU General Public License (GPL). The USM was tested on a summer heat-wave episode for a selected Prague crossroads. The general representation of the urban boundary layer and patterns of surface temperatures of various surface types (walls, pavement) are in good agreement with in situ observations made in Prague. Additional simulations were performed in order to assess the sensitivity of the results to uncertainties in the material parameters, the domain size, and the general effect of the USM itself. The first version of the USM is limited to the processes most relevant to the study of summer heat waves and serves as a basis for ongoing development which will address additional processes of the urban environment and lead to improvements to extend the utilization of the USM to other environments and conditions.
Modeling seasonality of ice and ocean carbon production in the Arctic
NASA Astrophysics Data System (ADS)
Jin, M.; Deal, C. M.; Ji, R.
2011-12-01
In the Arctic Ocean, both phytoplankton and sea ice algae are important contributors to the primary production and the arctic food web. Copepod in the arctic regions have developed their feeding habit depending on the timing between the ice algal bloom and the subsequent phytoplankton bloom. A mismatch of the timing due to climate changes could have dramatic consequences on the food web as shown by some regional observations. In this study, a global coupled ice-ocean-ecosystem model was used to assess the seasonality of the ice algal and phytoplankton blooms in the arctic. The ice-ocean ecosystem modules are fully coupled in the physical model POP-CICE (Parallel Ocean Program- Los Alamos Sea Ice Model). The model results are compared with various observations. The modeled ice and ocean carbon production were analyzed by regions and their linkage to the physical environment changes (such as changes of ice concentration and water temperature, and light intensity etc.) between low- and high-ice years.
Progress with lossy compression of data from the Community Earth System Model
NASA Astrophysics Data System (ADS)
Xu, H.; Baker, A.; Hammerling, D.; Li, S.; Clyne, J.
2017-12-01
Climate models, such as the Community Earth System Model (CESM), generate massive quantities of data, particularly when run at high spatial and temporal resolutions. The burden of storage is further exacerbated by creating large ensembles, generating large numbers of variables, outputting at high frequencies, and duplicating data archives (to protect against disk failures). Applying lossy compression methods to CESM datasets is an attractive means of reducing data storage requirements, but ensuring that the loss of information does not negatively impact science objectives is critical. In particular, test methods are needed to evaluate whether critical features (e.g., extreme values and spatial and temporal gradients) have been preserved and to boost scientists' confidence in the lossy compression process. We will provide an overview on our progress in applying lossy compression to CESM output and describe our unique suite of metric tests that evaluate the impact of information loss. Further, we will describe our processes how to choose an appropriate compression algorithm (and its associated parameters) given the diversity of CESM data (e.g., variables may be constant, smooth, change abruptly, contain missing values, or have large ranges). Traditional compression algorithms, such as those used for images, are not necessarily ideally suited for floating-point climate simulation data, and different methods may have different strengths and be more effective for certain types of variables than others. We will discuss our progress towards our ultimate goal of developing an automated multi-method parallel approach for compression of climate data that both maximizes data reduction and minimizes the impact of data loss on science results.
Interactive Parallel Data Analysis within Data-Centric Cluster Facilities using the IPython Notebook
NASA Astrophysics Data System (ADS)
Pascoe, S.; Lansdowne, J.; Iwi, A.; Stephens, A.; Kershaw, P.
2012-12-01
The data deluge is making traditional analysis workflows for many researchers obsolete. Support for parallelism within popular tools such as matlab, IDL and NCO is not well developed and rarely used. However parallelism is necessary for processing modern data volumes on a timescale conducive to curiosity-driven analysis. Furthermore, for peta-scale datasets such as the CMIP5 archive, it is no longer practical to bring an entire dataset to a researcher's workstation for analysis, or even to their institutional cluster. Therefore, there is an increasing need to develop new analysis platforms which both enable processing at the point of data storage and which provides parallelism. Such an environment should, where possible, maintain the convenience and familiarity of our current analysis environments to encourage curiosity-driven research. We describe how we are combining the interactive python shell (IPython) with our JASMIN data-cluster infrastructure. IPython has been specifically designed to bridge the gap between the HPC-style parallel workflows and the opportunistic curiosity-driven analysis usually carried out using domain specific languages and scriptable tools. IPython offers a web-based interactive environment, the IPython notebook, and a cluster engine for parallelism all underpinned by the well-respected Python/Scipy scientific programming stack. JASMIN is designed to support the data analysis requirements of the UK and European climate and earth system modeling community. JASMIN, with its sister facility CEMS focusing the earth observation community, has 4.5 PB of fast parallel disk storage alongside over 370 computing cores provide local computation. Through the IPython interface to JASMIN, users can make efficient use of JASMIN's multi-core virtual machines to perform interactive analysis on all cores simultaneously or can configure IPython clusters across multiple VMs. Larger-scale clusters can be provisioned through JASMIN's batch scheduling system. Outputs can be summarised and visualised using the full power of Python's many scientific tools, including Scipy, Matplotlib, Pandas and CDAT. This rich user experience is delivered through the user's web browser; maintaining the interactive feel of a workstation-based environment with the parallel power of a remote data-centric processing facility.
Millennial-scale Climate Variations Recorded As Far Back As The Early Pliocene
NASA Astrophysics Data System (ADS)
Steenbrink, J.; Hilgen, F. J.; Lourens, L. J.
Quaternary climate proxy records show compelling evidence for climate variability on time scales of a few thousand years. The causes for these millennial-scale or sub- Milankovitch cycles are yet poorly understood, not in the least due to the complex feedback mechanisms of large ice-sheets during the Quaternary. We present evidence of millennial-scale climate variability in Early Pliocene lacustrine sediments from the intramontane Ptolemais Basin in northwestern Greece. The sediments are well ex- posed in a series of open-pit lignite mines and exhibit a distinct m-scale sedimentary cyclicity of alternating lignites and lacustrine marl beds that result from precession- induced variations in climate. A higher-frequency cyclicity is particular prominent within the marl segment of individual cycles. A stratigraphic interval of~115 kyr, cov- ering five precession-induced sedimentary cycles, was studied in nine parallel sections from two quarries located several km apart. Colour reflectance records were used to quantify the within-cycle variability and to determine its lateral continuity. Much of the within-cycle variability could be correlated between the parallel sections, even in fine detail, which suggests that these changes reflect basin-wide variations in environ- mental conditions related to (regional) climate fluctuations. Interbedded volcanic ash beds demonstrate the synchronicity of these fluctuations and spectral analysis of the reflectance time series shows a significant concentration of variability at periods of ~11,~5.5 and~2 kyr. Their occurrence at times before the intensification of the North- ern Hemisphere glaciation suggests that they cannot solely have resulted from internal ice-sheet dynamics. Possible candidates include harmonics or combination tones of the main orbital cycles, variations in solar output or periodic motions of the Earth and moon.
The Urbino Summer School in Paleoclimatology: Investing in the future of paleoclimatology
NASA Astrophysics Data System (ADS)
Schellenberg, S. A.; Galeotti, S.; Brinkhuis, H.; Leckie, R. M.
2010-12-01
Improving our understanding of global climate dynamics is increasingly critical as we continue to perturb the Earth system on geologically rapid time-scales. One approach is the modeling of climate dynamics; another is the exploitation of natural archives of climate history. To promote the synergistic integration of these approaches in the next generation of paleoclimatologists, a group of international teacher-scholars have developed the Urbino Summer School in Paleoclimatology (USSP), which has been offered since 2004 at the Università degli Studi di Urbino in Urbino, Italy. The USSP provides international graduate students with an intensive three-week experience in reconstructing the history and dynamics of climate through an integrated series of lectures, investigations, and field and laboratory analyses. Complementing these formal components, informal scientific discussions and collaborations are promoted among faculty and students through group meals, coffee breaks, socials, and evening presentations. The first week begins with a broad overview of climate history and dynamics, and then focuses on the principles and methods that transform geographically- and materially-diverse data into globally time-ordinated paleoclimatic information. Lectures largely serve as “connective tissue” for student-centered investigations that use ocean drilling data and student-collected field data from the spectacular exposures of the surrounding Umbre-Marche Basin. The second week provides sessions and investigations on various biotic and geochemical proxies, and marks the start of student “working groups,” each of whom focus on current understanding of, and outstanding questions regarding, a particular geologic time-interval. Parallel sessions also commence, wherein students self-select to attend one of three concurrently-offered more specialized topics. The third week is an intensive exploration of geochemical, climate, and ocean modeling that stresses the integration of paleoclimate modeling and proxy data. The third week also includes the “Cioppino” conference comprised of lectures by experts from various fields that presenting “new and exciting ideas for digestion.” The course concludes with a series of lectures, discussion, and student presentations examining the relevance of paleoclimate to understanding modern climate dynamics and anthropogenic impacts. Student costs are increasingly being reduced per capita through governmental/institutional underwriting and individually through competitive awards (e.g., recent NSF USSP scholarships). Based on student and faculty evaluations, the current USSP structure appears largely optimized for our initial goal of promoting the integration of paleoclimate proxy data and modeling. Current planning efforts focus on strengthening course connections to Anthropocene issues and managing the large number of international faculty who donate their time and energy as an investment in the future of paleoclimatology.
A big data approach for climate change indicators processing in the CLIP-C project
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Palazzo, Cosimo; Fiore, Sandro; Aloisio, Giovanni
2016-04-01
Defining and implementing processing chains with multiple (e.g. tens or hundreds of) data analytics operators can be a real challenge in many practical scientific use cases such as climate change indicators. This is usually done via scripts (e.g. bash) on the client side and requires climate scientists to take care of, implement and replicate workflow-like control logic aspects (which may be error-prone too) in their scripts, along with the expected application-level part. Moreover, the big amount of data and the strong I/O demand pose additional challenges related to the performance. In this regard, production-level tools for climate data analysis are mostly sequential and there is a lack of big data analytics solutions implementing fine-grain data parallelism or adopting stronger parallel I/O strategies, data locality, workflow optimization, etc. High-level solutions leveraging on workflow-enabled big data analytics frameworks for eScience could help scientists in defining and implementing the workflows related to their experiments by exploiting a more declarative, efficient and powerful approach. This talk will start introducing the main needs and challenges regarding big data analytics workflow management for eScience and will then provide some insights about the implementation of some real use cases related to some climate change indicators on large datasets produced in the context of the CLIP-C project - a EU FP7 project aiming at providing access to climate information of direct relevance to a wide variety of users, from scientists to policy makers and private sector decision makers. All the proposed use cases have been implemented exploiting the Ophidia big data analytics framework. The software stack includes an internal workflow management system, which coordinates, orchestrates, and optimises the execution of multiple scientific data analytics and visualization tasks. Real-time workflow monitoring execution is also supported through a graphical user interface. In order to address the challenges of the use cases, the implemented data analytics workflows include parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, and import/export of datasets in NetCDF format. The use cases have been implemented on a HPC cluster of 8-nodes (16-cores/node) of the Athena Cluster available at the CMCC Supercomputing Centre. Benchmark results will be also presented during the talk.
NASA Technical Reports Server (NTRS)
1992-01-01
The U.S. Global Change Reasearch Program (USGCRP) was established as a Presidential initiative in the FY-1990 Budget to help develop sound national and international policies related to global environmental issues, particularly global climate change. The USGCRP is implemented through a priority-driven scientific research agenda that is designed to be integrated, comprehensive, and multidisciplinary. It is designed explicitly to address scientific uncertainties in such areas as climate change, ozone depletion, changes in terrestrial and marine productivity, global water and energy cycles, sea level changes, the impact of global changes on human health and activities, and the impact of anthropogenic activities on the Earth system. The USGCRP addresses three parallel but interconnected streams of activity: documenting global change (observations); enhancing understanding of key processes (process research); and predicting global and regional environmental change (integrated modeling and prediction).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Dean N.
2011-04-02
This report summarizes work carried out by the Earth System Grid Center for Enabling Technologies (ESG-CET) from October 1, 2010 through March 31, 2011. It discusses ESG-CET highlights for the reporting period, overall progress, period goals, and collaborations, and lists papers and presentations. To learn more about our project and to find previous reports, please visit the ESG-CET Web sites: http://esg-pcmdi.llnl.gov/ and/or https://wiki.ucar.edu/display/esgcet/Home. This report will be forwarded to managers in the Department of Energy (DOE) Scientific Discovery through Advanced Computing (SciDAC) program and the Office of Biological and Environmental Research (OBER), as well as national and international collaborators andmore » stakeholders (e.g., those involved in the Coupled Model Intercomparison Project, phase 5 (CMIP5) for the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5); the Community Earth System Model (CESM); the Climate Science Computational End Station (CCES); SciDAC II: A Scalable and Extensible Earth System Model for Climate Change Science; the North American Regional Climate Change Assessment Program (NARCCAP); the Atmospheric Radiation Measurement (ARM) program; the National Aeronautics and Space Administration (NASA), the National Oceanic and Atmospheric Administration (NOAA)), and also to researchers working on a variety of other climate model and observation evaluation activities. The ESG-CET executive committee consists of Dean N. Williams, Lawrence Livermore National Laboratory (LLNL); Ian Foster, Argonne National Laboratory (ANL); and Don Middleton, National Center for Atmospheric Research (NCAR). The ESG-CET team is a group of researchers and scientists with diverse domain knowledge, whose home institutions include eight laboratories and two universities: ANL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), LLNL, NASA/Jet Propulsion Laboratory (JPL), NCAR, Oak Ridge National Laboratory (ORNL), Pacific Marine Environmental Laboratory (PMEL)/NOAA, Rensselaer Polytechnic Institute (RPI), and University of Southern California, Information Sciences Institute (USC/ISI). All ESG-CET work is accomplished under DOE open-source guidelines and in close collaboration with the project's stakeholders, domain researchers, and scientists. Through the ESG project, the ESG-CET team has developed and delivered a production environment for climate data from multiple climate model sources (e.g., CMIP (IPCC), CESM, ocean model data (e.g., Parallel Ocean Program), observation data (e.g., Atmospheric Infrared Sounder, Microwave Limb Sounder), and analysis and visualization tools) that serves a worldwide climate research community. Data holdings are distributed across multiple sites including LANL, LBNL, LLNL, NCAR, and ORNL as well as unfunded partners sites such as the Australian National University (ANU) National Computational Infrastructure (NCI), the British Atmospheric Data Center (BADC), the Geophysical Fluid Dynamics Laboratory/NOAA, the Max Planck Institute for Meteorology (MPI-M), the German Climate Computing Centre (DKRZ), and NASA/JPL. As we transition from development activities to production and operations, the ESG-CET team is tasked with making data available to all users who want to understand it, process it, extract value from it, visualize it, and/or communicate it to others. This ongoing effort is extremely large and complex, but it will be incredibly valuable for building 'science gateways' to critical climate resources (such as CESM, CMIP5, ARM, NARCCAP, Atmospheric Infrared Sounder (AIRS), etc.) for processing the next IPCC assessment report. Continued ESG progress will result in a production-scale system that will empower scientists to attempt new and exciting data exchanges, which could ultimately lead to breakthrough climate science discoveries.« less
Springer, Yuri P.; Jarnevich, Catherine S.; Barnett, David T.; Monaghan, Andrew J.; Eisen, Rebecca J.
2015-01-01
The Lone star tick (Amblyomma americanum L.) is the primary vector for pathogens of significant public health importance in North America, yet relatively little is known about its current and potential future distribution. Building on a published summary of tick collection records, we used an ensemble modeling approach to predict the present-day and future distribution of climatically suitable habitat for establishment of the Lone star tick within the continental United States. Of the nine climatic predictor variables included in our five present-day models, average vapor pressure in July was by far the most important determinant of suitable habitat. The present-day ensemble model predicted an essentially contiguous distribution of suitable habitat extending to the Atlantic coast east of the 100th western meridian and south of the 40th northern parallel, but excluding a high elevation region associated with the Appalachian Mountains. Future ensemble predictions for 2061–2080 forecasted a stable western range limit, northward expansion of suitable habitat into the Upper Midwest and western Pennsylvania, and range contraction along portions of the Gulf coast and the lower Mississippi river valley. These findings are informative for raising awareness of A. americanum-transmitted pathogens in areas where the Lone Star tick has recently or may become established.
NASA Astrophysics Data System (ADS)
Zhao, F.; Frieler, K.; Warszawski, L.; Lange, S.; Schewe, J.; Reyer, C.; Ostberg, S.; Piontek, F.; Betts, R. A.; Burke, E.; Ciais, P.; Deryng, D.; Ebi, K. L.; Emanuel, K.; Elliott, J. W.; Galbraith, E. D.; Gosling, S.; Hickler, T.; Hinkel, J.; Jones, C.; Krysanova, V.; Lotze-Campen, H.; Mouratiadou, I.; Popp, A.; Tian, H.; Tittensor, D.; Vautard, R.; van Vliet, M. T. H.; Eddy, T.; Hattermann, F.; Huber, V.; Mengel, M.; Stevanovic, M.; Kirsten, T.; Mueller Schmied, H.; Denvil, S.; Halladay, K.; Suzuki, T.; Lotze, H. K.
2016-12-01
In Paris, France, December 2015 the Conference of Parties (COP) to the United Nations Framework Convention on Climate Change (UNFCCC) invited the IPCC to provide a "special report in 2018 on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways". In Nairobi, Kenya, April 2016 the IPCC panel accepted the invitation. Here we describe the model simulations planned within the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) to address the request by providing tailored cross-sectoral consistent impacts projections. The protocol is designed to allow for 1) a separation of the impacts of the historical warming starting from pre-industrial conditions from other human drivers such as historical land use changes (based on pre-industrial and historical impact model simulations), 2) a quantification of the effects of an additional warming to 1.5°C including a potential overshoot and long term effects up to 2300 in comparison to a no-mitigation scenario (based on the low emissions Representative Concentration Pathway RCP2.6 and a no-mitigation scenario RCP6.0) keeping socio-economic conditions fixed at year 2005 levels, and 3) an assessment of the climate effects based on the same climate scenarios but accounting for parallel changes in socio-economic conditions following the middle of the road Shared Socioeconomic Pathway (SSP2) and differential bio-energy requirements associated with the transformation of the energy system to reach RCP2.6 compared to RCP6.0. To provide the scientific basis for an aggregation of impacts across sectors and an analysis of cross-sectoral interactions potentially damping or amplifying sectoral impacts the protocol is designed to provide consistent impacts projections across a range of impact models from different sectors (global and regional hydrological models, global gridded crop models, global vegetation models, regional forestry models, global and regional marine ecosystem and fisheries models, global and regional coastal infrastructure models, energy models, health models, and agro-economic models).
NASA Astrophysics Data System (ADS)
Frieler, Katja; Warszawski, Lila; Zhao, Fang
2017-04-01
In Paris, France, December 2015 the Conference of Parties (COP) to the United Nations Framework Convention on Climate Change (UNFCCC) invited the IPCC to provide a "special report in 2018 on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways". In Nairobi, Kenya, April 2016 the IPCC panel accepted the invitation. Here we describe the model simulations planned within the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) to address the request by providing tailored cross-sectoral consistent impacts projections. The protocol is designed to allow for 1) a separation of the impacts of the historical warming starting from pre-industrial conditions from other human drivers such as historical land use changes (based on pre-industrial and historical impact model simulations), 2) a quantification of the effects of an additional warming to 1.5°C including a potential overshoot and long term effects up to 2300 in comparison to a no-mitigation scenario (based on the low emissions Representative Concentration Pathway RCP2.6 and a no-mitigation scenario RCP6.0) keeping socio-economic conditions fixed at year 2005 levels, and 3) an assessment of the climate effects based on the same climate scenarios but accounting for parallel changes in socio-economic conditions following the middle of the road Shared Socioeconomic Pathway (SSP2) and differential bio-energy requirements associated with the transformation of the energy system to reach RCP2.6 compared to RCP6.0. To provide the scientific basis for an aggregation of impacts across sectors and an analysis of cross-sectoral interactions potentially damping or amplifying sectoral impacts the protocol is designed to provide consistent impacts projections across a range of impact models from different sectors (global and regional hydrological models, global gridded crop models, global vegetation models, regional forestry models, global and regional marine ecosystem and fisheries models, global and regional coastal infrastructure models, energy models, health models, and agro-economic models).
NASA Astrophysics Data System (ADS)
Sun, H.; Bond, T.
2004-12-01
Carbonaceous aerosols, including black carbon (BC) and organic carbon (OC), make up a large fraction of the atmospheric aerosols and affect the radiative balance of the earth either by directly scattering and absorbing solar radiation or through indirect influence on cloud optical properties and cloud lifetimes. The major sources of BC and OC emissions are from combustion processes, mainly.fossil-fuel burning, biofuels burning, and open biomass burning. OC is nearly always emitted with BC. Because different combustion practices contribute to the emission of BC and OC to the atmosphere, the magnitude and characteristics of carbonaceous aerosols vary between regions. Since OC mainly scatters light and BC absorbs it, it is possible that OC can oppose the warming effect of BC, so that the net climatic effect of carbonaceous aerosols is not known. There is presently disagreement on whether carbonaceous aerosols produce a net warming or cooling effect on climate. Some differences in model prediction may result from model differences, such as dynamics and treatment of cloud feedbacks. However, large differences also result from initial assumptions about the properties of BC and OC: optical properties, size distribution, and interaction with water. Although there are hundreds of different organic species in atmospheric aerosols, with widely varying properties, global climate models to date have treated organics as one ¡°compound.¡± In addition, emissions of OC are often derived by multiplying BC emissions by a constant factor, so that the balance between these different compounds is assumed. Addressing these critical model assumptions is a necessary step toward estimating the net climatic impact of carbonaceous aerosols, and different human activities. We aim to contribute to this effort by tabulating important climate-relevant properties of both emissions and ambient measurements. Since one single organic ¡°compound¡± is not sufficient to represent all the organics in aerosols, we propose a Climate-Relevant Optical & Structural Subgroups of OC (CROSS-OC) which is a classification for organic aerosols based on structural and optical properties. We provide broad classes aiming at global models instead of very detailed classifications, which are not amenable for use in global-scale models due to the calculation cost. Organic matter (OM) which includes the hydrogen and oxygen bound to this carbon is divided into classes with varied absorption and scattering capabilities. Because our inventory tabulates emissions from specific sources, we make use of data available from source characterization. We present a global emission inventory of primary carbonaceous aerosols that has been designed for global climate modeling purpose. The inventory is based on our CROSS-OC classification and considers emissions from fossil fuels, biofuels, and open biomass burning. Fuel type, combustion type, and emission controls, and their prevalence on a regional basis are combined to determine emission factors for all types of carbonaceous aerosols. We also categorize surface concentration observations for BC and OC by region, size (super vs. sub micron), measurement type, time (including season) and date. We parallel the data format suggested by the Global Atmosphere Watch aerosol database. Work underway includes providing information on the CROSS-OC divisions in ambient aerosol when measurements contain sufficient detail.
NASA Astrophysics Data System (ADS)
Oki, T.; KIM, H.; Ferguson, C. R.; Dirmeyer, P.; Seneviratne, S. I.
2013-12-01
As the climate warms, the frequency and severity of flood and drought events is projected to increase. Understanding the role that the land surface will play in reinforcing or diminishing these extremes at regional scales will become critical. In fact, the current development path from atmospheric (GCM) to coupled atmosphere-ocean (AOGCM) to fully-coupled dynamic earth system models (ESMs) has brought new awareness to the climate modeling community of the abundance of uncertainty in land surface parameterizations. One way to test the representativeness of a land surface scheme is to do so in off-line (uncoupled) mode with controlled, high quality meteorological forcing. When multiple land schemes are run in-parallel (with the same forcing data), an inter-comparison of their outputs can provide the basis for model confidence estimates and future model refinements. In 2003, the Global Soil Wetness Project Phase 2 (GSWP2) provided the first global multi-model analysis of land surface state variables and fluxes. It spanned the decade of 1986-1995. While it was state-of-the art at the time, physical schemes have since been enhanced, a number of additional processes and components in the water-energy-eco-systems nexus can now be simulated, , and the availability of global, long-term observationally-based datasets that can be used for forcing and validating models has grown. Today, the data exists to support century-scale off-line experiments. The ongoing follow-on to GSWP2, named GSWP3, capitalizes on these new feasibilities and model functionalities. The project's cornerstone is its century-scale (1901-2010), 3-hourly, 0.5° meteorological forcing dataset that has been dynamically downscaled from the Twentieth Century Reanalysis and bias-corrected using monthly Climate Research Unit (CRU) temperature and Global Precipitation Climatology Centre (GPCC) precipitation data. However, GSWP3 also has an important long-term future climate component that spans the 21st century. Forcings for this period are produced from a select number of GCM-representative concentration pathways (RCPs) pairings. GSWP3 is specifically directed towards addressing the following key science questions: 1. How have interactions between eco-hydrological processes changed in the long term within a changing climate? 2. What is /will be the state of the water, energy, and carbon balances over land in the 20th and 21st centuries and what are the implications of the anticipated changes for human society in terms of freshwater resources, food productivity, and biodiversity? 3. How do the state-of-the-art land surface modeling systems perform and how can they be improved? In this presentation, we present preliminary results relevant to science question two, including: revised best-estimate global hydrological cycles for the retrospective period, inter-comparisons of modeled terrestrial water storage in large river basins and satellite remote-sensing estimates from the Gravity Recovery and Climate Experiment (GRACE), and the impacts of climate and anthropogenic changes during the 20th century on the long-term trend of water availability and scarcity.
Gutierrez, Kristie S; LePrevost, Catherine E
2016-02-03
Climate justice is a local, national, and global movement to protect at-risk populations who are disproportionately affected by climate change. The social context for this review is the Southeastern region of the United States, which is particularly susceptible to climate change because of the geography of the area and the vulnerabilities of the inhabiting populations. Negative human health effects on variable and vulnerable populations within the Southeast region due to changing climate are concerning, as health threats are not expected to produce parallel effects among all individuals. Vulnerable communities, such as communities of color, indigenous people, the geographically isolated, and those who are socioeconomically disadvantaged and already experiencing poor environmental quality, are least able to respond and adapt to climate change. Focusing on vulnerable populations in the Southeastern United States, this review is a synthesis of the recent (2010 to 2015) literature-base on the health effects connected to climate change. This review also addresses local and regional mitigation and adaptation strategies for citizens and leaders to combat direct and indirect human health effects related to a changing climate.
Scenarios of global mercury emissions from anthropogenic sources
NASA Astrophysics Data System (ADS)
Rafaj, P.; Bertok, I.; Cofala, J.; Schöpp, W.
2013-11-01
This paper discusses the impact of air quality and climate policies on global mercury emissions in the time horizon up to 2050. Evolution of mercury emissions is based on projections of energy consumption for a scenario without any global greenhouse gas mitigation efforts, and for a 2 °C climate policy scenario, which assumes internationally coordinated action to mitigate climate change. The assessment takes into account current air quality legislation in each country, as well as provides estimates of maximum feasible reductions in mercury through 2050. Results indicate significant scope for co-benefits of climate policies for mercury emissions. Atmospheric releases of mercury from anthropogenic sources under the global climate mitigation regime are reduced in 2050 by 45% when compared to the case without climate measures. Around one third of world-wide co-benefits for mercury emissions by 2050 occur in China. An annual Hg-abatement of about 800 tons is estimated for the coal combustion in power sector if the current air pollution legislation and climate policies are adopted in parallel.
Spatially explicit shallow landslide susceptibility mapping over large areas
Bellugi, Dino; Dietrich, William E.; Stock, Jonathan D.; McKean, Jim; Kazian, Brian; Hargrove, Paul
2011-01-01
Recent advances in downscaling climate model precipitation predictions now yield spatially explicit patterns of rainfall that could be used to estimate shallow landslide susceptibility over large areas. In California, the United States Geological Survey is exploring community emergency response to the possible effects of a very large simulated storm event and to do so it has generated downscaled precipitation maps for the storm. To predict the corresponding pattern of shallow landslide susceptibility across the state, we have used the model Shalstab (a coupled steady state runoff and infinite slope stability model) which susceptibility spatially explicit estimates of relative potential instability. Such slope stability models that include the effects of subsurface runoff on potentially destabilizing pore pressure evolution require water routing and hence the definition of upslope drainage area to each potential cell. To calculate drainage area efficiently over a large area we developed a parallel framework to scale-up Shalstab and specifically introduce a new efficient parallel drainage area algorithm which produces seamless results. The single seamless shallow landslide susceptibility map for all of California was accomplished in a short run time, and indicates that much larger areas can be efficiently modelled. As landslide maps generally over predict the extent of instability for any given storm. Local empirical data on the fraction of predicted unstable cells that failed for observed rainfall intensity can be used to specify the likely extent of hazard for a given storm. This suggests that campaigns to collect local precipitation data and detailed shallow landslide location maps after major storms could be used to calibrate models and improve their use in hazard assessment for individual storms.
Inexact hardware for modelling weather & climate
NASA Astrophysics Data System (ADS)
Düben, Peter D.; McNamara, Hugh; Palmer, Tim
2014-05-01
The use of stochastic processing hardware and low precision arithmetic in atmospheric models is investigated. Stochastic processors allow hardware-induced faults in calculations, sacrificing exact calculations in exchange for improvements in performance and potentially accuracy and a reduction in power consumption. A similar trade-off is achieved using low precision arithmetic, with improvements in computation and communication speed and savings in storage and memory requirements. As high-performance computing becomes more massively parallel and power intensive, these two approaches may be important stepping stones in the pursuit of global cloud resolving atmospheric modelling. The impact of both, hardware induced faults and low precision arithmetic is tested in the dynamical core of a global atmosphere model. Our simulations show that both approaches to inexact calculations do not substantially affect the quality of the model simulations, provided they are restricted to act only on smaller scales. This suggests that inexact calculations at the small scale could reduce computation and power costs without adversely affecting the quality of the simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stamnes, K.; Ellingson, R.G.; Curry, J.A.
1999-01-01
Recent climate modeling results point to the Arctic as a region that is particularly sensitive to global climate change. The Arctic warming predicted by the models to result from the expected doubling of atmospheric carbon dioxide is two to three times the predicted mean global warming, and considerably greater than the warming predicted for the Antarctic. The North Slope of Alaska-Adjacent Arctic Ocean (NSA-AAO) Cloud and Radiation Testbed (CART) site of the Atmospheric Radiation Measurement (ARM) Program is designed to collect data on temperature-ice-albedo and water vapor-cloud-radiation feedbacks, which are believed to be important to the predicted enhanced warming inmore » the Arctic. The most important scientific issues of Arctic, as well as global, significance to be addressed at the NSA-AAO CART site are discussed, and a brief overview of the current approach toward, and status of, site development is provided. ARM radiometric and remote sensing instrumentation is already deployed and taking data in the perennial Arctic ice pack as part of the SHEBA (Surface Heat Budget of the Arctic ocean) experiment. In parallel with ARM`s participation in SHEBA, the NSA-AAO facility near Barrow was formally dedicated on 1 July 1997 and began routine data collection early in 1998. This schedule permits the US Department of Energy`s ARM Program, NASA`s Arctic Cloud program, and the SHEBA program (funded primarily by the National Science Foundation and the Office of Naval Research) to be mutually supportive. In addition, location of the NSA-AAO Barrow facility on National Oceanic and Atmospheric Administration land immediately adjacent to its Climate Monitoring and Diagnostic Laboratory Barrow Observatory includes NOAA in this major interagency Arctic collaboration.« less
Impact of anthropogenic aerosols on regional climate change in Beijing, China
NASA Astrophysics Data System (ADS)
Zhao, B.; Liou, K. N.; He, C.; Lee, W. L.; Gu, Y.; Li, Q.; Leung, L. R.
2015-12-01
Anthropogenic aerosols affect regional climate significantly through radiative (direct and semi-direct) and indirect effects, but the magnitude of these effects over megacities are subject to large uncertainty. In this study, we evaluated the effects of anthropogenic aerosols on regional climate change in Beijing, China using the online-coupled Weather Research and Forecasting/Chemistry Model (WRF/Chem) with the Fu-Liou-Gu radiation scheme and a spatial resolution of 4km. We further updated this radiation scheme with a geometric-optics surface-wave (GOS) approach for the computation of light absorption and scattering by black carbon (BC) particles in which aggregation shape and internal mixing properties are accounted for. In addition, we incorporated in WRF/Chem a 3D radiative transfer parameterization in conjunction with high-resolution digital data for city buildings and landscape to improve the simulation of boundary-layer, surface solar fluxes and associated sensible/latent heat fluxes. Preliminary simulated meteorological parameters, fine particles (PM2.5) and their chemical components agree well with observational data in terms of both magnitude and spatio-temporal variations. The effects of anthropogenic aerosols, including BC, on radiative forcing, surface temperature, wind speed, humidity, cloud water path, and precipitation are quantified on the basis of simulation results. With several preliminary sensitivity runs, we found that meteorological parameters and aerosol radiative effects simulated with the incorporation of improved BC absorption and 3-D radiation parameterizations deviate substantially from simulation results using the conventional homogeneous/core-shell configuration for BC and the plane-parallel model for radiative transfer. Understanding of the aerosol effects on regional climate change over megacities must consider the complex shape and mixing state of aerosol aggregates and 3D radiative transfer effects over city landscape.
Crossing the chasm: how to develop weather and climate models for next generation computers?
NASA Astrophysics Data System (ADS)
Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon
2018-05-01
Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries - and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.
Parallelization and automatic data distribution for nuclear reactor simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebrock, L.M.
1997-07-01
Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directlymore » affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed.« less
NASA Astrophysics Data System (ADS)
Penner, Joyce E.; Andronova, Natalia; Oehmke, Robert C.; Brown, Jonathan; Stout, Quentin F.; Jablonowski, Christiane; van Leer, Bram; Powell, Kenneth G.; Herzog, Michael
2007-07-01
One of the most important advances needed in global climate models is the development of atmospheric General Circulation Models (GCMs) that can reliably treat convection. Such GCMs require high resolution in local convectively active regions, both in the horizontal and vertical directions. During previous research we have developed an Adaptive Mesh Refinement (AMR) dynamical core that can adapt its grid resolution horizontally. Our approach utilizes a finite volume numerical representation of the partial differential equations with floating Lagrangian vertical coordinates and requires resolving dynamical processes on small spatial scales. For the latter it uses a newly developed general-purpose library, which facilitates 3D block-structured AMR on spherical grids. The library manages neighbor information as the blocks adapt, and handles the parallel communication and load balancing, freeing the user to concentrate on the scientific modeling aspects of their code. In particular, this library defines and manages adaptive blocks on the sphere, provides user interfaces for interpolation routines and supports the communication and load-balancing aspects for parallel applications. We have successfully tested the library in a 2-D (longitude-latitude) implementation. During the past year, we have extended the library to treat adaptive mesh refinement in the vertical direction. Preliminary results are discussed. This research project is characterized by an interdisciplinary approach involving atmospheric science, computer science and mathematical/numerical aspects. The work is done in close collaboration between the Atmospheric Science, Computer Science and Aerospace Engineering Departments at the University of Michigan and NOAA GFDL.
ERIC Educational Resources Information Center
von Davier, Matthias
2016-01-01
This report presents results on a parallel implementation of the expectation-maximization (EM) algorithm for multidimensional latent variable models. The developments presented here are based on code that parallelizes both the E step and the M step of the parallel-E parallel-M algorithm. Examples presented in this report include item response…
Effects of Climate on Co-evolution of Weathering Profiles and Hillscapes
NASA Astrophysics Data System (ADS)
Anderson, R. S.; Rajaram, H.; Anderson, S. P.
2017-12-01
Considerable debate revolves around the relative importance of rock type, tectonics, and climate in creating the architecture of the critical zone. It has recently been proposed that differences in the depths and patterns of weathering between landscapes in Colorado's Front Range and South Carolina's piedmont can be attributed to the state of stress in the rock imposed by the magnitude and orientation the regional stresses with respect to the ridgelines (St. Claire et al., 2016). We argue for the importance of the climate, and in particular, in temperate regions, the amount of recharge. We employ numerical models of hillslope evolution between bounding erosional channels, in which the degree of rock weathering governs the rate of transformation of rock to soil. As the water table drapes between the stream channels, fresh rock is brought into the weathering zone at a rate governed by the rate of incision of the channels. We track the chemical weathering of rock, represented by alteration of feldspar to clays, which in turn requires calculation of the concentration of reactive species in the water along hydrologic flow paths. We present results from analytic solutions to the flow field in which travel times can be efficiently assessed. Below the water table, flow paths are hyperbolic, taking on considerable lateral components as they veer toward the bounding channels that serve as drains to the hillslope. We find that if water is far from equilibrium with respect to weatherable minerals at the water table, as occurs in wet, slowly-eroding landscapes, deep weathering can occur well below the water table to levels approximating the base of the bounding channels. In dry climates, on the other hand, the weathering zone is limited to a shallow surface - parallel layer. These models capture the essence of the observed differences in depth to fresh rock in both wet and dry climates without appeal to the state of stress in the rock.
A feasibility study on porting the community land model onto accelerators using OpenACC
Wang, Dali; Wu, Wei; Winkler, Frank; ...
2014-01-01
As environmental models (such as Accelerated Climate Model for Energy (ACME), Parallel Reactive Flow and Transport Model (PFLOTRAN), Arctic Terrestrial Simulator (ATS), etc.) became more and more complicated, we are facing enormous challenges regarding to porting those applications onto hybrid computing architecture. OpenACC appears as a very promising technology, therefore, we have conducted a feasibility analysis on porting the Community Land Model (CLM), a terrestrial ecosystem model within the Community Earth System Models (CESM)). Specifically, we used automatic function testing platform to extract a small computing kernel out of CLM, then we apply this kernel into the actually CLM dataflowmore » procedure, and investigate the strategy of data parallelization and the benefit of data movement provided by current implementation of OpenACC. Even it is a non-intensive kernel, on a single 16-core computing node, the performance (based on the actual computation time using one GPU) of OpenACC implementation is 2.3 time faster than that of OpenMP implementation using single OpenMP thread, but it is 2.8 times slower than the performance of OpenMP implementation using 16 threads. On multiple nodes, MPI_OpenACC implementation demonstrated very good scalability on up to 128 GPUs on 128 computing nodes. This study also provides useful information for us to look into the potential benefits of “deep copy” capability and “routine” feature of OpenACC standards. In conclusion, we believe that our experience on the environmental model, CLM, can be beneficial to many other scientific research programs who are interested to porting their large scale scientific code using OpenACC onto high-end computers, empowered by hybrid computing architecture.« less
Parallelization of the Physical-Space Statistical Analysis System (PSAS)
NASA Technical Reports Server (NTRS)
Larson, J. W.; Guo, J.; Lyster, P. M.
1999-01-01
Atmospheric data assimilation is a method of combining observations with model forecasts to produce a more accurate description of the atmosphere than the observations or forecast alone can provide. Data assimilation plays an increasingly important role in the study of climate and atmospheric chemistry. The NASA Data Assimilation Office (DAO) has developed the Goddard Earth Observing System Data Assimilation System (GEOS DAS) to create assimilated datasets. The core computational components of the GEOS DAS include the GEOS General Circulation Model (GCM) and the Physical-space Statistical Analysis System (PSAS). The need for timely validation of scientific enhancements to the data assimilation system poses computational demands that are best met by distributed parallel software. PSAS is implemented in Fortran 90 using object-based design principles. The analysis portions of the code solve two equations. The first of these is the "innovation" equation, which is solved on the unstructured observation grid using a preconditioned conjugate gradient (CG) method. The "analysis" equation is a transformation from the observation grid back to a structured grid, and is solved by a direct matrix-vector multiplication. Use of a factored-operator formulation reduces the computational complexity of both the CG solver and the matrix-vector multiplication, rendering the matrix-vector multiplications as a successive product of operators on a vector. Sparsity is introduced to these operators by partitioning the observations using an icosahedral decomposition scheme. PSAS builds a large (approx. 128MB) run-time database of parameters used in the calculation of these operators. Implementing a message passing parallel computing paradigm into an existing yet developing computational system as complex as PSAS is nontrivial. One of the technical challenges is balancing the requirements for computational reproducibility with the need for high performance. The problem of computational reproducibility is well known in the parallel computing community. It is a requirement that the parallel code perform calculations in a fashion that will yield identical results on different configurations of processing elements on the same platform. In some cases this problem can be solved by sacrificing performance. Meeting this requirement and still achieving high performance is very difficult. Topics to be discussed include: current PSAS design and parallelization strategy; reproducibility issues; load balance vs. database memory demands, possible solutions to these problems.
Climate Modeling with a Million CPUs
NASA Astrophysics Data System (ADS)
Tobis, M.; Jackson, C. S.
2010-12-01
Michael Tobis, Ph.D. Research Scientist Associate University of Texas Institute for Geophysics Charles S. Jackson Research Scientist University of Texas Institute for Geophysics Meteorological, oceanographic, and climatological applications have been at the forefront of scientific computing since its inception. The trend toward ever larger and more capable computing installations is unabated. However, much of the increase in capacity is accompanied by an increase in parallelism and a concomitant increase in complexity. An increase of at least four additional orders of magnitude in the computational power of scientific platforms is anticipated. It is unclear how individual climate simulations can continue to make effective use of the largest platforms. Conversion of existing community codes to higher resolution, or to more complex phenomenology, or both, presents daunting design and validation challenges. Our alternative approach is to use the expected resources to run very large ensembles of simulations of modest size, rather than to await the emergence of very large simulations. We are already doing this in exploring the parameter space of existing models using the Multiple Very Fast Simulated Annealing algorithm, which was developed for seismic imaging. Our experiments have the dual intentions of tuning the model and identifying ranges of parameter uncertainty. Our approach is less strongly constrained by the dimensionality of the parameter space than are competing methods. Nevertheless, scaling up remains costly. Much could be achieved by increasing the dimensionality of the search and adding complexity to the search algorithms. Such ensemble approaches scale naturally to very large platforms. Extensions of the approach are anticipated. For example, structurally different models can be tuned to comparable effectiveness. This can provide an objective test for which there is no realistic precedent with smaller computations. We find ourselves inventing new code to manage our ensembles. Component computations involve tens to hundreds of CPUs and tens to hundreds of hours. The results of these moderately large parallel jobs influence the scheduling of subsequent jobs, and complex algorithms may be easily contemplated for this. The operating system concept of a "thread" re-emerges at a very coarse level, where each thread manages atomic computations of thousands of CPU-hours. That is, rather than multiple threads operating on a processor, at this level, multiple processors operate within a single thread. In collaboration with the Texas Advanced Computing Center, we are developing a software library at the system level, which should facilitate the development of computations involving complex strategies which invoke large numbers of moderately large multi-processor jobs. While this may have applications in other sciences, our key intent is to better characterize the coupled behavior of a very large set of climate model configurations.
NASA Astrophysics Data System (ADS)
Pietroń, Jan; Jarsjö, Jerker
2014-05-01
Ongoing changes in the Central Asian climate including increasing temperatures can influence the hydrological regimes of rivers and the waterborne transport of sediments. Changes in the latter, especially in combination with adverse human activities, may severely impact water quality and aquatic ecosystems. However, waterborne transport of sediments is a result of complex processes and varies considerably between, and even within, river systems. There is therefore a need to increase our general knowledge about sediment transport under changing climate conditions. The Tuul River, the case site of this study, is located in the upper part of the basin of the Selenga River that is the main tributary to Lake Baikal, a UNESCO World Heritage Site. Like many other rivers located in the steppes of Northern Mongolia, the Tuul River is characterized by a hydrological regime that is not disturbed by engineered structures such as reservoirs and dams. However, the water quality of the downstream Tuul River is increasingly affected by adverse human activities - including placer gold mining. The largest contribution to the annual river discharge occurs during the relatively warm period in May to August. Typically, there are numerous rainfall events during this period that cause considerable river flow peaks. Parallel work has furthermore shown that due to climate change, the daily variability of discharge and numbers of peak flow events in the Tuul River Basin has increased during the past 60 years. This trend is expected to continue. We here aim at increasing our understanding of future sediment transport patterns in the Tuul River, specifically considering the scenario that peak flow events may become more frequent due to climate change. We use a one-dimensional sediment transport model of the downstream reach of the river to simulate natural patterns of sediment transport for a recent hydrological year. In general, the results show that sediment transport varies considerably spatially and temporally. Peak flow events during the warm period contribute largely to the total annual transport of sediments and also to the erosion of stored bed material. These results suggest that if the number of peak flow events will increase further due to climate change, there will be a significant increase in the annual sediment load and consequently in the load of contaminants that are attached to the sediments, in particular downstream of mining sites. The present results are furthermore consistent with parallel studies on sediment transport and climate change showing that increased water discharges and frequencies of rainfall/flow events can lead to enhanced erosion processes. Furthermore, in addition to climate change effects, human activates can change sediment loads in rivers to even greater extent, as pointed out in several studies. Thus, several different challenges can be expected to face the management of Central Asian rivers such as Tuul and their ecosystems in the future.
Trends and Correlation Estimation in Climate Sciences: Effects of Timescale Errors
NASA Astrophysics Data System (ADS)
Mudelsee, M.; Bermejo, M. A.; Bickert, T.; Chirila, D.; Fohlmeister, J.; Köhler, P.; Lohmann, G.; Olafsdottir, K.; Scholz, D.
2012-12-01
Trend describes time-dependence in the first moment of a stochastic process, and correlation measures the linear relation between two random variables. Accurately estimating the trend and correlation, including uncertainties, from climate time series data in the uni- and bivariate domain, respectively, allows first-order insights into the geophysical process that generated the data. Timescale errors, ubiquitious in paleoclimatology, where archives are sampled for proxy measurements and dated, poses a problem to the estimation. Statistical science and the various applied research fields, including geophysics, have almost completely ignored this problem due to its theoretical almost-intractability. However, computational adaptations or replacements of traditional error formulas have become technically feasible. This contribution gives a short overview of such an adaptation package, bootstrap resampling combined with parametric timescale simulation. We study linear regression, parametric change-point models and nonparametric smoothing for trend estimation. We introduce pairwise-moving block bootstrap resampling for correlation estimation. Both methods share robustness against autocorrelation and non-Gaussian distributional shape. We shortly touch computing-intensive calibration of bootstrap confidence intervals and consider options to parallelize the related computer code. Following examples serve not only to illustrate the methods but tell own climate stories: (1) the search for climate drivers of the Agulhas Current on recent timescales, (2) the comparison of three stalagmite-based proxy series of regional, western German climate over the later part of the Holocene, and (3) trends and transitions in benthic oxygen isotope time series from the Cenozoic. Financial support by Deutsche Forschungsgemeinschaft (FOR 668, FOR 1070, MU 1595/4-1) and the European Commission (MC ITN 238512, MC ITN 289447) is acknowledged.
NASA Astrophysics Data System (ADS)
Nijssen, B.; Hamman, J.; Bohn, T. J.
2015-12-01
The Variable Infiltration Capacity (VIC) model is a macro-scale semi-distributed hydrologic model. VIC development began in the early 1990s and it has been used extensively, applied from basin to global scales. VIC has been applied in a many use cases, including the construction of hydrologic data sets, trend analysis, data evaluation and assimilation, forecasting, coupled climate modeling, and climate change impact analysis. Ongoing applications of the VIC model include the University of Washington's drought monitor and forecast systems, and NASA's land data assimilation systems. The development of VIC version 5.0 focused on reconfiguring the legacy VIC source code to support a wider range of modern modeling applications. The VIC source code has been moved to a public Github repository to encourage participation by the model development community-at-large. The reconfiguration has separated the physical core of the model from the driver, which is responsible for memory allocation, pre- and post-processing and I/O. VIC 5.0 includes four drivers that use the same physical model core: classic, image, CESM, and Python. The classic driver supports legacy VIC configurations and runs in the traditional time-before-space configuration. The image driver includes a space-before-time configuration, netCDF I/O, and uses MPI for parallel processing. This configuration facilitates the direct coupling of streamflow routing, reservoir, and irrigation processes within VIC. The image driver is the foundation of the CESM driver; which couples VIC to CESM's CPL7 and a prognostic atmosphere. Finally, we have added a Python driver that provides access to the functions and datatypes of VIC's physical core from a Python interface. This presentation demonstrates how reconfiguring legacy source code extends the life and applicability of a research model.
NASA Astrophysics Data System (ADS)
Anantharaj, Valentine; Norman, Matthew; Evans, Katherine; Taylor, Mark; Worley, Patrick; Hack, James; Mayer, Benjamin
2014-05-01
During 2013, high-resolution climate model simulations accounted for over 100 million "core hours" using Titan at the Oak Ridge Leadership Computing Facility (OLCF). The suite of climate modeling experiments, primarily using the Community Earth System Model (CESM) at nearly 0.25 degree horizontal resolution, generated over a petabyte of data and nearly 100,000 files, ranging in sizes from 20 MB to over 100 GB. Effective utilization of leadership class resources requires careful planning and preparation. The application software, such as CESM, need to be ported, optimized and benchmarked for the target platform in order to meet the computational readiness requirements. The model configuration needs to be "tuned and balanced" for the experiments. This can be a complicated and resource intensive process, especially for high-resolution configurations using complex physics. The volume of I/O also increases with resolution; and new strategies may be required to manage I/O especially for large checkpoint and restart files that may require more frequent output for resiliency. It is also essential to monitor the application performance during the course of the simulation exercises. Finally, the large volume of data needs to be analyzed to derive the scientific results; and appropriate data and information delivered to the stakeholders. Titan is currently the largest supercomputer available for open science. The computational resources, in terms of "titan core hours" are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) and ASCR Leadership Computing Challenge (ALCC) programs, both sponsored by the U.S. Department of Energy (DOE) Office of Science. Titan is a Cray XK7 system, capable of a theoretical peak performance of over 27 PFlop/s, consists of 18,688 compute nodes, with a NVIDIA Kepler K20 GPU and a 16-core AMD Opteron CPU in every node, for a total of 299,008 Opteron cores and 18,688 GPUs offering a cumulative 560,640 equivalent cores. Scientific applications, such as CESM, are also required to demonstrate a "computational readiness capability" to efficiently scale across and utilize 20% of the entire system. The 0,25 deg configuration of the spectral element dynamical core of the Community Atmosphere Model (CAM-SE), the atmospheric component of CESM, has been demonstrated to scale efficiently across more than 5,000 nodes (80,000 CPU cores) on Titan. The tracer transport routines of CAM-SE have also been ported to take advantage of the hybrid many-core architecture of Titan using GPUs [see EGU2014-4233], yielding over 2X speedup when transporting over 100 tracers. The high throughput I/O in CESM, based on the Parallel IO Library (PIO), is being further augmented to support even higher resolutions and enhance resiliency. The application performance of the individual runs are archived in a database and routinely analyzed to identify and rectify performance degradation during the course of the experiments. The various resources available at the OLCF now support a scientific workflow to facilitate high-resolution climate modelling. A high-speed center-wide parallel file system, called ATLAS, capable of 1 TB/s, is available on Titan as well as on the clusters used for analysis (Rhea) and visualization (Lens/EVEREST). Long-term archive is facilitated by the HPSS storage system. The Earth System Grid (ESG), featuring search & discovery, is also used to deliver data. The end-to-end workflow allows OLCF users to efficiently share data and publish results in a timely manner.
Modeling the fracture of ice sheets on parallel computers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waisman, Haim; Bell, Robin; Keyes, David
2010-03-01
The objective of this project is to investigate the complex fracture of ice and understand its role within larger ice sheet simulations and global climate change. At the present time, ice fracture is not explicitly considered within ice sheet models due in part to large computational costs associated with the accurate modeling of this complex phenomena. However, fracture not only plays an extremely important role in regional behavior but also influences ice dynamics over much larger zones in ways that are currently not well understood. Dramatic illustrations of fracture-induced phenomena most notably include the recent collapse of ice shelves inmore » Antarctica (e.g. partial collapse of the Wilkins shelf in March of 2008 and the diminishing extent of the Larsen B shelf from 1998 to 2002). Other fracture examples include ice calving (fracture of icebergs) which is presently approximated in simplistic ways within ice sheet models, and the draining of supraglacial lakes through a complex network of cracks, a so called ice sheet plumbing system, that is believed to cause accelerated ice sheet flows due essentially to lubrication of the contact surface with the ground. These dramatic changes are emblematic of the ongoing change in the Earth's polar regions and highlight the important role of fracturing ice. To model ice fracture, a simulation capability will be designed centered around extended finite elements and solved by specialized multigrid methods on parallel computers. In addition, appropriate dynamic load balancing techniques will be employed to ensure an approximate equal amount of work for each processor.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Anthony P; Hanson, Paul J; DeKauwe, Martin G
2014-01-01
Free Air CO2 Enrichment (FACE) experiments provide a remarkable wealth of data to test the sensitivities of terrestrial ecosystem models (TEMs). In this study, a broad set of 11 TEMs were compared to 22 years of data from two contrasting FACE experiments in temperate forests of the south eastern US the evergreen Duke Forest and the deciduous Oak Ridge forest. We evaluated the models' ability to reproduce observed net primary productivity (NPP), transpiration and Leaf Area index (LAI) in ambient CO2 treatments. Encouragingly, many models simulated annual NPP and transpiration within observed uncertainty. Daily transpiration model errors were often relatedmore » to errors in leaf area phenology and peak LAI. Our analysis demonstrates that the simulation of LAI often drives the simulation of transpiration and hence there is a need to adopt the most appropriate of hypothesis driven methods to simulate and predict LAI. Of the three competing hypotheses determining peak LAI (1) optimisation to maximise carbon export, (2) increasing SLA with canopy depth and (3) the pipe model the pipe model produced LAI closest to the observations. Modelled phenology was either prescribed or based on broader empirical calibrations to climate. In some cases, simulation accuracy was achieved through compensating biases in component variables. For example, NPP accuracy was sometimes achieved with counter-balancing biases in nitrogen use efficiency and nitrogen uptake. Combined analysis of parallel measurements aides the identification of offsetting biases; without which over-confidence in model abilities to predict ecosystem function may emerge, potentially leading to erroneous predictions of change under future climates.« less
[Adaptability of APSIM model in Southwestern China: A case study of winter wheat in Chongqing City].
Dai, Tong; Wang, Jing; He, Di; Zhang, Jian-ping; Wang, Na
2015-04-01
Field experimental data of winter wheat and parallel daily meteorological data at four typical stations in Chongqing City were used to calibrate and validate APSIM-wheat model and determine the genetic parameters for 12 varieties of winter wheat. The results showed that there was a good agreement between the simulated and observed growth periods from sowing to emergence, flowering and maturity of wheat. Root mean squared errors (RMSEs) between simulated and observed emergence, flowering and maturity were 0-3, 1-8, and 0-8 d, respectively. Normalized root mean squared errors (NRMSEs) between simulated and observed above-ground biomass for 12 study varieties were less than 30%. NRMSE between simulated and observed yields for 10 varieties out of 12 study varieties were less than 30%. APSIM-wheat model performed well in simulating phenology, aboveground biomass and yield of winter wheat in Chongqing City, which could provide a foundational support for assessing the impact of climate change on wheat production in the study area based on the model.
NASA Astrophysics Data System (ADS)
Vieira, V. M. N. C. S.; Sahlée, E.; Jurus, P.; Clementi, E.; Pettersson, H.; Mateus, M.
2015-09-01
Earth-System and regional models, forecasting climate change and its impacts, simulate atmosphere-ocean gas exchanges using classical yet too simple generalizations relying on wind speed as the sole mediator while neglecting factors as sea-surface agitation, atmospheric stability, current drag with the bottom, rain and surfactants. These were proved fundamental for accurate estimates, particularly in the coastal ocean, where a significant part of the atmosphere-ocean greenhouse gas exchanges occurs. We include several of these factors in a customizable algorithm proposed for the basis of novel couplers of the atmospheric and oceanographic model components. We tested performances with measured and simulated data from the European coastal ocean, having found our algorithm to forecast greenhouse gas exchanges largely different from the forecasted by the generalization currently in use. Our algorithm allows calculus vectorization and parallel processing, improving computational speed roughly 12× in a single cpu core, an essential feature for Earth-System models applications.
Hidalgo-Galiana, A; Monge, M; Biron, D G; Canals, F; Ribera, I; Cieslak, A
2016-01-01
Physiological changes associated with evolutionary and ecological processes such as diversification, range expansion or speciation are still incompletely understood, especially for non-model species. Here we study differences in protein expression in response to temperature in a western Mediterranean diving beetle species complex, using two-dimensional differential gel electrophoresis with one Moroccan and one Iberian population each of Agabus ramblae and Agabus brunneus. We identified proteins with significant expression differences after thermal treatments comparing them with a reference EST library generated from one of the species of the complex (A. ramblae). The colonisation during the Middle Pleistocene of the Iberian peninsula by A. ramblae, where maximum temperatures and seasonality are lower than in the ancestral north African range, was associated with changes in the response to 27 °C in proteins related to energy metabolism. The subsequent speciation of A. brunneus from within populations of Iberian A. ramblae was associated with changes in the expression of several stress-related proteins (mostly chaperons) when exposed to 4 °C. These changes are in agreement with the known tolerance to lower temperatures of A. brunneus, which occupies a larger geographical area with a wider range of climatic conditions. In both cases, protein expression changes paralleled the evolution of thermal tolerance and the climatic conditions experienced by the species. However, although the colonisation of the Iberian peninsula did not result in morphological change, the speciation process of A. brunneus within Iberia involved genetic isolation and substantial differences in male genitalia and body size and shape.
Hidalgo-Galiana, A; Monge, M; Biron, D G; Canals, F; Ribera, I; Cieslak, A
2016-01-01
Physiological changes associated with evolutionary and ecological processes such as diversification, range expansion or speciation are still incompletely understood, especially for non-model species. Here we study differences in protein expression in response to temperature in a western Mediterranean diving beetle species complex, using two-dimensional differential gel electrophoresis with one Moroccan and one Iberian population each of Agabus ramblae and Agabus brunneus. We identified proteins with significant expression differences after thermal treatments comparing them with a reference EST library generated from one of the species of the complex (A. ramblae). The colonisation during the Middle Pleistocene of the Iberian peninsula by A. ramblae, where maximum temperatures and seasonality are lower than in the ancestral north African range, was associated with changes in the response to 27 °C in proteins related to energy metabolism. The subsequent speciation of A. brunneus from within populations of Iberian A. ramblae was associated with changes in the expression of several stress-related proteins (mostly chaperons) when exposed to 4 °C. These changes are in agreement with the known tolerance to lower temperatures of A. brunneus, which occupies a larger geographical area with a wider range of climatic conditions. In both cases, protein expression changes paralleled the evolution of thermal tolerance and the climatic conditions experienced by the species. However, although the colonisation of the Iberian peninsula did not result in morphological change, the speciation process of A. brunneus within Iberia involved genetic isolation and substantial differences in male genitalia and body size and shape. PMID:26328758
Perceived climate in physical activity settings.
Gill, Diane L; Morrow, Ronald G; Collins, Karen E; Lucey, Allison B; Schultz, Allison M
2010-01-01
This study focused on the perceived climate for LGBT youth and other minority groups in physical activity settings. A large sample of undergraduates and a selected sample including student teachers/interns and a campus Pride group completed a school climate survey and rated the climate in three physical activity settings (physical education, organized sport, exercise). Overall, school climate survey results paralleled the results with national samples revealing high levels of homophobic remarks and low levels of intervention. Physical activity climate ratings were mid-range, but multivariate analysis of variation test (MANOVA) revealed clear differences with all settings rated more inclusive for racial/ethnic minorities and most exclusive for gays/lesbians and people with disabilities. The results are in line with national surveys and research suggesting sexual orientation and physical characteristics are often the basis for harassment and exclusion in sport and physical activity. The current results also indicate that future physical activity professionals recognize exclusion, suggesting they could benefit from programs that move beyond awareness to skills and strategies for creating more inclusive programs.
The Uniqueness of Similarities: Parallels of Milton H. Erickson and Carl Rogers.
ERIC Educational Resources Information Center
Gunnison, Hugh
1985-01-01
Describes the influence of the philosophy and values of Carl Rogers and Milton Erickson on the counseling profession. Reviews the person-centered approach, direction, therapeutic climate, and the influence of early experiences. Includes a reaction by Carl Rogers. (JAC)
Climate Prediction Center - Reanalysis: Atmospheric Data
files; i.e., wgrib for GRIB-2 files wgrib2mv,wgrib2ms parallel processing with wgrib2 grb1to2.pl perl US government, DOC, NWS, NCEP or CPC. All spelling errors are property of the finder. comments
NASA Astrophysics Data System (ADS)
Yuni, Juniarti
2017-04-01
Gambir (Uncaria gambir Roxb. L) is a specific commodity of export in West Sumatra. Area of Gambir tree increases about 8 % per year in West Sumatera and until 1998 its production increased about 17% per year. However, in 1999 its area does not parallel with its production. In the last five years, the volume of export increases about 82.81%, while its value of export reaches US 2.5/kg. Therefore, this commodity has a strategic value for city's earnings. One of predicted causes is the use of unappropriated land. The aim of this research is to measure levels of land suitability in the buffer zone. TNKS (The National Park Kerinci-Seblat) in order to get the area, which is suitable for growing commodity of Gambir tree. To evaluate land suitability, quantitative model from FAO is used by combining environmental data, climate and condition of land (physical and chemical characteristic of the land). Estimation of Radiation Thermal Production Potential (RPP). Every data is measured (rating) individually and included in several mathematical formulas. After that, potential production of a land based on climate (Climate Production Potential) = CPP) is obtained quantitatively. By changing certain variant of this model program, it can predict the result of the plant in another area. By entering the real data of a land plant production, this model can predict the real plant production of land (Land Production Potential= LPP). Salido Saribulan area is included in class of land suitability S3f which is suitable for growing Gambir tree with a limitation factor of nutrient retention. Potential of actual gambir production at Salido Saribulan is 5 ton/ha, which is higher than actual gambir production.
Stewart, I.T.; Cayan, D.R.; Dettinger, M.D.
2004-01-01
Spring snowmelt is the most important contribution of many rivers in western North America. If climate changes, this contribution may change. A shift in the timing of springtime snowmelt towards earlier in the year already is observed during 1948-2000 in many western rivers. Streamflow timing changes for the 1995-2099 period are projected using regression relations between observed streamflow-timing responses in each river, measured by the temporal centroid of streamflow (CT) each year, and local temperature (TI) and precipitation (PI) indices. Under 21st century warming trends predicted by the Parallel Climate Model (PCM) under business-as-usual greenhouse-gas emissions, streamflow timing trends across much of western North America suggest even earlier springtime snowmelt than observed to date. Projected CT changes are consistent with observed rates and directions of change during the past five decades, and are strongest in the Pacific Northwest, Sierra Nevada, and Rocky Mountains, where many rivers eventually run 30-40 days earlier. The modest PI changes projected by PCM yield minimal CT changes. The responses of CT to the simultaneous effects of projected TI and PI trends are dominated by the TI changes. Regression-based CT projections agree with those from physically-based simulations of rivers in the Pacific Northwest and Sierra Nevada.
Warmer, deeper, and greener mixed layers in the North Atlantic subpolar gyre over the last 50 years.
Martinez, Elodie; Raitsos, Dionysios E; Antoine, David
2016-02-01
Shifts in global climate resonate in plankton dynamics, biogeochemical cycles, and marine food webs. We studied these linkages in the North Atlantic subpolar gyre (NASG), which hosts extensive phytoplankton blooms. We show that phytoplankton abundance increased since the 1960s in parallel to a deepening of the mixed layer and a strengthening of winds and heat losses from the ocean, as driven by the low frequency of the North Atlantic Oscillation (NAO). In parallel to these bottom-up processes, the top-down control of phytoplankton by copepods decreased over the same time period in the western NASG, following sea surface temperature changes typical of the Atlantic Multi-decadal Oscillation (AMO). While previous studies have hypothesized that climate-driven warming would facilitate seasonal stratification of surface waters and long-term phytoplankton increase in subpolar regions, here we show that deeper mixed layers in the NASG can be warmer and host a higher phytoplankton biomass. These results emphasize that different modes of climate variability regulate bottom-up (NAO control) and top-down (AMO control) forcing on phytoplankton at decadal timescales. As a consequence, different relationships between phytoplankton, zooplankton, and their physical environment appear subject to the disparate temporal scale of the observations (seasonal, interannual, or decadal). The prediction of phytoplankton response to climate change should be built upon what is learnt from observations at the longest timescales. © 2015 John Wiley & Sons Ltd.
Gutierrez, Kristie S.; LePrevost, Catherine E.
2016-01-01
Climate justice is a local, national, and global movement to protect at-risk populations who are disproportionately affected by climate change. The social context for this review is the Southeastern region of the United States, which is particularly susceptible to climate change because of the geography of the area and the vulnerabilities of the inhabiting populations. Negative human health effects on variable and vulnerable populations within the Southeast region due to changing climate are concerning, as health threats are not expected to produce parallel effects among all individuals. Vulnerable communities, such as communities of color, indigenous people, the geographically isolated, and those who are socioeconomically disadvantaged and already experiencing poor environmental quality, are least able to respond and adapt to climate change. Focusing on vulnerable populations in the Southeastern United States, this review is a synthesis of the recent (2010 to 2015) literature-base on the health effects connected to climate change. This review also addresses local and regional mitigation and adaptation strategies for citizens and leaders to combat direct and indirect human health effects related to a changing climate. PMID:26848673
NASA Astrophysics Data System (ADS)
Hoffert, M.
2012-12-01
Climate/energy policy is gridlocked between (1) a geophysics perspective revealing long-term instabilities from continued energy consumption growth, of which the fossil fuel greenhouse an early symptom; and (2) short-term, fossil-fuel energized-rapid-economic-growth-driven policies likely adaptive for hunter-gatherers competing for scarce food, but climatically fatal to planetary-scale economies dependent on agriculture and "energy slaves." Incorporating social science into climate/energy policy formulation has focused on integrated assessment models (IAMs) exploring scenarios (parallel universes making different social choices) depicting the evolution of GDP, energy consumed, the energy technology mixture, land use, greenhouse gas and aerosol emissions, and radiative forcing). Representative concentration pathways (RCP) scenarios developed for the IPCC AR5 report imply 5-10 degree C warming from fossil fuel burning unless unprecedentedly fast decarbonization rates ~ 7 %/yr are implemented from 2020 to 2100. A massive transition to carbon neutrality by midcentury is needed to keep warming < 2 degrees C (FIG. 1).Fossil fuel greenhouse warming is leveraged by two orders of magnitude relative to heating from human energy consumption. Even if civilization successfully transitions to carbon-neutrality in time, but energy use continues growing at 2%/year, fossil-fuel-greenhouse level warming would be generated by heat rejecting in only 200-300 years underscoring that sustainability implies a steady state planetary economy (FIG.2). Evolutionary psychology and neuroeconomics are emergent disciplines that may illuminate the physical v social science paradigm conflict threatening human survivability.
Parallel Evolution of Cold Tolerance within Drosophila melanogaster
Braun, Dylan T.; Lack, Justin B.
2017-01-01
Drosophila melanogaster originated in tropical Africa before expanding into strikingly different temperate climates in Eurasia and beyond. Here, we find elevated cold tolerance in three distinct geographic regions: beyond the well-studied non-African case, we show that populations from the highlands of Ethiopia and South Africa have significantly increased cold tolerance as well. We observe greater cold tolerance in outbred versus inbred flies, but only in populations with higher inversion frequencies. Each cold-adapted population shows lower inversion frequencies than a closely-related warm-adapted population, suggesting that inversion frequencies may decrease with altitude in addition to latitude. Using the FST-based “Population Branch Excess” statistic (PBE), we found only limited evidence for parallel genetic differentiation at the scale of ∼4 kb windows, specifically between Ethiopian and South African cold-adapted populations. And yet, when we looked for single nucleotide polymorphisms (SNPs) with codirectional frequency change in two or three cold-adapted populations, strong genomic enrichments were observed from all comparisons. These findings could reflect an important role for selection on standing genetic variation leading to “soft sweeps”. One SNP showed sufficient codirectional frequency change in all cold-adapted populations to achieve experiment-wide significance: an intronic variant in the synaptic gene Prosap. Another codirectional outlier SNP, at senseless-2, had a strong association with our cold trait measurements, but in the opposite direction as predicted. More generally, proteins involved in neurotransmission were enriched as potential targets of parallel adaptation. The ability to study cold tolerance evolution in a parallel framework will enhance this classic study system for climate adaptation. PMID:27777283
NASA Astrophysics Data System (ADS)
Mahmud, A.; Hixson, M.; Hu, J.; Zhao, Z.; Chen, S.-H.; Kleeman, M. J.
2010-11-01
The effect of global climate change on the annual average concentration of fine particulate matter (PM2.5) in California was studied using a climate-air quality modeling system composed of global through regional models. Output from the NCAR/DOE Parallel Climate Model (PCM) generated under the "business as usual" global emissions scenario was downscaled using the Weather Research and Forecasting (WRF) model followed by air quality simulations using the UCD/CIT airshed model. The system represents major atmospheric processes acting on gas and particle phase species including meteorological effects on emissions, advection, dispersion, chemical reaction rates, gas-particle conversion, and dry/wet deposition. The air quality simulations were carried out for the entire state of California with a resolution of 8-km for the years 2000-2006 (present climate with present emissions) and 2047-2053 (future climate with present emissions). Each of these 7-year analysis periods was analyzed using a total of 1008 simulated days to span a climatologically relevant time period with a practical computational burden. The 7-year windows were chosen to properly account for annual variability with the added benefit that the air quality predictions under the present climate could be compared to actual measurements. The climate-air quality modeling system successfully predicted the spatial pattern of present climate PM2.5 concentrations in California but the absolute magnitude of the annual average PM2.5 concentrations were under-predicted by ~4-39% in the major air basins. The majority of this under-prediction was caused by excess ventilation predicted by PCM-WRF that should be present to the same degree in the current and future time periods so that the net bias introduced into the comparison is minimized. Surface temperature, relative humidity (RH), rain rate, and wind speed were predicted to increase in the future climate while the ultra violet (UV) radiation was predicted to decrease in major urban areas in the San Joaquin Valley (SJV) and South Coast Air Basin (SoCAB). These changes lead to a predicted decrease in PM2.5 mass concentrations of ~0.3-0.7 μg m-3 in the southern portion of the SJV and ~0.3-1.1 μg m-3 along coastal regions of California including the heavily populated San Francisco Bay Area and the SoCAB surrounding Los Angeles. Annual average PM2.5 concentrations were predicted to increase at certain locations within the SJV and the Sacramento Valley (SV) due to the effects of climate change, but a corresponding analysis of the annual variability showed that these predictions are not statistically significant (i.e. the choice of a different 7-year period could produce a different outcome for these regions). Overall, virtually no region in California outside of coastal + central Los Angeles, and a small region around the port of Oakland in the San Francisco Bay Area experienced a statistically significant change in annual average PM2.5 concentrations due to the effects of climate change in the present~study. The present study employs the highest spatial resolution (8 km) and the longest analysis windows (7 years) of any climate-air quality analysis conducted for California to date, but the results still have some degree of uncertainty. Most significantly, GCM calculations have inherent uncertainty that is not fully represented in the current study since a single GCM was used as the starting point for all calculations. The PCM results used in the current study predicted greater wintertime increases in air temperature over the Pacific Ocean than over land, further motivating comparison to other GCM results. Ensembles of GCM results are usually employed to build confidence in climate calculations. The current results provide a first data-point for the climate-air quality analysis that simultaneously employ the fine spatial resolution and long time scales needed to capture the behavior of climate-PM2.5 interactions in California. Future downscaling studies should follow up with a full ensemble of GCMs as their starting point, and include aerosol feedback effects on local meteorology.
Climate Change: A "Green" Approach to Teaching Contemporary Germany
ERIC Educational Resources Information Center
Melin, Charlotte
2013-01-01
This article describes a newly designed upper division German language course, "Contemporary Germany: Food, Energy Politics," and two sampling methods of assessment for measuring parallel gains in German skills and sustainable development (SD) thinking. Second Language Acquisition (SLA) informed course design, key assignments, and…
NASA Astrophysics Data System (ADS)
Ward, E. J.; Thomas, R. Q.; Sun, G.; McNulty, S. G.; Domec, J. C.; Noormets, A.; King, J. S.
2015-12-01
Numerous studies, both experimental and observational, have been conducted over the past two decades in an attempt to understand how water and carbon cycling in terrestrial ecosystems may respond to changes in climatic conditions. These studies have produced a wealth of detailed data on key processes driving these cycles. In parallel, sophisticated models of these processes have been formulated to answer a variety of questions relevant to natural resource management. Recent advances in data assimilation techniques offer exciting new possibilities to combine this wealth of ecosystem data with process models of ecosystem function to improve prediction and quantify associated uncertainty. Using forests of the southeastern United States as our focus, we will specify how fine-scale physiological (e.g. half-hourly sap flux) can be scaled up with quantified error for use in models of stand growth and hydrology. This approach represents an opportunity to leverage current and past research from experiments including throughfall displacement × fertilization (PINEMAP), irrigation × fertilization (SETRES), elevated CO2 (Duke and ORNL FACE) and a variety of observational studies in both conifer and hardwood forests throughout the region, using a common platform for data assimilation and prediction. As part of this discussion, we will address variation in dominant species, stand structure, site age, management practices, soils and climate that represent both challenges to the development of a common analytical approach and opportunities to address questions of interest to policy makers and natural resource managers.
Conversi, Alessandra; Fonda Umani, Serena; Peluso, Tiziana; Molinero, Juan Carlos; Santojanni, Alberto; Edwards, Martin
2010-05-19
Regime shifts are abrupt changes encompassing a multitude of physical properties and ecosystem variables, which lead to new regime conditions. Recent investigations focus on the changes in ecosystem diversity and functioning associated to such shifts. Of particular interest, because of the implication on climate drivers, are shifts that occur synchronously in separated basins. In this work we analyze and review long-term records of Mediterranean ecological and hydro-climate variables and find that all point to a synchronous change in the late 1980s. A quantitative synthesis of the literature (including observed oceanic data, models and satellite analyses) shows that these years mark a major change in Mediterranean hydrographic properties, surface circulation, and deep water convection (the Eastern Mediterranean Transient). We provide novel analyses that link local, regional and basin scale hydrological properties with two major indicators of large scale climate, the North Atlantic Oscillation index and the Northern Hemisphere Temperature index, suggesting that the Mediterranean shift is part of a large scale change in the Northern Hemisphere. We provide a simplified scheme of the different effects of climate vs. temperature on pelagic ecosystems. Our results show that the Mediterranean Sea underwent a major change at the end of the 1980s that encompassed atmospheric, hydrological, and ecological systems, for which it can be considered a regime shift. We further provide evidence that the local hydrography is linked to the larger scale, northern hemisphere climate. These results suggest that the shifts that affected the North, Baltic, Black and Mediterranean (this work) Seas at the end of the 1980s, that have been so far only partly associated, are likely linked as part a northern hemisphere change. These findings bear wide implications for the development of climate change scenarios, as synchronous shifts may provide the key for distinguishing local (i.e., basin) anthropogenic drivers, such as eutrophication or fishing, from larger scale (hemispheric) climate drivers.
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
NASA Astrophysics Data System (ADS)
Steenbrink, J.; Kloosterboer-van Hoeve, M. L.; Hilgen, F. J.
2003-03-01
Quaternary climate proxy records show compelling evidence for climate variability on time scales of a few thousand years. The causes for these millennial-scale or sub-Milankovitch cycles are still poorly understood, not least due to the complex feedback mechanisms of large ice sheets during the Quaternary. We present evidence of millennial-scale climate variability in Early Pliocene lacustrine sediments from the intramontane Ptolemais Basin in northwestern Greece. The sediments are well exposed in a series of open-pit lignite mines and exhibit a distinct millennial-scale sedimentary cyclicity of alternating lignites and lacustrine marl beds that resulted from precession-induced variations in climate. The higher-frequency, millennial-scale cyclicity is particularly prominent within the grey-coloured marl segment of individual cycles. A stratigraphic interval of ˜115 ka, covering five precession-induced sedimentary cycles, was studied in nine parallel sections from two open-pit lignite mines located several km apart. High-resolution colour reflectance records were used to quantify the within-cycle variability and to determine its lateral continuity. Much of the within-cycle variability could be correlated between the parallel sections, even in fine detail, which suggests that these changes reflect basin-wide variations in environmental conditions related to (regional) climate fluctuations. Interbedded volcanic ash beds demonstrate the synchronicity of these fluctuations and spectral analysis of the reflectance time series shows a significant concentration of within-cycle variability at periods of ˜11, ˜5.5 and ˜2 ka. The occurrence of variability at such time scales at times before the intensification of the Northern Hemisphere glaciation suggests that they cannot solely have resulted from internal ice-sheet dynamics. Possible candidates include harmonics or combination tones of the main orbital cycles, variations in solar output or periodic motions of the Earth and Moon.
Kahilainen, Aapo; van Nouhuys, Saskya; Schulz, Torsti; Saastamoinen, Marjo
2018-04-23
Habitat fragmentation and climate change are both prominent manifestations of global change, but there is little knowledge on the specific mechanisms of how climate change may modify the effects of habitat fragmentation, for example, by altering dynamics of spatially structured populations. The long-term viability of metapopulations is dependent on independent dynamics of local populations, because it mitigates fluctuations in the size of the metapopulation as a whole. Metapopulation viability will be compromised if climate change increases spatial synchrony in weather conditions associated with population growth rates. We studied a recently reported increase in metapopulation synchrony of the Glanville fritillary butterfly (Melitaea cinxia) in the Finnish archipelago, to see if it could be explained by an increase in synchrony of weather conditions. For this, we used 23 years of butterfly survey data together with monthly weather records for the same period. We first examined the associations between population growth rates within different regions of the metapopulation and weather conditions during different life-history stages of the butterfly. We then examined the association between the trends in the synchrony of the weather conditions and the synchrony of the butterfly metapopulation dynamics. We found that precipitation from spring to late summer are associated with the M. cinxia per capita growth rate, with early summer conditions being most important. We further found that the increase in metapopulation synchrony is paralleled by an increase in the synchrony of weather conditions. Alternative explanations for spatial synchrony, such as increased dispersal or trophic interactions with a specialist parasitoid, did not show paralleled trends and are not supported. The climate driven increase in M. cinxia metapopulation synchrony suggests that climate change can increase extinction risk of spatially structured populations living in fragmented landscapes by altering their dynamics. © 2018 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Scholz, Denis; Hoffmann, Dirk L.; Spötl, Christoph; Hopcroft, Peter; Jochum, Klaus Peter; Richter, Detlev K.
2015-04-01
We present high-resolution δ18O, δ13C and trace element profiles for three stalagmites from western Germany, which grew during Marine Isotope Stage (MIS) 5. All stalagmites were precisely dated by MC-ICPMS 230Th/U-dating. Stalagmite HBSH-1 from Hüttenbläserschachthöhle grew between 130 and 80 ka and provides a climate record with decadal to centennial resolution. The other two stalagmites grew faster than HBSH 1, but their growth phases are shorter. Stalagmite HBSH 5 grew between 129 and 122 ka, whereas stalagmite BR 5 grew between 126 and 122 ka. The record of HBSH 1 shows four growth interruptions coinciding with Greenland Stadials (GS) 21, 22, 24, 25, and 26. This shows that stalagmite growth is a very sensitive proxy for cool and dry conditions in the northern hemisphere and enables us to precisely determine the timing and duration of the GS. We interpret stalagmite δ18O values as a proxy for supra-regional temperature changes in the North Atlantic realm, which is paticularly evident from their close resemblance with the δ18O values of the NGRIP and NEEM ice core records. Stalagmite δ13C values primarily reflect changes in hydrological balance and (local) vegetation and are, thus, a proxy for terrestrial climate change in central European. The δ13C record shows three pronounced negative peaks during MIS 5, and their timing is in agreement with MIS 5e, 5c and 5a. This suggests generally warm and humid climate in central Europe during these phases. The evolution of the δ18O and δ13C values during the Eemian is not parallel. The δ18O values progressively increase from 130 ka, peak at 125 ka and subsequently show a gradual decrease. The δ13C values, in contrast, start to decrease at 123 ka, show a negative peak at 120 ka and an aprupt increase at 114 ka. This suggests that the Eemian sensu strictu lasted from 124 to 114 ka, in agreement with a marine record from the Norwegian Sea and indicates and a strong influence on central European climate from high northern latitudes. We also compare our records with other MIS 5 climate records and climate modelling simulations performed with the general circulation model FAMOUS.
NASA Astrophysics Data System (ADS)
Chen, X.; Huang, X.; Flanner, M.; Yang, P.; Feldman, D.; Kuo, C.
2016-12-01
As of today, most state-of-the-art GCMs still assumes blackbody surface in their longwave radiation scheme. Recent works by Chen et al. (2014) and Feldman et al. (2014) have suggested that the surface spectral emissivity can impact the simulated radiation budget and climate change in a discernible way, especially in high latitudes. Using a recently developed global emissivity database that covers both far-IR and mid-IR, we incorporated the LW surface spectral emissivity into the radiation scheme of the CESM. Effort has been made to ensure a consistent treatment of surface upward LW broadband flux in both the land module and the atmospheric module of the CESM, an important aspect overlooked by the previous study. Then we assess impacts of the inclusion of surface spectral emissivity on simulated mean-state climate and climate changes by carrying out two sets of parallel runs. The first pair of experiments uses the standard slab-ocean CESM v1.1.1 to run two experiments: one control run using forcings at year 2000 level and one sensitivity run abruptly doubling the CO2. The second pair of experiment setup is identical to the first one but using the CESM that we have modified (Surface emissivity is a prognostic variable in our second pair of experiments). The current climate simulation results show that the Sahara desert region in the modified CESM has a warmer surface temperature than in the standard CESM by 2-3K. Over the high-latitude regions, the modified CESM tends to have a colder surface temperature than the standard CESM by 1-2.5K. As a result, the climatological sea ice coverage in the modified CESM is 8% more than it in the standard CESM in both Polar Regions. All these differences are statistically significant. As for simulated climate change in response to a doubling of CO2, the Arctic region in the modified CESM warms consistently faster than in the standard CESM by 1-2K while the Antarctic region shows a non-uniform pattern of differences between two models. Differences in the changes of sea ice coverage between two models show a zonally-uniform dipole pattern over both polar oceans. The reasons for such differences and its linkage with the change of surface spectral emissivity are further explained.
Lewandowsky, Stephan; Oberauer, Klaus; Gignac, Gilles E
2013-05-01
Although nearly all domain experts agree that carbon dioxide emissions are altering the world's climate, segments of the public remain unconvinced by the scientific evidence. Internet blogs have become a platform for denial of climate change, and bloggers have taken a prominent role in questioning climate science. We report a survey of climate-blog visitors to identify the variables underlying acceptance and rejection of climate science. Our findings parallel those of previous work and show that endorsement of free-market economics predicted rejection of climate science. Endorsement of free markets also predicted the rejection of other established scientific findings, such as the facts that HIV causes AIDS and that smoking causes lung cancer. We additionally show that, above and beyond endorsement of free markets, endorsement of a cluster of conspiracy theories (e.g., that the Federal Bureau of Investigation killed Martin Luther King, Jr.) predicted rejection of climate science as well as other scientific findings. Our results provide empirical support for previous suggestions that conspiratorial thinking contributes to the rejection of science. Acceptance of science, by contrast, was strongly associated with the perception of a consensus among scientists.
NASA Astrophysics Data System (ADS)
Pedro, J. B.; Martin, T.; Steig, E. J.; Jochum, M.; Park, W.; Rasmussen, S.
2015-12-01
Antarctic Isotope Maxima (AIM) are centennial-to-millennial scale warming events observed in Antarctic ice core records from the last glacial period and deglaciation. Mounting evidence links AIM events to parallel variations in atmospheric CO2, Southern Ocean (SO) sea surface temperatures and Antarctic Bottom Water production. According to the prevailing view, AIM events are forced from the North Atlantic by melt-water discharge from ice sheets suppressing the production of North Atlantic Deep Water and associated northward heat transport in the Atlantic. However observations and model studies increasingly suggest that melt-water fluxes have the wrong timing to be invoked as such a trigger. Here, drawing on results form the Kiel Climate Model, we present an alternative hypothesis in which AIM events are forced via internal oscillations in SO deep-convection. The quasi-periodic timescale of deep-convection events is set by heat (buoyancy) accumulation at SO intermediate depths and stochastic variability in sea ice conditions and freshening at the surface. Massive heat release from the SO convective zone drives Antarctic and large-scale southern hemisphere warming via a two-stage process involving changes in the location of Southern Ocean fronts, in the strength and intensity of the Westerlies and in meridional ocean and atmospheric heat flux anomalies. The potential for AIM events to be driven by internal Southern Ocean processes and the identification of time-lags internal to the southern high latitudes challenges conventional views on the North Atlantic as the pacemaker of millennial-scale climate variability.
NASA Astrophysics Data System (ADS)
Will, Andreas; Akhtar, Naveed; Brauch, Jennifer; Breil, Marcus; Davin, Edouard; Ho-Hagemann, Ha T. M.; Maisonnave, Eric; Thürkow, Markus; Weiher, Stefan
2017-04-01
We developed a coupled regional climate system model based on the CCLM regional climate model. Within this model system, using OASIS3-MCT as a coupler, CCLM can be coupled to two land surface models (the Community Land Model (CLM) and VEG3D), the NEMO-MED12 regional ocean model for the Mediterranean Sea, two ocean models for the North and Baltic seas (NEMO-NORDIC and TRIMNP+CICE) and the MPI-ESM Earth system model.We first present the different model components and the unified OASIS3-MCT interface which handles all couplings in a consistent way, minimising the model source code modifications and defining the physical and numerical aspects of the couplings. We also address specific coupling issues like the handling of different domains, multiple usage of the MCT library and exchange of 3-D fields.We analyse and compare the computational performance of the different couplings based on real-case simulations over Europe. The usage of the LUCIA tool implemented in OASIS3-MCT enables the quantification of the contributions of the coupled components to the overall coupling cost. These individual contributions are (1) cost of the model(s) coupled, (2) direct cost of coupling including horizontal interpolation and communication between the components, (3) load imbalance, (4) cost of different usage of processors by CCLM in coupled and stand-alone mode and (5) residual cost including i.a. CCLM additional computations.Finally a procedure for finding an optimum processor configuration for each of the couplings was developed considering the time to solution, computing cost and parallel efficiency of the simulation. The optimum configurations are presented for sequential, concurrent and mixed (sequential+concurrent) coupling layouts. The procedure applied can be regarded as independent of the specific coupling layout and coupling details.We found that the direct cost of coupling, i.e. communications and horizontal interpolation, in OASIS3-MCT remains below 7 % of the CCLM stand-alone cost for all couplings investigated. This is in particular true for the exchange of 450 2-D fields between CCLM and MPI-ESM. We identified remaining limitations in the coupling strategies and discuss possible future improvements of the computational efficiency.
Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology
NASA Astrophysics Data System (ADS)
Macioł, Piotr; Michalik, Kazimierz
2016-10-01
Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.
Lattice Boltzmann modeling of transport phenomena in fuel cells and flow batteries
NASA Astrophysics Data System (ADS)
Xu, Ao; Shyy, Wei; Zhao, Tianshou
2017-06-01
Fuel cells and flow batteries are promising technologies to address climate change and air pollution problems. An understanding of the complex multiscale and multiphysics transport phenomena occurring in these electrochemical systems requires powerful numerical tools. Over the past decades, the lattice Boltzmann (LB) method has attracted broad interest in the computational fluid dynamics and the numerical heat transfer communities, primarily due to its kinetic nature making it appropriate for modeling complex multiphase transport phenomena. More importantly, the LB method fits well with parallel computing due to its locality feature, which is required for large-scale engineering applications. In this article, we review the LB method for gas-liquid two-phase flows, coupled fluid flow and mass transport in porous media, and particulate flows. Examples of applications are provided in fuel cells and flow batteries. Further developments of the LB method are also outlined.
NASA Astrophysics Data System (ADS)
Park, Jong-Yeon; Stock, Charles A.; Yang, Xiaosong; Dunne, John P.; Rosati, Anthony; John, Jasmin; Zhang, Shaoqing
2018-03-01
Reliable estimates of historical and current biogeochemistry are essential for understanding past ecosystem variability and predicting future changes. Efforts to translate improved physical ocean state estimates into improved biogeochemical estimates, however, are hindered by high biogeochemical sensitivity to transient momentum imbalances that arise during physical data assimilation. Most notably, the breakdown of geostrophic constraints on data assimilation in equatorial regions can lead to spurious upwelling, resulting in excessive equatorial productivity and biogeochemical fluxes. This hampers efforts to understand and predict the biogeochemical consequences of El Niño and La Niña. We develop a strategy to robustly integrate an ocean biogeochemical model with an ensemble coupled-climate data assimilation system used for seasonal to decadal global climate prediction. Addressing spurious vertical velocities requires two steps. First, we find that tightening constraints on atmospheric data assimilation maintains a better equatorial wind stress and pressure gradient balance. This reduces spurious vertical velocities, but those remaining still produce substantial biogeochemical biases. The remainder is addressed by imposing stricter fidelity to model dynamics over data constraints near the equator. We determine an optimal choice of model-data weights that removed spurious biogeochemical signals while benefitting from off-equatorial constraints that still substantially improve equatorial physical ocean simulations. Compared to the unconstrained control run, the optimally constrained model reduces equatorial biogeochemical biases and markedly improves the equatorial subsurface nitrate concentrations and hypoxic area. The pragmatic approach described herein offers a means of advancing earth system prediction in parallel with continued data assimilation advances aimed at fully considering equatorial data constraints.
PROcess Based Diagnostics PROBE
NASA Technical Reports Server (NTRS)
Clune, T.; Schmidt, G.; Kuo, K.; Bauer, M.; Oloso, H.
2013-01-01
Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.
NASA Astrophysics Data System (ADS)
Yuan, F.; Wang, G.; Painter, S. L.; Tang, G.; Xu, X.; Kumar, J.; Bisht, G.; Hammond, G. E.; Mills, R. T.; Thornton, P. E.; Wullschleger, S. D.
2017-12-01
In Arctic tundra ecosystem soil freezing-thawing is one of dominant physical processes through which biogeochemical (e.g., carbon and nitrogen) cycles are tightly coupled. Besides hydraulic transport, freezing-thawing can cause pore water movement and aqueous species gradients, which are additional mechanisms for soil nitrogen (N) reactive-transport in Tundra ecosystem. In this study, we have fully coupled an in-development ESM(i.e., Advanced Climate Model for Energy, ACME)'s Land Model (ALM) aboveground processes with a state-of-the-art massively parallel 3-D subsurface thermal-hydrology and reactive transport code, PFLOTRAN. The resulting coupled ALM-PFLOTRAN model is a Land Surface Model (LSM) capable of resolving 3-D soil thermal-hydrological-biogeochemical cycles. This specific version of PFLOTRAN has incorporated CLM-CN Converging Trophic Cascade (CTC) model and a full and simple but robust soil N cycle. It includes absorption-desorption for soil NH4+ and gas dissolving-degasing process as well. It also implements thermal-hydrology mode codes with three newly-modified freezing-thawing algorithms which can greatly improve computing performance in regarding to numerical stiffness at freezing-point. Here we tested the model in fully 3-D coupled mode at the Next Generation Ecosystem Experiment-Arctic (NGEE-Arctic) field intensive study site at the Barrow Environmental Observatory (BEO), AK. The simulations show that: (1) synchronous coupling of soil thermal-hydrology and biogeochemistry in 3-D can greatly impact ecosystem dynamics across polygonal tundra landscape; and (2) freezing-thawing cycles can add more complexity to the system, resulting in greater mobility of soil N vertically and laterally, depending upon local micro-topography. As a preliminary experiment, the model is also implemented for Pan-Arctic region in 1-D column mode (i.e. no lateral connection), showing significant differences compared to stand-alone ALM. The developed ALM-PFLOTRAN coupling codes embeded within ESM will be used for Pan-Arctic regional evaluation of climate change-caused ecosystem responses and their feedbacks to climate system at various scales.
NASA Astrophysics Data System (ADS)
Keller, P.; Gehring, A. U.
1992-06-01
Paleomagnetic and structural data from the Pedraforca thrust sheet in the southeast Pyrenees show that the chemical weathering of the late Cretaceous limestones is a multistage process. The first weathering stage, of latest Eocene to early Oligocene age, is indicated by a chemical remanent magnetization carried by hematite. The formation of hematite as the dominant weathering product suggests a subtropical climate in northeast Spain during this period. The second weathering stage is indicated by the presence of goethite, which carries a chemical remanent magnetization parallel to the present earth field. This suggests formation of the goethite since the late Pleistocene under cooler climatic conditions similar to the present-day climate in the Pyrenees.
Environmental research program. 1995 Annual report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, N.J.
1996-06-01
The objective of the Environmental Research Program is to enhance the understanding of, and mitigate the effects of pollutants on health, ecological systems, global and regional climate, and air quality. The program is multidisciplinary and includes fundamental research and development in efficient and environmentally benign combustion, pollutant abatement and destruction, and novel methods of detection and analysis of criteria and noncriteria pollutants. This diverse group conducts investigations in combustion, atmospheric and marine processes, flue-gas chemistry, and ecological systems. Combustion chemistry research emphasizes modeling at microscopic and macroscopic scales. At the microscopic scale, functional sensitivity analysis is used to explore themore » nature of the potential-to-dynamics relationships for reacting systems. Rate coefficients are estimated using quantum dynamics and path integral approaches. At the macroscopic level, combustion processes are modelled using chemical mechanisms at the appropriate level of detail dictated by the requirements of predicting particular aspects of combustion behavior. Parallel computing has facilitated the efforts to use detailed chemistry in models of turbulent reacting flow to predict minor species concentrations.« less
Dragoni, Lisa
2005-11-01
This article attends to a broad range of practically significant employee motivations and provides insight into how to enhance individual-level performance by examining individual-level state goal orientation emergence in organizational work groups. Leadership and multilevel climate processes are theorized to parallel each dimension of state goal orientation to cue and ultimately induce the corresponding achievement focus among individual work group members. It is argued that the patterns of leader behavior, which elucidate the leader's achievement priority, shape group members' psychological and work group climate to embody this priority. Resulting multilevel climate perceptions signal and compel group members to adopt the ascribed form of state goal orientation. The quality of the leader-member exchange relationship is viewed as a means to clarify leader messages in the formation of group members' psychological climate and internalize these cues in the emergence of state goal orientation. Considerations for future research and practice are discussed. ((c) 2005 APA, all rights reserved).
Accelerating Dust Storm Simulation by Balancing Task Allocation in Parallel Computing Environment
NASA Astrophysics Data System (ADS)
Gui, Z.; Yang, C.; XIA, J.; Huang, Q.; YU, M.
2013-12-01
Dust storm has serious negative impacts on environment, human health, and assets. The continuing global climate change has increased the frequency and intensity of dust storm in the past decades. To better understand and predict the distribution, intensity and structure of dust storm, a series of dust storm models have been developed, such as Dust Regional Atmospheric Model (DREAM), the NMM meteorological module (NMM-dust) and Chinese Unified Atmospheric Chemistry Environment for Dust (CUACE/Dust). The developments and applications of these models have contributed significantly to both scientific research and our daily life. However, dust storm simulation is a data and computing intensive process. Normally, a simulation for a single dust storm event may take several days or hours to run. It seriously impacts the timeliness of prediction and potential applications. To speed up the process, high performance computing is widely adopted. By partitioning a large study area into small subdomains according to their geographic location and executing them on different computing nodes in a parallel fashion, the computing performance can be significantly improved. Since spatiotemporal correlations exist in the geophysical process of dust storm simulation, each subdomain allocated to a node need to communicate with other geographically adjacent subdomains to exchange data. Inappropriate allocations may introduce imbalance task loads and unnecessary communications among computing nodes. Therefore, task allocation method is the key factor, which may impact the feasibility of the paralleling. The allocation algorithm needs to carefully leverage the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire system. This presentation introduces two algorithms for such allocation and compares them with evenly distributed allocation method. Specifically, 1) In order to get optimized solutions, a quadratic programming based modeling method is proposed. This algorithm performs well with small amount of computing tasks. However, its efficiency decreases significantly as the subdomain number and computing node number increase. 2) To compensate performance decreasing for large scale tasks, a K-Means clustering based algorithm is introduced. Instead of dedicating to get optimized solutions, this method can get relatively good feasible solutions within acceptable time. However, it may introduce imbalance communication for nodes or node-isolated subdomains. This research shows both two algorithms have their own strength and weakness for task allocation. A combination of the two algorithms is under study to obtain a better performance. Keywords: Scheduling; Parallel Computing; Load Balance; Optimization; Cost Model
NASA Technical Reports Server (NTRS)
Waheed, Abdul; Yan, Jerry
1998-01-01
This paper presents a model to evaluate the performance and overhead of parallelizing sequential code using compiler directives for multiprocessing on distributed shared memory (DSM) systems. With increasing popularity of shared address space architectures, it is essential to understand their performance impact on programs that benefit from shared memory multiprocessing. We present a simple model to characterize the performance of programs that are parallelized using compiler directives for shared memory multiprocessing. We parallelized the sequential implementation of NAS benchmarks using native Fortran77 compiler directives for an Origin2000, which is a DSM system based on a cache-coherent Non Uniform Memory Access (ccNUMA) architecture. We report measurement based performance of these parallelized benchmarks from four perspectives: efficacy of parallelization process; scalability; parallelization overhead; and comparison with hand-parallelized and -optimized version of the same benchmarks. Our results indicate that sequential programs can conveniently be parallelized for DSM systems using compiler directives but realizing performance gains as predicted by the performance model depends primarily on minimizing architecture-specific data locality overhead.
The Tyndall Petition: Bridging the Gap between Academia and the General Public
NASA Astrophysics Data System (ADS)
Duong, K.; Ong, J.
2017-12-01
Climatepedia is a student-founded organization with a mission to communicate climate science to a broad audience. Since its inception in 2011, Climatepedia has grown from a UCLA club to a transitioning 501(c)(3) non-profit organization with members from UCLA, UC Irvine, Yale University, Duke University, UC Santa Barbara, and the University of Pennsylvania. Our main project is the Tyndall Petition (http://www.climatepedia.org/home/tyndallpetition) - the largest online climate petition of its kind - which features nearly 700 signatories who agree that human-induced climate change is an urgent and real issue. Our signatories are PhD level experts with a research focus in climate science or a highly related field. Each signatory has their own profile page that links to other signatories within our network. The Tyndall Petition can be used as a tool to bring transparency to the climate experts that support our statement. In this way, we hope to inform the general audience about the strong scientific consensus about climate change. We also seek to improve climate literacy through exposure to diverse research topics related to climate change. The Tyndall Petition can serve as a mechanism to connect signatories to regional climate issues and the communities affected by these issues. In parallel, Climatepedia administers a Student Certificate Program that trains college students to become climate literate, gain skills in climate communication, and support the growth of the Tyndall Petition.
The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan
2016-04-01
The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.
Nadkarni, P M; Miller, P L
1991-01-01
A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations.
Anatomically constrained neural network models for the categorization of facial expression
NASA Astrophysics Data System (ADS)
McMenamin, Brenton W.; Assadi, Amir H.
2004-12-01
The ability to recognize facial expression in humans is performed with the amygdala which uses parallel processing streams to identify the expressions quickly and accurately. Additionally, it is possible that a feedback mechanism may play a role in this process as well. Implementing a model with similar parallel structure and feedback mechanisms could be used to improve current facial recognition algorithms for which varied expressions are a source for error. An anatomically constrained artificial neural-network model was created that uses this parallel processing architecture and feedback to categorize facial expressions. The presence of a feedback mechanism was not found to significantly improve performance for models with parallel architecture. However the use of parallel processing streams significantly improved accuracy over a similar network that did not have parallel architecture. Further investigation is necessary to determine the benefits of using parallel streams and feedback mechanisms in more advanced object recognition tasks.
Anatomically constrained neural network models for the categorization of facial expression
NASA Astrophysics Data System (ADS)
McMenamin, Brenton W.; Assadi, Amir H.
2005-01-01
The ability to recognize facial expression in humans is performed with the amygdala which uses parallel processing streams to identify the expressions quickly and accurately. Additionally, it is possible that a feedback mechanism may play a role in this process as well. Implementing a model with similar parallel structure and feedback mechanisms could be used to improve current facial recognition algorithms for which varied expressions are a source for error. An anatomically constrained artificial neural-network model was created that uses this parallel processing architecture and feedback to categorize facial expressions. The presence of a feedback mechanism was not found to significantly improve performance for models with parallel architecture. However the use of parallel processing streams significantly improved accuracy over a similar network that did not have parallel architecture. Further investigation is necessary to determine the benefits of using parallel streams and feedback mechanisms in more advanced object recognition tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasenkamp, Daren; Sim, Alexander; Wehner, Michael
Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less
NASA Astrophysics Data System (ADS)
Mic, R.; Corbus, C.; Caian, M.; Neculau, G.
2009-09-01
This paper is a subject of a stage within the scope of European Project 037005 STREP FP6 - CECILIA ("The assessment of impact and vulnerability of climate changes in the Centre and Eastern Europe"). The aim of this project is to assess the impact of climate changes from the regional scale to local scale of Centre and Eastern Europe area, pointing up very high climate resolution usefulness for catching the effects due to the field complexity of study area. The analysed Buzau and Ialomita river basins from Romania covering an area of 14392 km² are situated outside the Curvature Carpathian Mountains, into a zone where the altitude varies from 2500 m to 50 m. In conformity of altitude, the annual precipitation varied from 1400 mm/year, in the mountainous area to 400 mm/year in the plane area and the evapotranspiration between 500 mm/year in the high area to 850 mm/year in the plane area. However, due to a very high variability of weather conditions, droughts as well as excessive humidity periods occur in the course of a year. For the impact study of the possibly climate changes on the runoff in the Buzau and Ialomita river basins, the WatBal model was used, which have been calibrated through the runoff simulation in 17 cross-sections for the reference period 1971 - 2000. WatBal model has two main components. The first is the water balance component that uses continuous functions to describe water movement into a conceptualised basin and the second is the component that allows the calculation of potential evapotranspiration using the Priestly-Taylor equation. For the calculation of changes in the main climatic parameters (atmospheric precipitation, air temperature, relative humidity, solar radiation and wind speed), used in the analysis of the climate change impact on the hydrological regime, there were used the simulations accomplished with a regional climatic model (regCM3), elaborated by ICTP (Trieste), implemented in Romania and used for monthly, seasonal and climate scenarios numerical simulations, at a high spatial resolution of 10 km. Determination of the grid network nodes of the regional climate model regCM3 related to sub-basins from the Buzau and Ialomita river basins was accomplished with a methodology based on obtaining a digital map of river basins, together with related sub-basins. Overlapping this digital map over the network nodes of the grid was made by georeferencing. The changes were calculated for the periods 2021-2050 and 2071-2100 towards the reference period, for each month, like the differences between the values of the climatic parameters corresponding to the two periods. The monthly mean discharges at 4 gauging stations from the Buzau river basin and 13 gauging stations from Ialomita river basin, in the above mentioned hypotheses, are estimated. Study revealed the following changes in the components of the hydrological cycle due to the climate change: - The increase of the evapotranspiration, especially in the summer months, due to the increase of the air temperature. - The reduction of the depth and duration of snow cover due to the increase of the air temperature during winter time. - The variation of the annual mean runoff recorded an increase from the plain to the mountains, standing out a tendency of smoothing during the year in parallel with a global decrease of these. - The early occurrence of the floods and the reduction of the mixed spring floods (snow and rain) by the desynchronisation of the snow melting with the rainfall occurrence. - The reduction of the annual mean runoff on rivers due especially to the increase of the evapotranstpiration.
NASA Astrophysics Data System (ADS)
Tolson, B.; Matott, L. S.; Gaffoor, T. A.; Asadzadeh, M.; Shafii, M.; Pomorski, P.; Xu, X.; Jahanpour, M.; Razavi, S.; Haghnegahdar, A.; Craig, J. R.
2015-12-01
We introduce asynchronous parallel implementations of the Dynamically Dimensioned Search (DDS) family of algorithms including DDS, discrete DDS, PA-DDS and DDS-AU. These parallel algorithms are unique from most existing parallel optimization algorithms in the water resources field in that parallel DDS is asynchronous and does not require an entire population (set of candidate solutions) to be evaluated before generating and then sending a new candidate solution for evaluation. One key advance in this study is developing the first parallel PA-DDS multi-objective optimization algorithm. The other key advance is enhancing the computational efficiency of solving optimization problems (such as model calibration) by combining a parallel optimization algorithm with the deterministic model pre-emption concept. These two efficiency techniques can only be combined because of the asynchronous nature of parallel DDS. Model pre-emption functions to terminate simulation model runs early, prior to completely simulating the model calibration period for example, when intermediate results indicate the candidate solution is so poor that it will definitely have no influence on the generation of further candidate solutions. The computational savings of deterministic model preemption available in serial implementations of population-based algorithms (e.g., PSO) disappear in synchronous parallel implementations as these algorithms. In addition to the key advances above, we implement the algorithms across a range of computation platforms (Windows and Unix-based operating systems from multi-core desktops to a supercomputer system) and package these for future modellers within a model-independent calibration software package called Ostrich as well as MATLAB versions. Results across multiple platforms and multiple case studies (from 4 to 64 processors) demonstrate the vast improvement over serial DDS-based algorithms and highlight the important role model pre-emption plays in the performance of parallel, pre-emptable DDS algorithms. Case studies include single- and multiple-objective optimization problems in water resources model calibration and in many cases linear or near linear speedups are observed.
Serial vs. parallel models of attention in visual search: accounting for benchmark RT-distributions.
Moran, Rani; Zehetleitner, Michael; Liesefeld, Heinrich René; Müller, Hermann J; Usher, Marius
2016-10-01
Visual search is central to the investigation of selective visual attention. Classical theories propose that items are identified by serially deploying focal attention to their locations. While this accounts for set-size effects over a continuum of task difficulties, it has been suggested that parallel models can account for such effects equally well. We compared the serial Competitive Guided Search model with a parallel model in their ability to account for RT distributions and error rates from a large visual search data-set featuring three classical search tasks: 1) a spatial configuration search (2 vs. 5); 2) a feature-conjunction search; and 3) a unique feature search (Wolfe, Palmer & Horowitz Vision Research, 50(14), 1304-1311, 2010). In the parallel model, each item is represented by a diffusion to two boundaries (target-present/absent); the search corresponds to a parallel race between these diffusors. The parallel model was highly flexible in that it allowed both for a parametric range of capacity-limitation and for set-size adjustments of identification boundaries. Furthermore, a quit unit allowed for a continuum of search-quitting policies when the target is not found, with "single-item inspection" and exhaustive searches comprising its extremes. The serial model was found to be superior to the parallel model, even before penalizing the parallel model for its increased complexity. We discuss the implications of the results and the need for future studies to resolve the debate.
Crops in silico: A community wide multi-scale computational modeling framework of plant canopies
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.
2016-12-01
Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.
Parallelized CCHE2D flow model with CUDA Fortran on Graphics Process Units
USDA-ARS?s Scientific Manuscript database
This paper presents the CCHE2D implicit flow model parallelized using CUDA Fortran programming technique on Graphics Processing Units (GPUs). A parallelized implicit Alternating Direction Implicit (ADI) solver using Parallel Cyclic Reduction (PCR) algorithm on GPU is developed and tested. This solve...
Deforestation in Amazonia impacts riverine carbon dynamics
NASA Astrophysics Data System (ADS)
Langerwisch, F.; Walz, A.; Rammig, A.; Tietjen, B.; Thonicke, K.; Cramer, W.
2015-10-01
Fluxes of organic and inorganic carbon within the Amazon basin are considerably controlled by annual flooding, which triggers the export of terrigenous organic material to the river and ultimately to the Atlantic Ocean. The amount of carbon imported to the river and the further conversion, transport and export of it, depend on terrestrial productivity and discharge, as well as temperature and atmospheric CO2. Both terrestrial productivity and discharge are influenced by climate and land use change. To assess the impact of these changes on the riverine carbon dynamics, the coupled model system of LPJmL and RivCM (Langerwisch et al., 2015) has been used. Vegetation dynamics (in LPJmL) as well as export and conversion of terrigenous carbon to and within the river (RivCM) are included. The model system has been applied for the years 1901 to 2099 under two deforestation scenarios and with climate forcing of three SRES emission scenarios, each for five climate models. The results suggest that, following deforestation, riverine particulate and dissolved organic carbon will strongly decrease by up to 90 % until the end of the current century. In parallel, discharge increases, leading to roughly unchanged net carbon transport during the first decades of the century, as long as a sufficient area is still forested. During the following decades the amount of transported carbon will decrease drastically. In contrast to the riverine organic carbon, the amount of riverine inorganic carbon is only determined by climate change forcing, namely increased temperature and atmospheric CO2 concentration. Mainly due to the higher atmospheric CO2 it leads to an increase in riverine inorganic carbon by up to 20 % (SRES A2). The changes in riverine carbon fluxes have direct effects on the export of carbon, either to the atmosphere via outgassing, or to the Atlantic Ocean via discharge. Basin-wide the outgassed carbon will increase slightly, but can be regionally reduced by up to 60 % due to deforestation. The discharge of organic carbon to the ocean will be reduced by about 40 % under the most severe deforestation and climate change scenario. The changes would have local and regional consequences on the carbon balance and habitat characteristics in the Amazon basin itself but also in the adjacent Atlantic Ocean.
Examining Parallelism of Sets of Psychometric Measures Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Patelis, Thanos; Marcoulides, George A.
2011-01-01
A latent variable modeling approach that can be used to examine whether several psychometric tests are parallel is discussed. The method consists of sequentially testing the properties of parallel measures via a corresponding relaxation of parameter constraints in a saturated model or an appropriately constructed latent variable model. The…
Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures
NASA Astrophysics Data System (ADS)
Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.
2016-12-01
The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.
New Developments in NOAA's Comprehensive Large Array-Data Stewardship System
NASA Astrophysics Data System (ADS)
Ritchey, N. A.; Morris, J. S.; Carter, D. J.
2012-12-01
The Comprehensive Large Array-data Stewardship System (CLASS) is part of the NOAA strategic goal of Climate Adaptation and Mitigation that gives focus to the building and sustaining of key observational assets and data archives critical to maintaining the global climate record. Since 2002, CLASS has been NOAA's enterprise solution for ingesting, storing and providing access to a host of near real-time remote sensing streams such as the Polar and Geostationary Operational Environmental Satellites (POES and GOES) and the Defense Meteorological Satellite Program (DMSP). Since October, 2011 CLASS has also been the dedicated Archive Data Segment (ADS) of the Suomi National Polar-orbiting Partnership (S-NPP). As the ADS, CLASS receives raw and processed S-NPP records for archival and distribution to the broad user community. Moving beyond just remote sensing and model data, NOAA has endorsed a plan to migrate all archive holdings from NOAA's National Data Centers into CLASS while retiring various disparate legacy data storage systems residing at the National Climatic Data Center (NCDC), National Geophysical Data Center (NGDC) and the National Oceanographic Data Center (NODC). In parallel to this data migration, CLASS is evolving to a service-oriented architecture utilizing cloud technologies for dissemination in addition to clearly defined interfaces that allow better collaboration with partners. This evolution will require implementation of standard access protocols and metadata which will lead to cost effective data and information preservation.
Effects of Global Change on U.S. Urban Areas: Vulnerabilities, Impacts, and Adaptation
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Wilbanks, Thomas J.; Kirshen, Paul; Romero-Lnkao, Patricia; Rosenzweig, Cynthia; Ruth, Matthias; Solecki, William; Tarr, Joel
2007-01-01
Human settlements, both large and small, are where the vast majority of people on the Earth live. Expansion of cities both in population and areal extent, is a relentless process that will accelerate in the 21st century. As a consequence of urban growth both in the United States and around the globe, it is important to develop an understanding of how urbanization will affect the local and regional environment. Of equal importance, however, is the assessment of how cities will be impacted by the looming prospects of global climate change and climate variability. The potential impacts of climate change and variability has recently been annunciated by the IPCC's "Climate Change 2007" report. Moreover, the U.S. Climate Change Science Program (CCSP) is preparing a series of "Synthesis and Assessment Products" (SAPs) reports to support informed discussion and decision making regarding climate change and variability by policy matters, resource managers, stakeholders, the media, and the general public. We are authors on a SAP describing the effects of global climate change on human settlements. This paper will present the elements of our SAP report that relate to what vulnerabilities and impacts will occur, what adaptation responses may take place, and what possible effects on settlement patterns and characteristics will potentially arise, on human settlements in the U.S. as a result of climate change and climate variability. We will also present some recommendations about what should be done to further research on how climate change and variability will impact human settlements in the U.S., as well as how to engage government officials, policy and decision makers, and the general public in understanding the implications of climate change and variability on the local and regional levels. Additionally, we wish to explore how technology such as remote sensing data coupled with modeling, can be employed as synthesis tools for deriving insight across a spectrum of impacts (e.g. public health, urban planning for mitigation strategies) on how cities can cope and adapt to climate change and variability. This latter point parallels the concepts and ideas presented in the U.S. National Academy of Sciences, Decadal Survey report on "Earth Science Applications from Space: National Imperatives for the Next Decade and Beyond" wherein the analysis of the impacts of climate change and variability, human health, and land use change are listed as key areas for development of future Earth observing remote sensing systems.
Barrett, Tristam; Feola, Giuseppe; Khusnitdinova, Marina; Krylova, Viktoria
2017-01-01
The convergence of climate change and post-Soviet socio-economic and institutional transformations has been underexplored so far, as have the consequences of such convergence on crop agriculture in Central Asia. This paper provides a place-based analysis of constraints and opportunities for adaptation to climate change, with a specific focus on water use, in two districts in southeast Kazakhstan. Data were collected by 2 multi-stakeholder participatory workshops, 21 semi-structured in-depth interviews, and secondary statistical data. The present-day agricultural system is characterised by enduring Soviet-era management structures, but without state inputs that previously sustained agricultural productivity. Low margins of profitability on many privatised farms mean that attempts to implement integrated water management have produced water users associations unable to maintain and upgrade a deteriorating irrigation infrastructure. Although actors engage in tactical adaptation measures, necessary structural adaptation of the irrigation system remains difficult without significant public or private investments. Market-based water management models have been translated ambiguously to this region, which fails to encourage efficient water use and hinders adaptation to water stress. In addition, a mutual interdependence of informal networks and formal institutions characterises both state governance and everyday life in Kazakhstan. Such interdependence simultaneously facilitates operational and tactical adaptation, but hinders structural adaptation, as informal networks exist as a parallel system that achieves substantive outcomes while perpetuating the inertia and incapacity of the state bureaucracy. This article has relevance for critical understanding of integrated water management in practice and adaptation to climate change in post-Soviet institutional settings more broadly.
Potential climatic impacts and reliability of very large-scale wind farms
NASA Astrophysics Data System (ADS)
Wang, C.; Prinn, R. G.
2010-02-01
Meeting future world energy needs while addressing climate change requires large-scale deployment of low or zero greenhouse gas (GHG) emission technologies such as wind energy. The widespread availability of wind power has fueled substantial interest in this renewable energy source as one of the needed technologies. For very large-scale utilization of this resource, there are however potential environmental impacts, and also problems arising from its inherent intermittency, in addition to the present need to lower unit costs. To explore some of these issues, we use a three-dimensional climate model to simulate the potential climate effects associated with installation of wind-powered generators over vast areas of land or coastal ocean. Using wind turbines to meet 10% or more of global energy demand in 2100, could cause surface warming exceeding 1 °C over land installations. In contrast, surface cooling exceeding 1 °C is computed over ocean installations, but the validity of simulating the impacts of wind turbines by simply increasing the ocean surface drag needs further study. Significant warming or cooling remote from both the land and ocean installations, and alterations of the global distributions of rainfall and clouds also occur. These results are influenced by the competing effects of increases in roughness and decreases in wind speed on near-surface turbulent heat fluxes, the differing nature of land and ocean surface friction, and the dimensions of the installations parallel and perpendicular to the prevailing winds. These results are also dependent on the accuracy of the model used, and the realism of the methods applied to simulate wind turbines. Additional theory and new field observations will be required for their ultimate validation. Intermittency of wind power on daily, monthly and longer time scales as computed in these simulations and inferred from meteorological observations, poses a demand for one or more options to ensure reliability, including backup generation capacity, very long distance power transmission lines, and onsite energy storage, each with specific economic and/or technological challenges.
Potential climatic impacts and reliability of very large-scale wind farms
NASA Astrophysics Data System (ADS)
Wang, C.; Prinn, R. G.
2009-09-01
Meeting future world energy needs while addressing climate change requires large-scale deployment of low or zero greenhouse gas (GHG) emission technologies such as wind energy. The widespread availability of wind power has fueled legitimate interest in this renewable energy source as one of the needed technologies. For very large-scale utilization of this resource, there are however potential environmental impacts, and also problems arising from its inherent intermittency, in addition to the present need to lower unit costs. To explore some of these issues, we use a three-dimensional climate model to simulate the potential climate effects associated with installation of wind-powered generators over vast areas of land or coastal ocean. Using wind turbines to meet 10% or more of global energy demand in 2100, could cause surface warming exceeding 1°C over land installations. In contrast, surface cooling exceeding 1°C is computed over ocean installations, but the validity of simulating the impacts of wind turbines by simply increasing the ocean surface drag needs further study. Significant warming or cooling remote from both the land and ocean installations, and alterations of the global distributions of rainfall and clouds also occur. These results are influenced by the competing effects of increases in roughness and decreases in wind speed on near-surface turbulent heat fluxes, the differing nature of land and ocean surface friction, and the dimensions of the installations parallel and perpendicular to the prevailing winds. These results are also dependent on the accuracy of the model used, and the realism of the methods applied to simulate wind turbines. Additional theory and new field observations will be required for their ultimate validation. Intermittency of wind power on daily, monthly and longer time scales as computed in these simulations and inferred from meteorological observations, poses a demand for one or more options to ensure reliability, including backup generation capacity, very long distance power transmission lines, and onsite energy storage, each with specific economic and/or technological challenges.
NASA Astrophysics Data System (ADS)
Hellmer, Hartmut H.; Rhein, Monika; Heinemann, Günther; Abalichin, Janna; Abouchami, Wafa; Baars, Oliver; Cubasch, Ulrich; Dethloff, Klaus; Ebner, Lars; Fahrbach, Eberhard; Frank, Martin; Gollan, Gereon; Greatbatch, Richard J.; Grieger, Jens; Gryanik, Vladimir M.; Gryschka, Micha; Hauck, Judith; Hoppema, Mario; Huhn, Oliver; Kanzow, Torsten; Koch, Boris P.; König-Langlo, Gert; Langematz, Ulrike; Leckebusch, Gregor C.; Lüpkes, Christof; Paul, Stephan; Rinke, Annette; Rost, Bjoern; van der Loeff, Michiel Rutgers; Schröder, Michael; Seckmeyer, Gunther; Stichel, Torben; Strass, Volker; Timmermann, Ralph; Trimborn, Scarlett; Ulbrich, Uwe; Venchiarutti, Celia; Wacker, Ulrike; Willmes, Sascha; Wolf-Gladrow, Dieter
2016-11-01
In the early 1980s, Germany started a new era of modern Antarctic research. The Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) was founded and important research platforms such as the German permanent station in Antarctica, today called Neumayer III, and the research icebreaker Polarstern were installed. The research primarily focused on the Atlantic sector of the Southern Ocean. In parallel, the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) started a priority program `Antarctic Research' (since 2003 called SPP-1158) to foster and intensify the cooperation between scientists from different German universities and the AWI as well as other institutes involved in polar research. Here, we review the main findings in meteorology and oceanography of the last decade, funded by the priority program. The paper presents field observations and modelling efforts, extending from the stratosphere to the deep ocean. The research spans a large range of temporal and spatial scales, including the interaction of both climate components. In particular, radiative processes, the interaction of the changing ozone layer with large-scale atmospheric circulations, and changes in the sea ice cover are discussed. Climate and weather forecast models provide an insight into the water cycle and the climate change signals associated with synoptic cyclones. Investigations of the atmospheric boundary layer focus on the interaction between atmosphere, sea ice and ocean in the vicinity of polynyas and leads. The chapters dedicated to polar oceanography review the interaction between the ocean and ice shelves with regard to the freshwater input and discuss the changes in water mass characteristics, ventilation and formation rates, crucial for the deepest limb of the global, climate-relevant meridional overturning circulation. They also highlight the associated storage of anthropogenic carbon as well as the cycling of carbon, nutrients and trace metals in the ocean with special emphasis on the Weddell Sea.
Nadkarni, P. M.; Miller, P. L.
1991-01-01
A parallel program for inter-database sequence comparison was developed on the Intel Hypercube using two models of parallel programming. One version was built using machine-specific Hypercube parallel programming commands. The other version was built using Linda, a machine-independent parallel programming language. The two versions of the program provide a case study comparing these two approaches to parallelization in an important biological application area. Benchmark tests with both programs gave comparable results with a small number of processors. As the number of processors was increased, the Linda version was somewhat less efficient. The Linda version was also run without change on Network Linda, a virtual parallel machine running on a network of desktop workstations. PMID:1807632
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrisochoides, N.; Sukup, F.
In this paper we present a parallel implementation of the Bowyer-Watson (BW) algorithm using the task-parallel programming model. The BW algorithm constitutes an ideal mesh refinement strategy for implementing a large class of unstructured mesh generation techniques on both sequential and parallel computers, by preventing the need for global mesh refinement. Its implementation on distributed memory multicomputes using the traditional data-parallel model has been proven very inefficient due to excessive synchronization needed among processors. In this paper we demonstrate that with the task-parallel model we can tolerate synchronization costs inherent to data-parallel methods by exploring concurrency in the processor level.more » Our preliminary performance data indicate that the task- parallel approach: (i) is almost four times faster than the existing data-parallel methods, (ii) scales linearly, and (iii) introduces minimum overheads compared to the {open_quotes}best{close_quotes} sequential implementation of the BW algorithm.« less
Modelling parallel programs and multiprocessor architectures with AXE
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.
1991-01-01
AXE, An Experimental Environment for Parallel Systems, was designed to model and simulate for parallel systems at the process level. It provides an integrated environment for specifying computation models, multiprocessor architectures, data collection, and performance visualization. AXE is being used at NASA-Ames for developing resource management strategies, parallel problem formulation, multiprocessor architectures, and operating system issues related to the High Performance Computing and Communications Program. AXE's simple, structured user-interface enables the user to model parallel programs and machines precisely and efficiently. Its quick turn-around time keeps the user interested and productive. AXE models multicomputers. The user may easily modify various architectural parameters including the number of sites, connection topologies, and overhead for operating system activities. Parallel computations in AXE are represented as collections of autonomous computing objects known as players. Their use and behavior is described. Performance data of the multiprocessor model can be observed on a color screen. These include CPU and message routing bottlenecks, and the dynamic status of the software.
Global Environmental Multiscale model - a platform for integrated environmental predictions
NASA Astrophysics Data System (ADS)
Kaminski, Jacek W.; Struzewska, Joanna; Neary, Lori; Dearden, Frank
2017-04-01
The Global Environmental Multiscale model was developed by the Government of Canada as an operational weather prediction model in the mid-1990s. Subsequently, it was used as the host meteorological model for an on-line implementation of air quality chemistry and aerosols from global to the meso-gamma scale. Further model developments led to the vertical extension of the modelling domain to include stratospheric chemistry, aerosols, and formation of polar stratospheric clouds. In parallel, the modelling platform was used for planetary applications where dynamical, radiative transfer and chemical processes in the atmosphere of Mars were successfully simulated. Undoubtedly, the developed modelling platform can be classified as an example capable of the seamless and coupled modelling of the dynamics and chemistry of planetary atmospheres. We will present modelling results for global, regional, and local air quality episodes and the long-term air quality trends. Upper troposphere and lower stratosphere modelling results will be presented in terms of climate change and subsonic aviation emissions modelling. Model results for the atmosphere of Mars will be presented in the context of the 2016 ExoMars mission and the anticipated observations from the NOMAD instrument. Also, we will present plans and the design to extend the GEM model to the F region with further coupling with a magnetospheric model that extends to 15 Re.
Climate Change in the Western United States: Projections and Observations (Invited)
NASA Astrophysics Data System (ADS)
Redmond, K. T.
2009-12-01
The interplay between projections and observations of climate, and the role of observations as they unfold, form the primary emphasis for this talk. The consensus among climate projections is that the Western United States will warm, and that annual precipitation will increase near the Canada/US border and decrease near the Mexico/US border. Inter-model agreement is greater for temperature than precipitation, though precipitation projections show some tendency toward slow convergence. Seasonal temperature changes are expected to be similar from month to month, slightly greater in summer and slightly smaller in winter. Coastal temperature increases are expected to be smaller than inland. High elevation increases may be slightly greater than those at low elevation. The precipitation season is in general expected to be more concentrated in winter, with less (or less increase, depending on latitude) precipitation in spring, summer, and autumn than without climate change. Climate should have started to depart from the baseline (no-change) case about 30-35 years ago. Observations show that temperatures West-wide did begin to rise during the 1970s. Precipitation changes have been more ambiguous. Annual temperature increases in the U.S. have been much more prominent in the West (and to some extent the north) than in the East, especially during the last decade. Summer in particular has shown a marked temperature increase since around 2000. Minimum temperatures have shown more increase (in many cases considerably more) than maximum temperatures. Annual freezing levels, from essentially independent data sets, have risen during this time. Acceptance of climate change in the public mind is increased when evidence visibly aligns with projections. This appears to have been particularly important in the western states. However, other sources of climate variability, of human or natural origin, on seasonal to decadal scales, can obscure or partially and temporarily mask expected effects of greenhouse gas forcing. Observational factors can likewise affect the reported climate history. Changes in climate elements have been detected, but parallel efforts at attribution are necessary to properly interpret the measurements, and provide the consistency desired by scientists and the remainder of the public. All of the above factors converge in the region's most prominent climate narrative, the ongoing Colorado River drought and its uncertain outcome.
Wedi, Nils P
2014-06-28
The steady path of doubling the global horizontal resolution approximately every 8 years in numerical weather prediction (NWP) at the European Centre for Medium Range Weather Forecasts may be substantially altered with emerging novel computing architectures. It coincides with the need to appropriately address and determine forecast uncertainty with increasing resolution, in particular, when convective-scale motions start to be resolved. Blunt increases in the model resolution will quickly become unaffordable and may not lead to improved NWP forecasts. Consequently, there is a need to accordingly adjust proven numerical techniques. An informed decision on the modelling strategy for harnessing exascale, massively parallel computing power thus also requires a deeper understanding of the sensitivity to uncertainty--for each part of the model--and ultimately a deeper understanding of multi-scale interactions in the atmosphere and their numerical realization in ultra-high-resolution NWP and climate simulations. This paper explores opportunities for substantial increases in the forecast efficiency by judicious adjustment of the formal accuracy or relative resolution in the spectral and physical space. One path is to reduce the formal accuracy by which the spectral transforms are computed. The other pathway explores the importance of the ratio used for the horizontal resolution in gridpoint space versus wavenumbers in spectral space. This is relevant for both high-resolution simulations as well as ensemble-based uncertainty estimation. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, W. -L.; Gu, Y.; Liou, K. N.
2015-05-19
We investigate 3-D mountain effects on solar flux distributions and their impact on surface hydrology over the western United States, specifically the Rocky Mountains and the Sierra Nevada, using the global CCSM4 (Community Climate System Model version 4; Community Atmosphere Model/Community Land Model – CAM4/CLM4) with a 0.23° × 0.31° resolution for simulations over 6 years. In a 3-D radiative transfer parameterization, we have updated surface topography data from a resolution of 1 km to 90 m to improve parameterization accuracy. In addition, we have also modified the upward-flux deviation (3-D–PP (plane-parallel)) adjustment to ensure that the energy balance atmore » the surface is conserved in global climate simulations based on 3-D radiation parameterization. We show that deviations in the net surface fluxes are not only affected by 3-D mountains but also influenced by feedbacks of cloud and snow in association with the long-term simulations. Deviations in sensible heat and surface temperature generally follow the patterns of net surface solar flux. The monthly snow water equivalent (SWE) deviations show an increase in lower elevations due to reduced snowmelt, leading to a reduction in cumulative runoff. Over higher-elevation areas, negative SWE deviations are found because of increased solar radiation available at the surface. Simulated precipitation increases for lower elevations, while it decreases for higher elevations, with a minimum in April. Liquid runoff significantly decreases at higher elevations after April due to reduced SWE and precipitation.« less
NASA Astrophysics Data System (ADS)
Reerink, Thomas J.; van de Berg, Willem Jan; van de Wal, Roderik S. W.
2016-11-01
This paper accompanies the second OBLIMAP open-source release. The package is developed to map climate fields between a general circulation model (GCM) and an ice sheet model (ISM) in both directions by using optimal aligned oblique projections, which minimize distortions. The curvature of the surfaces of the GCM and ISM grid differ, both grids may be irregularly spaced and the ratio of the grids is allowed to differ largely. OBLIMAP's stand-alone version is able to map data sets that differ in various aspects on the same ISM grid. Each grid may either coincide with the surface of a sphere, an ellipsoid or a flat plane, while the grid types might differ. Re-projection of, for example, ISM data sets is also facilitated. This is demonstrated by relevant applications concerning the major ice caps. As the stand-alone version also applies to the reverse mapping direction, it can be used as an offline coupler. Furthermore, OBLIMAP 2.0 is an embeddable GCM-ISM coupler, suited for high-frequency online coupled experiments. A new fast scan method is presented for structured grids as an alternative for the former time-consuming grid search strategy, realising a performance gain of several orders of magnitude and enabling the mapping of high-resolution data sets with a much larger number of grid nodes. Further, a highly flexible masked mapping option is added. The limitation of the fast scan method with respect to unstructured and adaptive grids is discussed together with a possible future parallel Message Passing Interface (MPI) implementation.
McConnell, Joseph R.; Aristarain, Alberto J.; Banta, J. Ryan; Edwards, P. Ross; Simões, Jefferson C.
2007-01-01
Crustal dust in the atmosphere impacts Earth's radiative forcing directly by modifying the radiation budget and affecting cloud nucleation and optical properties, and indirectly through ocean fertilization, which alters carbon sequestration. Increased dust in the atmosphere has been linked to decreased global air temperature in past ice core studies of glacial to interglacial transitions. We present a continuous ice core record of aluminum deposition during recent centuries in the northern Antarctic Peninsula, the most rapidly warming region of the Southern Hemisphere; such a record has not been reported previously. This record shows that aluminosilicate dust deposition more than doubled during the 20th century, coincident with the ≈1°C Southern Hemisphere warming: a pattern in parallel with increasing air temperatures, decreasing relative humidity, and widespread desertification in Patagonia and northern Argentina. These results have far-reaching implications for understanding the forces driving dust generation and impacts of changing dust levels on climate both in the recent past and future. PMID:17389397
Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis
NASA Technical Reports Server (NTRS)
Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.
2012-01-01
MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni
2017-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final/intermediate products, workflows, sessions, etc.) since everything is managed on the server-side; (v) it complements, extends and interoperates with the ESGF stack; (vi) it provides a "tool" for scientists to run multi-model experiments, and finally; and (vii) it can drastically reduce the time-to-solution for these experiments from weeks to hours. At the time the contribution is being written, the proposed testbed represents the first concrete implementation of a distributed multi-model experiment in the ESGF/CMIP context joining server-side and parallel processing, end-to-end workflow management and cloud computing. As opposed to the current scenario based on search & discovery, data download, and client-based data analysis, the INDIGO-DataCloud architectural solution described in this contribution addresses the scientific computing & analytics requirements by providing a paradigm shift based on server-side and high performance big data frameworks jointly with two-level workflow management systems realized at the PaaS level via a cloud infrastructure.
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.
2017-12-01
Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.
Rice, Kenneth G.; Beier, Paul; Breault, Tim; Middleton, Beth A.; Peck, Myron A.; Tirpak, John M.; Ratnaswamy, Mary; Austen, Douglas; Harrison, Sarah
2017-01-01
In 2008, the U.S. Congress authorized the establishment of the National Climate Change and Wildlife Science Center (NCCWSC) within the U.S. Department of Interior (DOI). Housed administratively within the U.S. Geological Survey (USGS), NCCWSC is part of the DOI’s ongoing mission to meet the challenges of climate change and its effects on wildlife and aquatic resources. From 2010 through 2012, NCCWSC established eight regional DOI Climate Science Centers (CSCs). Each of these regional CSCs operated with the mission to “synthesize and integrate climate change impact data and develop tools that the Department’s managers and partners can use when managing the Department’s land, water, fish and wildlife, and cultural heritage resources” (Salazar 2009). The model developed by NCCWSC for the regional CSCs employed a dual approach of a federal USGS-staffed component and a parallel host-university component established competitively through a 5-year cooperative agreement with NCCWSC. At the conclusion of this 5-year agreement, a review of each CSC was undertaken, with the Southeast Climate Science Center (SE CSC) review in February 2016. The SE CSC is hosted by North Carolina State University (NCSU) in Raleigh, North Carolina, and is physically housed within the NCSU Department of Applied Ecology along with the Center for Applied Aquatic Ecology, the North Carolina Cooperative Fish and Wildlife Research Unit (CFWRU), and the North Carolina Agromedicine Institute. The U.S. Department of Agriculture Southeast Regional Climate Hub is based at NCSU as is the National Oceanic and Atmospheric Administration (NOAA) Southeast Regional Climate Center, the North Carolina Institute for Climate Studies, the North Carolina Wildlife Resources Commission, the NOAA National Weather Service, the State Climate Office of North Carolina, and the U.S. Forest Service Eastern Forest Environmental Threat Assessment Center. This creates a strong core of organizations operating in close proximity focused on climate issues. The geographic area covered by the SE CSC represents all or part of 16 states and the Caribbean Islands and has overlapping boundaries with seven Landscape Conservation Cooperatives (LCCs): Appalachian LCC, Eastern Tallgrass Prairie and Big Rivers LCC, Gulf Coast Prairie LCC, Gulf Coastal Plains and Ozarks LCC, Peninsular Florida LCC, South Atlantic LCC, and Caribbean LCC. The SE CSC region also encompasses 134 U.S. Fish and Wildlife Service refuges and 89 National Park Service (NPS) units and is home to 11 federally recognized and 54 state recognized tribes.
Ferrucci, Filomena; Salza, Pasquale; Sarro, Federica
2017-06-29
The need to improve the scalability of Genetic Algorithms (GAs) has motivated the research on Parallel Genetic Algorithms (PGAs), and different technologies and approaches have been used. Hadoop MapReduce represents one of the most mature technologies to develop parallel algorithms. Based on the fact that parallel algorithms introduce communication overhead, the aim of the present work is to understand if, and possibly when, the parallel GAs solutions using Hadoop MapReduce show better performance than sequential versions in terms of execution time. Moreover, we are interested in understanding which PGA model can be most effective among the global, grid, and island models. We empirically assessed the performance of these three parallel models with respect to a sequential GA on a software engineering problem, evaluating the execution time and the achieved speedup. We also analysed the behaviour of the parallel models in relation to the overhead produced by the use of Hadoop MapReduce and the GAs' computational effort, which gives a more machine-independent measure of these algorithms. We exploited three problem instances to differentiate the computation load and three cluster configurations based on 2, 4, and 8 parallel nodes. Moreover, we estimated the costs of the execution of the experimentation on a potential cloud infrastructure, based on the pricing of the major commercial cloud providers. The empirical study revealed that the use of PGA based on the island model outperforms the other parallel models and the sequential GA for all the considered instances and clusters. Using 2, 4, and 8 nodes, the island model achieves an average speedup over the three datasets of 1.8, 3.4, and 7.0 times, respectively. Hadoop MapReduce has a set of different constraints that need to be considered during the design and the implementation of parallel algorithms. The overhead of data store (i.e., HDFS) accesses, communication, and latency requires solutions that reduce data store operations. For this reason, the island model is more suitable for PGAs than the global and grid model, also in terms of costs when executed on a commercial cloud provider.
Effect of Metamorphic Foliation on Regolith Thickness, Catalina Critical Zone Observatory, Arizona
NASA Astrophysics Data System (ADS)
Leone, J. D.; Holbrook, W. S.; Chorover, J.; Carr, B.
2016-12-01
Terrestrial life is sustained by nutrients and water held in soil and weathered rock, which are components of the Earth's critical zone, referred to as regolith. The thickness of regolith in the near-surface is thought to be influenced by factors such as climate, topographic stress, erosion and lithology. Our study has two aims: to determine the effect of metamorphic foliation on regolith thickness and to test an environmental model, Effective Energy Mass Transfer (EEMT), within a zero-order basin (ZOB) in the Santa Catalina Mountains. Seismic refraction and electrical resistivity data show a stark contrast in physical properties, and inferred regolith thickness, on north- versus south-facing slopes: north-facing slopes are characterized by higher seismic velocities and higher resistivities, consistent with thin regolith, while south-facing slopes show lower resistivities and velocities, indicative of deeper and more extensive weathering. This contrast is exactly the opposite of that expected from most climatic models, including the EEMT model, which predicts deeper regolith on north-facing slopes. Instead, regolith thickness appears to be controlled by metamorphic foliation: we observed a general, positive correlation between interpreted regolith thickness and foliation dip within heavily foliated lithologies and no correlation in weakly foliated lithologies. We hypothesize that hydraulic conductivity controls weathering here: where foliation is parallel to the surface topography, regolith is thin, but where foliation pierces the surface topography at a substantial angle, regolith is thick. The effect of foliation is much larger than that expected from environmental models: regolith thickness varies by a factor of 4 (2.5 m vs. 10 m). These results suggest that metamorphic foliation, and perhaps by extension sedimentary layering, plays a key role in determining regolith thickness and must be accounted for in models of critical zone development.
Gridding Cloud and Irradiance to Quantify Variability at the ARM Southern Great Plains Site
NASA Astrophysics Data System (ADS)
Riihimaki, L.; Long, C. N.; Gaustad, K.
2017-12-01
Ground-based radiometers provide the most accurate measurements of surface irradiance. However, geometry differences between surface point measurements and large area climate model grid boxes or satellite-based footprints can cause systematic differences in surface irradiance comparisons. In this work, irradiance measurements from a network of ground stations around Kansas and Oklahoma at the US Department of Energy Atmospheric Radiation Measurement (ARM) Southern Great Plains facility are examined. Upwelling and downwelling broadband shortwave and longwave radiometer measurements are available at each site as well as surface meteorological measurements. In addition to the measured irradiances, clear sky irradiance and cloud fraction estimates are analyzed using well established methods based on empirical fits to measured clear sky irradiances. Measurements are interpolated onto a 0.25 degree latitude and longitude grid using a Gaussian weight scheme in order to provide a more accurate statistical comparison between ground measurements and a larger area such as that used in climate models, plane parallel radiative transfer calculations, and other statistical and climatological research. Validation of the gridded product will be shown, as well as analysis that quantifies the impact of site location, cloud type, and other factors on the resulting surface irradiance estimates. The results of this work are being incorporated into the Surface Cloud Grid operational data product produced by ARM, and will be made publicly available for use by others.
Effects of land cover change on the tropical circulation in a GCM
NASA Astrophysics Data System (ADS)
Jonko, Alexandra Karolina; Hense, Andreas; Feddema, Johannes Jan
2010-09-01
Multivariate statistics are used to investigate sensitivity of the tropical atmospheric circulation to scenario-based global land cover change (LCC), with the largest changes occurring in the tropics. Three simulations performed with the fully coupled Parallel Climate Model (PCM) are compared: (1) a present day control run; (2) a simulation with present day land cover and Intergovernmental Panel on Climate Change (IPCC) Special Report on Emission Scenarios (SRES) A2 greenhouse gas (GHG) projections; and (3) a simulation with SRES A2 land cover and GHG projections. Dimensionality of PCM data is reduced by projection onto a priori specified eigenvectors, consisting of Rossby and Kelvin waves produced by a linearized, reduced gravity model of the tropical circulation. A Hotelling T 2 test is performed on projection amplitudes. Effects of LCC evaluated by this method are limited to diabatic heating. A statistically significant and recurrent signal is detected for 33% of all tests performed for various combinations of parameters. Taking into account uncertainties and limitations of the present methodology, this signal can be interpreted as a Rossby wave response to prescribed LCC. The Rossby waves are shallow, large-scale motions, trapped at the equator and most pronounced in boreal summer. Differences in mass and flow fields indicate a shift of the tropical Walker circulation patterns with an anomalous subsidence over tropical South America.
gpuPOM: a GPU-based Princeton Ocean Model
NASA Astrophysics Data System (ADS)
Xu, S.; Huang, X.; Zhang, Y.; Fu, H.; Oey, L.-Y.; Xu, F.; Yang, G.
2014-11-01
Rapid advances in the performance of the graphics processing unit (GPU) have made the GPU a compelling solution for a series of scientific applications. However, most existing GPU acceleration works for climate models are doing partial code porting for certain hot spots, and can only achieve limited speedup for the entire model. In this work, we take the mpiPOM (a parallel version of the Princeton Ocean Model) as our starting point, design and implement a GPU-based Princeton Ocean Model. By carefully considering the architectural features of the state-of-the-art GPU devices, we rewrite the full mpiPOM model from the original Fortran version into a new Compute Unified Device Architecture C (CUDA-C) version. We take several accelerating methods to further improve the performance of gpuPOM, including optimizing memory access in a single GPU, overlapping communication and boundary operations among multiple GPUs, and overlapping input/output (I/O) between the hybrid Central Processing Unit (CPU) and the GPU. Our experimental results indicate that the performance of the gpuPOM on a workstation containing 4 GPUs is comparable to a powerful cluster with 408 CPU cores and it reduces the energy consumption by 6.8 times.
Karasick, Michael S.; Strip, David R.
1996-01-01
A parallel computing system is described that comprises a plurality of uniquely labeled, parallel processors, each processor capable of modelling a three-dimensional object that includes a plurality of vertices, faces and edges. The system comprises a front-end processor for issuing a modelling command to the parallel processors, relating to a three-dimensional object. Each parallel processor, in response to the command and through the use of its own unique label, creates a directed-edge (d-edge) data structure that uniquely relates an edge of the three-dimensional object to one face of the object. Each d-edge data structure at least includes vertex descriptions of the edge and a description of the one face. As a result, each processor, in response to the modelling command, operates upon a small component of the model and generates results, in parallel with all other processors, without the need for processor-to-processor intercommunication.
Romañach, Stephanie; Watling, James I.; Fletcher, Robert J.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.
2014-01-01
Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.
NASA Astrophysics Data System (ADS)
Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Hong, Yang; Zuo, Depeng; Ren, Minglei; Lei, Tianjie; Liang, Ke
2018-01-01
Hydrological model calibration has been a hot issue for decades. The shuffled complex evolution method developed at the University of Arizona (SCE-UA) has been proved to be an effective and robust optimization approach. However, its computational efficiency deteriorates significantly when the amount of hydrometeorological data increases. In recent years, the rise of heterogeneous parallel computing has brought hope for the acceleration of hydrological model calibration. This study proposed a parallel SCE-UA method and applied it to the calibration of a watershed rainfall-runoff model, the Xinanjiang model. The parallel method was implemented on heterogeneous computing systems using OpenMP and CUDA. Performance testing and sensitivity analysis were carried out to verify its correctness and efficiency. Comparison results indicated that heterogeneous parallel computing-accelerated SCE-UA converged much more quickly than the original serial version and possessed satisfactory accuracy and stability for the task of fast hydrological model calibration.
Iterative algorithms for large sparse linear systems on parallel computers
NASA Technical Reports Server (NTRS)
Adams, L. M.
1982-01-01
Algorithms for assembling in parallel the sparse system of linear equations that result from finite difference or finite element discretizations of elliptic partial differential equations, such as those that arise in structural engineering are developed. Parallel linear stationary iterative algorithms and parallel preconditioned conjugate gradient algorithms are developed for solving these systems. In addition, a model for comparing parallel algorithms on array architectures is developed and results of this model for the algorithms are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liou, Kuo-Nan
2016-02-09
Under the support of the aforementioned DOE Grant, we have made two fundamental contributions to atmospheric and climate sciences: (1) Develop an efficient 3-D radiative transfer parameterization for application to intense and intricate inhomogeneous mountain/snow regions. (2) Innovate a stochastic parameterization for light absorption by internally mixed black carbon and dust particles in snow grains for understanding and physical insight into snow albedo reduction in climate models. With reference to item (1), we divided solar fluxes reaching mountain surfaces into five components: direct and diffuse fluxes, direct- and diffuse-reflected fluxes, and coupled mountain-mountain flux. “Exact” 3D Monte Carlo photon tracingmore » computations can then be performed for these solar flux components to compare with those calculated from the conventional plane-parallel (PP) radiative transfer program readily available in climate models. Subsequently, Parameterizations of the deviations of 3D from PP results for five flux components are carried out by means of the multiple linear regression analysis associated with topographic information, including elevation, solar incident angle, sky view factor, and terrain configuration factor. We derived five regression equations with high statistical correlations for flux deviations and successfully incorporated this efficient parameterization into WRF model, which was used as the testbed in connection with the Fu-Liou-Gu PP radiation scheme that has been included in the WRF physics package. Incorporating this 3D parameterization program, we conducted simulations of WRF and CCSM4 to understand and evaluate the mountain/snow effect on snow albedo reduction during seasonal transition and the interannual variability for snowmelt, cloud cover, and precipitation over the Western United States presented in the final report. With reference to item (2), we developed in our previous research a geometric-optics surface-wave approach (GOS) for the computation of light absorption and scattering by complex and inhomogeneous particles for application to aggregates and snow grains with external and internal mixing structures. We demonstrated that a small black (BC) particle on the order of 1 μm internally mixed with snow grains could effectively reduce visible snow albedo by as much as 5–10%. Following this work and within the context of DOE support, we have made two key accomplishments presented in the attached final report.« less
Profiling and Improving I/O Performance of a Large-Scale Climate Scientific Application
NASA Technical Reports Server (NTRS)
Liu, Zhuo; Wang, Bin; Wang, Teng; Tian, Yuan; Xu, Cong; Wang, Yandong; Yu, Weikuan; Cruz, Carlos A.; Zhou, Shujia; Clune, Tom;
2013-01-01
Exascale computing systems are soon to emerge, which will pose great challenges on the huge gap between computing and I/O performance. Many large-scale scientific applications play an important role in our daily life. The huge amounts of data generated by such applications require highly parallel and efficient I/O management policies. In this paper, we adopt a mission-critical scientific application, GEOS-5, as a case to profile and analyze the communication and I/O issues that are preventing applications from fully utilizing the underlying parallel storage systems. Through in-detail architectural and experimental characterization, we observe that current legacy I/O schemes incur significant network communication overheads and are unable to fully parallelize the data access, thus degrading applications' I/O performance and scalability. To address these inefficiencies, we redesign its I/O framework along with a set of parallel I/O techniques to achieve high scalability and performance. Evaluation results on the NASA discover cluster show that our optimization of GEOS-5 with ADIOS has led to significant performance improvements compared to the original GEOS-5 implementation.
Wu, Xiao-Lin; Sun, Chuanyu; Beissinger, Timothy M; Rosa, Guilherme Jm; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel
2012-09-25
Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs.
2012-01-01
Background Most Bayesian models for the analysis of complex traits are not analytically tractable and inferences are based on computationally intensive techniques. This is true of Bayesian models for genome-enabled selection, which uses whole-genome molecular data to predict the genetic merit of candidate animals for breeding purposes. In this regard, parallel computing can overcome the bottlenecks that can arise from series computing. Hence, a major goal of the present study is to bridge the gap to high-performance Bayesian computation in the context of animal breeding and genetics. Results Parallel Monte Carlo Markov chain algorithms and strategies are described in the context of animal breeding and genetics. Parallel Monte Carlo algorithms are introduced as a starting point including their applications to computing single-parameter and certain multiple-parameter models. Then, two basic approaches for parallel Markov chain Monte Carlo are described: one aims at parallelization within a single chain; the other is based on running multiple chains, yet some variants are discussed as well. Features and strategies of the parallel Markov chain Monte Carlo are illustrated using real data, including a large beef cattle dataset with 50K SNP genotypes. Conclusions Parallel Markov chain Monte Carlo algorithms are useful for computing complex Bayesian models, which does not only lead to a dramatic speedup in computing but can also be used to optimize model parameters in complex Bayesian models. Hence, we anticipate that use of parallel Markov chain Monte Carlo will have a profound impact on revolutionizing the computational tools for genomic selection programs. PMID:23009363
Full Stokes finite-element modeling of ice sheets using a graphics processing unit
NASA Astrophysics Data System (ADS)
Seddik, H.; Greve, R.
2016-12-01
Thermo-mechanical simulation of ice sheets is an important approach to understand and predict their evolution in a changing climate. For that purpose, higher order (e.g., ISSM, BISICLES) and full Stokes (e.g., Elmer/Ice, http://elmerice.elmerfem.org) models are increasingly used to more accurately model the flow of entire ice sheets. In parallel to this development, the rapidly improving performance and capabilities of Graphics Processing Units (GPUs) allows to efficiently offload more calculations of complex and computationally demanding problems on those devices. Thus, in order to continue the trend of using full Stokes models with greater resolutions, using GPUs should be considered for the implementation of ice sheet models. We developed the GPU-accelerated ice-sheet model Sainō. Sainō is an Elmer (http://www.csc.fi/english/pages/elmer) derivative implemented in Objective-C which solves the full Stokes equations with the finite element method. It uses the standard OpenCL language (http://www.khronos.org/opencl/) to offload the assembly of the finite element matrix on the GPU. A mesh-coloring scheme is used so that elements with the same color (non-sharing nodes) are assembled in parallel on the GPU without the need for synchronization primitives. The current implementation shows that, for the ISMIP-HOM experiment A, during the matrix assembly in double precision with 8000, 87,500 and 252,000 brick elements, Sainō is respectively 2x, 10x and 14x faster than Elmer/Ice (when both models are run on a single processing unit). In single precision, Sainō is even 3x, 20x and 25x faster than Elmer/Ice. A detailed description of the comparative results between Sainō and Elmer/Ice will be presented, and further perspectives in optimization and the limitations of the current implementation.
A model for optimizing file access patterns using spatio-temporal parallelism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boonthanome, Nouanesengsy; Patchett, John; Geveci, Berk
2013-01-01
For many years now, I/O read time has been recognized as the primary bottleneck for parallel visualization and analysis of large-scale data. In this paper, we introduce a model that can estimate the read time for a file stored in a parallel filesystem when given the file access pattern. Read times ultimately depend on how the file is stored and the access pattern used to read the file. The file access pattern will be dictated by the type of parallel decomposition used. We employ spatio-temporal parallelism, which combines both spatial and temporal parallelism, to provide greater flexibility to possible filemore » access patterns. Using our model, we were able to configure the spatio-temporal parallelism to design optimized read access patterns that resulted in a speedup factor of approximately 400 over traditional file access patterns.« less
Optimisation of a parallel ocean general circulation model
NASA Astrophysics Data System (ADS)
Beare, M. I.; Stevens, D. P.
1997-10-01
This paper presents the development of a general-purpose parallel ocean circulation model, for use on a wide range of computer platforms, from traditional scalar machines to workstation clusters and massively parallel processors. Parallelism is provided, as a modular option, via high-level message-passing routines, thus hiding the technical intricacies from the user. An initial implementation highlights that the parallel efficiency of the model is adversely affected by a number of factors, for which optimisations are discussed and implemented. The resulting ocean code is portable and, in particular, allows science to be achieved on local workstations that could otherwise only be undertaken on state-of-the-art supercomputers.
Impacts of weighting climate models for hydro-meteorological climate change studies
NASA Astrophysics Data System (ADS)
Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe; Caya, Daniel
2017-06-01
Weighting climate models is controversial in climate change impact studies using an ensemble of climate simulations from different climate models. In climate science, there is a general consensus that all climate models should be considered as having equal performance or in other words that all projections are equiprobable. On the other hand, in the impacts and adaptation community, many believe that climate models should be weighted based on their ability to better represent various metrics over a reference period. The debate appears to be partly philosophical in nature as few studies have investigated the impact of using weights in projecting future climate changes. The present study focuses on the impact of assigning weights to climate models for hydrological climate change studies. Five methods are used to determine weights on an ensemble of 28 global climate models (GCMs) adapted from the Coupled Model Intercomparison Project Phase 5 (CMIP5) database. Using a hydrological model, streamflows are computed over a reference (1961-1990) and future (2061-2090) periods, with and without post-processing climate model outputs. The impacts of using different weighting schemes for GCM simulations are then analyzed in terms of ensemble mean and uncertainty. The results show that weighting GCMs has a limited impact on both projected future climate in term of precipitation and temperature changes and hydrology in terms of nine different streamflow criteria. These results apply to both raw and post-processed GCM model outputs, thus supporting the view that climate models should be considered equiprobable.
National Centers for Environmental Prediction
Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model PARALLEL/EXPERIMENTAL MODEL FORECAST GRAPHICS OPERATIONAL VERIFICATION / DIAGNOSTICS PARALLEL VERIFICATION Developmental Air Quality Forecasts and Verification Back to Table of Contents 2. PARALLEL/EXPERIMENTAL GRAPHICS
NASA Technical Reports Server (NTRS)
Lou, John; Ferraro, Robert; Farrara, John; Mechoso, Carlos
1996-01-01
An analysis is presented of several factors influencing the performance of a parallel implementation of the UCLA atmospheric general circulation model (AGCM) on massively parallel computer systems. Several modificaitons to the original parallel AGCM code aimed at improving its numerical efficiency, interprocessor communication cost, load-balance and issues affecting single-node code performance are discussed.
Hadano, Mayumi; Nasahara, Kenlo Nishida; Motohka, Takeshi; Noda, Hibiki Muraoka; Murakami, Kazutaka; Hosaka, Masahiro
2013-06-01
Reports indicate that leaf onset (leaf flush) of deciduous trees in cool-temperate ecosystems is occurring earlier in the spring in response to global warming. In this study, we created two types of phenology models, one driven only by warmth (spring warming [SW] model) and another driven by both warmth and winter chilling (parallel chill [PC] model), to predict such phenomena in the Japanese Islands at high spatial resolution (500 m). We calibrated these models using leaf onset dates derived from satellite data (Terra/MODIS) and in situ temperature data derived from a dense network of ground stations Automated Meteorological Data Acquisition System. We ran the model using future climate predictions created by the Japanese Meteorological Agency's MRI-AGCM3.1S model. In comparison to the first decade of the 2000s, our results predict that the date of leaf onset in the 2030s will advance by an average of 12 days under the SW model and 7 days under the PC model throughout the study area. The date of onset in the 2090s will advance by 26 days under the SW model and by 15 days under the PC model. The greatest impact will occur on Hokkaido (the northernmost island) and in the central mountains.
Combined Climate and Flow Abstraction Impacts on an Aggrading Alpine River
NASA Astrophysics Data System (ADS)
Bakker, M.; Costa, A.; Silva, T. A.; Stutenbecker, L.; Girardclos, S.; Loizeau, J. L.; Molnar, P.; Schlunegger, F.; Lane, S. N.
2017-12-01
Recent climatic warming and associated glacial retreat may have a large impact on sediment release and transfer in Alpine river basins. In parallel, the sediment transport capacity of many European Alpine streams is affected by hydropower exploitation, notably where flow is abstracted but the sediment supply to the headwaters is maintained at flow intakes. Here, we investigate the combined effects of climate change and flow abstraction on morphodynamics and sediment transfer in one such Alpine stream, the Borgne River, Switzerland. A unique dataset forms the basis for determining sediment deposition and transfer: (1) a set of high resolution Digital Elevation Models (DEMs) of braided river reaches is derived through applying Structure from Motion (SfM) photogrammetry to archival aerial photographs available for the period 1959-2014; (2) flow intake management data is used for the reconstruction of (up- and downstream) discharge and sediment supply since 1977. Subsequently we use bedload transport capacity calculations and climate data to assess their relative impact on the system evolution over the last 25 years. From the historical DEMs we find considerable aggradation of the river bed (up to 5 meters) since the onset of flow abstraction in 1963. Rapid and widespread aggradation however did not commence until the onset of glacier retreat in the late 1980s and the dry and notably warm years of the early 1990s. This aggradation coincided with an increase in sediment supply, although it accounts for only c. 25% of supplied material, the remainder was transferred through the studied reaches. Flow abstraction reduces transport capacity by an order of magnitude but the residual transport rates are close to sediment supply rates, which is why significant transport remains. However, the reduction in transport capacity due to direct human impacts in basin hydrology (flow abstraction) makes the system much more sensitive to changes in climate-driven hydrological variability and climate induced changes in intake management and sediment supply rates. This was exemplified by an increasingly strong climate (winter precipitation and summer temperature) influence on the delivery of glacially derived sediment.
How does climate warming affect plant-pollinator interactions?
Hegland, Stein Joar; Nielsen, Anders; Lázaro, Amparo; Bjerknes, Anne-Line; Totland, Ørjan
2009-02-01
Climate warming affects the phenology, local abundance and large-scale distribution of plants and pollinators. Despite this, there is still limited knowledge of how elevated temperatures affect plant-pollinator mutualisms and how changed availability of mutualistic partners influences the persistence of interacting species. Here we review the evidence of climate warming effects on plants and pollinators and discuss how their interactions may be affected by increased temperatures. The onset of flowering in plants and first appearance dates of pollinators in several cases appear to advance linearly in response to recent temperature increases. Phenological responses to climate warming may therefore occur at parallel magnitudes in plants and pollinators, although considerable variation in responses across species should be expected. Despite the overall similarities in responses, a few studies have shown that climate warming may generate temporal mismatches among the mutualistic partners. Mismatches in pollination interactions are still rarely explored and their demographic consequences are largely unknown. Studies on multi-species plant-pollinator assemblages indicate that the overall structure of pollination networks probably are robust against perturbations caused by climate warming. We suggest potential ways of studying warming-caused mismatches and their consequences for plant-pollinator interactions, and highlight the strengths and limitations of such approaches.
Translational Environmental Research: Improving the Usefulness and Usability of Research Results
NASA Astrophysics Data System (ADS)
Garfin, G.
2008-12-01
In recent years, requests for proposals more frequently emphasize outreach to stakeholder communities, decision support, and science that serves societal needs. Reports from the National Academy of Sciences and Western States Water Council emphasize the need for science translation and outreach, in order to address societal concerns with climate extremes, such as drought, the use of climate predictions, and the growing challenges of climate change. In the 1990s, the NOAA Climate Program Office developed its Regional Integrated Sciences and Asssessments program to help bridge the gap between climate science (notably, seasonal predictions) and society, to improve the flow of information to stakeholders, and to increase the relevance of climate science to inform decisions. During the same time period, the National Science Foundation initiated multi-year Science and Technology Centers and Decision Making Under Uncertainty Centers, with similar goals, but different metrics of success. Moreover, the combination of population growth, climate change, and environmental degradation has prompted numerous research initiatives on linking knowledge and action for sustainable development. This presentation reviews various models and methodologies for translating science results from field, lab, or modeling work to use by society. Lessons and approaches from cooperative extension, boundary organizations, co-production of science and policy, and medical translational research are examined. In particular, multi-step translation as practiced within the health care community is examined. For example, so- called "T1" (translation 1) research moves insights from basic science to clinical research; T2 research evaluates the effectiveness of clinical practice, who benefits from promising care regimens, and develops tools for clinicians, patients, and policy makers. T3 activities test the implementation, delivery, and spread of research results and clinical practices in order to foster policy changes and improve general health. Parallels in environmental sciences might be TER1 (translational environmental research 1), basic insights regarding environmental processes and relationships between environmental changes and their causes. TER2, applied environmental research, development of best practices, and development of decision support tools. TER3, might include usability and impact evaluation, effective outreach and implementation of best practices, and application of research insights to public policy and institutional change. According to the medical literature, and in anecdotal evidence from end-to-end environmental science, decision-maker and public involvement in these various forms of engaged research decreases the lag between scientific discovery and implementation of discoveries in operational practices, information tools, and organizational and public policies.
Using decision pathway surveys to inform climate engineering policy choices
Gregory, Robin; Satterfield, Terre; Hasell, Ariel
2016-01-01
Over the coming decades citizens living in North America and Europe will be asked about a variety of new technological and behavioral initiatives intended to mitigate the worst impacts of climate change. A common approach to public input has been surveys whereby respondents’ attitudes about climate change are explained by individuals’ demographic background, values, and beliefs. In parallel, recent deliberative research seeks to more fully address the complex value tradeoffs linked to novel technologies and difficult ethical questions that characterize leading climate mitigation alternatives. New methods such as decision pathway surveys may offer important insights for policy makers by capturing much of the depth and reasoning of small-group deliberations while meeting standard survey goals including large-sample stakeholder engagement. Pathway surveys also can help participants to deepen their factual knowledge base and arrive at a more complete understanding of their own values as they apply to proposed policy alternatives. The pathway results indicate more fully the conditional and context-specific nature of support for several “upstream” climate interventions, including solar radiation management techniques and carbon dioxide removal technologies. PMID:26729883
Using decision pathway surveys to inform climate engineering policy choices.
Gregory, Robin; Satterfield, Terre; Hasell, Ariel
2016-01-19
Over the coming decades citizens living in North America and Europe will be asked about a variety of new technological and behavioral initiatives intended to mitigate the worst impacts of climate change. A common approach to public input has been surveys whereby respondents' attitudes about climate change are explained by individuals' demographic background, values, and beliefs. In parallel, recent deliberative research seeks to more fully address the complex value tradeoffs linked to novel technologies and difficult ethical questions that characterize leading climate mitigation alternatives. New methods such as decision pathway surveys may offer important insights for policy makers by capturing much of the depth and reasoning of small-group deliberations while meeting standard survey goals including large-sample stakeholder engagement. Pathway surveys also can help participants to deepen their factual knowledge base and arrive at a more complete understanding of their own values as they apply to proposed policy alternatives. The pathway results indicate more fully the conditional and context-specific nature of support for several "upstream" climate interventions, including solar radiation management techniques and carbon dioxide removal technologies.
The Monash University Interactive Simple Climate Model
NASA Astrophysics Data System (ADS)
Dommenget, D.
2013-12-01
The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.
NASA Astrophysics Data System (ADS)
Renner, M.; Bernhofer, C.
2011-12-01
The prediction of climate effects on terrestrial ecosystems and water resources is one of the major research questions in hydrology. Conceptual water-energy balance models can be used to gain a first order estimate of how long-term average streamflow is changing with a change in water and energy supply. A common framework for investigation of this question is based on the Budyko hypothesis, which links hydrological response to aridity. Recently, Renner et al. (2011) introduced the CCUW hypothesis, which is based on the assumption that the total efficiency of the catchment ecosystem to use the available water and energy for actual evapotranspiration remains constant even under climate changes. Here, we confront the climate sensitivity approaches (including several versions of Budyko's approach and the CCUW) with data of more than 400 basins distributed over the continental United States. We first map an estimate of the sensitivity of streamflow to changes in precipitation using long-term average data of the period 1949-2003. This provides a hydro-climatic status of the respective basins as well as their expected proportional effect on changes in climate. Next, by splitting the data in two periods, we (i) analyse the long-term average changes in hydro-climatolgy, we (ii) use the different climate sensitivity methods to predict the change in streamflow given the observed changes in water and energy supply and (iii) we apply a quantitative approach to separate the impacts of changes in the long-term average climate from basin characteristics change on streamflow. This allows us to evaluate the observed changes in streamflow as well as to evaluate the impact of basin changes on the validity of climate sensitivity approaches. The apparent increase of streamflow in the majority of basins in the US is dominated by a climate trend towards increased humidity. It is further evident that impacts of changes in basin characteristics appear in parallel with climate changes. There are coherent spatial patterns with basins of increasing catchment efficiency being dominant in the western and central parts of the US. A hot spot of decreasing efficiency is found within the US Midwest. The impact of basin changes on the prediction is large and can be twice as the observed change signal. However, we find that both, the CCUW hypothesis and the approaches using the Budyko hypothesis, show minimal deviations between observed and predicted changes in streamflow for basins where a dominance of climatic changes and low influences of basin changes have been found. Thus, climate sensitivity methods can be regarded as valid tools if we expect climate changes only and neglect any direct anthropogenic influences.
NASA Astrophysics Data System (ADS)
Georgiev, K.; Zlatev, Z.
2010-11-01
The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.
An OpenACC-Based Unified Programming Model for Multi-accelerator Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S
2015-01-01
This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.
Boundedness and exponential convergence in a chemotaxis model for tumor invasion
NASA Astrophysics Data System (ADS)
Jin, Hai-Yang; Xiang, Tian
2016-12-01
We revisit the following chemotaxis system modeling tumor invasion {ut=Δu-∇ṡ(u∇v),x∈Ω,t>0,vt=Δv+wz,x∈Ω,t>0,wt=-wz,x∈Ω,t>0,zt=Δz-z+u,x∈Ω,t>0, in a smooth bounded domain Ω \\subset {{{R}}n}(n≥slant 1) with homogeneous Neumann boundary and initial conditions. This model was recently proposed by Fujie et al (2014 Adv. Math. Sci. Appl. 24 67-84) as a model for tumor invasion with the role of extracellular matrix incorporated, and was analyzed later by Fujie et al (2016 Discrete Contin. Dyn. Syst. 36 151-69), showing the uniform boundedness and convergence for n≤slant 3 . In this work, we first show that the {{L}∞} -boundedness of the system can be reduced to the boundedness of \\parallel u(\\centerdot,t){{\\parallel}{{L\\frac{n{4}+ɛ}}(Ω )}} for some ɛ >0 alone, and then, for n≥slant 4 , if the initial data \\parallel {{u}0}{{\\parallel}{{L\\frac{n{4}}}}} , \\parallel {{z}0}{{\\parallel}{{L\\frac{n{2}}}}} and \\parallel \
Effects of Global Change on U.S. Urban Areas: Vulnerabilities, Impacts, and Adaptation
NASA Astrophysics Data System (ADS)
Quattrochi, D. A.; Wilbanks, T. J.; Kirshen, P. H.; Romero-Lankao, P.; Rosenzweig, C. E.; Ruth, M.; Solecki, W.; Tarr, J. A.
2007-05-01
Human settlements, both large and small, are where the vast majority of people on the Earth live. Expansion of cities both in population and areal extent, is a relentless process that will accelerate in the 21st century. As a consequence of urban growth both in the United States and around the globe, it is important to develop an understanding of how urbanization will affect the local and regional environment. Of equal importance, however, is the assessment of how cities will be impacted by the looming prospects of global climate change and climate variability. The potential impacts of climate change and variability has recently been enunciated by the IPCC's "Climate Change 2007" report. Moreover, the U.S. Climate Change Science Program (CCSP) is preparing a series of "Synthesis and Assessment Products" (SAP) reports to support informed discussion and decision making regarding climate change and variability by policy makers, resource managers, stakeholders, the media, and the general public. We are working on a chapter of SAP 4.6 ("Analysis of the Effects of Global Chance on Human Health and Welfare and Human Systems") wherein we wish to describe the effects of global climate change on human settlements. This paper will present the thoughts and ideas that are being formulated for our SAP report that relate to what vulnerabilities and impacts will occur, what adaptation responses may take place, and what possible effects on settlement patterns and characteristics will potentially arise, on human settlements in the U.S. as a result of climate change and climate variability. We wish to present these ideas and concepts as a "work in progress" that are subject to several rounds of review, and we invite comments from listeners at this session on the rationale and veracity of our thoughts. Additionally, we wish to explore how technology such as remote sensing data coupled with modeling, can be employed as synthesis tools for deriving insight across a spectrum of impacts (e.g. public health, urban planning for mitigation strategies) on how cities can cope and adapt to climate change and variability. This latter point parallels the concepts and ideas presented in the U.S. National Academy of Sciences, Decadal Survey report on "Earth Science Applications from Space: National Imperatives for the Next Decade and Beyond" wherein the analysis of the impacts of climate change and variability, human health, and land use change are listed as key areas for development of future Earth observing remote sensing systems.
An embedded multi-core parallel model for real-time stereo imaging
NASA Astrophysics Data System (ADS)
He, Wenjing; Hu, Jian; Niu, Jingyu; Li, Chuanrong; Liu, Guangyu
2018-04-01
The real-time processing based on embedded system will enhance the application capability of stereo imaging for LiDAR and hyperspectral sensor. The task partitioning and scheduling strategies for embedded multiprocessor system starts relatively late, compared with that for PC computer. In this paper, aimed at embedded multi-core processing platform, a parallel model for stereo imaging is studied and verified. After analyzing the computing amount, throughout capacity and buffering requirements, a two-stage pipeline parallel model based on message transmission is established. This model can be applied to fast stereo imaging for airborne sensors with various characteristics. To demonstrate the feasibility and effectiveness of the parallel model, a parallel software was designed using test flight data, based on the 8-core DSP processor TMS320C6678. The results indicate that the design performed well in workload distribution and had a speed-up ratio up to 6.4.
Hurricane Forecasting with the High-resolution NASA Finite-volume General Circulation Model
NASA Technical Reports Server (NTRS)
Atlas, R.; Reale, O.; Shen, B.-W.; Lin, S.-J.; Chern, J.-D.; Putman, W.; Lee, T.; Yeh, K.-S.; Bosilovich, M.; Radakovich, J.
2004-01-01
A high-resolution finite-volume General Circulation Model (fvGCM), resulting from a development effort of more than ten years, is now being run operationally at the NASA Goddard Space Flight Center and Ames Research Center. The model is based on a finite-volume dynamical core with terrain-following Lagrangian control-volume discretization and performs efficiently on massive parallel architectures. The computational efficiency allows simulations at a resolution of a quarter of a degree, which is double the resolution currently adopted by most global models in operational weather centers. Such fine global resolution brings us closer to overcoming a fundamental barrier in global atmospheric modeling for both weather and climate, because tropical cyclones and even tropical convective clusters can be more realistically represented. In this work, preliminary results of the fvGCM are shown. Fifteen simulations of four Atlantic tropical cyclones in 2002 and 2004 are chosen because of strong and varied difficulties presented to numerical weather forecasting. It is shown that the fvGCM, run at the resolution of a quarter of a degree, can produce very good forecasts of these tropical systems, adequately resolving problems like erratic track, abrupt recurvature, intense extratropical transition, multiple landfall and reintensification, and interaction among vortices.
NASA Astrophysics Data System (ADS)
Wan, Hui; Zhang, Kai; Rasch, Philip J.; Singh, Balwinder; Chen, Xingyuan; Edwards, Jim
2017-02-01
A test procedure is proposed for identifying numerically significant solution changes in evolution equations used in atmospheric models. The test issues a fail
signal when any code modifications or computing environment changes lead to solution differences that exceed the known time step sensitivity of the reference model. Initial evidence is provided using the Community Atmosphere Model (CAM) version 5.3 that the proposed procedure can be used to distinguish rounding-level solution changes from impacts of compiler optimization or parameter perturbation, which are known to cause substantial differences in the simulated climate. The test is not exhaustive since it does not detect issues associated with diagnostic calculations that do not feedback to the model state variables. Nevertheless, it provides a practical and objective way to assess the significance of solution changes. The short simulation length implies low computational cost. The independence between ensemble members allows for parallel execution of all simulations, thus facilitating fast turnaround. The new method is simple to implement since it does not require any code modifications. We expect that the same methodology can be used for any geophysical model to which the concept of time step convergence is applicable.
Drought mitigation in Australia: reducing the losses but not removing the hazard
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heathcote, R.L.
This paper presents a brief history of drought in Australia, pointing up some parallels and contrasts with the North American experience. It then outlines the various strategies (technological and nontechnological) that have been adopted to try to mitigate drought. It reviews the current thinking on the effect of increasing atmospheric carbon dioxide on the Australian climate and their relevance to agricultural and pastoral activities through possible modification of the incidence and intensity of drought. Finally, it evaluates the history of technological adjustments to drought stresses and tries to forecast the success or failure of such adjustments to foreseeable climate change.
Climate-mediated dance of the plankton
NASA Astrophysics Data System (ADS)
Behrenfeld, Michael J.
2014-10-01
Climate change will unquestionably influence global ocean plankton because it directly impacts both the availability of growth-limiting resources and the ecological processes governing biomass distributions and annual cycles. Forecasting this change demands recognition of the vital, yet counterintuitive, attributes of the plankton world. The biomass of photosynthetic phytoplankton, for example, is not proportional to their division rate. Perhaps more surprising, physical processes (such as deep vertical mixing) can actually trigger an accumulation in phytoplankton while simultaneously decreasing their division rates. These behaviours emerge because changes in phytoplankton division rates are paralleled by proportional changes in grazing, viral attack and other loss rates. Here I discuss this trophic dance between predators and prey, how it dictates when phytoplankton biomass remains constant or achieves massive blooms, and how it can determine even the sign of change in ocean ecosystems under a warming climate.
Spatiotemporal Domain Decomposition for Massive Parallel Computation of Space-Time Kernel Density
NASA Astrophysics Data System (ADS)
Hohl, A.; Delmelle, E. M.; Tang, W.
2015-07-01
Accelerated processing capabilities are deemed critical when conducting analysis on spatiotemporal datasets of increasing size, diversity and availability. High-performance parallel computing offers the capacity to solve computationally demanding problems in a limited timeframe, but likewise poses the challenge of preventing processing inefficiency due to workload imbalance between computing resources. Therefore, when designing new algorithms capable of implementing parallel strategies, careful spatiotemporal domain decomposition is necessary to account for heterogeneity in the data. In this study, we perform octtree-based adaptive decomposition of the spatiotemporal domain for parallel computation of space-time kernel density. In order to avoid edge effects near subdomain boundaries, we establish spatiotemporal buffers to include adjacent data-points that are within the spatial and temporal kernel bandwidths. Then, we quantify computational intensity of each subdomain to balance workloads among processors. We illustrate the benefits of our methodology using a space-time epidemiological dataset of Dengue fever, an infectious vector-borne disease that poses a severe threat to communities in tropical climates. Our parallel implementation of kernel density reaches substantial speedup compared to sequential processing, and achieves high levels of workload balance among processors due to great accuracy in quantifying computational intensity. Our approach is portable of other space-time analytical tests.
Ecological adaptation of diverse honey bee (Apis mellifera) populations.
Parker, Robert; Melathopoulos, Andony P; White, Rick; Pernal, Stephen F; Guarna, M Marta; Foster, Leonard J
2010-06-15
Honey bees are complex eusocial insects that provide a critical contribution to human agricultural food production. Their natural migration has selected for traits that increase fitness within geographical areas, but in parallel their domestication has selected for traits that enhance productivity and survival under local conditions. Elucidating the biochemical mechanisms of these local adaptive processes is a key goal of evolutionary biology. Proteomics provides tools unique among the major 'omics disciplines for identifying the mechanisms employed by an organism in adapting to environmental challenges. Through proteome profiling of adult honey bee midgut from geographically dispersed, domesticated populations combined with multiple parallel statistical treatments, the data presented here suggest some of the major cellular processes involved in adapting to different climates. These findings provide insight into the molecular underpinnings that may confer an advantage to honey bee populations. Significantly, the major energy-producing pathways of the mitochondria, the organelle most closely involved in heat production, were consistently higher in bees that had adapted to colder climates. In opposition, up-regulation of protein metabolism capacity, from biosynthesis to degradation, had been selected for in bees from warmer climates. Overall, our results present a proteomic interpretation of expression polymorphisms between honey bee ecotypes and provide insight into molecular aspects of local adaptation or selection with consequences for honey bee management and breeding. The implications of our findings extend beyond apiculture as they underscore the need to consider the interdependence of animal populations and their agro-ecological context.
The QWeCI Project: seamlessly linking climate science to society
NASA Astrophysics Data System (ADS)
Morse, A. P.; Caminade, C.; Jones, A. E.; MacLeod, D.; Heath, A. E.
2012-04-01
The EU FP7 QWeCI project Quantifying Weather and Climate Impacts on health in developing countries (www.liv.ac.uk/qweci) has 13 partners with 7 of these in Africa. The geographical focus of the project is in Senegal, Ghana and Malawi. In all three countries the project has a strong scientific dissemination outlook as well as having field based surveillance programmes in Ghana and Senegal to understand more about the local parameters controlling the transmission of malaria and in Senegal of Rift Valley fever. The project has a strong and active climate science activity in using hindcasts of the new System 4 seasonal forecasting system at ECMWF; to further develop the use of monthly to seasonal forecasts from ensemble prediction systems; within project downscaling development; the assessment of decadal ensemble prediction systems; and the development and testing of vector borne disease models for malaria and Rift Valley fever. In parallel with the science programme the project has a large outreach activity involving regular communication and bi-lateral exchanges, science and decision maker focused workshops. In Malawi a long range WiFi network has been established for the dissemination of data. In Senegal where they is a concentration of partners and stakeholders the project is gaining a role as a catalyst for wider health and climate related activity within government departments and national research bodies along with the support and involvement of local communities. Within these wider community discussions we have interactive inputs from African and European scientists who are partners in the project. This paper will show highlights of the work completed so far and give an outline to future development and to encourage a wider user interaction from outside of the current project team and their direct collaborators.
NASA Astrophysics Data System (ADS)
Aalto, R. E.; Lauer, J. W.; Darby, S. E.; Best, J.; Dietrich, W. E.
2015-12-01
During glacial-marine transgressions vast volumes of sediment are deposited due to the infilling of lowland fluvial systems and shallow shelves, material that is removed during ensuing regressions. Modelling these processes would illuminate system morphodynamics, fluxes, and 'complexity' in response to base level change, yet such problems are computationally formidable. Environmental systems are characterized by strong interconnectivity, yet traditional supercomputers have slow inter-node communication -- whereas rapidly advancing Graphics Processing Unit (GPU) technology offers vastly higher (>100x) bandwidths. GULLEM (GpU-accelerated Lowland Landscape Evolution Model) employs massively parallel code to simulate coupled fluvial-landscape evolution for complex lowland river systems over large temporal and spatial scales. GULLEM models the accommodation space carved/infilled by representing a range of geomorphic processes, including: river & tributary incision within a multi-directional flow regime, non-linear diffusion, glacial-isostatic flexure, hydraulic geometry, tectonic deformation, sediment production, transport & deposition, and full 3D tracking of all resulting stratigraphy. Model results concur with the Holocene dynamics of the Fly River, PNG -- as documented with dated cores, sonar imaging of floodbasin stratigraphy, and the observations of topographic remnants from LGM conditions. Other supporting research was conducted along the Mekong River, the largest fluvial system of the Sunda Shelf. These and other field data provide tantalizing empirical glimpses into the lowland landscapes of large rivers during glacial-interglacial transitions, observations that can be explored with this powerful numerical model. GULLEM affords estimates for the timing and flux budgets within the Fly and Sunda Systems, illustrating complex internal system responses to the external forcing of sea level and climate. Furthermore, GULLEM can be applied to most ANY fluvial system to explore processes across a wide range of temporal and spatial scales. The presentation will provide insights (& many animations) illustrating river morphodynamics & resulting landscapes formed as a result of sea level oscillations. [Image: The incised 3.2e6 km^2 Sundaland domain @ 431ka
NASA Astrophysics Data System (ADS)
Lian, Yanping; Lin, Stephen; Yan, Wentao; Liu, Wing Kam; Wagner, Gregory J.
2018-05-01
In this paper, a parallelized 3D cellular automaton computational model is developed to predict grain morphology for solidification of metal during the additive manufacturing process. Solidification phenomena are characterized by highly localized events, such as the nucleation and growth of multiple grains. As a result, parallelization requires careful treatment of load balancing between processors as well as interprocess communication in order to maintain a high parallel efficiency. We give a detailed summary of the formulation of the model, as well as a description of the communication strategies implemented to ensure parallel efficiency. Scaling tests on a representative problem with about half a billion cells demonstrate parallel efficiency of more than 80% on 8 processors and around 50% on 64; loss of efficiency is attributable to load imbalance due to near-surface grain nucleation in this test problem. The model is further demonstrated through an additive manufacturing simulation with resulting grain structures showing reasonable agreement with those observed in experiments.
NASA Astrophysics Data System (ADS)
Lian, Yanping; Lin, Stephen; Yan, Wentao; Liu, Wing Kam; Wagner, Gregory J.
2018-01-01
In this paper, a parallelized 3D cellular automaton computational model is developed to predict grain morphology for solidification of metal during the additive manufacturing process. Solidification phenomena are characterized by highly localized events, such as the nucleation and growth of multiple grains. As a result, parallelization requires careful treatment of load balancing between processors as well as interprocess communication in order to maintain a high parallel efficiency. We give a detailed summary of the formulation of the model, as well as a description of the communication strategies implemented to ensure parallel efficiency. Scaling tests on a representative problem with about half a billion cells demonstrate parallel efficiency of more than 80% on 8 processors and around 50% on 64; loss of efficiency is attributable to load imbalance due to near-surface grain nucleation in this test problem. The model is further demonstrated through an additive manufacturing simulation with resulting grain structures showing reasonable agreement with those observed in experiments.
Karasick, M.S.; Strip, D.R.
1996-01-30
A parallel computing system is described that comprises a plurality of uniquely labeled, parallel processors, each processor capable of modeling a three-dimensional object that includes a plurality of vertices, faces and edges. The system comprises a front-end processor for issuing a modeling command to the parallel processors, relating to a three-dimensional object. Each parallel processor, in response to the command and through the use of its own unique label, creates a directed-edge (d-edge) data structure that uniquely relates an edge of the three-dimensional object to one face of the object. Each d-edge data structure at least includes vertex descriptions of the edge and a description of the one face. As a result, each processor, in response to the modeling command, operates upon a small component of the model and generates results, in parallel with all other processors, without the need for processor-to-processor intercommunication. 8 figs.
Study on Development of 1D-2D Coupled Real-time Urban Inundation Prediction model
NASA Astrophysics Data System (ADS)
Lee, Seungsoo
2017-04-01
In recent years, we are suffering abnormal weather condition due to climate change around the world. Therefore, countermeasures for flood defense are urgent task. In this research, study on development of 1D-2D coupled real-time urban inundation prediction model using predicted precipitation data based on remote sensing technology is conducted. 1 dimensional (1D) sewerage system analysis model which was introduced by Lee et al. (2015) is used to simulate inlet and overflow phenomena by interacting with surface flown as well as flows in conduits. 2 dimensional (2D) grid mesh refinement method is applied to depict road networks for effective calculation time. 2D surface model is coupled with 1D sewerage analysis model in order to consider bi-directional flow between both. Also parallel computing method, OpenMP, is applied to reduce calculation time. The model is estimated by applying to 25 August 2014 extreme rainfall event which caused severe inundation damages in Busan, Korea. Oncheoncheon basin is selected for study basin and observed radar data are assumed as predicted rainfall data. The model shows acceptable calculation speed with accuracy. Therefore it is expected that the model can be used for real-time urban inundation forecasting system to minimize damages.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Procassini, R.J.
1997-12-31
The fine-scale, multi-space resolution that is envisioned for accurate simulations of complex weapons systems in three spatial dimensions implies flop-rate and memory-storage requirements that will only be obtained in the near future through the use of parallel computational techniques. Since the Monte Carlo transport models in these simulations usually stress both of these computational resources, they are prime candidates for parallelization. The MONACO Monte Carlo transport package, which is currently under development at LLNL, will utilize two types of parallelism within the context of a multi-physics design code: decomposition of the spatial domain across processors (spatial parallelism) and distribution ofmore » particles in a given spatial subdomain across additional processors (particle parallelism). This implementation of the package will utilize explicit data communication between domains (message passing). Such a parallel implementation of a Monte Carlo transport model will result in non-deterministic communication patterns. The communication of particles between subdomains during a Monte Carlo time step may require a significant level of effort to achieve a high parallel efficiency.« less
NASA Astrophysics Data System (ADS)
Akil, Mohamed
2017-05-01
The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.
Hierarchical Parallelism in Finite Difference Analysis of Heat Conduction
NASA Technical Reports Server (NTRS)
Padovan, Joseph; Krishna, Lala; Gute, Douglas
1997-01-01
Based on the concept of hierarchical parallelism, this research effort resulted in highly efficient parallel solution strategies for very large scale heat conduction problems. Overall, the method of hierarchical parallelism involves the partitioning of thermal models into several substructured levels wherein an optimal balance into various associated bandwidths is achieved. The details are described in this report. Overall, the report is organized into two parts. Part 1 describes the parallel modelling methodology and associated multilevel direct, iterative and mixed solution schemes. Part 2 establishes both the formal and computational properties of the scheme.
The Art and Science of Climate Model Tuning
Hourdin, Frederic; Mauritsen, Thorsten; Gettelman, Andrew; ...
2017-03-31
The process of parameter estimation targeting a chosen set of observations is an essential aspect of numerical modeling. This process is usually named tuning in the climate modeling community. In climate models, the variety and complexity of physical processes involved, and their interplay through a wide range of spatial and temporal scales, must be summarized in a series of approximate submodels. Most submodels depend on uncertain parameters. Tuning consists of adjusting the values of these parameters to bring the solution as a whole into line with aspects of the observed climate. Tuning is an essential aspect of climate modeling withmore » its own scientific issues, which is probably not advertised enough outside the community of model developers. Optimization of climate models raises important questions about whether tuning methods a priori constrain the model results in unintended ways that would affect our confidence in climate projections. Here, we present the definition and rationale behind model tuning, review specific methodological aspects, and survey the diversity of tuning approaches used in current climate models. We also discuss the challenges and opportunities in applying so-called objective methods in climate model tuning. Here, we discuss how tuning methodologies may affect fundamental results of climate models, such as climate sensitivity. The article concludes with a series of recommendations to make the process of climate model tuning more transparent.« less
The Art and Science of Climate Model Tuning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hourdin, Frederic; Mauritsen, Thorsten; Gettelman, Andrew
The process of parameter estimation targeting a chosen set of observations is an essential aspect of numerical modeling. This process is usually named tuning in the climate modeling community. In climate models, the variety and complexity of physical processes involved, and their interplay through a wide range of spatial and temporal scales, must be summarized in a series of approximate submodels. Most submodels depend on uncertain parameters. Tuning consists of adjusting the values of these parameters to bring the solution as a whole into line with aspects of the observed climate. Tuning is an essential aspect of climate modeling withmore » its own scientific issues, which is probably not advertised enough outside the community of model developers. Optimization of climate models raises important questions about whether tuning methods a priori constrain the model results in unintended ways that would affect our confidence in climate projections. Here, we present the definition and rationale behind model tuning, review specific methodological aspects, and survey the diversity of tuning approaches used in current climate models. We also discuss the challenges and opportunities in applying so-called objective methods in climate model tuning. Here, we discuss how tuning methodologies may affect fundamental results of climate models, such as climate sensitivity. The article concludes with a series of recommendations to make the process of climate model tuning more transparent.« less
Current and Future Carbon Budgets of Tropical Rain Forest: A Cross Scale Analysis. Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberbauer, S. F.
2004-01-16
The goal of this project was to make a first assessment of the major carbon stocks and fluxes and their climatic determinants in a lowland neotropical rain forest, the La Selva Biological Station, Costa Rica. Our research design was based on the concurrent use of several of the best available approaches, so that data could be cross-validated. A major focus of our effort was to combine meteorological studies of whole-forest carbon exchange (eddy flux), with parallel independent measurements of key components of the forest carbon budget. The eddy flux system operated from February 1998 to February 2001. To obtain fieldmore » data that could be scaled up to the landscape level, we monitored carbon stocks, net primary productivity components including tree growth and mortality, litterfall, woody debris production, root biomass, and soil respiration in a series of replicated plots stratified across the major environmental gradients of the forest. A second major focus of this project was on the stocks and changes of carbon in the soil. We used isotope studies and intensive monitoring to investigate soil organic stocks and the climate-driven variation of soil respiration down the soil profile, in a set of six 4m deep soil shafts stratified across the landscape. We measured short term tree growth, climate responses of sap flow, and phenology in a suite of ten canopy trees to develop individual models of tree growth to daytime weather variables.« less
Hamilton, Jill A; Aitken, Sally N
2013-08-01
Historic colonization and contemporary evolutionary processes contribute to patterns of genetic variation and differentiation among populations. However, separating the respective influences of these processes remains a challenge, particularly for natural hybrid zones, where standing genetic variation may result from evolutionary processes both preceding and following contact, influencing the evolutionary trajectory of hybrid populations. Where adaptation to novel environments may be facilitated by interspecific hybridization, teasing apart these processes will have practical implications for forest management in changing environments. We evaluated the neutral genetic architecture of the Picea sitchensis (Sitka spruce) × P. glauca (white spruce) hybrid zone along the Nass and Skeena river valleys in northwestern British Columbia using chloroplast, mitochondrial, and nuclear microsatellite markers, in combination with cone morphological traits. Sitka spruce mitotype "capture", evidenced by this species dominating the maternal lineage, is consistent with earlier colonization of the region by Sitka spruce. This "capture" differs from the spatial distribution of chloroplast haplotypes, indicating pollen dispersal and its contribution to geographic structure. Genetic ancestry, based on nuclear markers, was strongly influenced by climate and geography. Highly parallel results for replicate transects along environmental gradients provide support for the bounded hybrid superiority model of hybrid zone maintenance. • This broad-scale analysis of neutral genetic structure indicates the importance of historic and contemporary gene flow, environmental selection, and their interaction in shaping neutral genetic variation within this hybrid zone, informative to seed transfer development and reforestation for future climates.
Majarena, Ana C.; Santolaria, Jorge; Samper, David; Aguilar, Juan J.
2010-01-01
This paper presents an overview of the literature on kinematic and calibration models of parallel mechanisms, the influence of sensors in the mechanism accuracy and parallel mechanisms used as sensors. The most relevant classifications to obtain and solve kinematic models and to identify geometric and non-geometric parameters in the calibration of parallel robots are discussed, examining the advantages and disadvantages of each method, presenting new trends and identifying unsolved problems. This overview tries to answer and show the solutions developed by the most up-to-date research to some of the most frequent questions that appear in the modelling of a parallel mechanism, such as how to measure, the number of sensors and necessary configurations, the type and influence of errors or the number of necessary parameters. PMID:22163469
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.
Probabilistic projections of 21st century climate change over Northern Eurasia
NASA Astrophysics Data System (ADS)
Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang
2013-12-01
We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.
Retargeting of existing FORTRAN program and development of parallel compilers
NASA Technical Reports Server (NTRS)
Agrawal, Dharma P.
1988-01-01
The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.
Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying
2018-03-01
This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.
NASA Astrophysics Data System (ADS)
Baker, Allison H.; Hu, Yong; Hammerling, Dorit M.; Tseng, Yu-heng; Xu, Haiying; Huang, Xiaomeng; Bryan, Frank O.; Yang, Guangwen
2016-07-01
The Parallel Ocean Program (POP), the ocean model component of the Community Earth System Model (CESM), is widely used in climate research. Most current work in CESM-POP focuses on improving the model's efficiency or accuracy, such as improving numerical methods, advancing parameterization, porting to new architectures, or increasing parallelism. Since ocean dynamics are chaotic in nature, achieving bit-for-bit (BFB) identical results in ocean solutions cannot be guaranteed for even tiny code modifications, and determining whether modifications are admissible (i.e., statistically consistent with the original results) is non-trivial. In recent work, an ensemble-based statistical approach was shown to work well for software verification (i.e., quality assurance) on atmospheric model data. The general idea of the ensemble-based statistical consistency testing is to use a qualitative measurement of the variability of the ensemble of simulations as a metric with which to compare future simulations and make a determination of statistical distinguishability. The capability to determine consistency without BFB results boosts model confidence and provides the flexibility needed, for example, for more aggressive code optimizations and the use of heterogeneous execution environments. Since ocean and atmosphere models have differing characteristics in term of dynamics, spatial variability, and timescales, we present a new statistical method to evaluate ocean model simulation data that requires the evaluation of ensemble means and deviations in a spatial manner. In particular, the statistical distribution from an ensemble of CESM-POP simulations is used to determine the standard score of any new model solution at each grid point. Then the percentage of points that have scores greater than a specified threshold indicates whether the new model simulation is statistically distinguishable from the ensemble simulations. Both ensemble size and composition are important. Our experiments indicate that the new POP ensemble consistency test (POP-ECT) tool is capable of distinguishing cases that should be statistically consistent with the ensemble and those that should not, as well as providing a simple, subjective and systematic way to detect errors in CESM-POP due to the hardware or software stack, positively contributing to quality assurance for the CESM-POP code.
Synergies Between Grace and Regional Atmospheric Modeling Efforts
NASA Astrophysics Data System (ADS)
Kusche, J.; Springer, A.; Ohlwein, C.; Hartung, K.; Longuevergne, L.; Kollet, S. J.; Keune, J.; Dobslaw, H.; Forootan, E.; Eicker, A.
2014-12-01
In the meteorological community, efforts converge towards implementation of high-resolution (< 12km) data-assimilating regional climate modelling/monitoring systems based on numerical weather prediction (NWP) cores. This is driven by requirements of improving process understanding, better representation of land surface interactions, atmospheric convection, orographic effects, and better forecasting on shorter timescales. This is relevant for the GRACE community since (1) these models may provide improved atmospheric mass separation / de-aliasing and smaller topography-induced errors, compared to global (ECMWF-Op, ERA-Interim) data, (2) they inherit high temporal resolution from NWP models, (3) parallel efforts towards improving the land surface component and coupling groundwater models; this may provide realistic hydrological mass estimates with sub-diurnal resolution, (4) parallel efforts towards re-analyses, with the aim of providing consistent time series. (5) On the other hand, GRACE can help validating models and aids in the identification of processes needing improvement. A coupled atmosphere - land surface - groundwater modelling system is currently being implemented for the European CORDEX region at 12.5 km resolution, based on the TerrSysMP platform (COSMO-EU NWP, CLM land surface and ParFlow groundwater models). We report results from Springer et al. (J. Hydromet., accept.) on validating the water cycle in COSMO-EU using GRACE and precipitation, evapotranspiration and runoff data; confirming that the model does favorably at representing observations. We show that after GRACE-derived bias correction, basin-average hydrological conditions prior to 2002 can be reconstructed better than before. Next, comparing GRACE with CLM forced by EURO-CORDEX simulations allows identifying processes needing improvement in the model. Finally, we compare COSMO-EU atmospheric pressure, a proxy for mass corrections in satellite gravimetry, with ERA-Interim over Europe at timescales shorter/longer than 1 month, and spatial scales below/above ERA resolution. We find differences between regional and global model more pronounced at high frequencies, with magnitude at sub-grid scale and larger scale corresponding to 1-3 hPa (1-3 cm EWH); relevant for the assessment of post-GRACE concepts.
Selection of climate change scenario data for impact modelling.
Sloth Madsen, M; Maule, C Fox; MacKellar, N; Olesen, J E; Christensen, J Hesselbjerg
2012-01-01
Impact models investigating climate change effects on food safety often need detailed climate data. The aim of this study was to select climate change projection data for selected crop phenology and mycotoxin impact models. Using the ENSEMBLES database of climate model output, this study illustrates how the projected climate change signal of important variables as temperature, precipitation and relative humidity depends on the choice of the climate model. Using climate change projections from at least two different climate models is recommended to account for model uncertainty. To make the climate projections suitable for impact analysis at the local scale a weather generator approach was adopted. As the weather generator did not treat all the necessary variables, an ad-hoc statistical method was developed to synthesise realistic values of missing variables. The method is presented in this paper, applied to relative humidity, but it could be adopted to other variables if needed.
Improving Climate Projections Using "Intelligent" Ensembles
NASA Technical Reports Server (NTRS)
Baker, Noel C.; Taylor, Patrick C.
2015-01-01
Recent changes in the climate system have led to growing concern, especially in communities which are highly vulnerable to resource shortages and weather extremes. There is an urgent need for better climate information to develop solutions and strategies for adapting to a changing climate. Climate models provide excellent tools for studying the current state of climate and making future projections. However, these models are subject to biases created by structural uncertainties. Performance metrics-or the systematic determination of model biases-succinctly quantify aspects of climate model behavior. Efforts to standardize climate model experiments and collect simulation data-such as the Coupled Model Intercomparison Project (CMIP)-provide the means to directly compare and assess model performance. Performance metrics have been used to show that some models reproduce present-day climate better than others. Simulation data from multiple models are often used to add value to projections by creating a consensus projection from the model ensemble, in which each model is given an equal weight. It has been shown that the ensemble mean generally outperforms any single model. It is possible to use unequal weights to produce ensemble means, in which models are weighted based on performance (called "intelligent" ensembles). Can performance metrics be used to improve climate projections? Previous work introduced a framework for comparing the utility of model performance metrics, showing that the best metrics are related to the variance of top-of-atmosphere outgoing longwave radiation. These metrics improve present-day climate simulations of Earth's energy budget using the "intelligent" ensemble method. The current project identifies several approaches for testing whether performance metrics can be applied to future simulations to create "intelligent" ensemble-mean climate projections. It is shown that certain performance metrics test key climate processes in the models, and that these metrics can be used to evaluate model quality in both current and future climate states. This information will be used to produce new consensus projections and provide communities with improved climate projections for urgent decision-making.
Psychosocial effects of workplace physical exercise among workers with chronic pain
Andersen, Lars L.; Persson, Roger; Jakobsen, Markus D.; Sundstrup, Emil
2017-01-01
Abstract While workplace physical exercise can help manage musculoskeletal disorders, less is known about psychosocial effects of such interventions. This aim of this study was to investigate the effect of workplace physical exercise on psychosocial factors among workers with chronic musculoskeletal pain. The trial design was a 2-armed parallel-group randomized controlled trial with allocation concealment. A total of 66 slaughterhouse workers (51 men and 15 women, mean age 45 years [standard deviation (SD) 10]) with upper limb chronic musculoskeletal pain were randomly allocated to group-based strength training (physical exercise group) or individual ergonomic training and education (reference group) for 10 weeks. Social climate was assessed with the General Nordic Questionnaire for Psychological and Social Factors at Work, and vitality and mental health were assessed with the 36-item Short Form Health Survey. All scales were converted to 0 to 100 (higher scores are better). Between-group differences from baseline to follow-up were determined using linear mixed models adjusted for workplace, age, gender, and baseline values of the outcome. Mean baseline scores of social climate, mental health, and vitality were 52.2 (SD 14.9), 79.5 (SD 13.7), and 53.9 (SD 19.7), respectively. Complete baseline and follow-up data were obtained from 30 and 31 from the physical exercise and reference groups, respectively. The between-group differences from baseline to follow-up between physical exercise and reference were 7.6 (95% CI 0.3 to 14.9), −2.3 (95% CI -10.3 to 5.8), and 10.1 (95% CI 0.6 to 19.5) for social climate, mental health, and vitality, respectively. For social climate and vitality, this corresponded to moderate effect sizes (Cohen d = 0.51 for both) in favor of physical exercise. There were no reported adverse events. In conclusion, workplace physical exercise performed together with colleagues improves social climate and vitality among workers with chronic musculoskeletal pain. Mental health remained unchanged. PMID:28072707
NASA Astrophysics Data System (ADS)
Muszynski, G.; Kashinath, K.; Wehner, M. F.; Prabhat, M.; Kurlin, V.
2017-12-01
We investigate novel approaches to detecting, classifying and characterizing extreme weather events, such as atmospheric rivers (ARs), in large high-dimensional climate datasets. ARs are narrow filaments of concentrated water vapour in the atmosphere that bring much of the precipitation in many mid-latitude regions. The precipitation associated with ARs is also responsible for major flooding events in many coastal regions of the world, including the west coast of the United States and western Europe. In this study we combine ideas from Topological Data Analysis (TDA) with Machine Learning (ML) for detecting, classifying and characterizing extreme weather events, like ARs. TDA is a new field that sits at the interface between topology and computer science, that studies "shape" - hidden topological structure - in raw data. It has been applied successfully in many areas of applied sciences, including complex networks, signal processing and image recognition. Using TDA we provide ARs with a shape characteristic as a new feature descriptor for the task of AR classification. In particular, we track the change in topology in precipitable water (integrated water vapour) fields using the Union-Find algorithm. We use the generated feature descriptors with ML classifiers to establish reliability and classification performance of our approach. We utilize the parallel toolkit for extreme climate events analysis (TECA: Petascale Pattern Recognition for Climate Science, Prabhat et al., Computer Analysis of Images and Patterns, 2015) for comparison (it is assumed that events identified by TECA is ground truth). Preliminary results indicate that our approach brings new insight into the study of ARs and provides quantitative information about the relevance of topological feature descriptors in analyses of a large climate datasets. We illustrate this method on climate model output and NCEP reanalysis datasets. Further, our method outperforms existing methods on detection and classification of ARs. This work illustrates that TDA combined with ML may provide a uniquely powerful approach for detection, classification and characterization of extreme weather phenomena.
Living with a Star: New Opportunities in Sun-Climate Research
NASA Technical Reports Server (NTRS)
2003-01-01
Living With a Star is a NASA initiative employing the combination of dedicated spacecraft with targeted research and modeling efforts to improve what we know of solar effects of all kinds on the Earth and its surrounding space environment, with particular emphasis on those that have significant practical impacts on life and society. The highest priority among these concerns is the subject of this report: the potential effects of solar variability on regional and global climate, including the extent to which solar variability has contributed to the well-documented warming of the Earth in the last 100 years. Understanding how the climate system reacts to external forcing from the Sun will also greatly improve our knowledge of how climate will respond to other climate drivers, including those of anthropogenic origin. A parallel element of the LWS program addresses solar effects on space weather : the impulsive emissions of charged particles, short-wave electromagnetic radiation and magnetic disturbances in the upper atmosphere and near-Earth environment that also affect life and society. These include a wide variety of solar impacts on aeronautics, astronautics, electric power transmission, and national defense. Specific examples are (1) the impacts of potentially- damaging high energy radiation and atomic particles of solar origin on satellites and satellite operations, spacecraft electronics systems and components, electronic communications, electric power distribution grids, navigational and GPS systems, and high altitude aircraft; and (2) the threat of sporadic, high-energy solar radiation to astronauts and high altitude aircraft passengers and crews. Elements of the LWS program include an array of dedicated spacecraft in near- Earth and near-Sun orbits that will closely study and observe both the Sun itself and the impacts of its variations on the Earth's radiation belts and magnetosphere, the upper atmosphere, and ionosphere. These spacecraft, positioned to study and monitor changing conditions in the Sun-Earth neighborhood, will also serve as sentinels of solar storms and impulsive events.
Andersen, Lars L; Persson, Roger; Jakobsen, Markus D; Sundstrup, Emil
2017-01-01
While workplace physical exercise can help manage musculoskeletal disorders, less is known about psychosocial effects of such interventions. This aim of this study was to investigate the effect of workplace physical exercise on psychosocial factors among workers with chronic musculoskeletal pain.The trial design was a 2-armed parallel-group randomized controlled trial with allocation concealment. A total of 66 slaughterhouse workers (51 men and 15 women, mean age 45 years [standard deviation (SD) 10]) with upper limb chronic musculoskeletal pain were randomly allocated to group-based strength training (physical exercise group) or individual ergonomic training and education (reference group) for 10 weeks. Social climate was assessed with the General Nordic Questionnaire for Psychological and Social Factors at Work, and vitality and mental health were assessed with the 36-item Short Form Health Survey. All scales were converted to 0 to 100 (higher scores are better). Between-group differences from baseline to follow-up were determined using linear mixed models adjusted for workplace, age, gender, and baseline values of the outcome.Mean baseline scores of social climate, mental health, and vitality were 52.2 (SD 14.9), 79.5 (SD 13.7), and 53.9 (SD 19.7), respectively. Complete baseline and follow-up data were obtained from 30 and 31 from the physical exercise and reference groups, respectively. The between-group differences from baseline to follow-up between physical exercise and reference were 7.6 (95% CI 0.3 to 14.9), -2.3 (95% CI -10.3 to 5.8), and 10.1 (95% CI 0.6 to 19.5) for social climate, mental health, and vitality, respectively. For social climate and vitality, this corresponded to moderate effect sizes (Cohen d = 0.51 for both) in favor of physical exercise. There were no reported adverse events.In conclusion, workplace physical exercise performed together with colleagues improves social climate and vitality among workers with chronic musculoskeletal pain. Mental health remained unchanged.
NASA Astrophysics Data System (ADS)
Ollinger, S. V.; Silverberg, S.; Albrechtova, J.; Freuder, R.; Gengarelly, L.; Martin, M.; Randolph, G.; Schloss, A.
2007-12-01
The global carbon cycle is a key regulator of the Earth's climate and is central to the normal function of ecological systems. Because rising atmospheric CO2 is the principal cause of climate change, understanding how ecosystems cycle and store carbon has become an extremely important issue. In recent years, the growing importance of the carbon cycle has brought it to the forefront of both science and environmental policy. The need for better scientific understanding has led to establishment of numerous research programs, such as the North American Carbon Program (NACP), which seeks to understand controls on carbon cycling under present and future conditions. Parallel efforts are greatly needed to integrate state-of-the-art science on the carbon cycle and its importance to climate with education and outreach efforts that help prepare society to make sound decisions on energy use, carbon management and climate change adaptation. Here, we present a new effort that joins carbon cycle scientists with the International GLOBE Education program to develop carbon cycle activities for K-12 classrooms. The GLOBE Carbon Cycle project is focused on bringing cutting edge research and research techniques in the field of terrestrial ecosystem carbon cycling into the classroom. Students will collect data about their school field site through existing protocols of phenology, land cover and soils as well as new protocols focused on leaf traits, and ecosystem growth and change. They will also participate in classroom activities to understand carbon cycling in terrestrial ecosystems, these will include plant- a-plant experiments, hands-on demonstrations of various concepts, and analysis of collected data. In addition to the traditional GLOBE experience, students will have the opportunity to integrate their data with emerging and expanding technologies including global and local carbon cycle models and remote sensing toolkits. This program design will allow students to explore research questions from local to global scales with both present and future environmental conditions.
Steele, Madeline O.; Chang, Heejun; Reusser, Deborah A.; Brown, Cheryl A.; Jung, Il-Won
2012-01-01
As part of a larger investigation into potential effects of climate change on estuarine habitats in the Pacific Northwest, we estimated changes in freshwater inputs into four estuaries: Coquille River estuary, South Slough of Coos Bay, and Yaquina Bay in Oregon, and Willapa Bay in Washington. We used the U.S. Geological Survey's Precipitation Runoff Modeling System (PRMS) to model watershed hydrological processes under current and future climatic conditions. This model allowed us to explore possible shifts in coastal hydrologic regimes at a range of spatial scales. All modeled watersheds are located in rainfall-dominated coastal areas with relatively insignificant base flow inputs, and their areas vary from 74.3 to 2,747.6 square kilometers. The watersheds also vary in mean elevation, ranging from 147 meters in the Willapa to 1,179 meters in the Coquille. The latitudes of watershed centroids range from 43.037 degrees north latitude in the Coquille River estuary to 46.629 degrees north latitude in Willapa Bay. We calibrated model parameters using historical climate grid data downscaled to one-sixteenth of a degree by the Climate Impacts Group, and historical runoff from sub-watersheds or neighboring watersheds. Nash Sutcliffe efficiency values for daily flows in calibration sub-watersheds ranged from 0.71 to 0.89. After calibration, we forced the PRMS models with four North American Regional Climate Change Assessment Program climate models: Canadian Regional Climate Model-(National Center for Atmospheric Research) Community Climate System Model version 3, Canadian Regional Climate Model-Canadian Global Climate Model version 3, Hadley Regional Model version 3-Hadley Centre Climate Model version 3, and Regional Climate Model-Canadian Global Climate Model version 3. These are global climate models (GCMs) downscaled with regional climate models that are embedded within the GCMs, and all use the A2 carbon emission scenario developed by the Intergovernmental Panel on Climate Change. With these climate-forcing outputs, we derived the mean change in flow from the period encompassing the 1980s (1971-1995) to the period encompassing the 2050s (2041-2065). Specifically, we calculated percent change in mean monthly flow rate, coefficient of variation, top 5 percent of flow, and 7-day low flow. The trends with the most agreement among climate models and among watersheds were increases in autumn mean monthly flows, especially in October and November, decreases in summer monthly mean flow, and increases in the top 5 percent of flow. We also estimated variance in PRMS outputs owing to parameter uncertainty and the selection of climate model using Latin hypercube sampling. This analysis showed that PRMS low-flow simulations are more uncertain than medium or high flow simulations, and that variation among climate models was a larger source of uncertainty than the hydrological model parameters. These results improve our understanding of how climate change may affect the saltwater-freshwater balance in Pacific Northwest estuaries, with implications for their sensitive ecosystems.
Designing ecological climate change impact assessments to reflect key climatic drivers
Sofaer, Helen R.; Barsugli, Joseph J.; Jarnevich, Catherine S.; Abatzoglou, John T.; Talbert, Marian; Miller, Brian W.; Morisette, Jeffrey T.
2017-01-01
Identifying the climatic drivers of an ecological system is a key step in assessing its vulnerability to climate change. The climatic dimensions to which a species or system is most sensitive – such as means or extremes – can guide methodological decisions for projections of ecological impacts and vulnerabilities. However, scientific workflows for combining climate projections with ecological models have received little explicit attention. We review Global Climate Model (GCM) performance along different dimensions of change and compare frameworks for integrating GCM output into ecological models. In systems sensitive to climatological means, it is straightforward to base ecological impact assessments on mean projected changes from several GCMs. Ecological systems sensitive to climatic extremes may benefit from what we term the ‘model space’ approach: a comparison of ecological projections based on simulated climate from historical and future time periods. This approach leverages the experimental framework used in climate modeling, in which historical climate simulations serve as controls for future projections. Moreover, it can capture projected changes in the intensity and frequency of climatic extremes, rather than assuming that future means will determine future extremes. Given the recent emphasis on the ecological impacts of climatic extremes, the strategies we describe will be applicable across species and systems. We also highlight practical considerations for the selection of climate models and data products, emphasizing that the spatial resolution of the climate change signal is generally coarser than the grid cell size of downscaled climate model output. Our review illustrates how an understanding of how climate model outputs are derived and downscaled can improve the selection and application of climatic data used in ecological modeling.
Designing ecological climate change impact assessments to reflect key climatic drivers.
Sofaer, Helen R; Barsugli, Joseph J; Jarnevich, Catherine S; Abatzoglou, John T; Talbert, Marian K; Miller, Brian W; Morisette, Jeffrey T
2017-07-01
Identifying the climatic drivers of an ecological system is a key step in assessing its vulnerability to climate change. The climatic dimensions to which a species or system is most sensitive - such as means or extremes - can guide methodological decisions for projections of ecological impacts and vulnerabilities. However, scientific workflows for combining climate projections with ecological models have received little explicit attention. We review Global Climate Model (GCM) performance along different dimensions of change and compare frameworks for integrating GCM output into ecological models. In systems sensitive to climatological means, it is straightforward to base ecological impact assessments on mean projected changes from several GCMs. Ecological systems sensitive to climatic extremes may benefit from what we term the 'model space' approach: a comparison of ecological projections based on simulated climate from historical and future time periods. This approach leverages the experimental framework used in climate modeling, in which historical climate simulations serve as controls for future projections. Moreover, it can capture projected changes in the intensity and frequency of climatic extremes, rather than assuming that future means will determine future extremes. Given the recent emphasis on the ecological impacts of climatic extremes, the strategies we describe will be applicable across species and systems. We also highlight practical considerations for the selection of climate models and data products, emphasizing that the spatial resolution of the climate change signal is generally coarser than the grid cell size of downscaled climate model output. Our review illustrates how an understanding of how climate model outputs are derived and downscaled can improve the selection and application of climatic data used in ecological modeling. © 2017 John Wiley & Sons Ltd.
The Effects of Climate Model Similarity on Local, Risk-Based Adaptation Planning
NASA Astrophysics Data System (ADS)
Steinschneider, S.; Brown, C. M.
2014-12-01
The climate science community has recently proposed techniques to develop probabilistic projections of climate change from ensemble climate model output. These methods provide a means to incorporate the formal concept of risk, i.e., the product of impact and probability, into long-term planning assessments for local systems under climate change. However, approaches for pdf development often assume that different climate models provide independent information for the estimation of probabilities, despite model similarities that stem from a common genealogy. Here we utilize an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) to develop probabilistic climate information, with and without an accounting of inter-model correlations, and use it to estimate climate-related risks to a local water utility in Colorado, U.S. We show that the tail risk of extreme climate changes in both mean precipitation and temperature is underestimated if model correlations are ignored. When coupled with impact models of the hydrology and infrastructure of the water utility, the underestimation of extreme climate changes substantially alters the quantification of risk for water supply shortages by mid-century. We argue that progress in climate change adaptation for local systems requires the recognition that there is less information in multi-model climate ensembles than previously thought. Importantly, adaptation decisions cannot be limited to the spread in one generation of climate models.
Fully Parallel MHD Stability Analysis Tool
NASA Astrophysics Data System (ADS)
Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang
2014-10-01
Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Initial results of the code parallelization will be reported. Work is supported by the U.S. DOE SBIR program.
Equatorward dispersion of the Sarychev volcanic plume and the relation to the Asian summer monsoon
NASA Astrophysics Data System (ADS)
Wu, Xue; Griessbach, Sabine; Hoffmann, Lars
2017-04-01
Sulfur dioxide emissions and subsequent sulfate aerosols from strong volcanic eruptions have large impact on global climate. Although most of previous studies attribute the global influence to volcanic eruptions in the tropics, high-latitude volcanic eruptions are also an important cause for global climate variations. In fact, the potential climate impact of volcanic also largely depends on the season when eruptions occur, the erupted plume height and the surrounding meteorological conditions. This work focuses on the eruption of a high-latitude volcano Sarychev, and the role of Asian summer monsoon (ASM) during the transport and dispersion of the erupted plumes. First, the sulfur dioxide emission rate and height of emission of the Sarychev eruption in June 2009 are modelled using a Lagrangian particle dispersion model named Massive-Parallel Trajectory Calculations (MPTRAC), together with sulfur dioxide observations of the Atmospheric Infrared Sounder (AIRS/Aqua) and a backward trajectory approach. Then, the transport and dispersion of the plumes are modelled with MPTRAC and validated with sulfur dioxide observations from AIRS and aerosol observations from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS). The modelled trajectories and the MIPAS data both show the plumes are transported towards the tropics from the southeast edge of the ASM (in the vertical range of 340-400K) controlled by the clockwise winds of ASM, and from above the ASM (above 400K) in form of in-mixing process. Especially, in the vertical range around 340-400K, a transport barrier based on potential vorticity (PV) gradients separates the 'aerosol hole' inside of the ASM circulation and the aerosol-rich surrounding area, which shows the PV gradients based barrier may be more practical than the barrier based on the geopotential height. With help of ASM circulation, the aerosol transported to the tropics and stayed in the tropical lower stratosphere for about eight months, which were the main aerosol sources during that time. This enables the Sarychev eruption to have potential impact on global radiative budget similar to a tropical volcanic eruption.
Parallel Climate Data Assimilation PSAS Package Achieves 18 GFLOPs on 512-Node Intel Paragon
NASA Technical Reports Server (NTRS)
Ding, H. Q.; Chan, C.; Gennery, D. B.; Ferraro, R. D.
1995-01-01
Several algorithms were added to the Physical-space Statistical Analysis System (PSAS) from Goddard, which assimilates observational weather data by correcting for different levels of uncertainty about the data and different locations for mobile observation platforms. The new algorithms and use of the 512-node Intel Paragon allowed a hundred-fold decrease in processing time.
NASA Astrophysics Data System (ADS)
Brinkerhoff, D. J.; Johnson, J. V.
2013-07-01
We introduce a novel, higher order, finite element ice sheet model called VarGlaS (Variational Glacier Simulator), which is built on the finite element framework FEniCS. Contrary to standard procedure in ice sheet modelling, VarGlaS formulates ice sheet motion as the minimization of an energy functional, conferring advantages such as a consistent platform for making numerical approximations, a coherent relationship between motion and heat generation, and implicit boundary treatment. VarGlaS also solves the equations of enthalpy rather than temperature, avoiding the solution of a contact problem. Rather than include a lengthy model spin-up procedure, VarGlaS possesses an automated framework for model inversion. These capabilities are brought to bear on several benchmark problems in ice sheet modelling, as well as a 500 yr simulation of the Greenland ice sheet at high resolution. VarGlaS performs well in benchmarking experiments and, given a constant climate and a 100 yr relaxation period, predicts a mass evolution of the Greenland ice sheet that matches present-day observations of mass loss. VarGlaS predicts a thinning in the interior and thickening of the margins of the ice sheet.
Chenier plain genesis explained by feedbacks between waves, mud, and sand
NASA Astrophysics Data System (ADS)
Nardin, William; Fagherazzi, Sergio
2017-04-01
Cheniers are sandy ridges parallel to the coast established by high energy waves. Here we discuss ontogeny of chenier plains through dimensional analysis and numerical results from the morphodynamic model Delft3D-SWAN. Our results show that wave energy and inner-shelf slope play an important role in the formation of chenier plains. In our numerical experiments, waves affect chenier plain development in three ways: by winnowing coarse sediment from the mudflat, by eroding mud and accumulating sand over the beach during extreme wave events. We further show that different sediment characteristics and wave climates can lead to three alternative coastal landscapes: strand plains, mudflats, or the more complex chenier plains. Low inner-shelf slopes are the most favorable for strand plain and chenier plain formation, while high slopes decrease the likelihood of mudflat development and preservation.
Chenier plain development: feedbacks between waves, mud and sand
NASA Astrophysics Data System (ADS)
Nardin, W.; Fagherazzi, S.
2015-12-01
Cheniers are sandy ridges parallel to the coast established by high energy waves. Here we discuss Chenier plains ontogeny through dimensional analysis and numerical results from the morphodynamic model Delft3D-SWAN. Our results show that wave energy and shelf slope play an important role in the formation of Chenier plains. In our numerical experiments waves affect Chenier plain development in three ways: by winnowing sediment from the mudflat, by eroding mud and accumulating sand over the beach during extreme wave events. We further show that different sediment characteristics and wave climates can lead to three alternative coastal landscapes: strand plains, mudflats, or the more complex Chenier plains. Low inner-shelf slopes are the most favorable for strand plain and Chenier plain formation, while high slopes decrease the likelihood of mudflat development and preservation.
David E. Rupp,
2016-05-05
The 20th century climate for the Southeastern United States and surrounding areas as simulated by global climate models used in the Coupled Model Intercomparison Project Phase 5 (CMIP5) was evaluated. A suite of statistics that characterize various aspects of the regional climate was calculated from both model simulations and observation-based datasets. CMIP5 global climate models were ranked by their ability to reproduce the observed climate. Differences in the performance of the models between regions of the United States (the Southeastern and Northwestern United States) warrant a regional-scale assessment of CMIP5 models.
Sustained Assessment Metadata as a Pathway to Trustworthiness of Climate Science Information
NASA Astrophysics Data System (ADS)
Champion, S. M.; Kunkel, K.
2017-12-01
The Sustained Assessment process has produced a suite of climate change reports: The Third National Climate Assessment (NCA3), Regional Surface Climate Conditions in CMIP3 and CMIP5 for the United States: Differences, Similarities, and Implications for the U.S. National Climate Assessment, Impacts of Climate Change on Human Health in the United States: A Scientific Assessment, The State Climate Summaries, as well as the anticipated Climate Science Special Report and Fourth National Climate Assessment. Not only are these groundbreaking reports of climate change science, they are also the first suite of climate science reports to provide access to complex metadata directly connected to the report figures and graphics products. While the basic metadata documentation requirement is federally mandated through a series of federal guidelines as a part of the Information Quality Act, Sustained Assessment products are also deemed Highly Influential Scientific Assessments, which further requires demonstration of the transparency and reproducibility of the content. To meet these requirements, the Technical Support Unit (TSU) for the Sustained Assessment embarked on building a system for not only collecting and documenting metadata to the required standards, but one that also provides consumers unprecedented access to the underlying data and methods. As our process and documentation have evolved, the value of both continue to grow in parallel with the consumer expectation of quality, accessible climate science information. This presentation will detail the how the TSU accomplishes the mandated requirements with their metadata collection and documentation process, as well as the technical solution designed to demonstrate compliance while also providing access to the content for the general public. We will also illustrate how our accessibility platforms guide consumers through the Assessment science at a level of transparency that builds trust and confidence in the report content.
Rapid genetic divergence in response to 15 years of simulated climate change.
Ravenscroft, Catherine H; Whitlock, Raj; Fridley, Jason D
2015-11-01
Genetic diversity may play an important role in allowing individual species to resist climate change, by permitting evolutionary responses. Our understanding of the potential for such responses to climate change remains limited, and very few experimental tests have been carried out within intact ecosystems. Here, we use amplified fragment length polymorphism (AFLP) data to assess genetic divergence and test for signatures of evolutionary change driven by long-term simulated climate change applied to natural grassland at Buxton Climate Change Impacts Laboratory (BCCIL). Experimental climate treatments were applied to grassland plots for 15 years using a replicated and spatially blocked design and included warming, drought and precipitation treatments. We detected significant genetic differentiation between climate change treatments and control plots in two coexisting perennial plant study species (Festuca ovina and Plantago lanceolata). Outlier analyses revealed a consistent signature of selection associated with experimental climate treatments at individual AFLP loci in P. lanceolata, but not in F. ovina. Average background differentiation at putatively neutral AFLP loci was close to zero, and genomewide genetic structure was associated neither with species abundance changes (demography) nor with plant community-level responses to long-term climate treatments. Our results demonstrate genetic divergence in response to a suite of climatic environments in reproductively mature populations of two perennial plant species and are consistent with an evolutionary response to climatic selection in P. lanceolata. These genetic changes have occurred in parallel with impacts on plant community structure and may have contributed to the persistence of individual species through 15 years of simulated climate change at BCCIL. © 2015 The Authors. Global Change Biology Bioenergy Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Challinor, A. J.
2010-12-01
Recent progress in assessing the impacts of climate variability and change on crops using multiple regional-scale simulations of crop and climate (i.e. ensembles) is presented. Simulations for India and China used perturbed responses to elevated carbon dioxide constrained using observations from FACE studies and controlled environments. Simulations with crop parameter sets representing existing and potential future adapted varieties were also carried out. The results for India are compared to sensitivity tests on two other crop models. For China, a parallel approach used socio-economic data to account for autonomous farmer adaptation. Results for the USA analysed cardinal temperatures under a range of local warming scenarios for 2711 varieties of spring wheat. The results are as follows: 1. Quantifying and reducing uncertainty. The relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes is examined. The observational constraints from FACE and controlled environment studies are shown to be the likely critical factor in maintaining relatively low crop parameter uncertainty. Without these constraints, crop simulation uncertainty in a doubled CO2 environment would likely be greater than uncertainty in simulating climate. However, consensus across crop models in India varied across different biophysical processes. 2. The response of yield to changes in local mean temperature was examined and compared to that found in the literature. No consistent response to temperature change was found across studies. 3. Implications for adaptation. China. The simulations of spring wheat in China show the relative importance of tolerance to water and heat stress in avoiding future crop failures. The greatest potential for reducing the number of harvests less than one standard deviation below the baseline mean yield value comes from alleviating water stress; the greatest potential for reducing harvests less than two standard deviations below the mean comes from alleviation of heat stress. The socio-economic analysis suggests that adaptation is also possible through measures such as greater investment. India. The simulations of groundnut in India identified regions where heat stress will play an increasing role in limiting crop yields, and other regions where crops with greater thermal time requirement will be needed. The simulations were used, together with an observed dataset and a simple analysis of crop cardinal temperatures and thermal time, to estimate the potential for adaptation using existing cultivars. USA. Analysis of spring wheat in the USA showed that at +2oC of local warming, 87% of the 2711 varieties examined, and all of the five most common varieties, could be used to maintain the crop duration of the current climate (i.e. successful adaptation to mean warming). At +4o this fell to 54% of all varieties, and two of the top five. 4. Future research. The results, and the limitations of the study, suggest directions for research to link climate and crop models, socio-economic analyses and crop variety trial data in order to prioritise adaptation options such as capacity building, plant breeding and biotechnology.
Partitioning and packing mathematical simulation models for calculation on parallel computers
NASA Technical Reports Server (NTRS)
Arpasi, D. J.; Milner, E. J.
1986-01-01
The development of multiprocessor simulations from a serial set of ordinary differential equations describing a physical system is described. Degrees of parallelism (i.e., coupling between the equations) and their impact on parallel processing are discussed. The problem of identifying computational parallelism within sets of closely coupled equations that require the exchange of current values of variables is described. A technique is presented for identifying this parallelism and for partitioning the equations for parallel solution on a multiprocessor. An algorithm which packs the equations into a minimum number of processors is also described. The results of the packing algorithm when applied to a turbojet engine model are presented in terms of processor utilization.
First results of high-resolution modeling of Cenozoic subduction orogeny in Andes
NASA Astrophysics Data System (ADS)
Liu, S.; Sobolev, S. V.; Babeyko, A. Y.; Krueger, F.; Quinteros, J.; Popov, A.
2016-12-01
The Andean Orogeny is the result of the upper-plate crustal shortening during the Cenozoic Nazca plate subduction beneath South America plate. With up to 300 km shortening, the Earth's second highest Altiplano-Puna Plateau was formed with a pronounced N-S oriented deformation diversity. Furthermore, the tectonic shortening in the Southern Andes was much less intensive and started much later. The mechanism of the shortening and the nature of N-S variation of its magnitude remain controversial. The previous studies of the Central Andes suggested that they might be related to the N-S variation in the strength of the lithosphere, friction coupling at slab interface, and are probably influenced by the interaction of the climate and tectonic systems. However, the exact nature of the strength variation was not explored due to the lack of high numerical resolution and 3D numerical models at that time. Here we will employ large-scale subduction models with a high resolution to reveal and quantify the factors controlling the strength of lithospheric structures and their effect on the magnitude of tectonic shortening in the South America plate between 18°-35°S. These high-resolution models are performed by using the highly scalable parallel 3D code LaMEM (Lithosphere and Mantle Evolution Model). This code is based on finite difference staggered grid approach and employs massive linear and non-linear solvers within the PETSc library to complete high-performance MPI-based parallelization in geodynamic modeling. Currently, in addition to benchmark-models we are developing high-resolution (< 1km) 2D subduction models with application to Nazca-South America convergence. In particular, we will present the models focusing on the effect of friction reduction in the Paleozoic-Cenozoic sediments above the uppermost crust in the Subandean Ranges. Future work will be focused on the origin of different styles of deformation and topography evolution in Altiplano-Puna Plateau and Central-Southern Andes through 3D modeling of large-scale interaction of subducting and overriding plates.
OpenMP parallelization of a gridded SWAT (SWATG)
NASA Astrophysics Data System (ADS)
Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin
2017-12-01
Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Rittger, K.; Painter, T. H.; Selkowitz, D.; Mattmann, C. A.; Ramirez, P.
2014-12-01
As part of a JPL-USGS collaboration to expand distribution of essential climate variables (ECV) to include on-demand fractional snow cover we describe our experience and implementation of a shift towards the use of NVIDIA's CUDA® parallel computing platform and programming model. In particular the on-demand aspect of this work involves the improvement (via faster processing and a reduction in overall running times) for determination of fractional snow-covered area (fSCA) from Landsat TM/ETM+. Our observations indicate that processing tasks associated with remote sensing including the Snow Covered Area and Grain Size Model (SCAG) when applied to MODIS or LANDSAT TM/ETM+ are computationally intensive processes. We believe the shift to the CUDA programming paradigm represents a significant improvement in the ability to more quickly assert the outcomes of such activities. We use the TMSCAG model as our subject to highlight this argument. We do this by describing how we can ingest a LANDSAT surface reflectance image (typically provided in HDF format), perform spectral mixture analysis to produce land cover fractions including snow, vegetation and rock/soil whilst greatly reducing running time for such tasks. Within the scope of this work we first document the original workflow used to assert fSCA for Landsat TM and it's primary shortcomings. We then introduce the logic and justification behind the switch to the CUDA paradigm for running single as well as batch jobs on the GPU in order to achieve parallel processing. Finally we share lessons learned from the implementation of myriad of existing algorithms to a single set of code in a single target language as well as benefits this ultimately provides scientists at the USGS.
Impact of automatization in temperature series in Spain and comparison with the POST-AWS dataset
NASA Astrophysics Data System (ADS)
Aguilar, Enric; López-Díaz, José Antonio; Prohom Duran, Marc; Gilabert, Alba; Luna Rico, Yolanda; Venema, Victor; Auchmann, Renate; Stepanek, Petr; Brandsma, Theo
2016-04-01
Climate data records are most of the times affected by inhomogeneities. Especially inhomogeneities introducing network-wide biases are sometimes related to changes happening almost simultaneously in an entire network. Relative homogenization is difficult in these cases, especially at the daily scale. A good example of this is the substitution of manual observations (MAN) by automatic weather stations (AWS). Parallel measurements (i.e. records taken at the same time with the old (MAN) and new (AWS) sensors can provide an idea of the bias introduced and help to evaluate the suitability of different correction approaches. We present here a quality controlled dataset compiled under the DAAMEC Project, comprising 46 stations across Spain and over 85,000 parallel measurements (AWS-MAN) of daily maximum and minimum temperature. We study the differences between both sensors and compare it with the available metadata to account for internal inhomogeneities. The differences between both systems vary much across stations, with patterns more related to their particular settings than to climatic/geographical reasons. The typical median biases (AWS-MAN) by station (comprised between the interquartile range) oscillate between -0.2°C and 0.4 in daily maximum temperature and between -0.4°C and 0.2°C in daily minimum temperature. These and other results are compared with a larger network, the Parallel Observations Scientific Team, a working group of the International Surface Temperatures Initiative (ISTI-POST) dataset, which comprises our stations, as well as others from different countries in America, Asia and Europe.
The Research of the Parallel Computing Development from the Angle of Cloud Computing
NASA Astrophysics Data System (ADS)
Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun
2017-10-01
Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.
Describing, using 'recognition cones'. [parallel-series model with English-like computer program
NASA Technical Reports Server (NTRS)
Uhr, L.
1973-01-01
A parallel-serial 'recognition cone' model is examined, taking into account the model's ability to describe scenes of objects. An actual program is presented in an English-like language. The concept of a 'description' is discussed together with possible types of descriptive information. Questions regarding the level and the variety of detail are considered along with approaches for improving the serial representations of parallel systems.
Parallel distributed, reciprocal Monte Carlo radiation in coupled, large eddy combustion simulations
NASA Astrophysics Data System (ADS)
Hunsaker, Isaac L.
Radiation is the dominant mode of heat transfer in high temperature combustion environments. Radiative heat transfer affects the gas and particle phases, including all the associated combustion chemistry. The radiative properties are in turn affected by the turbulent flow field. This bi-directional coupling of radiation turbulence interactions poses a major challenge in creating parallel-capable, high-fidelity combustion simulations. In this work, a new model was developed in which reciprocal monte carlo radiation was coupled with a turbulent, large-eddy simulation combustion model. A technique wherein domain patches are stitched together was implemented to allow for scalable parallelism. The combustion model runs in parallel on a decomposed domain. The radiation model runs in parallel on a recomposed domain. The recomposed domain is stored on each processor after information sharing of the decomposed domain is handled via the message passing interface. Verification and validation testing of the new radiation model were favorable. Strong scaling analyses were performed on the Ember cluster and the Titan cluster for the CPU-radiation model and GPU-radiation model, respectively. The model demonstrated strong scaling to over 1,700 and 16,000 processing cores on Ember and Titan, respectively.
NASA Astrophysics Data System (ADS)
Stippich, Christian; Glasmacher, Ulrich Anton; Hackspacher, Peter
2015-04-01
The aim of the research is to quantify the long-term landscape evolution of the South Atlantic passive continental margin (SAPCM) in SE-Brazil and NW-Namibia. Excellent onshore outcrop conditions and complete rift to post-rift archives between Sao Paulo and Porto Alegre and in the transition from Namibia to Angola (onshore Walvis ridge) allow a high precision quantification of exhumation, and uplift rates, influencing physical parameters, long-term acting forces, and process-response systems. Research will integrate the published and partly published thermochronological data from Brazil and Namibia, and test lately published new concepts on causes of long-term landscape evolution at rifted margins. The climate-continental margin-mantle coupled process-response system is caused by the interaction between endogenous and exogenous forces, which are related to the mantle-process driven rift - drift - passive continental margin evolution of the South Atlantic, and the climate change since the Early/Late Cretaceous climate maximum. Special emphasis will be given to the influence of long-living transform faults such as the Florianopolis Fracture Zone (FFZ) on the long-term topography evolution of the SAPCM's. A long-term landscape evolution model with process rates will be achieved by thermo-kinematic 3-D modeling (software code PECUBE1,2 and FastScape3). Testing model solutions obtained for a multidimensional parameter space against the real thermochronological and geomorphological data set, the most likely combinations of parameter rates, and values can be constrained. The data and models will allow separating the exogenous and endogenous forces and their process rates. References 1. Braun, J., 2003. Pecube: A new finite element code to solve the 3D heat transport equation including the effects of a time-varying, finite amplitude surface topography. Computers and Geosciences, v.29, pp.787-794. 2. Braun, J., van der Beek, P., Valla, P., Robert, X., Herman, F., Goltzbacj, C., Pedersen, V., Perry, C., Simon-Labric, T., Prigent, C. 2012. Quantifying rates of landscape evolution and tectonic processes by thermochronology and numerical modeling of crustal heat transport using PECUBE. Tectonophysics, v.524-525, pp.1-28. 3. Braun, J. and Willett, S.D., 2013. A very efficient, O(n), implicit and parallel method to solve the basic stream power law equation governing fluvial incision and landscape evolution. Geomorphology, v.180-181, 170-179.
NASA Astrophysics Data System (ADS)
Vivoni, Enrique R.; Mascaro, Giuseppe; Mniszewski, Susan; Fasel, Patricia; Springer, Everett P.; Ivanov, Valeriy Y.; Bras, Rafael L.
2011-10-01
SummaryA major challenge in the use of fully-distributed hydrologic models has been the lack of computational capabilities for high-resolution, long-term simulations in large river basins. In this study, we present the parallel model implementation and real-world hydrologic assessment of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS). Our parallelization approach is based on the decomposition of a complex watershed using the channel network as a directed graph. The resulting sub-basin partitioning divides effort among processors and handles hydrologic exchanges across boundaries. Through numerical experiments in a set of nested basins, we quantify parallel performance relative to serial runs for a range of processors, simulation complexities and lengths, and sub-basin partitioning methods, while accounting for inter-run variability on a parallel computing system. In contrast to serial simulations, the parallel model speed-up depends on the variability of hydrologic processes. Load balancing significantly improves parallel speed-up with proportionally faster runs as simulation complexity (domain resolution and channel network extent) increases. The best strategy for large river basins is to combine a balanced partitioning with an extended channel network, with potential savings through a lower TIN resolution. Based on these advances, a wider range of applications for fully-distributed hydrologic models are now possible. This is illustrated through a set of ensemble forecasts that account for precipitation uncertainty derived from a statistical downscaling model.
Modelling and simulation of parallel triangular triple quantum dots (TTQD) by using SIMON 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fathany, Maulana Yusuf, E-mail: myfathany@gmail.com; Fuada, Syifaul, E-mail: fsyifaul@gmail.com; Lawu, Braham Lawas, E-mail: bram-labs@rocketmail.com
2016-04-19
This research presents analysis of modeling on Parallel Triple Quantum Dots (TQD) by using SIMON (SIMulation Of Nano-structures). Single Electron Transistor (SET) is used as the basic concept of modeling. We design the structure of Parallel TQD by metal material with triangular geometry model, it is called by Triangular Triple Quantum Dots (TTQD). We simulate it with several scenarios using different parameters; such as different value of capacitance, various gate voltage, and different thermal condition.
NASA Astrophysics Data System (ADS)
Kalafatis, S.
2015-12-01
Many climate scientists and boundary organizations have accumulated years of experience providing decision support for climate adaptation related to landscape change. The Great Lakes Integrated Sciences + Assessments (GLISA) is one such organization that has developed a reputation for providing stakeholders with climate change decision support throughout the Great Lakes region of North America. After five years of applied outreach, GLISA climate scientists working with practitioners identified three common limitations across projects that were slowing down the use of information, describing them as mismatched terminology, unrealistic expectations, and disordered integration. Discussions with GLISA-affiliated social scientists revealed compelling parallels between these observations and the existing social science literature on the persistent "usability gap" in information use as well as opportunities to preemptively overcome these barriers. The discovery of these overlaps between the climate scientists' experience of barriers and the social science literature as well as strategies to systematically address them demonstrate the potential for boundary organizations to act as incubators of more and more efficient co-production over time. To help illustrate these findings, this presentation also provides an example of decision-making for adaptation in the face of landscape change in which GLISA scientists assisted Isle Royale National Park with assessing the implications of future ecological transitions for current wildlife management efforts.
Beauregard, Frieda; de Blois, Sylvie
2014-01-01
Both climatic and edaphic conditions determine plant distribution, however many species distribution models do not include edaphic variables especially over large geographical extent. Using an exceptional database of vegetation plots (n = 4839) covering an extent of ∼55000 km2, we tested whether the inclusion of fine scale edaphic variables would improve model predictions of plant distribution compared to models using only climate predictors. We also tested how well these edaphic variables could predict distribution on their own, to evaluate the assumption that at large extents, distribution is governed largely by climate. We also hypothesized that the relative contribution of edaphic and climatic data would vary among species depending on their growth forms and biogeographical attributes within the study area. We modelled 128 native plant species from diverse taxa using four statistical model types and three sets of abiotic predictors: climate, edaphic, and edaphic-climate. Model predictive accuracy and variable importance were compared among these models and for species' characteristics describing growth form, range boundaries within the study area, and prevalence. For many species both the climate-only and edaphic-only models performed well, however the edaphic-climate models generally performed best. The three sets of predictors differed in the spatial information provided about habitat suitability, with climate models able to distinguish range edges, but edaphic models able to better distinguish within-range variation. Model predictive accuracy was generally lower for species without a range boundary within the study area and for common species, but these effects were buffered by including both edaphic and climatic predictors. The relative importance of edaphic and climatic variables varied with growth forms, with trees being more related to climate whereas lower growth forms were more related to edaphic conditions. Our study identifies the potential for non-climate aspects of the environment to pose a constraint to range expansion under climate change. PMID:24658097
Beauregard, Frieda; de Blois, Sylvie
2014-01-01
Both climatic and edaphic conditions determine plant distribution, however many species distribution models do not include edaphic variables especially over large geographical extent. Using an exceptional database of vegetation plots (n = 4839) covering an extent of ∼55,000 km2, we tested whether the inclusion of fine scale edaphic variables would improve model predictions of plant distribution compared to models using only climate predictors. We also tested how well these edaphic variables could predict distribution on their own, to evaluate the assumption that at large extents, distribution is governed largely by climate. We also hypothesized that the relative contribution of edaphic and climatic data would vary among species depending on their growth forms and biogeographical attributes within the study area. We modelled 128 native plant species from diverse taxa using four statistical model types and three sets of abiotic predictors: climate, edaphic, and edaphic-climate. Model predictive accuracy and variable importance were compared among these models and for species' characteristics describing growth form, range boundaries within the study area, and prevalence. For many species both the climate-only and edaphic-only models performed well, however the edaphic-climate models generally performed best. The three sets of predictors differed in the spatial information provided about habitat suitability, with climate models able to distinguish range edges, but edaphic models able to better distinguish within-range variation. Model predictive accuracy was generally lower for species without a range boundary within the study area and for common species, but these effects were buffered by including both edaphic and climatic predictors. The relative importance of edaphic and climatic variables varied with growth forms, with trees being more related to climate whereas lower growth forms were more related to edaphic conditions. Our study identifies the potential for non-climate aspects of the environment to pose a constraint to range expansion under climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostka, Joel
The goal of this project was to investigate changes in the structure of dissolved and solid phase organic matter, the production of CO 2 and CH 4, and the composition of decomposer microbial communities in response to the climatic forcing of environmental processes that determine the balance between carbon gas production versus storage and sequestration in peatlands. Cutting-edge analytical chemistry and next generation sequencing of microbial genes were been applied to habitats at the Marcell Experimental Forest (MEF), where the US DOE’s Oak Ridge National Laboratory and the USDA Forest Service are constructing a large-scale ecosystem study entitled, “Spruce andmore » Peatland Responses Under Climatic and Environmental Change”(SPRUCE). Our study represented a comprehensive characterization of the sources, transformation, and decomposition of organic matter in the S1 bog at MEF. Multiple lines of evidence point to distinct, vertical zones of organic matter transformation: 1) the acrotelm consisting of living mosses, root material, and newly formed litter (0-30 cm), 2) the mesotelm, a mid-depth transition zone (30-75 cm) characterized by labile organic C compounds and intense decomposition, and 3) the underlying catotelm (below 75cm) characterized by refractory organic compounds as well as relatively low decomposition rates. These zones are in part defined by physical changes in hydraulic conductivity and water table depth. O-alkyl-C, which represents the carbohydrate fraction in the peat, was shown to be an excellent proxy for soil decomposition rates. The carbon cycle in deep peat was shown to be fueled by modern carbon sources further indicating that hydrology and surface vegetation play a role in belowground carbon cycling. We provide the first metagenomic study of an ombrotrophic peat bog, with novel insights into microbial specialization and functions in this unique terrestrial ecosystem. Vertical structuring of microbial communities closely paralleled the chemical evolution of peat, with large shifts in microbial populations occurring in the biogeochemical hotspot, the mesotelm, where the highest rates of decomposition were detected. Stable isotope geochemistry and potential rates of methane production paralleled vertical changes in methanogen community composition to indicate a predominance of acetoclastic methanogenesis mediated by the Methanosarcinales in the mesotelm, while hydrogen-utilizing methanogens dominated in the deeper catotelm. Evidence pointed to the availability of phosphorus as well as nitrogen limiting the microbially-mediated turnover of organic carbon at MEF. Prior to initiation of the experimental treatments, our study provided key baseline data for the SPRUCE site on the vertical stratification of peat decomposition, key enzymatic pathways, and microbial taxa containing these pathways. The sensitivity of soil carbon turnover to climate change is strongly linked to recalcitrant carbon stocks and the temperature sensitivity of decomposition is thought to increase with increasing molecular complexity of carbon substrates. This project delivered results on how climate change perturbations impact the microbially-mediated turnover of recalcitrant organic matter in peatland forest soils, both under controlled conditions in the laboratory and at the ecosystem-scale in the field. This project revisited the concept of “recalcitrance” in the regulation of soil carbon turnover using a combination of natural abundance radiocarbon and optical spectroscopic measurements on bulk DOM, and high resolution molecular characterization of DOM. The project elucidated how organic matter reactivity and decomposition will respond to climate change in a both a qualitative (organic matter lability) and quantitiative (increased rates) manner. An Aromaticity Index was developed to represent a more direct and accurate parameter for modeling of DOM reactivity in peatlands. The abundance and community composition of soil microorganisms that mediate C cycling were interrogated with depth in the peat, with season, and in manipulated climate enclosures at unprecedented resolution. Therefore this project delivered strategic new insights on the functioning of peatland ecosystems that collectively store approximately one-third of the world's soil carbon. Furthermore, results from the detailed characterization of DOM lability and microbial community structure/ function will be employed to further develop biogeochemical models to include microbial respiration pathways as well as to track carbon flow with a term that incorporates relative reactivity based on aromaticity index. As it stands now, detailed soil organic matter structure and microbial parameters are not included in Earth system models.« less
Wildhaber, Mark L.; Wikle, Christopher K.; Anderson, Christopher J.; Franz, Kristie J.; Moran, Edward H.; Dey, Rima; Mader, Helmut; Kraml, Julia
2012-01-01
Climate change operates over a broad range of spatial and temporal scales. Understanding its effects on ecosystems requires multi-scale models. For understanding effects on fish populations of riverine ecosystems, climate predicted by coarse-resolution Global Climate Models must be downscaled to Regional Climate Models to watersheds to river hydrology to population response. An additional challenge is quantifying sources of uncertainty given the highly nonlinear nature of interactions between climate variables and community level processes. We present a modeling approach for understanding and accomodating uncertainty by applying multi-scale climate models and a hierarchical Bayesian modeling framework to Midwest fish population dynamics and by linking models for system components together by formal rules of probability. The proposed hierarchical modeling approach will account for sources of uncertainty in forecasts of community or population response. The goal is to evaluate the potential distributional changes in an ecological system, given distributional changes implied by a series of linked climate and system models under various emissions/use scenarios. This understanding will aid evaluation of management options for coping with global climate change. In our initial analyses, we found that predicted pallid sturgeon population responses were dependent on the climate scenario considered.
Northern Hemisphere glaciation and the evolution of Plio-Pleistocene climate noise
NASA Astrophysics Data System (ADS)
Meyers, Stephen R.; Hinnov, Linda A.
2010-08-01
Deterministic orbital controls on climate variability are commonly inferred to dominate across timescales of 104-106 years, although some studies have suggested that stochastic processes may be of equal or greater importance. Here we explicitly quantify changes in deterministic orbital processes (forcing and/or pacing) versus stochastic climate processes during the Plio-Pleistocene, via time-frequency analysis of two prominent foraminifera oxygen isotopic stacks. Our results indicate that development of the Northern Hemisphere ice sheet is paralleled by an overall amplification of both deterministic and stochastic climate energy, but their relative dominance is variable. The progression from a more stochastic early Pliocene to a strongly deterministic late Pleistocene is primarily accommodated during two transitory phases of Northern Hemisphere ice sheet growth. This long-term trend is punctuated by “stochastic events,” which we interpret as evidence for abrupt reorganization of the climate system at the initiation and termination of the mid-Pleistocene transition and at the onset of Northern Hemisphere glaciation. In addition to highlighting a complex interplay between deterministic and stochastic climate change during the Plio-Pleistocene, our results support an early onset for Northern Hemisphere glaciation (between 3.5 and 3.7 Ma) and reveal some new characteristics of the orbital signal response, such as the puzzling emergence of 100 ka and 400 ka cyclic climate variability during theoretical eccentricity nodes.
Parallelization of elliptic solver for solving 1D Boussinesq model
NASA Astrophysics Data System (ADS)
Tarwidi, D.; Adytia, D.
2018-03-01
In this paper, a parallel implementation of an elliptic solver in solving 1D Boussinesq model is presented. Numerical solution of Boussinesq model is obtained by implementing a staggered grid scheme to continuity, momentum, and elliptic equation of Boussinesq model. Tridiagonal system emerging from numerical scheme of elliptic equation is solved by cyclic reduction algorithm. The parallel implementation of cyclic reduction is executed on multicore processors with shared memory architectures using OpenMP. To measure the performance of parallel program, large number of grids is varied from 28 to 214. Two test cases of numerical experiment, i.e. propagation of solitary and standing wave, are proposed to evaluate the parallel program. The numerical results are verified with analytical solution of solitary and standing wave. The best speedup of solitary and standing wave test cases is about 2.07 with 214 of grids and 1.86 with 213 of grids, respectively, which are executed by using 8 threads. Moreover, the best efficiency of parallel program is 76.2% and 73.5% for solitary and standing wave test cases, respectively.
A hybrid parallel framework for the cellular Potts model simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Yi; He, Kejing; Dong, Shoubin
2009-01-01
The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approachmore » achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).« less
Unusual forest growth decline in boreal North America covaries with the retreat of Arctic sea ice.
Girardin, Martin P; Guo, Xiao Jing; De Jong, Rogier; Kinnard, Christophe; Bernier, Pierre; Raulier, Frédéric
2014-03-01
The 20th century was a pivotal period at high northern latitudes as it marked the onset of rapid climatic warming brought on by major anthropogenic changes in global atmospheric composition. In parallel, Arctic sea ice extent has been decreasing over the period of available satellite data records. Here, we document how these changes influenced vegetation productivity in adjacent eastern boreal North America. To do this, we used normalized difference vegetation index (NDVI) data, model simulations of net primary productivity (NPP) and tree-ring width measurements covering the last 300 years. Climatic and proxy-climatic data sets were used to explore the relationships between vegetation productivity and Arctic sea ice concentration and extent, and temperatures. Results indicate that an unusually large number of black spruce (Picea mariana) trees entered into a period of growth decline during the late-20th century (62% of sampled trees; n = 724 cross sections of age >70 years). This finding is coherent with evidence encoded in NDVI and simulated NPP data. Analyses of climatic and vegetation productivity relationships indicate that the influence of recent climatic changes in the studied forests has been via the enhanced moisture stress (i.e. greater water demands) and autotrophic respiration amplified by the declining sea ice concentration in Hudson Bay and Hudson Strait. The recent decline strongly contrasts with other growth reduction events that occurred during the 19th century, which were associated with cooling and high sea ice severity. The recent decline of vegetation productivity is the first one to occur under circumstances related to excess heat in a 300-year period, and further culminates with an intensifying wildfire regime in the region. Our results concur with observations from other forest ecosystems about intensifying temperature-driven drought stress and tree mortality with ongoing climatic changes. © 2013 Her Majesty the Queen in Right of Canada Global Change Biology © 2013 John Wiley & Sons Ltd Reproduced with the permission of the Minister of Natural Resources Canada.
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
Atmospheric, Climatic, and Environmental Research
NASA Technical Reports Server (NTRS)
Broecker, Wallace S.; Gornitz, Vivien M.
1994-01-01
The climate and atmospheric modeling project involves analysis of basic climate processes, with special emphasis on studies of the atmospheric CO2 and H2O source/sink budgets and studies of the climatic role Of CO2, trace gases and aerosols. These studies are carried out, based in part on use of simplified climate models and climate process models developed at GISS. The principal models currently employed are a variable resolution 3-D general circulation model (GCM), and an associated "tracer" model which simulates the advection of trace constituents using the winds generated by the GCM.
A Parallel Saturation Algorithm on Shared Memory Architectures
NASA Technical Reports Server (NTRS)
Ezekiel, Jonathan; Siminiceanu
2007-01-01
Symbolic state-space generators are notoriously hard to parallelize. However, the Saturation algorithm implemented in the SMART verification tool differs from other sequential symbolic state-space generators in that it exploits the locality of ring events in asynchronous system models. This paper explores whether event locality can be utilized to efficiently parallelize Saturation on shared-memory architectures. Conceptually, we propose to parallelize the ring of events within a decision diagram node, which is technically realized via a thread pool. We discuss the challenges involved in our parallel design and conduct experimental studies on its prototypical implementation. On a dual-processor dual core PC, our studies show speed-ups for several example models, e.g., of up to 50% for a Kanban model, when compared to running our algorithm only on a single core.
Assessing NARCCAP climate model effects using spatial confidence regions.
French, Joshua P; McGinnis, Seth; Schwartzman, Armin
2017-01-01
We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP) climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.
To share or not to share: Drivers and barriers for sharing data via online amateur weather networks
NASA Astrophysics Data System (ADS)
Gharesifard, Mohammad; Wehn, Uta
2016-04-01
Increasing attention is being paid to the importance and potential of crowd-sourced data to complement current environmental data-streams (i.e. in-situ observations and RS data). In parallel, the diffusion of Information Communication Technologies (ICTs) that are interactive and easy to use have provided a way forward in facing extreme climatic events and the threatening hazards resulting from those. The combination of these two trends is referred to as ICT-enabled 'citizen observatories' of the environment. Nevertheless, the success of these citizen observatories hinges on the continued involvement of citizens as central actors of these initiatives. Developing strategies to (further) engage citizens requires in-depth understanding of the behavioral determinants that encourage or impede individuals to collect and share environment-related data. This paper takes the case of citizen-sensed weather data using Personal Weather Stations (PWSs) and looks at the drivers and barriers for sharing such data via online amateur weather networks. This is done employing a behavioral science lens that considers data sharing a decision and systematically investigates the influential factors that affect this decision. The analysis and findings are based on qualitative empirical research carried out in the Netherlands, United Kingdom and Italy. Subsequently, a model was developed that depicts the main drivers and barriers for citizen participation in weather observatories. This resulting model can be utilized as a tool to develop strategies for further enhancing ICT-enabled citizen participation in climatic observations and, consequently, in environmental management.
CCN and IN concentration measurements during the Antarctic Circumnavigation Expedition
NASA Astrophysics Data System (ADS)
Stratmann, F.; Henning, S.; Löffler, M.; Welti, A.; Hartmann, M.; Wernli, H.; Baccarini, A.; Schmale, J.
2017-12-01
Cloud condensation nuclei (CCN) and ice nuclei (IN) concentrations measured during the Antarctic Circumnavigation Expedition (ACE) within the Study of Preindustrial-like Aerosol-Climate Effects (SPACE) are presented. The measurements give a circumpolar transect through the Sub Antarctic Ocean, where existing measurements are scarce. ACE took place during the austral summer 2016/17 and included exploration of different environments from pristine open Ocean to Antarctic islands and the southernmost ports of the 3 surrounding continents. CCN concentrations are measured over the entire range of expected in-cloud supersaturations from 0.1 to 1% using a CCNc instrument from DMT. IN concentrations are determined from filter samples at water saturated conditions from -5°C to -25°C, covering common temperatures of mixed-phase cloud glaciation. The sensitivity of measured IN and CCN concentrations to meteorological parameters, activity of marine biology and location is assessed to gain insight into potential sources of CCN and IN. Back trajectory modelling is used to allocate regional variations to aerosol sources originating in the marine boundary layer or long-range transport. The gained datasets constrain CCN and IN concentrations in the marine boundary layer along the cruise track. The comprehensive set of parallel measured parameters during ACE allow to evaluate contributions of local ocean-surface sources versus long-range transport to Sub-Antarctic CCN and IN. The measurements can be used as input to climate models, e.g. pristine Sub Antarctic conditions can provide an approximation for a pre-industrial environment.
NASA Astrophysics Data System (ADS)
Cailleret, Maxime; Snell, Rebecca; von Waldow, Harald; Kotlarski, Sven; Bugmann, Harald
2015-04-01
Different levels of uncertainty should be considered in climate impact projections by Dynamic Vegetation Models (DVMs), particularly when it comes to managing climate risks. Such information is useful to detect the key processes and uncertainties in the climate model - impact model chain and may be used to support recommendations for future improvements in the simulation of both climate and biological systems. In addition, determining which uncertainty source is dominant is an important aspect to recognize the limitations of climate impact projections by a multi-model ensemble mean approach. However, to date, few studies have clarified how each uncertainty source (baseline climate data, greenhouse gas emission scenario, climate model, and DVM) affects the projection of ecosystem properties. Focusing on one greenhouse gas emission scenario, we assessed the uncertainty in the projections of a forest landscape model (LANDCLIM) and a stand-scale forest gap model (FORCLIM) that is caused by linking climate data with an impact model. LANDCLIM was used to assess the uncertainty in future landscape properties of the Visp valley in Switzerland that is due to (i) the use of different 'baseline' climate data (gridded data vs. data from weather stations), and (ii) differences in climate projections among 10 GCM-RCM chains. This latter point was also considered for the projections of future forest properties by FORCLIM at several sites along an environmental gradient in Switzerland (14 GCM-RCM chains), for which we also quantified the uncertainty caused by (iii) the model chain specific statistical properties of the climate time-series, and (iv) the stochasticity of the demographic processes included in the model, e.g., the annual number of saplings that establish, or tree mortality. Using methods of variance decomposition analysis, we found that (i) The use of different baseline climate data strongly impacts the prediction of forest properties at the lowest and highest, but not so much at medium elevations. (ii) Considering climate change, the variability that is due to the GCM-RCM chains is much greater than the variability induced by the uncertainty in the initial climatic conditions. (iii) The uncertainties caused by the intrinsic stochasticity in the DVMs and by the random generation of the climate time-series are negligible. Overall, our results indicate that DVMs are quite sensitive to the climate data, highlighting particularly (1) the limitations of using one single multi-model average climate change scenario in climate impact studies and (2) the need to better consider the uncertainty in climate model outputs for projecting future vegetation changes.
Bisetti, Fabrizio; Attili, Antonio; Pitsch, Heinz
2014-01-01
Combustion of fossil fuels is likely to continue for the near future due to the growing trends in energy consumption worldwide. The increase in efficiency and the reduction of pollutant emissions from combustion devices are pivotal to achieving meaningful levels of carbon abatement as part of the ongoing climate change efforts. Computational fluid dynamics featuring adequate combustion models will play an increasingly important role in the design of more efficient and cleaner industrial burners, internal combustion engines, and combustors for stationary power generation and aircraft propulsion. Today, turbulent combustion modelling is hindered severely by the lack of data that are accurate and sufficiently complete to assess and remedy model deficiencies effectively. In particular, the formation of pollutants is a complex, nonlinear and multi-scale process characterized by the interaction of molecular and turbulent mixing with a multitude of chemical reactions with disparate time scales. The use of direct numerical simulation (DNS) featuring a state of the art description of the underlying chemistry and physical processes has contributed greatly to combustion model development in recent years. In this paper, the analysis of the intricate evolution of soot formation in turbulent flames demonstrates how DNS databases are used to illuminate relevant physico-chemical mechanisms and to identify modelling needs. PMID:25024412
What’s Needed from Climate Modeling to Advance Actionable Science for Water Utilities?
NASA Astrophysics Data System (ADS)
Barsugli, J. J.; Anderson, C. J.; Smith, J. B.; Vogel, J. M.
2009-12-01
“…perfect information on climate change is neither available today nor likely to be available in the future, but … over time, as the threats climate change poses to our systems grow more real, predicting those effects with greater certainty is non-discretionary. We’re not yet at a level at which climate change projections can drive climate change adaptation.” (Testimony of WUCA Staff Chair David Behar to the House Committee on Science and Technology, May 5, 2009) To respond to this challenge, the Water Utility Climate Alliance (WUCA) has sponsored a white paper titled “Options for Improving Climate Modeling to Assist Water Utility Planning for Climate Change. ” This report concerns how investments in the science of climate change, and in particular climate modeling and downscaling, can best be directed to help make climate projections more actionable. The meaning of “model improvement” can be very different depending on whether one is talking to a climate model developer or to a water manager trying to incorporate climate projections in to planning. We first surveyed the WUCA members on present and potential uses of climate model projections and on climate inputs to their various system models. Based on those surveys and on subsequent discussions, we identified four dimensions along which improvement in modeling would make the science more “actionable”: improved model agreement on change in key parameters; narrowing the range of model projections; providing projections at spatial and temporal scales that match water utilities system models; providing projections that water utility planning horizons. With these goals in mind we developed four options for improving global-scale climate modeling and three options for improving downscaling that will be discussed. However, there does not seem to be a single investment - the proverbial “magic bullet” -- which will substantially reduce the range of model projections at the scales at which utility planning is conducted. In the near term we feel strongly that water utilities and climate scientists should work together to leverage the upcoming Coupled Model Intercomparison Project, Phase 5 (CMIP5; a coordinated set climate model experiments that will be used to support the upcoming IPCC Fifth Assessment) to better benefit water utilities. In the longer term, even with model and downscaling improvements, it is very likely that substantial uncertainty about future climate change at the desired spatial and temporal scales will remain. Nonetheless, there is no doubt the climate is changing, and the challenge is to work with what we have, or what we can reasonably expect to have in the coming years to make the best decisions we can.
NASA Astrophysics Data System (ADS)
Khodayari, Arezoo; Wuebbles, Donald J.; Olsen, Seth C.; Fuglestvedt, Jan S.; Berntsen, Terje; Lund, Marianne T.; Waitz, Ian; Wolfe, Philip; Forster, Piers M.; Meinshausen, Malte; Lee, David S.; Lim, Ling L.
2013-08-01
This study evaluates the capabilities of the carbon cycle and energy balance treatments relative to the effect of aviation CO2 emissions on climate in several existing simplified climate models (SCMs) that are either being used or could be used for evaluating the effects of aviation on climate. Since these models are used in policy-related analyses, it is important that the capabilities of such models represent the state of understanding of the science. We compare the Aviation Environmental Portfolio Management Tool (APMT) Impacts climate model, two models used at the Center for International Climate and Environmental Research-Oslo (CICERO-1 and CICERO-2), the Integrated Science Assessment Model (ISAM) model as described in Jain et al. (1994), the simple Linear Climate response model (LinClim) and the Model for the Assessment of Greenhouse-gas Induced Climate Change version 6 (MAGICC6). In this paper we select scenarios to illustrate the behavior of the carbon cycle and energy balance models in these SCMs. This study is not intended to determine the absolute and likely range of the expected climate response in these models but to highlight specific features in model representations of the carbon cycle and energy balance models that need to be carefully considered in studies of aviation effects on climate. These results suggest that carbon cycle models that use linear impulse-response-functions (IRF) in combination with separate equations describing air-sea and air-biosphere exchange of CO2 can account for the dominant nonlinearities in the climate system that would otherwise not have been captured with an IRF alone, and hence, produce a close representation of more complex carbon cycle models. Moreover, results suggest that an energy balance model with a 2-box ocean sub-model and IRF tuned to reproduce the response of coupled Earth system models produces a close representation of the globally-averaged temperature response of more complex energy balance models.
Research on Parallel Three Phase PWM Converters base on RTDS
NASA Astrophysics Data System (ADS)
Xia, Yan; Zou, Jianxiao; Li, Kai; Liu, Jingbo; Tian, Jun
2018-01-01
Converters parallel operation can increase capacity of the system, but it may lead to potential zero-sequence circulating current, so the control of circulating current was an important goal in the design of parallel inverters. In this paper, the Real Time Digital Simulator (RTDS) is used to model the converters parallel system in real time and study the circulating current restraining. The equivalent model of two parallel converters and zero-sequence circulating current(ZSCC) were established and analyzed, then a strategy using variable zero vector control was proposed to suppress the circulating current. For two parallel modular converters, hardware-in-the-loop(HIL) study based on RTDS and practical experiment were implemented, results prove that the proposed control strategy is feasible and effective.
Scalable multi-objective control for large scale water resources systems under uncertainty
NASA Astrophysics Data System (ADS)
Giuliani, Matteo; Quinn, Julianne; Herman, Jonathan; Castelletti, Andrea; Reed, Patrick
2016-04-01
The use of mathematical models to support the optimal management of environmental systems is rapidly expanding over the last years due to advances in scientific knowledge of the natural processes, efficiency of the optimization techniques, and availability of computational resources. However, undergoing changes in climate and society introduce additional challenges for controlling these systems, ultimately motivating the emergence of complex models to explore key causal relationships and dependencies on uncontrolled sources of variability. In this work, we contribute a novel implementation of the evolutionary multi-objective direct policy search (EMODPS) method for controlling environmental systems under uncertainty. The proposed approach combines direct policy search (DPS) with hierarchical parallelization of multi-objective evolutionary algorithms (MOEAs) and offers a threefold advantage: the DPS simulation-based optimization can be combined with any simulation model and does not add any constraint on modeled information, allowing the use of exogenous information in conditioning the decisions. Moreover, the combination of DPS and MOEAs prompts the generation or Pareto approximate set of solutions for up to 10 objectives, thus overcoming the decision biases produced by cognitive myopia, where narrow or restrictive definitions of optimality strongly limit the discovery of decision relevant alternatives. Finally, the use of large-scale MOEAs parallelization improves the ability of the designed solutions in handling the uncertainty due to severe natural variability. The proposed approach is demonstrated on a challenging water resources management problem represented by the optimal control of a network of four multipurpose water reservoirs in the Red River basin (Vietnam). As part of the medium-long term energy and food security national strategy, four large reservoirs have been constructed on the Red River tributaries, which are mainly operated for hydropower production, flood control, and water supply. Numerical results under historical as well as synthetically generated hydrologic conditions show that our approach is able to discover key system tradeoffs in the operations of the system. The ability of the algorithm to find near-optimal solutions increases with the number of islands in the adopted hierarchical parallelization scheme. In addition, although significant performance degradation is observed when the solutions designed over history are re-evaluated over synthetically generated inflows, we successfully reduced these vulnerabilities by identifying alternative solutions that are more robust to hydrologic uncertainties, while also addressing the tradeoffs across the Red River multi-sector services.
NASA Astrophysics Data System (ADS)
Yu, Entao; King, Martin P.; Sobolowski, Stefan; Otterå, Odd Helge; Gao, Yongqi
2018-06-01
This study investigates the robustness of hydroclimate impacts in Asia due to major drivers of climate variability in the Pacific Ocean, namely the El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO). Composite analyses are carried out on a tree ring-based Palmer Drought Severity Index as well as on a long coupled global climate model control experiment. El Niño (La Niña) has a robust impact on wet (dry) conditions in West Asia and dry (wet) conditions in South Asia. For the PDO, impacts are found throughout the Asia domain. However, identifying the robust signals due to PDO from these analyses is more challenging due to the limited lengths of the data. Results indicate that West Asia (South and Southeast Asia) experiences wet (dry) conditions during periods of positive PDO. For East Asia, there is indication that positive (negative) PDO is associated with wet (dry) conditions around and southward of 30°N and dry (wet) conditions north of this latitude. This result is consistent with the current understanding of the role of PDO in the "southern-flood northern-drought" phenomenon in China. We suggest that specific extreme events or periods have regional impacts with strong intensities that cannot be fully explained through the composite analysis of ENSO, PDO, or any combination thereof. Two such examples are shown to illustrate this: the Strange Parallel Drought (1756-1768 CE) and the Great Drought (1876-1878 CE). Additionally, during these climate events, ENSO and PDO can be in phases which are not consistent with the required phases of these drivers that explain the concurrent drought and pluvial conditions in Asia. Therefore, not all historical drought and pluvial events in Northeast Asia and northern China can be related back to ENSO or PDO. Finally, we also examine the dynamical characteristics of the reported hydroclimatic impacts in the global climate model experiment. There is moisture transport into (out of) regions that exhibit wet (dry) conditions in a manner consistent with the various ENSO and PDO composites, thereby providing physical explanation of the index-based results.
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
NASA Astrophysics Data System (ADS)
Sanabria, Diego Ignacio
2001-07-01
Detailed outcrop analysis of the Lower Jurassic Kayenta Formation provides the basis for the formulation of a new sequence stratigraphic model for arid to semi-arid continental deposits and the generation of a comprehensive set of sedimentologic criteria for the recognition of ephemeral stream deposits. Criteria for the recognition of ephemeral deposits in the ancient record were divided into three categories according to the scale of the feature being considered. The first category takes into account sedimentary structures commonly found in the record of ephemeral stream deposits including hyperconcentrated and debris flow deposits, planar parallel bedding, sigmoidal cross-bedding, hummocky cross-bedding, climbing ripple lamination, scour-and-fill structures, convolute bedding, overturned cross-bedding, ball-and-pillow structures, pocket structures, pillars, mud curls, flaser lamination, algal lamination, termite nests, and vertebrate tracks. The second category is concerned with the mesoscale facies architecture of ephemeral stream deposits and includes waning flow successions, bedform climb, downstream accretion, terminal wadi splays, and channel-fill successions indicating catastrophic flooding. At the large-scale facies architecture level, the third category, ephemeral stream deposits are commonly arranged in depositional units characterized by a downstream decrease in grain size and scale of sedimentary structures resulting from deposition in terminal fan systems. Outcrops of the Kayenta Formation and its transition to the Navajo Sandstone along the Vermilion and Echo Cliffs of Northern Arizona indicate that wet/dry climatic cyclicity exerted a major control on regional facies architecture. Two scales of wet/dry climatic cyclicity can be recognized in northern Arizona. Three sequence sets composed of rocks accumulated under predominantly dry or wet conditions are the expression of long-term climatic cyclicity. Short-term climatic cyclicity, on the other hand, is represented by high-frequency sequences composed of eolian or ephemeral fluvial deposits overlain by perennial fluvial sediments. Increased evapotranspiration rates, depressed water tables, and accumulation of eolian or ephemeral fluvial deposits characterize the dry portion of these cycles. The wet part of the cycles is marked by an increase in precipitation and the establishment of perennial fluvial systems and lacustrine basins. This depositional model constitutes a valuable tool for correlation of similar deposits in the subsurface.
Capabilities of Fully Parallelized MHD Stability Code MARS
NASA Astrophysics Data System (ADS)
Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang
2016-10-01
Results of full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. Parallel version of MARS, named PMARS, has been recently developed at FAR-TECH. Parallelized MARS is an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, implemented in MARS. Parallelization of the code included parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse vector iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the MARS algorithm using parallel libraries and procedures. Parallelized MARS is capable of calculating eigenmodes with significantly increased spatial resolution: up to 5,000 adapted radial grid points with up to 500 poloidal harmonics. Such resolution is sufficient for simulation of kink, tearing and peeling-ballooning instabilities with physically relevant parameters. Work is supported by the U.S. DOE SBIR program.
Fully Parallel MHD Stability Analysis Tool
NASA Astrophysics Data System (ADS)
Svidzinski, Vladimir; Galkin, Sergei; Kim, Jin-Soo; Liu, Yueqiang
2015-11-01
Progress on full parallelization of the plasma stability code MARS will be reported. MARS calculates eigenmodes in 2D axisymmetric toroidal equilibria in MHD-kinetic plasma models. It is a powerful tool for studying MHD and MHD-kinetic instabilities and it is widely used by fusion community. Parallel version of MARS is intended for simulations on local parallel clusters. It will be an efficient tool for simulation of MHD instabilities with low, intermediate and high toroidal mode numbers within both fluid and kinetic plasma models, already implemented in MARS. Parallelization of the code includes parallelization of the construction of the matrix for the eigenvalue problem and parallelization of the inverse iterations algorithm, implemented in MARS for the solution of the formulated eigenvalue problem. Construction of the matrix is parallelized by distributing the load among processors assigned to different magnetic surfaces. Parallelization of the solution of the eigenvalue problem is made by repeating steps of the present MARS algorithm using parallel libraries and procedures. Results of MARS parallelization and of the development of a new fix boundary equilibrium code adapted for MARS input will be reported. Work is supported by the U.S. DOE SBIR program.
NASA Astrophysics Data System (ADS)
Boyko, Oleksiy; Zheleznyak, Mark
2015-04-01
The original numerical code TOPKAPI-IMMS of the distributed rainfall-runoff model TOPKAPI ( Todini et al, 1996-2014) is developed and implemented in Ukraine. The parallel version of the code has been developed recently to be used on multiprocessors systems - multicore/processors PC and clusters. Algorithm is based on binary-tree decomposition of the watershed for the balancing of the amount of computation for all processors/cores. Message passing interface (MPI) protocol is used as a parallel computing framework. The numerical efficiency of the parallelization algorithms is demonstrated for the case studies for the flood predictions of the mountain watersheds of the Ukrainian Carpathian regions. The modeling results is compared with the predictions based on the lumped parameters models.
Dynamic modeling of parallel robots for computed-torque control implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Codourey, A.
1998-12-01
In recent years, increased interest in parallel robots has been observed. Their control with modern theory, such as the computed-torque method, has, however, been restrained, essentially due to the difficulty in establishing a simple dynamic model that can be calculated in real time. In this paper, a simple method based on the virtual work principle is proposed for modeling parallel robots. The mass matrix of the robot, needed for decoupling control strategies, does not explicitly appear in the formulation; however, it can be computed separately, based on kinetic energy considerations. The method is applied to the DELTA parallel robot, leadingmore » to a very efficient model that has been implemented in a real-time computed-torque control algorithm.« less
Hadano, Mayumi; Nasahara, Kenlo Nishida; Motohka, Takeshi; Noda, Hibiki Muraoka; Murakami, Kazutaka; Hosaka, Masahiro
2013-01-01
Reports indicate that leaf onset (leaf flush) of deciduous trees in cool-temperate ecosystems is occurring earlier in the spring in response to global warming. In this study, we created two types of phenology models, one driven only by warmth (spring warming [SW] model) and another driven by both warmth and winter chilling (parallel chill [PC] model), to predict such phenomena in the Japanese Islands at high spatial resolution (500 m). We calibrated these models using leaf onset dates derived from satellite data (Terra/MODIS) and in situ temperature data derived from a dense network of ground stations Automated Meteorological Data Acquisition System. We ran the model using future climate predictions created by the Japanese Meteorological Agency's MRI-AGCM3.1S model. In comparison to the first decade of the 2000s, our results predict that the date of leaf onset in the 2030s will advance by an average of 12 days under the SW model and 7 days under the PC model throughout the study area. The date of onset in the 2090s will advance by 26 days under the SW model and by 15 days under the PC model. The greatest impact will occur on Hokkaido (the northernmost island) and in the central mountains. PMID:23789086
A communication library for the parallelization of air quality models on structured grids
NASA Astrophysics Data System (ADS)
Miehe, Philipp; Sandu, Adrian; Carmichael, Gregory R.; Tang, Youhua; Dăescu, Dacian
PAQMSG is an MPI-based, Fortran 90 communication library for the parallelization of air quality models (AQMs) on structured grids. It consists of distribution, gathering and repartitioning routines for different domain decompositions implementing a master-worker strategy. The library is architecture and application independent and includes optimization strategies for different architectures. This paper presents the library from a user perspective. Results are shown from the parallelization of STEM-III on Beowulf clusters. The PAQMSG library is available on the web. The communication routines are easy to use, and should allow for an immediate parallelization of existing AQMs. PAQMSG can also be used for constructing new models.