Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC , state and local government Web resources and services. Real-time, global, sea surface temperature (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model
Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance
2014-01-01
Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...
Computational methods for global/local analysis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.
1992-01-01
Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.
Global/local methods for probabilistic structural analysis
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.
1993-01-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Global/local methods for probabilistic structural analysis
NASA Astrophysics Data System (ADS)
Millwater, H. R.; Wu, Y.-T.
1993-04-01
A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.
Model-Derived Global Aerosol Climatology for MISR Analysis ("Clim-Likely" Data Set)
Atmospheric Science Data Center
2018-04-19
Model-Derived Global Aerosol Climatology for MISR Analysis Multi-angle Imaging ... (MISR) monthly, global 1° x 1° "Clim-Likely" aerosol climatology, derived from 'typical-year' aerosol transport model results are available for individual 1° x 1° boxes or ...
Global/local stress analysis of composite panels
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.; Knight, Norman F., Jr.
1989-01-01
A method for performing a global/local stress analysis is described, and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
Global/local stress analysis of composite structures. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ransom, Jonathan B.
1989-01-01
A method for performing a global/local stress analysis is described and its capabilities are demonstrated. The method employs spline interpolation functions which satisfy the linear plate bending equation to determine displacements and rotations from a global model which are used as boundary conditions for the local model. Then, the local model is analyzed independent of the global model of the structure. This approach can be used to determine local, detailed stress states for specific structural regions using independent, refined local models which exploit information from less-refined global models. The method presented is not restricted to having a priori knowledge of the location of the regions requiring local detailed stress analysis. This approach also reduces the computational effort necessary to obtain the detailed stress state. Criteria for applying the method are developed. The effectiveness of the method is demonstrated using a classical stress concentration problem and a graphite-epoxy blade-stiffened panel with a discontinuous stiffener.
IPMP Global Fit - A one-step direct data analysis tool for predictive microbiology.
Huang, Lihan
2017-12-04
The objective of this work is to develop and validate a unified optimization algorithm for performing one-step global regression analysis of isothermal growth and survival curves for determination of kinetic parameters in predictive microbiology. The algorithm is incorporated with user-friendly graphical interfaces (GUIs) to develop a data analysis tool, the USDA IPMP-Global Fit. The GUIs are designed to guide the users to easily navigate through the data analysis process and properly select the initial parameters for different combinations of mathematical models. The software is developed for one-step kinetic analysis to directly construct tertiary models by minimizing the global error between the experimental observations and mathematical models. The current version of the software is specifically designed for constructing tertiary models with time and temperature as the independent model parameters in the package. The software is tested with a total of 9 different combinations of primary and secondary models for growth and survival of various microorganisms. The results of data analysis show that this software provides accurate estimates of kinetic parameters. In addition, it can be used to improve the experimental design and data collection for more accurate estimation of kinetic parameters. IPMP-Global Fit can be used in combination with the regular USDA-IPMP for solving the inverse problems and developing tertiary models in predictive microbiology. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.
2017-12-01
Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.
NASA Technical Reports Server (NTRS)
1984-01-01
The Global Modeling and Simulation Branch (GMSB) of the Laboratory for Atmospheric Sciences (GLAS) is engaged in general circulation modeling studies related to global atmospheric and oceanographic research. The research activities discussed are organized into two disciplines: Global Weather/Observing Systems and Climate/Ocean-Air Interactions. The Global Weather activities are grouped in four areas: (1) Analysis and Forecast Studies, (2) Satellite Observing Systems, (3) Analysis and Model Development, (4) Atmospheric Dynamics and Diagnostic Studies. The GLAS Analysis/Forecast/Retrieval System was applied to both FGGE and post FGGE periods. The resulting analyses have already been used in a large number of theoretical studies of atmospheric dynamics, forecast impact studies and development of new or improved algorithms for the utilization of satellite data. Ocean studies have focused on the analysis of long-term global sea surface temperature data, for use in the study of the response of the atmosphere to sea surface temperature anomalies. Climate research has concentrated on the simulation of global cloudiness, and on the sensitivities of the climate to sea surface temperature and ground wetness anomalies.
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Products People GLOBAL CLIMATE & WEATHER MODELING Global Forecast System (GFS) products - Please see
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
1998-01-01
The goal of this research is the continued development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. This work involves a combination of modeling and analysis efforts involving 4DDA datasets and simulations from the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) coordinate model and the GEOS GCM.
Sumner, T; Shephard, E; Bogle, I D L
2012-09-07
One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified.
Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency
NASA Technical Reports Server (NTRS)
Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey
2011-01-01
The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours
NASA Astrophysics Data System (ADS)
Ise, Takeshi; Litton, Creighton M.; Giardina, Christian P.; Ito, Akihiko
2010-12-01
Partitioning of gross primary production (GPP) to aboveground versus belowground, to growth versus respiration, and to short versus long-lived tissues exerts a strong influence on ecosystem structure and function, with potentially large implications for the global carbon budget. A recent meta-analysis of forest ecosystems suggests that carbon partitioning to leaves, stems, and roots varies consistently with GPP and that the ratio of net primary production (NPP) to GPP is conservative across environmental gradients. To examine influences of carbon partitioning schemes employed by global ecosystem models, we used this meta-analysis-based model and a satellite-based (MODIS) terrestrial GPP data set to estimate global woody NPP and equilibrium biomass, and then compared it to two process-based ecosystem models (Biome-BGC and VISIT) using the same GPP data set. We hypothesized that different carbon partitioning schemes would result in large differences in global estimates of woody NPP and equilibrium biomass. Woody NPP estimated by Biome-BGC and VISIT was 25% and 29% higher than the meta-analysis-based model for boreal forests, with smaller differences in temperate and tropics. Global equilibrium woody biomass, calculated from model-specific NPP estimates and a single set of tissue turnover rates, was 48 and 226 Pg C higher for Biome-BGC and VISIT compared to the meta-analysis-based model, reflecting differences in carbon partitioning to structural versus metabolically active tissues. In summary, we found that different carbon partitioning schemes resulted in large variations in estimates of global woody carbon flux and storage, indicating that stand-level controls on carbon partitioning are not yet accurately represented in ecosystem models.
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-01-01
Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-08-15
It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.
NASA Technical Reports Server (NTRS)
Robertson, Franklin R.; Christy, John R.; Goodman, Steven J.; Miller, Tim L.; Fitzjarrald, Dan; Lapenta, Bill; Wang, Shouping
1991-01-01
The primary objective is to determine the scope and interactions of the global water cycle with all components of the Earth system and to understand how it stimulates and regulates changes on both global and regional scales. The following subject areas are covered: (1) water vapor variability; (2) multi-phase water analysis; (3) diabatic heating; (4) MSU (Microwave Sounding Unit) temperature analysis; (5) Optimal precipitation and streamflow analysis; (6) CCM (Community Climate Model) hydrological cycle; (7) CCM1 climate sensitivity to lower boundary forcing; and (8) mesoscale modeling of atmosphere/surface interaction.
Exploratory Model Analysis of the Space Based Infrared System (SBIRS) Low Global Scheduler Problem
1999-12-01
solution. The non- linear least squares model is defined as Y = f{e,t) where: 0 =M-element parameter vector Y =N-element vector of all data t...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM (SBIRS) LOW GLOBAL SCHEDULER...December 1999 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM
Global Real-Time Ocean Forecast System
services. Marine Modeling and Analysis Branch Logo Click here to go to the MMAB home page Global Real-Time 17 Oct 2017 at 0Z, the Global RTOFS model has been upgraded to version 1.1.2. Changes include: The ). The global operational Real-Time Ocean Forecast System (Global RTOFS) at the National Centers for
How Stationary Are the Internal Tides in a High-Resolution Global Ocean Circulation Model?
2014-05-12
Egbert et al., 1994] and that the model global internal tide amplitudes compare well with an altimetric-based tidal analysis [Ray and Byrne, 2010]. The... analysis [Foreman, 1977] applied to the HYCOM total SSH. We will follow Shriver et al. [2012], analyzing the tides along satellite altimeter tracks...spots,’’ the comparison between the model and altimetric analysis is not as good due, in part, to two prob- lems, errors in the model barotropic tides and
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Dynamic Analysis of the Melanoma Model: From Cancer Persistence to Its Eradication
NASA Astrophysics Data System (ADS)
Starkov, Konstantin E.; Jimenez Beristain, Laura
In this paper, we study the global dynamics of the five-dimensional melanoma model developed by Kronik et al. This model describes interactions of tumor cells with cytotoxic T cells and respective cytokines under cellular immunotherapy. We get the ultimate upper and lower bounds for variables of this model, provide formulas for equilibrium points and present local asymptotic stability/hyperbolic instability conditions. Next, we prove the existence of the attracting set. Based on these results we come to global asymptotic melanoma eradication conditions via global stability analysis. Finally, we provide bounds for a locus of the melanoma persistence equilibrium point, study the case of melanoma persistence and describe conditions under which we observe global attractivity to the unique melanoma persistence equilibrium point.
Van Steen, Kristel; Curran, Desmond; Kramer, Jocelyn; Molenberghs, Geert; Van Vreckem, Ann; Bottomley, Andrew; Sylvester, Richard
2002-12-30
Clinical and quality of life (QL) variables from an EORTC clinical trial of first line chemotherapy in advanced breast cancer were used in a prognostic factor analysis of survival and response to chemotherapy. For response, different final multivariate models were obtained from forward and backward selection methods, suggesting a disconcerting instability. Quality of life was measured using the EORTC QLQ-C30 questionnaire completed by patients. Subscales on the questionnaire are known to be highly correlated, and therefore it was hypothesized that multicollinearity contributed to model instability. A correlation matrix indicated that global QL was highly correlated with 7 out of 11 variables. In a first attempt to explore multicollinearity, we used global QL as dependent variable in a regression model with other QL subscales as predictors. Afterwards, standard diagnostic tests for multicollinearity were performed. An exploratory principal components analysis and factor analysis of the QL subscales identified at most three important components and indicated that inclusion of global QL made minimal difference to the loadings on each component, suggesting that it is redundant in the model. In a second approach, we advocate a bootstrap technique to assess the stability of the models. Based on these analyses and since global QL exacerbates problems of multicollinearity, we therefore recommend that global QL be excluded from prognostic factor analyses using the QLQ-C30. The prognostic factor analysis was rerun without global QL in the model, and selected the same significant prognostic factors as before. Copyright 2002 John Wiley & Sons, Ltd.
A global sensitivity analysis approach for morphogenesis models.
Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G
2015-11-21
Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.
Coupled 2D-3D finite element method for analysis of a skin panel with a discontinuous stiffener
NASA Technical Reports Server (NTRS)
Wang, J. T.; Lotts, C. G.; Davis, D. D., Jr.; Krishnamurthy, T.
1992-01-01
This paper describes a computationally efficient analysis method which was used to predict detailed stress states in a typical composite compression panel with a discontinuous hat stiffener. A global-local approach was used. The global model incorporated both 2D shell and 3D brick elements connected by newly developed transition elements. Most of the panel was modeled with 2D elements, while 3D elements were employed to model the stiffener flange and the adjacent skin. Both linear and geometrically nonlinear analyses were performed on the global model. The effect of geometric nonlinearity induced by the eccentric load path due to the discontinuous hat stiffener was significant. The local model used a fine mesh of 3D brick elements to model the region at the end of the stiffener. Boundary conditions of the local 3D model were obtained by spline interpolation of the nodal displacements from the global analysis. Detailed in-plane and through-the-thickness stresses were calculated in the flange-skin interface near the end of the stiffener.
Understanding Coupling of Global and Diffuse Solar Radiation with Climatic Variability
NASA Astrophysics Data System (ADS)
Hamdan, Lubna
Global solar radiation data is very important for wide variety of applications and scientific studies. However, this data is not readily available because of the cost of measuring equipment and the tedious maintenance and calibration requirements. Wide variety of models have been introduced by researchers to estimate and/or predict the global solar radiations and its components (direct and diffuse radiation) using other readily obtainable atmospheric parameters. The goal of this research is to understand the coupling of global and diffuse solar radiation with climatic variability, by investigating the relationships between these radiations and atmospheric parameters. For this purpose, we applied multilinear regression analysis on the data of National Solar Radiation Database 1991--2010 Update. The analysis showed that the main atmospheric parameters that affect the amount of global radiation received on earth's surface are cloud cover and relative humidity. Global radiation correlates negatively with both variables. Linear models are excellent approximations for the relationship between atmospheric parameters and global radiation. A linear model with the predictors total cloud cover, relative humidity, and extraterrestrial radiation is able to explain around 98% of the variability in global radiation. For diffuse radiation, the analysis showed that the main atmospheric parameters that affect the amount received on earth's surface are cloud cover and aerosol optical depth. Diffuse radiation correlates positively with both variables. Linear models are very good approximations for the relationship between atmospheric parameters and diffuse radiation. A linear model with the predictors total cloud cover, aerosol optical depth, and extraterrestrial radiation is able to explain around 91% of the variability in diffuse radiation. Prediction analysis showed that the linear models we fitted were able to predict diffuse radiation with efficiency of test adjusted R2 values equal to 0.93, using the data of total cloud cover, aerosol optical depth, relative humidity and extraterrestrial radiation. However, for prediction purposes, using nonlinear terms or nonlinear models might enhance the prediction of diffuse radiation.
System for the Analysis of Global Energy Markets - Vol. II, Model Documentation
2003-01-01
The second volume provides a data implementation guide that lists all naming conventions and model constraints. In addition, Volume 1 has two appendixes that provide a schematic of the System for the Analysis of Global Energy Markets (SAGE) structure and a listing of the source code, respectively.
Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra; ...
2017-11-20
The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Katrina Eleanor; Urrego Blanco, Jorge Rolando; Jonko, Alexandra
The Colorado River basin is a fundamentally important river for society, ecology and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model.more » Here, we combine global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach.« less
Global climate change impacts on forests and markets
Xiaohui Tian; Brent Sohngen; John B Kim; Sara Ohrel; Jefferson Cole
2016-01-01
This paper develops an economic analysis of climate change impacts in the global forest sector. It illustrates how potential future climate change impacts can be integrated into a dynamic forestry economics model using data from a global dynamic vegetation model, theMC2model. The results suggest that climate change will cause forest outputs (such as timber) to increase...
An Analysis of San Diego's Housing Market Using a Geographically Weighted Regression Approach
NASA Astrophysics Data System (ADS)
Grant, Christina P.
San Diego County real estate transaction data was evaluated with a set of linear models calibrated by ordinary least squares and geographically weighted regression (GWR). The goal of the analysis was to determine whether the spatial effects assumed to be in the data are best studied globally with no spatial terms, globally with a fixed effects submarket variable, or locally with GWR. 18,050 single-family residential sales which closed in the six months between April 2014 and September 2014 were used in the analysis. Diagnostic statistics including AICc, R2, Global Moran's I, and visual inspection of diagnostic plots and maps indicate superior model performance by GWR as compared to both global regressions.
Global sensitivity analysis for fuzzy inputs based on the decomposition of fuzzy output entropy
NASA Astrophysics Data System (ADS)
Shi, Yan; Lu, Zhenzhou; Zhou, Yicheng
2018-06-01
To analyse the component of fuzzy output entropy, a decomposition method of fuzzy output entropy is first presented. After the decomposition of fuzzy output entropy, the total fuzzy output entropy can be expressed as the sum of the component fuzzy entropy contributed by fuzzy inputs. Based on the decomposition of fuzzy output entropy, a new global sensitivity analysis model is established for measuring the effects of uncertainties of fuzzy inputs on the output. The global sensitivity analysis model can not only tell the importance of fuzzy inputs but also simultaneously reflect the structural composition of the response function to a certain degree. Several examples illustrate the validity of the proposed global sensitivity analysis, which is a significant reference in engineering design and optimization of structural systems.
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
2001-01-01
This research was directed to the development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. An additional objective was to investigate the accuracy and theoretical limits of global climate predictability which are imposed by the inherent limitations of simulating trace constituent transport and the hydrologic processes of condensation, precipitation and cloud life cycles.
Dynamic Modeling of Cell-Free Biochemical Networks Using Effective Kinetic Models
2015-03-16
sensitivity value was the maximum uncertainty in that value estimated by the Sobol method. 2.4. Global Sensitivity Analysis of the Reduced Order Coagulation...sensitivity analysis, using the variance-based method of Sobol , to estimate which parameters controlled the performance of the reduced order model [69]. We...Environment. Comput. Sci. Eng. 2007, 9, 90–95. 69. Sobol , I. Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
System for the Analysis of Global Energy Markets - Vol. I, Model Documentation
2003-01-01
Documents the objectives and the conceptual and methodological approach used in the development of projections for the International Energy Outlook. The first volume of this report describes the System for the Analysis of Global Energy Markets (SAGE) methodology and provides an in-depth explanation of the equations of the model.
NASA Technical Reports Server (NTRS)
Fu, L. L.; Chao, Y.
1997-01-01
Investigated in this study is the response of a global ocean general circulation model to forcing provided by two wind products: operational analysis from the National Center for Environmental Prediction (NCEP); observations made by the ERS-1 radar scatterometer.
Structural Analysis of the Redesigned Ice/Frost Ramp Bracket
NASA Technical Reports Server (NTRS)
Phillips, D. R.; Dawicke, D. S.; Gentz, S. J.; Roberts, P. W.; Raju, I. S.
2007-01-01
This paper describes the interim structural analysis of a redesigned Ice/Frost Ramp bracket for the Space Shuttle External Tank (ET). The proposed redesigned bracket consists of mounts for attachment to the ET wall, supports for the electronic/instrument cables and propellant repressurization lines that run along the ET, an upper plate, a lower plate, and complex bolted connections. The eight nominal bolted connections are considered critical in the summarized structural analysis. Each bolted connection contains a bolt, a nut, four washers, and a non-metallic spacer and block that are designed for thermal insulation. A three-dimensional (3D) finite element model of the bracket is developed using solid 10-node tetrahedral elements. The loading provided by the ET Project is used in the analysis. Because of the complexities associated with accurately modeling the bolted connections in the bracket, the analysis is performed using a global/local analysis procedure. The finite element analysis of the bracket identifies one of the eight bolted connections as having high stress concentrations. A local area of the bracket surrounding this bolted connection is extracted from the global model and used as a local model. Within the local model, the various components of the bolted connection are refined, and contact is introduced along the appropriate interfaces determined by the analysts. The deformations from the global model are applied as boundary conditions to the local model. The results from the global/local analysis show that while the stresses in the bolts are well within yield, the spacers fail due to compression. The primary objective of the interim structural analysis is to show concept viability for static thermal testing. The proposed design concept would undergo continued design optimization to address the identified analytical assumptions and concept shortcomings, assuming successful thermal testing.
Evaluating Economic Impacts of Expanded Global Wood Energy Consumption with the USFPM/GFPM Model
Peter J. Ince; Andrew Kramp; Kenneth E. Skog
2012-01-01
A U.S. forest sector market module was developed within the general Global Forest Products Model. The U.S. module tracks regional timber markets, timber harvests by species group, and timber product outputs in greater detail than does the global model. This hybrid approach provides detailed regional market analysis for the United States although retaining the...
NASA Technical Reports Server (NTRS)
Christensen, E. J.; Haines, B. J.; Mccoll, K. C.; Nerem, R. S.
1994-01-01
We have compared Global Positioning System (GPS)-based dynamic and reduced-dynamic TOPEX/Poseidon orbits over three 10-day repeat cycles of the ground-track. The results suggest that the prelaunch joint gravity model (JGM-1) introduces geographically correlated errors (GCEs) which have a strong meridional dependence. The global distribution and magnitude of these GCEs are consistent with a prelaunch covariance analysis, with estimated and predicted global rms error statistics of 2.3 and 2.4 cm rms, respectively. Repeating the analysis with the post-launch joint gravity model (JGM-2) suggests that a portion of the meridional dependence observed in JGM-1 still remains, with global rms error of 1.2 cm.
Global Futures: The Emerging Scenario.
ERIC Educational Resources Information Center
Seth, Satish C.
1983-01-01
Acknowledging global interdependence, especially in economics, may be the most important step toward resolving international conflicts. Describes seven major global dangers and gives scenarios for exploring likely global futures. As "tools of prescription" these global models are inadequate, but as "tools of analysis" they have…
Lattice Boltzmann methods for global linear instability analysis
NASA Astrophysics Data System (ADS)
Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis
2017-12-01
Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.
Improving Incremental Balance in the GSI 3DVAR Analysis System
NASA Technical Reports Server (NTRS)
Errico, Ronald M.; Yang, Runhua; Kleist, Daryl T.; Parrish, David F.; Derber, John C.; Treadon, Russ
2008-01-01
The Gridpoint Statistical Interpolation (GSI) analysis system is a unified global/regional 3DVAR analysis code that has been under development for several years at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center. It has recently been implemented into operations at NCEP in both the global and North American data assimilation systems (GDAS and NDAS). An important aspect of this development has been improving the balance of the analysis produced by GSI. The improved balance between variables has been achieved through the inclusion of a Tangent Linear Normal Mode Constraint (TLNMC). The TLNMC method has proven to be very robust and effective. The TLNMC as part of the global GSI system has resulted in substantial improvement in data assimilation both at NCEP and at the NASA Global Modeling and Assimilation Office (GMAO).
Global model of zenith tropospheric delay proposed based on EOF analysis
NASA Astrophysics Data System (ADS)
Sun, Langlang; Chen, Peng; Wei, Erhu; Li, Qinzheng
2017-07-01
Tropospheric delay is one of the main error budgets in Global Navigation Satellite System (GNSS) measurements. Many empirical correction models have been developed to compensate this delay, and models which do not require meteorological parameters have received the most attention. This study established a global troposphere zenith total delay (ZTD) model, called Global Empirical Orthogonal Function Troposphere (GEOFT), based on the empirical orthogonal function (EOF, also known as geographically weighted PCAs) analysis method and the Global Geodetic Observing System (GGOS) Atmosphere data from 2012 to 2015. The results showed that ZTD variation could be well represented by the characteristics of the EOF base function Ek and associated coefficients Pk. Here, E1 mainly signifies the equatorial anomaly; E2 represents north-south asymmetry, and E3 and E4 reflects regional variation. Moreover, P1 mainly reflects annual and semiannual variation components; P2 and P3 mainly contains annual variation components, and P4 displays semiannual variation components. We validated the proposed GEOFT model using tropospheric delay data of GGOS ZTD grid data and the tropospheric product of the International GNSS Service (IGS) over the year 2016. The results showed that GEOFT model has high accuracy with bias and RMS of -0.3 and 3.9 cm, respectively, with respect to the GGOS ZTD data, and of -0.8 and 4.1 cm, respectively, with respect to the global IGS tropospheric product. The accuracy of GEOFT demonstrating that the use of the EOF analysis method to characterize ZTD variation is reasonable.
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2015-05-01
Sensitivity analysis is an essential paradigm in Earth and Environmental Systems modeling. However, the term "sensitivity" has a clear definition, based in partial derivatives, only when specified locally around a particular point (e.g., optimal solution) in the problem space. Accordingly, no unique definition exists for "global sensitivity" across the problem space, when considering one or more model responses to different factors such as model parameters or forcings. A variety of approaches have been proposed for global sensitivity analysis, based on different philosophies and theories, and each of these formally characterizes a different "intuitive" understanding of sensitivity. These approaches focus on different properties of the model response at a fundamental level and may therefore lead to different (even conflicting) conclusions about the underlying sensitivities. Here we revisit the theoretical basis for sensitivity analysis, summarize and critically evaluate existing approaches in the literature, and demonstrate their flaws and shortcomings through conceptual examples. We also demonstrate the difficulty involved in interpreting "global" interaction effects, which may undermine the value of existing interpretive approaches. With this background, we identify several important properties of response surfaces that are associated with the understanding and interpretation of sensitivities in the context of Earth and Environmental System models. Finally, we highlight the need for a new, comprehensive framework for sensitivity analysis that effectively characterizes all of the important sensitivity-related properties of model response surfaces.
[Global Atmospheric Chemistry/Transport Modeling and Data-Analysis
NASA Technical Reports Server (NTRS)
Prinn, Ronald G.
1999-01-01
This grant supported a global atmospheric chemistry/transport modeling and data- analysis project devoted to: (a) development, testing, and refining of inverse methods for determining regional and global transient source and sink strengths for trace gases; (b) utilization of these inverse methods which use either the Model for Atmospheric Chemistry and Transport (MATCH) which is based on analyzed observed winds or back- trajectories calculated from these same winds for determining regional and global source and sink strengths for long-lived trace gases important in ozone depletion and the greenhouse effect; (c) determination of global (and perhaps regional) average hydroxyl radical concentrations using inverse methods with multiple "titrating" gases; and (d) computation of the lifetimes and spatially resolved destruction rates of trace gases using 3D models. Important ultimate goals included determination of regional source strengths of important biogenic/anthropogenic trace gases and also of halocarbons restricted by the Montreal Protocol and its follow-on agreements, and hydrohalocarbons now used as alternatives to the above restricted halocarbons.
Development and verification of local/global analysis techniques for laminated composites
NASA Technical Reports Server (NTRS)
Griffin, O. Hayden, Jr.
1989-01-01
Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.
2-D to 3-D global/local finite element analysis of cross-ply composite laminates
NASA Technical Reports Server (NTRS)
Thompson, D. Muheim; Griffin, O. Hayden, Jr.
1990-01-01
An example of two-dimensional to three-dimensional global/local finite element analysis of a laminated composite plate with a hole is presented. The 'zoom' technique of global/local analysis is used, where displacements of the global/local interface from the two-dimensional global model are applied to the edges of the three-dimensional local model. Three different hole diameters, one, three, and six inches, are considered in order to compare the effect of hole size on the three-dimensional stress state around the hole. In addition, three different stacking sequences are analyzed for the six inch hole case in order to study the effect of stacking sequence. The existence of a 'critical' hole size, where the interlaminar stresses are maximum, is indicated. Dispersion of plies at the same angle, as opposed to clustering, is found to reduce the magnitude of some interlaminar stress components and increase others.
A global/local analysis method for treating details in structural design
NASA Technical Reports Server (NTRS)
Aminpour, Mohammad A.; Mccleary, Susan L.; Ransom, Jonathan B.
1993-01-01
A method for analyzing global/local behavior of plate and shell structures is described. In this approach, a detailed finite element model of the local region is incorporated within a coarser global finite element model. The local model need not be nodally compatible (i.e., need not have a one-to-one nodal correspondence) with the global model at their common boundary; therefore, the two models may be constructed independently. The nodal incompatibility of the models is accounted for by introducing appropriate constraint conditions into the potential energy in a hybrid variational formulation. The primary advantage of this method is that the need for transition modeling between global and local models is eliminated. Eliminating transition modeling has two benefits. First, modeling efforts are reduced since tedious and complex transitioning need not be performed. Second, errors due to the mesh distortion, often unavoidable in mesh transitioning, are minimized by avoiding distorted elements beyond what is needed to represent the geometry of the component. The method is applied reduced to a plate loaded in tension and transverse bending. The plate has a central hole, and various hole sixes and shapes are studied. The method is also applied to a composite laminated fuselage panel with a crack emanating from a window in the panel. While this method is applied herein to global/local problems, it is also applicable to the coupled analysis of independently modeled components as well as adaptive refinement.
Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.
Kang, Yun; Lanchier, Nicolas
2011-06-01
We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.
Balcan, Duygu; Gonçalves, Bruno; Hu, Hao; Ramasco, José J.; Colizza, Vittoria
2010-01-01
Here we present the Global Epidemic and Mobility (GLEaM) model that integrates sociodemographic and population mobility data in a spatially structured stochastic disease approach to simulate the spread of epidemics at the worldwide scale. We discuss the flexible structure of the model that is open to the inclusion of different disease structures and local intervention policies. This makes GLEaM suitable for the computational modeling and anticipation of the spatio-temporal patterns of global epidemic spreading, the understanding of historical epidemics, the assessment of the role of human mobility in shaping global epidemics, and the analysis of mitigation and containment scenarios. PMID:21415939
Accurate Realization of GPS Vertical Global Reference Frame
NASA Technical Reports Server (NTRS)
Elosegui, Pedro
2004-01-01
The few millimeter per year level accuracy of radial global velocity estimates with the Global Positioning System (GPS) is at least an order of magnitude poorer than the accuracy of horizontal global motions. An improvement in the accuracy of radial global velocities would have a very positive impact on a number of geophysical studies of current general interest such as global sea-level and climate change, coastal hazards, glacial isostatic adjustment, atmospheric and oceanic loading, glaciology and ice mass variability, tectonic deformation and volcanic inflation, and geoid variability. The goal of this project is to improve our current understanding of GPS error sources associated with estimates of radial velocities at global scales. GPS error sources relevant to this project can be classified in two broad categories: (1) those related to the analysis of the GPS phase observable, and (2) those related to the combination of the positions and velocities of a set of globally distributed stations as determined from the analysis of GPS data important aspect in the first category include the effect on vertical rate estimates due to standard analysis choices, such as orbit modeling, network geometry, ambiguity resolution, as well as errors in models (or simply the lack of models) for clocks, multipath, phase-center variations, atmosphere, and solid-Earth tides. The second category includes the possible methods of combining and defining terrestrial reference flames for determining vertical velocities in a global scale. The latter has been the subject of our research activities during this reporting period.
Modeling the plasmasphere based on LEO satellites onboard GPS measurements
NASA Astrophysics Data System (ADS)
Chen, Peng; Yao, Yibin; Li, Qinzheng; Yao, Wanqiang
2017-01-01
The plasmasphere, which is located above the ionosphere, is a significant component of Earth's atmosphere. A global plasmaspheric model was constructed using the total electron content (TEC) along the signal propagation path calculated using onboard Global Positioning System observations from the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) and MetOp-A, provided by the COSMIC Data Analysis and Archive Center (CDAAC). First, the global plasmaspheric model was established using only COSMIC TEC, and a set of MetOp-A TEC provided by CDAAC served for external evaluation. Results indicated that the established model using only COSMIC data is highly accurate. Then, COSMIC and MetOp-A TEC were combined to produce a new global plasmaspheric model. Finally, the variational characteristics of global plasmaspheric electron content with latitude, local time, and season were investigated using the global plasmaspheric model established in this paper.
The statistical analysis of global climate change studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, J.W.
1992-01-01
The focus of this work is to contribute to the enhancement of the relationship between climatologists and statisticians. The analysis of global change data has been underway for many years by atmospheric scientists. Much of this analysis includes a heavy reliance on statistics and statistical inference. Some specific climatological analyses are presented and the dependence on statistics is documented before the analysis is undertaken. The first problem presented involves the fluctuation-dissipation theorem and its application to global climate models. This problem has a sound theoretical niche in the literature of both climate modeling and physics, but a statistical analysis inmore » which the data is obtained from the model to show graphically the relationship has not been undertaken. It is under this motivation that the author presents this problem. A second problem concerning the standard errors in estimating global temperatures is purely statistical in nature although very little materials exists for sampling on such a frame. This problem not only has climatological and statistical ramifications, but political ones as well. It is planned to use these results in a further analysis of global warming using actual data collected on the earth. In order to simplify the analysis of these problems, the development of a computer program, MISHA, is presented. This interactive program contains many of the routines, functions, graphics, and map projections needed by the climatologist in order to effectively enter the arena of data visualization.« less
NASA Astrophysics Data System (ADS)
Hameed, M.; Demirel, M. C.; Moradkhani, H.
2015-12-01
Global Sensitivity Analysis (GSA) approach helps identify the effectiveness of model parameters or inputs and thus provides essential information about the model performance. In this study, the effects of the Sacramento Soil Moisture Accounting (SAC-SMA) model parameters, forcing data, and initial conditions are analysed by using two GSA methods: Sobol' and Fourier Amplitude Sensitivity Test (FAST). The simulations are carried out over five sub-basins within the Columbia River Basin (CRB) for three different periods: one-year, four-year, and seven-year. Four factors are considered and evaluated by using the two sensitivity analysis methods: the simulation length, parameter range, model initial conditions, and the reliability of the global sensitivity analysis methods. The reliability of the sensitivity analysis results is compared based on 1) the agreement between the two sensitivity analysis methods (Sobol' and FAST) in terms of highlighting the same parameters or input as the most influential parameters or input and 2) how the methods are cohered in ranking these sensitive parameters under the same conditions (sub-basins and simulation length). The results show the coherence between the Sobol' and FAST sensitivity analysis methods. Additionally, it is found that FAST method is sufficient to evaluate the main effects of the model parameters and inputs. Another conclusion of this study is that the smaller parameter or initial condition ranges, the more consistency and coherence between the sensitivity analysis methods results.
GCIP water and energy budget synthesis (WEBS)
Roads, J.; Lawford, R.; Bainto, E.; Berbery, E.; Chen, S.; Fekete, B.; Gallo, K.; Grundstein, A.; Higgins, W.; Kanamitsu, M.; Krajewski, W.; Lakshmi, V.; Leathers, D.; Lettenmaier, D.; Luo, L.; Maurer, E.; Meyers, T.; Miller, D.; Mitchell, Ken; Mote, T.; Pinker, R.; Reichler, T.; Robinson, D.; Robock, A.; Smith, J.; Srinivasan, G.; Verdin, K.; Vinnikov, K.; Vonder, Haar T.; Vorosmarty, C.; Williams, S.; Yarosh, E.
2003-01-01
As part of the World Climate Research Program's (WCRPs) Global Energy and Water-Cycle Experiment (GEWEX) Continental-scale International Project (GCIP), a preliminary water and energy budget synthesis (WEBS) was developed for the period 1996-1999 fromthe "best available" observations and models. Besides this summary paper, a companion CD-ROM with more extensive discussion, figures, tables, and raw data is available to the interested researcher from the GEWEX project office, the GAPP project office, or the first author. An updated online version of the CD-ROM is also available at http://ecpc.ucsd.edu/gcip/webs.htm/. Observations cannot adequately characterize or "close" budgets since too many fundamental processes are missing. Models that properly represent the many complicated atmospheric and near-surface interactions are also required. This preliminary synthesis therefore included a representative global general circulation model, regional climate model, and a macroscale hydrologic model as well as a global reanalysis and a regional analysis. By the qualitative agreement among the models and available observations, it did appear that we now qualitatively understand water and energy budgets of the Mississippi River Basin. However, there is still much quantitative uncertainty. In that regard, there did appear to be a clear advantage to using a regional analysis over a global analysis or a regional simulation over a global simulation to describe the Mississippi River Basin water and energy budgets. There also appeared to be some advantage to using a macroscale hydrologic model for at least the surface water budgets. Copyright 2003 by the American Geophysical Union.
Long-Term Global Morphology of Gravity Wave Activity Using UARS Data
NASA Technical Reports Server (NTRS)
Eckermann, Stephen D.; Bacmeister, Julio T.; Wu, Dong L.
1998-01-01
Progress in research into the global morphology of gravity wave activity using UARS data is described for the period March-June, 1998. Highlights this quarter include further progress in the analysis and interpretation of CRISTA temperature variances; model-generated climatologies of mesospheric gravity wave activity using the HWM-93 wind and temperature model; and modeling of gravity wave detection from space-based platforms. Preliminary interpretations and recommended avenues for further analysis are also described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kang, Shujiang; Kline, Keith L; Nair, S. Surendran
A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulatedmore » a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.« less
Requirements for a next generation global flood inundation models
NASA Astrophysics Data System (ADS)
Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C. C.
2016-12-01
In this paper we review the current status of global hydrodynamic models for flood inundation prediction and highlight recent successes and current limitations. Building on this analysis we then go on to consider what is required to develop the next generation of such schemes and show that to achieve this a number of fundamental science problems will need to be overcome. New data sets and new types of analysis will be required, and we show that these will only partially be met by currently planned satellite missions and data collection initiatives. A particular example is the quality of available global Digital Elevation data. The current best data set for flood modelling, SRTM, is only available at a relatively modest 30m resolution, contains pixel-to-pixel noise of 6m and is corrupted by surface artefacts. Creative processing techniques have sought to address these issues with some success, but fundamentally the quality of the available global terrain data limits flood modelling and needs to be overcome. Similar arguments can be made for many other elements of global hydrodynamic models including their bathymetry data, boundary conditions, flood defence information and model validation data. We therefore systematically review each component of global flood models and document whether planned new technology will solve current limitations and, if not, what exactly will be required to do so.
Fletcher, Patrick; Bertram, Richard; Tabak, Joel
2016-06-01
Models of electrical activity in excitable cells involve nonlinear interactions between many ionic currents. Changing parameters in these models can produce a variety of activity patterns with sometimes unexpected effects. Further more, introducing new currents will have different effects depending on the initial parameter set. In this study we combined global sampling of parameter space and local analysis of representative parameter sets in a pituitary cell model to understand the effects of adding K (+) conductances, which mediate some effects of hormone action on these cells. Global sampling ensured that the effects of introducing K (+) conductances were captured across a wide variety of contexts of model parameters. For each type of K (+) conductance we determined the types of behavioral transition that it evoked. Some transitions were counterintuitive, and may have been missed without the use of global sampling. In general, the wide range of transitions that occurred when the same current was applied to the model cell at different locations in parameter space highlight the challenge of making accurate model predictions in light of cell-to-cell heterogeneity. Finally, we used bifurcation analysis and fast/slow analysis to investigate why specific transitions occur in representative individual models. This approach relies on the use of a graphics processing unit (GPU) to quickly map parameter space to model behavior and identify parameter sets for further analysis. Acceleration with modern low-cost GPUs is particularly well suited to exploring the moderate-sized (5-20) parameter spaces of excitable cell and signaling models.
Surface Current Skill Assessment of Global and Regional forecast models.
NASA Astrophysics Data System (ADS)
Allen, A. A.
2016-02-01
The U.S. Coast Guard has been using SAROPS since January 2007 at all fifty of its operational centers to plan search and rescue missions. SAROPS relies on an Environmental Data Server (EDS) that integrates global, national, and regional ocean and meteorological observation and forecast data. The server manages spatial and temporal aggregation of hindcast, nowcast, and forecast data so the SAROPS controller has the best available data for search planning. The EDS harvests a wide range of global and regional forecasts and data, including NOAA NCEP's global HYCOM model (RTOFS), the U.S. Navy's Global HYCOM model, the 5 NOAA NOS Great Lakes models and a suite of other reginal forecasts from NOS and IOOS Regional Associations. The EDS also integrates surface drifter data as the U.S. Coast Guard regularly deploys Self-Locating Datum Marker Buoys (SLDMBs) during SAR cases and a significant set of drifter data has been collected and the archive continues to grow. This data is critically useful during real-time SAR planning, but also represents a valuable scientific dataset for analyzing surface currents. In 2014, a new initiative was started by the U.S. Coast Guard to evaluate the skill of the various models to support the decision making process during search and rescue planning. This analysis falls into 2 categories: historical analysis of drifter tracks and model predictions to provide skill assessment of models in different regions and real-time analysis of models and drifter tracks during a SAR incident. The EDS, using Liu and Wiesberg's (2014) autonomously determines surface skill measurements of the co-located models' simulated surface trajectories versus the actual drift of the SLDMBs (CODE/Davis style surface drifters GPS positioned at 30min intervals). Surface skill measurements are archived in a database and are user retrieval by lat/long/time cubes. This paper will focus on the comparison of models from in the period from 23 August to 21 September 2015. Surface Skill was determined for the following regions: California Coast, Gulf of Mexico, South and Mid Atlantic Bights. Skill was determined for the two version of the NCEP Global RTOFS, Navy's Global HYCOM model, and where appropriated the local regional models
Selecting global climate models for regional climate change studies
Pierce, David W.; Barnett, Tim P.; Santer, Benjamin D.; Gleckler, Peter J.
2009-01-01
Regional or local climate change modeling studies currently require starting with a global climate model, then downscaling to the region of interest. How should global models be chosen for such studies, and what effect do such choices have? This question is addressed in the context of a regional climate detection and attribution (D&A) study of January-February-March (JFM) temperature over the western U.S. Models are often selected for a regional D&A analysis based on the quality of the simulated regional climate. Accordingly, 42 performance metrics based on seasonal temperature and precipitation, the El Nino/Southern Oscillation (ENSO), and the Pacific Decadal Oscillation are constructed and applied to 21 global models. However, no strong relationship is found between the score of the models on the metrics and results of the D&A analysis. Instead, the importance of having ensembles of runs with enough realizations to reduce the effects of natural internal climate variability is emphasized. Also, the superiority of the multimodel ensemble average (MM) to any 1 individual model, already found in global studies examining the mean climate, is true in this regional study that includes measures of variability as well. Evidence is shown that this superiority is largely caused by the cancellation of offsetting errors in the individual global models. Results with both the MM and models picked randomly confirm the original D&A results of anthropogenically forced JFM temperature changes in the western U.S. Future projections of temperature do not depend on model performance until the 2080s, after which the better performing models show warmer temperatures. PMID:19439652
Should precipitation influence dust emission in global dust models?
NASA Astrophysics Data System (ADS)
Okin, Gregory
2016-04-01
Soil moisture modulates the threshold shear stress required to initiate aeolian transport and dust emission. Most of the theoretical and laboratory work that has confirmed the impact of soil moisture has appropriately acknowledged that it is the soil moisture of a surface layer a few grain diameters thick that truly controls threshold shear velocity. Global and regional models of dust emission include the effect of soil moisture on transport threshold, but most ignore the fact that only the moisture of the very topmost "active layer" matters. The soil moisture in the active layer can differ greatly from that integrated through the top 2, 5, 10, or 100 cm (surface layers used by various global models) because the top 2 mm of heavy texture soils dries within ~1/2 day while sandy soils dry within less than 2 hours. Thus, in drylands where dust emission occurs, it is likely that this top layer is drier than the underlying soil in the days and weeks after rain. This paper explores, globally, the time between rain events in relation to the time for the active layer to dry and the timing of high wind events. This analysis is carried out using the same coarse reanalyses used in global dust models and is intended to inform the soil moisture controls in these models. The results of this analysis indicate that the timing between events is, in almost all dust-producing areas, significantly longer than the drying time of the active layer, even when considering soil texture differences. Further, the analysis shows that the probability of a high wind event during the period after a rain where the surface is wet is small. Therefore, in coarse global models, there is little reason to include rain-derived soil moisture in the modeling scheme.
Real-Time Onboard Global Nonlinear Aerodynamic Modeling from Flight Data
NASA Technical Reports Server (NTRS)
Brandon, Jay M.; Morelli, Eugene A.
2014-01-01
Flight test and modeling techniques were developed to accurately identify global nonlinear aerodynamic models onboard an aircraft. The techniques were developed and demonstrated during piloted flight testing of an Aermacchi MB-326M Impala jet aircraft. Advanced piloting techniques and nonlinear modeling techniques based on fuzzy logic and multivariate orthogonal function methods were implemented with efficient onboard calculations and flight operations to achieve real-time maneuver monitoring and analysis, and near-real-time global nonlinear aerodynamic modeling and prediction validation testing in flight. Results demonstrated that global nonlinear aerodynamic models for a large portion of the flight envelope were identified rapidly and accurately using piloted flight test maneuvers during a single flight, with the final identified and validated models available before the aircraft landed.
The GEOS Ozone Data Assimilation System: Specification of Error Statistics
NASA Technical Reports Server (NTRS)
Stajner, Ivanka; Riishojgaard, Lars Peter; Rood, Richard B.
2000-01-01
A global three-dimensional ozone data assimilation system has been developed at the Data Assimilation Office of the NASA/Goddard Space Flight Center. The Total Ozone Mapping Spectrometer (TOMS) total ozone and the Solar Backscatter Ultraviolet (SBUV) or (SBUV/2) partial ozone profile observations are assimilated. The assimilation, into an off-line ozone transport model, is done using the global Physical-space Statistical Analysis Scheme (PSAS). This system became operational in December 1999. A detailed description of the statistical analysis scheme, and in particular, the forecast and observation error covariance models is given. A new global anisotropic horizontal forecast error correlation model accounts for a varying distribution of observations with latitude. Correlations are largest in the zonal direction in the tropics where data is sparse. Forecast error variance model is proportional to the ozone field. The forecast error covariance parameters were determined by maximum likelihood estimation. The error covariance models are validated using x squared statistics. The analyzed ozone fields in the winter 1992 are validated against independent observations from ozone sondes and HALOE. There is better than 10% agreement between mean Halogen Occultation Experiment (HALOE) and analysis fields between 70 and 0.2 hPa. The global root-mean-square (RMS) difference between TOMS observed and forecast values is less than 4%. The global RMS difference between SBUV observed and analyzed ozone between 50 and 3 hPa is less than 15%.
Flight motor set 360L001 (STS-26R). (Reconstructed dynamic loads analysis)
NASA Technical Reports Server (NTRS)
Call, V. B.
1989-01-01
A transient analysis was performed to correlate the predicted versus measured behavior of the Redesigned Solid Rocket Booster (RSRB) during Flight 360L001 (STS-26R) liftoff. Approximately 9 accelerometers, 152 strain gages, and 104 girth gages were bonded to the motors during this event. Prior to Flight 360L001, a finite element model of the RSRB was analyzed to predict the accelerations, strains, and displacements measured by this developmental flight instrumentation (DFI) within an order of magnitude. Subsequently, an analysis has been performed which uses actual Flight 360L001 liftoff loading conditions, and makes more precise predictions for the RSRB structural behavior. Essential information describing the analytical model, analytical techniques used, correlation of the predicted versus measured RSRB behavior, and conclusions, are presented. A detailed model of the RSRB was developed and correlated for use in analyzing the motor behavior during liftoff loading conditions. This finite element model, referred to as the RSRB global model, uses super-element techniques to model all components of the RSRB. The objective of the RSRB global model is to accurately predict deflections and gap openings in the field joints to an accuracy of approximately 0.001 inch. The model of the field joint component was correlated to Referee and Joint Environment Simulation (JES) tests. The accuracy of the assembled RSRB global model was validated by correlation to static-fire tests such DM-8, DM-9, QM-7, and QM-8. This validated RSRB global model was used to predict RSRB structural behavior and joint gap opening during Flight 360L001 liftoff. The results of a transient analysis of the RSRB global model with imposed liftoff loading conditions are presented. Rockwell used many gage measurements to reconstruct the load parameters which were imposed on the RSRB during the Flight 360L001 liftoff. Each load parameter, and its application, is described. Also presented are conclusions and recommendations based on the analysis of this load case and the resulting correlation between predicted and measured RSRB structural behavior.
Spatially explicit modeling of particulate nutrient flux in Large global rivers
NASA Astrophysics Data System (ADS)
Cohen, S.; Kettner, A.; Mayorga, E.; Harrison, J. A.
2017-12-01
Water, sediment, nutrient and carbon fluxes along river networks have undergone considerable alterations in response to anthropogenic and climatic changes, with significant consequences to infrastructure, agriculture, water security, ecology and geomorphology worldwide. However, in a global setting, these changes in fluvial fluxes and their spatial and temporal characteristics are poorly constrained, due to the limited availability of continuous and long-term observations. We present results from a new global-scale particulate modeling framework (WBMsedNEWS) that combines the Global NEWS watershed nutrient export model with the spatially distributed WBMsed water and sediment model. We compare the model predictions against multiple observational datasets. The results indicate that the model is able to accurately predict particulate nutrient (Nitrogen, Phosphorus and Organic Carbon) fluxes on an annual time scale. Analysis of intra-basin nutrient dynamics and fluxes to global oceans is presented.
NASA Astrophysics Data System (ADS)
Meng, J.; Mitchell, K.; Wei, H.; Yang, R.; Kumar, S.; Geiger, J.; Xie, P.
2008-05-01
Over the past several years, the Environmental Modeling Center (EMC) of the National Centers for Environmental Prediction (NCEP) of the U.S. National Weather Service has developed a Global Land Data Assimilation System (GLDAS). For its computational infrastructure, the GLDAS applies the NASA Land Information System (LIS), developed by the Hydrological Science Branch of NASA Goddard Space Flight Center. The land model utilized in the NCEP GLDAS is the NCEP Noah Land Surface Model (Noah LSM). This presentation will 1) describe how the GLDAS component has been included in the development of NCEP's third global reanalysis (with special attention to the input sources of global precipitation), and 2) will present results from the GLDAS component of pilot tests of the new NCEP global reanalysis. Unlike NCEP's past two global reanalysis projects, this new NCEP global reanalysis includes both a global land data assimilation system (GLDAS) and a global ocean data assimilation system (GODAS). The new global reanalysis will span 30-years (1979-2008) and will include a companion realtime operational component. The atmospheric, ocean, and land states of this global reanalysis will provide the initial conditions for NCEP's 3rd- generation global coupled Climate Forecast System (CFS). NCEP is now preparing to launch a 28-year seasonal reforecast project with its new CFS, to provide the reforecast foundation for operational NCEP seasonal climate forecasts using the new CFS. Together, the new global reanalysis and companion CFS reforecasts constitute what NCEP calls the Climate Forecast System Reanalysis and Reforecast (CFSRR) project. Compared to the previous two generations of NCEP global reanalysis, the hallmark of the GLDAS component of CFSRR is GLDAS use of global analyses of observed precipitation to drive the land surface component of the reanalysis (rather than the typical reanalysis approach of using precipitation from the assimilating background atmospheric model). Specifically, the GLDAS merges two global analyses of observed precipitation produced by the Climate Prediction Center (CPC) of NCEP, as follows: 1) a new CPC daily gauge-only land-only global precipitation analysis at 0.5-degree resolution and 2) the well-known CPC CMAP global 2.0 x 2.5 degree 5-day precipitation analysis, which utilizes satellite estimates of precipitation, as well as some gauge observations. The presentation will describe how these two analyses are merged with latitude-dependent weights that favor the gauge-only analysis in mid-latitudes and the satellite-dominated CMAP analysis in tropical latitudes. Finally, we will show some impacts of using GLDAS to initialize the land states of seasonal CFS reforecasts, versus using the previous generation of NCEP global reanalysis as the source for CFS initial land states.
Superspace and global stability in general relativity
NASA Astrophysics Data System (ADS)
Gurzadyan, A. V.; Kocharyan, A. A.
A framework is developed enabling the global analysis of the stability of cosmological models using the local geometric characteristics of the infinite-dimensional superspace, i.e. using the generalized Jacobi equation reformulated for pseudo-Riemannian manifolds. We give a direct formalism for dynamical analysis in the superspace, the requisite equation pertinent for stability analysis of the universe by means of generalized covariant and Fermi derivative is derived. Then, the relevant definitions and formulae are retrieved for cosmological models with a scalar field.
NASA Astrophysics Data System (ADS)
Gu, Huaying; Liu, Zhixue; Weng, Yingliang
2017-04-01
The present study applies the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) with spatial effects approach for the analysis of the time-varying conditional correlations and contagion effects among global real estate markets. A distinguishing feature of the proposed model is that it can simultaneously capture the spatial interactions and the dynamic conditional correlations compared with the traditional MGARCH models. Results reveal that the estimated dynamic conditional correlations have exhibited significant increases during the global financial crisis from 2007 to 2009, thereby suggesting contagion effects among global real estate markets. The analysis further indicates that the returns of the regional real estate markets that are in close geographic and economic proximities exhibit strong co-movement. In addition, evidence of significantly positive leverage effects in global real estate markets is also determined. The findings have significant implications on global portfolio diversification opportunities and risk management practices.
NASA Technical Reports Server (NTRS)
Armstrong, Richard; Hardman, Molly
1991-01-01
A snow model that supports the daily, operational analysis of global snow depth and age has been developed. It provides improved spatial interpolation of surface reports by incorporating digital elevation data, and by the application of regionalized variables (kriging) through the use of a global snow depth climatology. Where surface observations are inadequate, the model applies satellite remote sensing. Techniques for extrapolation into data-void mountain areas and a procedure to compute snow melt are also contained in the model.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
Global spatiotemporal distribution of soil respiration modeled using a global database
NASA Astrophysics Data System (ADS)
Hashimoto, S.; Carvalhais, N.; Ito, A.; Migliavacca, M.; Nishina, K.; Reichstein, M.
2015-07-01
The flux of carbon dioxide from the soil to the atmosphere (soil respiration) is one of the major fluxes in the global carbon cycle. At present, the accumulated field observation data cover a wide range of geographical locations and climate conditions. However, there are still large uncertainties in the magnitude and spatiotemporal variation of global soil respiration. Using a global soil respiration data set, we developed a climate-driven model of soil respiration by modifying and updating Raich's model, and the global spatiotemporal distribution of soil respiration was examined using this model. The model was applied at a spatial resolution of 0.5°and a monthly time step. Soil respiration was divided into the heterotrophic and autotrophic components of respiration using an empirical model. The estimated mean annual global soil respiration was 91 Pg C yr-1 (between 1965 and 2012; Monte Carlo 95 % confidence interval: 87-95 Pg C yr-1) and increased at the rate of 0.09 Pg C yr-2. The contribution of soil respiration from boreal regions to the total increase in global soil respiration was on the same order of magnitude as that of tropical and temperate regions, despite a lower absolute magnitude of soil respiration in boreal regions. The estimated annual global heterotrophic respiration and global autotrophic respiration were 51 and 40 Pg C yr-1, respectively. The global soil respiration responded to the increase in air temperature at the rate of 3.3 Pg C yr-1 °C-1, and Q10 = 1.4. Our study scaled up observed soil respiration values from field measurements to estimate global soil respiration and provide a data-oriented estimate of global soil respiration. The estimates are based on a semi-empirical model parameterized with over one thousand data points. Our analysis indicates that the climate controls on soil respiration may translate into an increasing trend in global soil respiration and our analysis emphasizes the relevance of the soil carbon flux from soil to the atmosphere in response to climate change. Further approaches should additionally focus on climate controls in soil respiration in combination with changes in vegetation dynamics and soil carbon stocks, along with their effects on the long temporal dynamics of soil respiration. We expect that these spatiotemporal estimates will provide a benchmark for future studies and also help to constrain process-oriented models.
Quasi-decadal Oscillation in the CMIP5 and CMIP3 Climate Model Simulations: California Case
NASA Astrophysics Data System (ADS)
Wang, J.; Yin, H.; Reyes, E.; Chung, F. I.
2014-12-01
The ongoing three drought years in California are reminding us of two other historical long drought periods: 1987-1992 and 1928-1934. This kind of interannual variability is corresponding to the dominating 7-15 yr quasi-decadal oscillation in precipitation and streamflow in California. When using global climate model projections to assess the climate change impact on water resources planning in California, it is natural to ask if global climate models are able to reproduce the observed interannual variability like 7-15 yr quasi-decadal oscillation. Further spectral analysis to tree ring retrieved precipitation and historical precipitation record proves the existence of 7-15 yr quasi-decadal oscillation in California. But while implementing spectral analysis to all the CMIP5 and CMIP3 global climate model historical simulations using wavelet analysis approach, it was found that only two models in CMIP3 , CGCM 2.3.2a of MRI and NCAP PCM1.0, and only two models in CMIP5, MIROC5 and CESM1-WACCM, have statistically significant 7-15 yr quasi-decadal oscillations in California. More interesting, the existence of 7-15 yr quasi-decadal oscillation in the global climate model simulation is also sensitive to initial conditions. 12-13 yr quasi-decadal oscillation occurs in one ensemble run of CGCM 2.3.2a of MRI but does not exist in the other four ensemble runs.
Identifiability of PBPK Models with Applications to ...
Any statistical model should be identifiable in order for estimates and tests using it to be meaningful. We consider statistical analysis of physiologically-based pharmacokinetic (PBPK) models in which parameters cannot be estimated precisely from available data, and discuss different types of identifiability that occur in PBPK models and give reasons why they occur. We particularly focus on how the mathematical structure of a PBPK model and lack of appropriate data can lead to statistical models in which it is impossible to estimate at least some parameters precisely. Methods are reviewed which can determine whether a purely linear PBPK model is globally identifiable. We propose a theorem which determines when identifiability at a set of finite and specific values of the mathematical PBPK model (global discrete identifiability) implies identifiability of the statistical model. However, we are unable to establish conditions that imply global discrete identifiability, and conclude that the only safe approach to analysis of PBPK models involves Bayesian analysis with truncated priors. Finally, computational issues regarding posterior simulations of PBPK models are discussed. The methodology is very general and can be applied to numerous PBPK models which can be expressed as linear time-invariant systems. A real data set of a PBPK model for exposure to dimethyl arsinic acid (DMA(V)) is presented to illustrate the proposed methodology. We consider statistical analy
Zumpano, Camila Eugênia; Mendonça, Tânia Maria da Silva; Silva, Carlos Henrique Martins da; Correia, Helena; Arnold, Benjamin; Pinto, Rogério de Melo Costa
2017-01-23
This study aimed to perform the cross-cultural adaptation and validation of the Patient-Reported Outcomes Measurement Information System (PROMIS) Global Health scale in the Portuguese language. The ten Global Health items were cross-culturally adapted by the method proposed in the Functional Assessment of Chronic Illness Therapy (FACIT). The instrument's final version in Portuguese was self-administered by 1,010 participants in Brazil. The scale's precision was verified by floor and ceiling effects analysis, reliability of internal consistency, and test-retest reliability. Exploratory and confirmatory factor analyses were used to assess the construct's validity and instrument's dimensionality. Calibration of the items used the Gradual Response Model proposed by Samejima. Four global items required adjustments after the pretest. Analysis of the psychometric properties showed that the Global Health scale has good reliability, with Cronbach's alpha of 0.83 and intra-class correlation of 0.89. Exploratory and confirmatory factor analyses showed good fit in the previously established two-dimensional model. The Global Physical Health and Global Mental Health scale showed good latent trait coverage according to the Gradual Response Model. The PROMIS Global Health items showed equivalence in Portuguese compared to the original version and satisfactory psychometric properties for application in clinical practice and research in the Brazilian population.
Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC Weather Service NWS logo - Click to go to the NWS homepage Environmental Modeling Center Home News Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC
Global Citizenship and the Importance of Education in a Globally Integrated World
ERIC Educational Resources Information Center
Smith, William C.; Fraser, Pablo; Chykina, Volha; Ikoma, Sakiko; Levitan, Joseph; Liu, Jing; Mahfouz, Julia
2017-01-01
As national borders dissipate and technology allows different cultures and nationalities to communicate on a regular basis, more individuals are self-identifying as a global citizen. Using Social Network Analysis and multi-level modelling, this study explores factors associated with global citizen affinity and finds that education plays an…
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.
2016-12-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.
2017-12-01
The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Løvholt, Finn
2017-04-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
Micromechanics Fatigue Damage Analysis Modeling for Fabric Reinforced Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Min, J. B.; Xue, D.; Shi, Y.
2013-01-01
A micromechanics analysis modeling method was developed to analyze the damage progression and fatigue failure of fabric reinforced composite structures, especially for the brittle ceramic matrix material composites. A repeating unit cell concept of fabric reinforced composites was used to represent the global composite structure. The thermal and mechanical properties of the repeating unit cell were considered as the same as those of the global composite structure. The three-phase micromechanics, the shear-lag, and the continuum fracture mechanics models were integrated with a statistical model in the repeating unit cell to predict the progressive damages and fatigue life of the composite structures. The global structure failure was defined as the loss of loading capability of the repeating unit cell, which depends on the stiffness reduction due to material slice failures and nonlinear material properties in the repeating unit cell. The present methodology is demonstrated with the analysis results evaluated through the experimental test performed with carbon fiber reinforced silicon carbide matrix plain weave composite specimens.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Teague, David; Gardner, Bryce; Cotoni, Vincent
2010-01-01
This presentation further develops the orthogrid vehicle panel work. Employed Hybrid Module capabilities to assess both low/mid frequency and high frequency models in the VA One simulation environment. The response estimates from three modeling approaches are compared to ground test measurements. Detailed Finite Element Model of the Test Article -Expect to capture both the global panel modes and the local pocket mode response, but at a considerable analysis expense (time & resources). A Composite Layered Construction equivalent global stiffness approximation using SEA -Expect to capture response of the global panel modes only. An SEA approximation using the Periodic Subsystem Formulation. A finite element model of a single periodic cell is used to derive the vibroacoustic properties of the entire periodic structure (modal density, radiation efficiency, etc. Expect to capture response at various locations on the panel (on the skin and on the ribs) with less analysis expense
NASA Technical Reports Server (NTRS)
Robertson, Franklin; Goodman, Steven J.; Christy, John R.; Fitzjarrald, Daniel E.; Chou, Shi-Hung; Crosson, William; Wang, Shouping; Ramirez, Jorge
1993-01-01
This research is the MSFC component of a joint MSFC/Pennsylvania State University Eos Interdisciplinary Investigation on the global water cycle extension across the earth sciences. The primary long-term objective of this investigation is to determine the scope and interactions of the global water cycle with all components of the Earth system and to understand how it stimulates and regulates change on both global and regional scales. Significant accomplishments in the past year are presented and include the following: (1) water vapor variability; (2) multi-phase water analysis; (3) global modeling; and (4) optimal precipitation and stream flow analysis and hydrologic processes.
Melenteva, Anastasiia; Galyanin, Vladislav; Savenkova, Elena; Bogomolov, Andrey
2016-07-15
A large set of fresh cow milk samples collected from many suppliers over a large geographical area in Russia during a year has been analyzed by optical spectroscopy in the range 400-1100 nm in accordance with previously developed scatter-based technique. The global (i.e. resistant to seasonal, genetic, regional and other variations of the milk composition) models for fat and total protein content, which were built using partial least-squares (PLS) regression, exhibit satisfactory prediction performances enabling their practical application in the dairy. The root mean-square errors of prediction (RMSEP) were 0.09 and 0.10 for fat and total protein content, respectively. The issues of raw milk analysis and multivariate modelling based on the historical spectroscopic data have been considered and approaches to the creation of global models and their transfer between the instruments have been proposed. Availability of global models should significantly facilitate the dissemination of optical spectroscopic methods for the laboratory and in-line quantitative milk analysis. Copyright © 2016. Published by Elsevier Ltd.
Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency
NASA Technical Reports Server (NTRS)
Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey
2012-01-01
The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.
NASA Astrophysics Data System (ADS)
Safaei, S.; Haghnegahdar, A.; Razavi, S.
2016-12-01
Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.
Mixed kernel function support vector regression for global sensitivity analysis
NASA Astrophysics Data System (ADS)
Cheng, Kai; Lu, Zhenzhou; Wei, Yuhao; Shi, Yan; Zhou, Yicheng
2017-11-01
Global sensitivity analysis (GSA) plays an important role in exploring the respective effects of input variables on an assigned output response. Amongst the wide sensitivity analyses in literature, the Sobol indices have attracted much attention since they can provide accurate information for most models. In this paper, a mixed kernel function (MKF) based support vector regression (SVR) model is employed to evaluate the Sobol indices at low computational cost. By the proposed derivation, the estimation of the Sobol indices can be obtained by post-processing the coefficients of the SVR meta-model. The MKF is constituted by the orthogonal polynomials kernel function and Gaussian radial basis kernel function, thus the MKF possesses both the global characteristic advantage of the polynomials kernel function and the local characteristic advantage of the Gaussian radial basis kernel function. The proposed approach is suitable for high-dimensional and non-linear problems. Performance of the proposed approach is validated by various analytical functions and compared with the popular polynomial chaos expansion (PCE). Results demonstrate that the proposed approach is an efficient method for global sensitivity analysis.
1981-01-01
on modeling the managerial aspects of the firm. The second has been the application to economic theory led by ...individual portfolio optimization problems which were embedded in a larger global optimization problem. In the global problem, portfolios were linked by market ...demand quantities or be given by linear demand relationships. As in~ the source markets , the model
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
The global gridded crop model intercomparison: Data and modeling protocols for Phase 1 (v1.0)
Elliott, J.; Müller, C.; Deryng, D.; ...
2015-02-11
We present protocols and input data for Phase 1 of the Global Gridded Crop Model Intercomparison, a project of the Agricultural Model Intercomparison and Improvement Project (AgMIP). The project consist of global simulations of yields, phenologies, and many land-surface fluxes using 12–15 modeling groups for many crops, climate forcing data sets, and scenarios over the historical period from 1948 to 2012. The primary outcomes of the project include (1) a detailed comparison of the major differences and similarities among global models commonly used for large-scale climate impact assessment, (2) an evaluation of model and ensemble hindcasting skill, (3) quantification ofmore » key uncertainties from climate input data, model choice, and other sources, and (4) a multi-model analysis of the agricultural impacts of large-scale climate extremes from the historical record.« less
Challenges in the global QCD analysis of parton structure of nucleons
NASA Astrophysics Data System (ADS)
Tung, Wu-Ki
2000-12-01
We briefly summarize the current status of global QCD analysis of the parton structure of the nucleon and then highlight the open questions and challenges which confront this endeavor on which much of the phenomenology of the Standard Model and the search of New Physics depend.
Prospects for improving the representation of coastal and shelf seas in global ocean models
NASA Astrophysics Data System (ADS)
Holt, Jason; Hyder, Patrick; Ashworth, Mike; Harle, James; Hewitt, Helene T.; Liu, Hedong; New, Adrian L.; Pickles, Stephen; Porter, Andrew; Popova, Ekaterina; Icarus Allen, J.; Siddorn, John; Wood, Richard
2017-02-01
Accurately representing coastal and shelf seas in global ocean models represents one of the grand challenges of Earth system science. They are regions of immense societal importance through the goods and services they provide, hazards they pose and their role in global-scale processes and cycles, e.g. carbon fluxes and dense water formation. However, they are poorly represented in the current generation of global ocean models. In this contribution, we aim to briefly characterise the problem, and then to identify the important physical processes, and their scales, needed to address this issue in the context of the options available to resolve these scales globally and the evolving computational landscape.We find barotropic and topographic scales are well resolved by the current state-of-the-art model resolutions, e.g. nominal 1/12°, and still reasonably well resolved at 1/4°; here, the focus is on process representation. We identify tides, vertical coordinates, river inflows and mixing schemes as four areas where modelling approaches can readily be transferred from regional to global modelling with substantial benefit. In terms of finer-scale processes, we find that a 1/12° global model resolves the first baroclinic Rossby radius for only ˜ 8 % of regions < 500 m deep, but this increases to ˜ 70 % for a 1/72° model, so resolving scales globally requires substantially finer resolution than the current state of the art.We quantify the benefit of improved resolution and process representation using 1/12° global- and basin-scale northern North Atlantic nucleus for a European model of the ocean (NEMO) simulations; the latter includes tides and a k-ɛ vertical mixing scheme. These are compared with global stratification observations and 19 models from CMIP5. In terms of correlation and basin-wide rms error, the high-resolution models outperform all these CMIP5 models. The model with tides shows improved seasonal cycles compared to the high-resolution model without tides. The benefits of resolution are particularly apparent in eastern boundary upwelling zones.To explore the balance between the size of a globally refined model and that of multiscale modelling options (e.g. finite element, finite volume or a two-way nesting approach), we consider a simple scale analysis and a conceptual grid refining approach. We put this analysis in the context of evolving computer systems, discussing model turnaround time, scalability and resource costs. Using a simple cost model compared to a reference configuration (taken to be a 1/4° global model in 2011) and the increasing performance of the UK Research Councils' computer facility, we estimate an unstructured mesh multiscale approach, resolving process scales down to 1.5 km, would use a comparable share of the computer resource by 2021, the two-way nested multiscale approach by 2022, and a 1/72° global model by 2026. However, we also note that a 1/12° global model would not have a comparable computational cost to a 1° global model in 2017 until 2027. Hence, we conclude that for computationally expensive models (e.g. for oceanographic research or operational oceanography), resolving scales to ˜ 1.5 km would be routinely practical in about a decade given substantial effort on numerical and computational development. For complex Earth system models, this extends to about 2 decades, suggesting the focus here needs to be on improved process parameterisation to meet these challenges.
Emery, D W
1997-01-01
In many circles, managed care and capitation have become synonymous; unfortunately, the assumptions informing capitation are based on a flawed unidimensional model of risk. PEHP of Utah has rejected the unidimensional model and has therefore embraced a multidimensional model of risk that suggests that global fees are the optimal purchasing modality. A globally priced episode of care forms a natural unit of analysis that enhances purchasing clarity, allows providers to more efficiently focus on the Marginal Rate of Technical Substitution, and conforms to the multidimensional reality of risk. Most importantly, global fees simultaneously maximize patient choice and provider cost consciousness.
Robles, A; Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2014-04-01
The results of a global sensitivity analysis of a filtration model for submerged anaerobic MBRs (AnMBRs) are assessed in this paper. This study aimed to (1) identify the less- (or non-) influential factors of the model in order to facilitate model calibration and (2) validate the modelling approach (i.e. to determine the need for each of the proposed factors to be included in the model). The sensitivity analysis was conducted using a revised version of the Morris screening method. The dynamic simulations were conducted using long-term data obtained from an AnMBR plant fitted with industrial-scale hollow-fibre membranes. Of the 14 factors in the model, six were identified as influential, i.e. those calibrated using off-line protocols. A dynamic calibration (based on optimisation algorithms) of these influential factors was conducted. The resulting estimated model factors accurately predicted membrane performance. Copyright © 2014 Elsevier Ltd. All rights reserved.
Physical Education and Health: Global Perspectives and Best Practice
ERIC Educational Resources Information Center
Chin, Ming-Kai, Ed.; Edginton, Christopher R.
2014-01-01
"Physical Education and Health: Global Perspectives and Best Practice" draws together global scholars, researchers, and practitioners to provide a review and analysis of new directions in physical education and health worldwide. The book provides descriptive information from 40 countries regarding contemporary practices, models, and…
Gill, Stephen; Benatar, Solomon
2016-01-01
The Lancet-University of Oslo Commission Report on Global Governance for Health provides an insightful analysis of the global health inequalities that result from transnational activities consequent on what the authors call contemporary "global social norms." Our critique is that the analysis and suggested reforms to prevailing institutions and practices are confined within the perspective of the dominant-although unsustainable and inequitable-market-oriented, neoliberal development model of global capitalism. Consequently, the report both elides critical discussion of many key forms of material and political power under conditions of neoliberal development and governance that shape the nature and priorities of the global governance for health, and fails to point to the extent of changes required to sustainably improve global health. We propose that an alternative concept of progress-one grounded in history, political economy, and ecologically responsible health ethics-is sorely needed to better address challenges of global health governance in the new millennium. This might be premised on global solidarity and the "development of sustainability." We argue that the prevailing market civilization model that lies at the heart of global capitalism is being, and will further need to be, contested to avoid contradictions and dislocations associated with the commodification and privatization of health. © The Author(s) 2016.
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Region Spherical Harmonic Magnetic Modeling from Near-Surface and Satellite-Altitude Anomlaies
NASA Technical Reports Server (NTRS)
Kim, Hyung Rae; von Frese, Ralph R. B.; Taylor, Patrick T.
2013-01-01
The compiled near-surface data and satellite crustal magnetic measured data are modeled with a regionally concentrated spherical harmonic presentation technique over Australia and Antarctica. Global crustal magnetic anomaly studies have used a spherical harmonic analysis to represent the Earth's magnetic crustal field. This global approach, however is best applied where the data are uniformly distributed over the entire Earth. Satellite observations generally meet this requirement, but unequally distributed data cannot be easily adapted in global modeling. Even for the satellite observations, due to the errors spread over the globe, data smoothing is inevitable in the global spherical harmonic presentations. In addition, global high-resolution modeling requires a great number of global spherical harmonic coefficients for the regional presentation of crustal magnetic anomalies, whereas a lesser number of localized spherical coefficients will satisfy. We compared methods in both global and regional approaches and for a case where the errors were propagated outside the region of interest. For observations from the upcoming Swarm constellation, the regional modeling will allow the production a lesser number of spherical coefficients that are relevant to the region of interest
NASA Astrophysics Data System (ADS)
Mitchell, K. E.
2006-12-01
The Environmental Modeling Center (EMC) of the National Centers for Environmental Prediction (NCEP) applies several different analyses of observed precipitation in both the data assimilation and validation components of NCEP's global and regional numerical weather and climate prediction/analysis systems (including in NCEP global and regional reanalysis). This invited talk will survey these data assimilation and validation applications and methodologies, as well as the temporal frequency, spatial domains, spatial resolution, data sources, data density and data quality control in the precipitation analyses that are applied. Some of the precipitation analyses applied by EMC are produced by NCEP's Climate Prediction Center (CPC), while others are produced by the River Forecast Centers (RFCs) of the National Weather Service (NWS), or by automated algorithms of the NWS WSR-88D Radar Product Generator (RPG). Depending on the specific type of application in data assimilation or model forecast validation, the temporal resolution of the precipitation analyses may be hourly, daily, or pentad (5-day) and the domain may be global, continental U.S. (CONUS), or Mexico. The data sources for precipitation include ground-based gauge observations, radar-based estimates, and satellite-based estimates. The precipitation analyses over the CONUS are analyses of either hourly, daily or monthly totals of precipitation, and they are of two distinct types: gauge-only or primarily radar-estimated. The gauge-only CONUS analysis of daily precipitation utilizes an orographic-adjustment technique (based on the well-known PRISM precipitation climatology of Oregon State University) developed by the NWS Office of Hydrologic Development (OHD). The primary NCEP global precipitation analysis is the pentad CPC Merged Analysis of Precipitation (CMAP), which blends both gauge observations and satellite estimates. The presentation will include a brief comparison between the CMAP analysis and other global precipitation analyses by other institutions. Other global precipitation analyses produced by other methodologies are also used by EMC in certain applications, such as CPC's well-known satellite-IR based technique known as "GPI", and satellite-microwave based estimates from NESDIS or NASA. Finally, the presentation will cover the three assimilation methods used by EMC to assimilate precipitation data, including 1) 3D-VAR variational assimilation in NCEP's Global Data Assimilation System (GDAS), 2) direct insertion of precipitation-inferred vertical latent heating profiles in NCEP's N. American Data Assimilation System (NDAS) and its N. American Regional Reanalysis (NARR) counterpart, and 3) direct use of observed precipitation to drive the Noah land model component of NCEP's Global and N. American Land Data Assimilation Systems (GLDAS and NLDAS). In the applications of precipitation analyses in data assimilation at NCEP, the analyses are temporally disaggregated to hourly or less using time-weights calculated from A) either radar-based estimates or an analysis of hourly gauge-observations for the CONUS-domain daily precipitation analyses, or B) global model forecasts of 6-hourly precipitation (followed by linear interpolation to hourly or less) for the global CMAP precipitation analysis.
Local conformity induced global oscillation
NASA Astrophysics Data System (ADS)
Li, Dong; Li, Wei; Hu, Gang; Zheng, Zhigang
2009-04-01
The game ‘rock-paper-scissors’ model, with the consideration of the effect of the psychology of conformity, is investigated. The interaction between each two agents is global, but the strategy of the conformity is local for individuals. In the statistical opinion, the probability of the appearance of each strategy is uniform. The dynamical analysis of this model indicates that the equilibrium state may lose its stability at a threshold and is replaced by a globally oscillating state. The global oscillation is induced by the local conformity, which is originated from the synchronization of individual strategies.
2015-03-16
shaded region around each total sensitivity value was the maximum uncertainty in that value estimated by the Sobol method. 2.4. Global Sensitivity...Performance We conducted a global sensitivity analysis, using the variance-based method of Sobol , to estimate which parameters controlled the...Hunter, J.D. Matplotlib: A 2D Graphics Environment. Comput. Sci. Eng. 2007, 9, 90–95. 69. Sobol , I. Global sensitivity indices for nonlinear
Global Sensitivity Analysis of Environmental Models: Convergence, Robustness and Validation
NASA Astrophysics Data System (ADS)
Sarrazin, Fanny; Pianosi, Francesca; Khorashadi Zadeh, Farkhondeh; Van Griensven, Ann; Wagener, Thorsten
2015-04-01
Global Sensitivity Analysis aims to characterize the impact that variations in model input factors (e.g. the parameters) have on the model output (e.g. simulated streamflow). In sampling-based Global Sensitivity Analysis, the sample size has to be chosen carefully in order to obtain reliable sensitivity estimates while spending computational resources efficiently. Furthermore, insensitive parameters are typically identified through the definition of a screening threshold: the theoretical value of their sensitivity index is zero but in a sampling-base framework they regularly take non-zero values. There is little guidance available for these two steps in environmental modelling though. The objective of the present study is to support modellers in making appropriate choices, regarding both sample size and screening threshold, so that a robust sensitivity analysis can be implemented. We performed sensitivity analysis for the parameters of three hydrological models with increasing level of complexity (Hymod, HBV and SWAT), and tested three widely used sensitivity analysis methods (Elementary Effect Test or method of Morris, Regional Sensitivity Analysis, and Variance-Based Sensitivity Analysis). We defined criteria based on a bootstrap approach to assess three different types of convergence: the convergence of the value of the sensitivity indices, of the ranking (the ordering among the parameters) and of the screening (the identification of the insensitive parameters). We investigated the screening threshold through the definition of a validation procedure. The results showed that full convergence of the value of the sensitivity indices is not necessarily needed to rank or to screen the model input factors. Furthermore, typical values of the sample sizes that are reported in the literature can be well below the sample sizes that actually ensure convergence of ranking and screening.
Global Analysis, Interpretation and Modelling: An Earth Systems Modelling Program
NASA Technical Reports Server (NTRS)
Moore, Berrien, III; Sahagian, Dork
1997-01-01
The Goal of the GAIM is: To advance the study of the coupled dynamics of the Earth system using as tools both data and models; to develop a strategy for the rapid development, evaluation, and application of comprehensive prognostic models of the Global Biogeochemical Subsystem which could eventually be linked with models of the Physical-Climate Subsystem; to propose, promote, and facilitate experiments with existing models or by linking subcomponent models, especially those associated with IGBP Core Projects and with WCRP efforts. Such experiments would be focused upon resolving interface issues and questions associated with developing an understanding of the prognostic behavior of key processes; to clarify key scientific issues facing the development of Global Biogeochemical Models and the coupling of these models to General Circulation Models; to assist the Intergovernmental Panel on Climate Change (IPCC) process by conducting timely studies that focus upon elucidating important unresolved scientific issues associated with the changing biogeochemical cycles of the planet and upon the role of the biosphere in the physical-climate subsystem, particularly its role in the global hydrological cycle; and to advise the SC-IGBP on progress in developing comprehensive Global Biogeochemical Models and to maintain scientific liaison with the WCRP Steering Group on Global Climate Modelling.
Can global hydrological models reproduce large scale river flood regimes?
NASA Astrophysics Data System (ADS)
Eisner, Stephanie; Flörke, Martina
2013-04-01
River flooding remains one of the most severe natural hazards. On the one hand, major flood events pose a serious threat to human well-being, causing deaths and considerable economic damage. On the other hand, the periodic occurrence of flood pulses is crucial to maintain the functioning of riverine floodplains and wetlands, and to preserve the ecosystem services the latter provide. In many regions, river floods reveal a distinct seasonality, i.e. they occur at a particular time during the year. This seasonality is related to regionally dominant flood generating processes which can be expressed in river flood types. While in data-rich regions (esp. Europe and North America) the analysis of flood regimes can be based on observed river discharge time series, this data is sparse or lacking in many other regions of the world. This gap of knowledge can be filled by global modeling approaches. However, to date most global modeling studies have focused on mean annual or monthly water availability and their change over time while simulating discharge extremes, both floods and droughts, still remains a challenge for large scale hydrological models. This study will explore the ability of the global hydrological model WaterGAP3 to simulate the large scale patterns of river flood regimes, represented by seasonal pattern and the dominant flood type. WaterGAP3 simulates the global terrestrial water balance on a 5 arc minute spatial grid (excluding Greenland and Antarctica) at a daily time step. The model accounts for human interference on river flow, i.e. water abstraction for various purposes, e.g. irrigation, and flow regulation by large dams and reservoirs. Our analysis will provide insight in the general ability of global hydrological models to reproduce river flood regimes and thus will promote the creation of a global map of river flood regimes to provide a spatially inclusive and comprehensive picture. Understanding present-day flood regimes can support both flood risk analysis and the assessment of potential regional impacts of climate change on river flooding.
Why Is Rainfall Error Analysis Requisite for Data Assimilation and Climate Modeling?
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Zhang, Sara Q.
2004-01-01
Given the large temporal and spatial variability of precipitation processes, errors in rainfall observations are difficult to quantify yet crucial to making effective use of rainfall data for improving atmospheric analysis, weather forecasting, and climate modeling. We highlight the need for developing a quantitative understanding of systematic and random errors in precipitation observations by examining explicit examples of how each type of errors can affect forecasts and analyses in global data assimilation. We characterize the error information needed from the precipitation measurement community and how it may be used to improve data usage within the general framework of analysis techniques, as well as accuracy requirements from the perspective of climate modeling and global data assimilation.
Evaluating the utility of mid-infrared spectral subspaces for predicting soil properties.
Sila, Andrew M; Shepherd, Keith D; Pokhariyal, Ganesh P
2016-04-15
We propose four methods for finding local subspaces in large spectral libraries. The proposed four methods include (a) cosine angle spectral matching; (b) hit quality index spectral matching; (c) self-organizing maps and (d) archetypal analysis methods. Then evaluate prediction accuracies for global and subspaces calibration models. These methods were tested on a mid-infrared spectral library containing 1907 soil samples collected from 19 different countries under the Africa Soil Information Service project. Calibration models for pH, Mehlich-3 Ca, Mehlich-3 Al, total carbon and clay soil properties were developed for the whole library and for the subspace. Root mean square error of prediction was used to evaluate predictive performance of subspace and global models. The root mean square error of prediction was computed using a one-third-holdout validation set. Effect of pretreating spectra with different methods was tested for 1st and 2nd derivative Savitzky-Golay algorithm, multiplicative scatter correction, standard normal variate and standard normal variate followed by detrending methods. In summary, the results show that global models outperformed the subspace models. We, therefore, conclude that global models are more accurate than the local models except in few cases. For instance, sand and clay root mean square error values from local models from archetypal analysis method were 50% poorer than the global models except for subspace models obtained using multiplicative scatter corrected spectra with which were 12% better. However, the subspace approach provides novel methods for discovering data pattern that may exist in large spectral libraries.
Air Quality Forecasts Using the NASA GEOS Model: A Unified Tool from Local to Global Scales
NASA Technical Reports Server (NTRS)
Knowland, E. Emma; Keller, Christoph; Nielsen, J. Eric; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Cook, Melanie; Liu, Junhua;
2017-01-01
We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (approximately 25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.
Performance Analysis of a Ring Current Model Driven by Global MHD
NASA Astrophysics Data System (ADS)
Falasca, A.; Keller, K. A.; Fok, M.; Hesse, M.; Gombosi, T.
2003-12-01
Effectively modeling the high-energy particles in Earth's inner magnetosphere has the potential to improve safety in both manned and unmanned spacecraft. One model of this environment is the Fok Ring Current Model. This model can utilize as inputs both solar wind data, and empirical ionospheric electric field and magnetic field models. Alternatively, we have a procedure which allows the model to be driven by outputs from the BATS-R-US global MHD model. By using in-situ satellite data we will compare the predictive capability of this model in its original stand-alone form, to that of the model when driven by the BATS-R-US Global Magnetosphere Model. As a basis for comparison we use the April 2002 and May 2003 storms where suitable LANL geosynchronous data are available.
Global Sensitivity Analysis and Parameter Calibration for an Ecosystem Carbon Model
NASA Astrophysics Data System (ADS)
Safta, C.; Ricciuto, D. M.; Sargsyan, K.; Najm, H. N.; Debusschere, B.; Thornton, P. E.
2013-12-01
We present uncertainty quantification results for a process-based ecosystem carbon model. The model employs 18 parameters and is driven by meteorological data corresponding to years 1992-2006 at the Harvard Forest site. Daily Net Ecosystem Exchange (NEE) observations were available to calibrate the model parameters and test the performance of the model. Posterior distributions show good predictive capabilities for the calibrated model. A global sensitivity analysis was first performed to determine the important model parameters based on their contribution to the variance of NEE. We then proceed to calibrate the model parameters in a Bayesian framework. The daily discrepancies between measured and predicted NEE values were modeled as independent and identically distributed Gaussians with prescribed daily variance according to the recorded instrument error. All model parameters were assumed to have uninformative priors with bounds set according to expert opinion. The global sensitivity results show that the rate of leaf fall (LEAFALL) is responsible for approximately 25% of the total variance in the average NEE for 1992-2005. A set of 4 other parameters, Nitrogen use efficiency (NUE), base rate for maintenance respiration (BR_MR), growth respiration fraction (RG_FRAC), and allocation to plant stem pool (ASTEM) contribute between 5% and 12% to the variance in average NEE, while the rest of the parameters have smaller contributions. The posterior distributions, sampled with a Markov Chain Monte Carlo algorithm, exhibit significant correlations between model parameters. However LEAFALL, the most important parameter for the average NEE, is not informed by the observational data, while less important parameters show significant updates between their prior and posterior densities. The Fisher information matrix values, indicating which parameters are most informed by the experimental observations, are examined to augment the comparison between the calibration and global sensitivity analysis results.
Local-global analysis of crack growth in continuously reinfoced ceramic matrix composites
NASA Technical Reports Server (NTRS)
Ballarini, Roberto; Ahmed, Shamim
1989-01-01
This paper describes the development of a mathematical model for predicting the strength and micromechanical failure characteristics of continuously reinforced ceramic matrix composites. The local-global analysis models the vicinity of a propagating crack tip as a local heterogeneous region (LHR) consisting of spring-like representation of the matrix, fibers and interfaces. Parametric studies are conducted to investigate the effects of LHR size, component properties, and interface conditions on the strength and sequence of the failure processes in the unidirectional composite system.
Crustal Structure of Mars from Mars Global Surveyor Topography and Gravity
NASA Technical Reports Server (NTRS)
Zuber, M. T.; Solomon, S. C.; Phillips, R. J.; Smith, D. E.; Tyler, G. L.; Aharonson, O.; Balmino, G.; Banerdt, W. B.; Head, J. W.; Johnson, C. L.
2000-01-01
In this analysis we invert global models of Mars' topography from Mars Orbiter Laser Altimeter (MOLA) and gravity from Doppler tracking obtained during the mapping mission of Mars Global Surveyor (MGS). We analyze the distribution of Martian crust and discuss implications for Mars' thermal history.
Global alliances effect in coalition forming
NASA Astrophysics Data System (ADS)
Vinogradova, Galina; Galam, Serge
2014-11-01
Coalition forming is investigated among countries, which are coupled with short range interactions, under the influence of externally-set opposing global alliances. The model extends a recent Natural Model of coalition forming inspired from Statistical Physics, where instabilities are a consequence of decentralized maximization of the individual benefits of actors. In contrast to physics where spins can only evaluate the immediate cost/benefit of a flip of orientation, countries have a long horizon of rationality, which associates with the ability to envision a way up to a better configuration even at the cost of passing through intermediate loosing states. The stabilizing effect is produced through polarization by the global alliances of either a particular unique global interest factor or multiple simultaneous ones. This model provides a versatile theoretical tool for the analysis of real cases and design of novel strategies. Such analysis is provided for several real cases including the Eurozone. The results shed a new light on the understanding of the complex phenomena of planned stabilization in the coalition forming.
The Pilot Phase of the Global Soil Wetness Project Phase 3
NASA Astrophysics Data System (ADS)
Kim, H.; Oki, T.
2015-12-01
After the second phase of the Global Soil Wetness Project (GSWP2) as an early global continuous gridded multi-model analysis, a comprehensive set of land surface fluxes and state variables became available. It has been broadly utilized in the hydrology community, and its success has evolved to take advantages of recent scientific progress and to extend the relatively short time span (1986-1995) of the previous project. In the third phase proposed here (GSWP3), an extensive set of quantities for hydro-energy-eco systems will be produced to investigate their long-term (1901-2010) changes. The energy-water-carbon cycles and their interactions are also examined subcomponent-wise with appropriate model verifications in ensemble land simulations. In this study, the preliminary results and problems found from the first round analysis of the GSWP3 pilot study are shown. Also, it is discussed how the global offline simulation activity contributes to wider communities and a bigger scope such as Climate Model Intercomparison Project Phase 6 (CMIP6).
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, M. F.; Ershadi, A.; Jimenez, C.
Determining the spatial distribution and temporal development of evaporation at regional and global scales is required to improve our understanding of the coupled water and energy cycles and to better monitor any changes in observed trends and variability of linked hydrological processes. With recent international efforts guiding the development of long-term and globally distributed flux estimates, continued product assessments are required to inform upon the selection of suitable model structures and also to establish the appropriateness of these multi-model simulations for global application. In support of the objectives of the Global Energy and Water Cycle Exchanges (GEWEX) LandFlux project, fourmore » commonly used evaporation models are evaluated against data from tower-based eddy-covariance observations, distributed across a range of biomes and climate zones. The selected schemes include the Surface Energy Balance System (SEBS) approach, the Priestley–Taylor Jet Propulsion Laboratory (PT-JPL) model, the Penman–Monteith-based Mu model (PM-Mu) and the Global Land Evaporation Amsterdam Model (GLEAM). Here we seek to examine the fidelity of global evaporation simulations by examining the multi-model response to varying sources of forcing data. To do this, we perform parallel and collocated model simulations using tower-based data together with a global-scale grid-based forcing product. Through quantifying the multi-model response to high-quality tower data, a better understanding of the subsequent model response to the coarse-scale globally gridded data that underlies the LandFlux product can be obtained, while also providing a relative evaluation and assessment of model performance. Using surface flux observations from 45 globally distributed eddy-covariance stations as independent metrics of performance, the tower-based analysis indicated that PT-JPL provided the highest overall statistical performance (0.72; 61 W m –2; 0.65), followed closely by GLEAM (0.68; 64 W m –2; 0.62), with values in parentheses representing the R 2, RMSD and Nash–Sutcliffe efficiency (NSE), respectively. PM-Mu (0.51; 78 W m –2; 0.45) tended to underestimate fluxes, while SEBS (0.72; 101 W m –2; 0.24) overestimated values relative to observations. A focused analysis across specific biome types and climate zones showed considerable variability in the performance of all models, with no single model consistently able to outperform any other. Results also indicated that the global gridded data tended to reduce the performance for all of the studied models when compared to the tower data, likely a response to scale mismatch and issues related to forcing quality. Rather than relying on any single model simulation, the spatial and temporal variability at both the tower- and grid-scale highlighted the potential benefits of developing an ensemble or blended evaporation product for global-scale LandFlux applications. Hence, challenges related to the robust assessment of the LandFlux product are also discussed.« less
McCabe, M. F.; Ershadi, A.; Jimenez, C.; ...
2016-01-26
Determining the spatial distribution and temporal development of evaporation at regional and global scales is required to improve our understanding of the coupled water and energy cycles and to better monitor any changes in observed trends and variability of linked hydrological processes. With recent international efforts guiding the development of long-term and globally distributed flux estimates, continued product assessments are required to inform upon the selection of suitable model structures and also to establish the appropriateness of these multi-model simulations for global application. In support of the objectives of the Global Energy and Water Cycle Exchanges (GEWEX) LandFlux project, fourmore » commonly used evaporation models are evaluated against data from tower-based eddy-covariance observations, distributed across a range of biomes and climate zones. The selected schemes include the Surface Energy Balance System (SEBS) approach, the Priestley–Taylor Jet Propulsion Laboratory (PT-JPL) model, the Penman–Monteith-based Mu model (PM-Mu) and the Global Land Evaporation Amsterdam Model (GLEAM). Here we seek to examine the fidelity of global evaporation simulations by examining the multi-model response to varying sources of forcing data. To do this, we perform parallel and collocated model simulations using tower-based data together with a global-scale grid-based forcing product. Through quantifying the multi-model response to high-quality tower data, a better understanding of the subsequent model response to the coarse-scale globally gridded data that underlies the LandFlux product can be obtained, while also providing a relative evaluation and assessment of model performance. Using surface flux observations from 45 globally distributed eddy-covariance stations as independent metrics of performance, the tower-based analysis indicated that PT-JPL provided the highest overall statistical performance (0.72; 61 W m –2; 0.65), followed closely by GLEAM (0.68; 64 W m –2; 0.62), with values in parentheses representing the R 2, RMSD and Nash–Sutcliffe efficiency (NSE), respectively. PM-Mu (0.51; 78 W m –2; 0.45) tended to underestimate fluxes, while SEBS (0.72; 101 W m –2; 0.24) overestimated values relative to observations. A focused analysis across specific biome types and climate zones showed considerable variability in the performance of all models, with no single model consistently able to outperform any other. Results also indicated that the global gridded data tended to reduce the performance for all of the studied models when compared to the tower data, likely a response to scale mismatch and issues related to forcing quality. Rather than relying on any single model simulation, the spatial and temporal variability at both the tower- and grid-scale highlighted the potential benefits of developing an ensemble or blended evaporation product for global-scale LandFlux applications. Hence, challenges related to the robust assessment of the LandFlux product are also discussed.« less
Redefinition and global estimation of basal ecosystem respiration rate
NASA Astrophysics Data System (ADS)
Yuan, Wenping; Luo, Yiqi; Li, Xianglan; Liu, Shuguang; Yu, Guirui; Zhou, Tao; Bahn, Michael; Black, Andy; Desai, Ankur R.; Cescatti, Alessandro; Marcolla, Barbara; Jacobs, Cor; Chen, Jiquan; Aurela, Mika; Bernhofer, Christian; Gielen, Bert; Bohrer, Gil; Cook, David R.; Dragoni, Danilo; Dunn, Allison L.; Gianelle, Damiano; Grünwald, Thomas; Ibrom, Andreas; Leclerc, Monique Y.; Lindroth, Anders; Liu, Heping; Marchesini, Luca Belelli; Montagnani, Leonardo; Pita, Gabriel; Rodeghiero, Mirco; Rodrigues, Abel; Starr, Gregory; Stoy, Paul C.
2011-12-01
Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located at latitudes ranging from ˜3°S to ˜70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual temperature can be considered as BR in empirical models. A strong correlation was found between the mean annual ER and mean annual gross primary production (GPP). Consequently, GPP, which is typically more accurately modeled, can be used to estimate BR. A light use efficiency GPP model (i.e., EC-LUE) was applied to estimate global GPP, BR and ER with input data from MERRA (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate resolution Imaging Spectroradiometer). The global ER was 103 Pg C yr -1, with the highest respiration rate over tropical forests and the lowest value in dry and high-latitude areas.
Redefinition and global estimation of basal ecosystem respiration rate
Yuan, W.; Luo, Y.; Li, X.; Liu, S.; Yu, G.; Zhou, T.; Bahn, M.; Black, A.; Desai, A.R.; Cescatti, A.; Marcolla, B.; Jacobs, C.; Chen, J.; Aurela, M.; Bernhofer, C.; Gielen, B.; Bohrer, G.; Cook, D.R.; Dragoni, D.; Dunn, A.L.; Gianelle, D.; Grnwald, T.; Ibrom, A.; Leclerc, M.Y.; Lindroth, A.; Liu, H.; Marchesini, L.B.; Montagnani, L.; Pita, G.; Rodeghiero, M.; Rodrigues, A.; Starr, G.; Stoy, Paul C.
2011-01-01
Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located at latitudes ranging from ∼3°S to ∼70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual temperature can be considered as BR in empirical models. A strong correlation was found between the mean annual ER and mean annual gross primary production (GPP). Consequently, GPP, which is typically more accurately modeled, can be used to estimate BR. A light use efficiency GPP model (i.e., EC-LUE) was applied to estimate global GPP, BR and ER with input data from MERRA (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate resolution Imaging Spectroradiometer). The global ER was 103 Pg C yr −1, with the highest respiration rate over tropical forests and the lowest value in dry and high-latitude areas.
The credibility challenge for global fluvial flood risk analysis
NASA Astrophysics Data System (ADS)
Trigg, M. A.; Birch, C. E.; Neal, J. C.; Bates, P. D.; Smith, A.; Sampson, C. C.; Yamazaki, D.; Hirabayashi, Y.; Pappenberger, F.; Dutra, E.; Ward, P. J.; Winsemius, H. C.; Salamon, P.; Dottori, F.; Rudari, R.; Kappes, M. S.; Simpson, A. L.; Hadzilacos, G.; Fewtrell, T. J.
2016-09-01
Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable decision making. Global flood hazard models are now a practical reality, thanks to improvements in numerical algorithms, global datasets, computing power, and coupled modelling frameworks. Outputs of these models are vital for consistent quantification of global flood risk and in projecting the impacts of climate change. However, the urgency of these tasks means that outputs are being used as soon as they are made available and before such methods have been adequately tested. To address this, we compare multi-probability flood hazard maps for Africa from six global models and show wide variation in their flood hazard, economic loss and exposed population estimates, which has serious implications for model credibility. While there is around 30%-40% agreement in flood extent, our results show that even at continental scales, there are significant differences in hazard magnitude and spatial pattern between models, notably in deltas, arid/semi-arid zones and wetlands. This study is an important step towards a better understanding of modelling global flood hazard, which is urgently required for both current risk and climate change projections.
Simulation skill of APCC set of global climate models for Asian summer monsoon rainfall variability
NASA Astrophysics Data System (ADS)
Singh, U. K.; Singh, G. P.; Singh, Vikas
2015-04-01
The performance of 11 Asia-Pacific Economic Cooperation Climate Center (APCC) global climate models (coupled and uncoupled both) in simulating the seasonal summer (June-August) monsoon rainfall variability over Asia (especially over India and East Asia) has been evaluated in detail using hind-cast data (3 months advance) generated from APCC which provides the regional climate information product services based on multi-model ensemble dynamical seasonal prediction systems. The skill of each global climate model over Asia was tested separately in detail for the period of 21 years (1983-2003), and simulated Asian summer monsoon rainfall (ASMR) has been verified using various statistical measures for Indian and East Asian land masses separately. The analysis found a large variation in spatial ASMR simulated with uncoupled model compared to coupled models (like Predictive Ocean Atmosphere Model for Australia, National Centers for Environmental Prediction and Japan Meteorological Agency). The simulated ASMR in coupled model was closer to Climate Prediction Centre Merged Analysis of Precipitation (CMAP) compared to uncoupled models although the amount of ASMR was underestimated in both models. Analysis also found a high spread in simulated ASMR among the ensemble members (suggesting that the model's performance is highly dependent on its initial conditions). The correlation analysis between sea surface temperature (SST) and ASMR shows that that the coupled models are strongly associated with ASMR compared to the uncoupled models (suggesting that air-sea interaction is well cared in coupled models). The analysis of rainfall using various statistical measures suggests that the multi-model ensemble (MME) performed better compared to individual model and also separate study indicate that Indian and East Asian land masses are more useful compared to Asia monsoon rainfall as a whole. The results of various statistical measures like skill of multi-model ensemble, large spread among the ensemble members of individual model, strong teleconnection (correlation analysis) with SST, coefficient of variation, inter-annual variability, analysis of Taylor diagram, etc. suggest that there is a need to improve coupled model instead of uncoupled model for the development of a better dynamical seasonal forecast system.
What factors mediate the relationship between global self-worth and weight and shape concerns?
Murphy, Edel; Dooley, Barbara; Menton, Aoife; Dolphin, Louise
2016-04-01
The primary aim of this study was to investigate whether the relationship between global self-worth and weight concerns and global self-worth and shape concerns was mediated by pertinent body image factors, while controlling for gender and estimated BMI. Participants were 775 adolescents (56% male) aged 12-18years (M=14.6; SD=1.50). Mediation analysis revealed a direct and a mediated effect between global self-worth and two body image models: 1) weight concerns and 2) shape concerns. The strongest mediators in both models were physical appearance, restrained eating, and depression. Partial mediation was observed for both models, indicating that body image factors which span cognitive, affective, and behavioral constructs, explain the association between global self-worth and weight and shape concerns. Implications for future research, weight and shape concern prevention and global self-worth enhancement programs are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
1976-03-01
atmosphere,as well as very fine grid cloud models and cloud probability models. Some of the new requirements that will be supported with this system are a...including the Advanced Prediction Model for the global atmosphere, as well as very fine grid cloud models and cloud proba- bility models. Some of the new...with the mapping and gridding function (imput and output)? Should the capability exist to interface raw ungridded data with the SID interface
Ager, Alastair; Zarowsky, Christina
2015-01-17
Strengthening health research capacity in low- and middle-income countries remains a major policy goal. The Health Research Capacity Strengthening (HRCS) Global Learning (HGL) program of work documented experiences of HRCS across sub-Saharan Africa. We reviewed findings from HGL case studies and reflective papers regarding the dynamics of HRCS. Analysis was structured with respect to common challenges in such work, identified through a multi-dimensional scaling analysis of responses from 37 participants at the concluding symposium of the program of work. Symposium participants identified 10 distinct clusters of challenges: engaging researchers, policymakers, and donors; securing trust and cooperation; finding common interest; securing long-term funding; establishing sustainable models of capacity strengthening; ensuring Southern ownership; accommodating local health system priorities and constraints; addressing disincentives for academic engagement; establishing and retaining research teams; and sustaining mentorship and institutional support. Analysis links these challenges to three key and potentially competing drivers of the political economy of health research: an enduring model of independent researchers and research leaders, the globalization of knowledge and the linked mobility of (elite) individuals, and institutionalization of research within universities and research centres and, increasingly, national research and development agendas. We identify tensions between efforts to embrace the global 'Community of Science' and the promotion and protection of national and institutional agendas in an unequal global health research environment. A nuanced understanding of the dynamics and implications of the uneven global health research landscape is required, along with a willingness to explore pragmatic models that seek to balance these competing drivers.
Managing uncertainty: a review of food system scenario analysis and modelling
Reilly, Michael; Willenbockel, Dirk
2010-01-01
Complex socio-ecological systems like the food system are unpredictable, especially to long-term horizons such as 2050. In order to manage this uncertainty, scenario analysis has been used in conjunction with food system models to explore plausible future outcomes. Food system scenarios use a diversity of scenario types and modelling approaches determined by the purpose of the exercise and by technical, methodological and epistemological constraints. Our case studies do not suggest Malthusian futures for a projected global population of 9 billion in 2050; but international trade will be a crucial determinant of outcomes; and the concept of sustainability across the dimensions of the food system has been inadequately explored so far. The impact of scenario analysis at a global scale could be strengthened with participatory processes involving key actors at other geographical scales. Food system models are valuable in managing existing knowledge on system behaviour and ensuring the credibility of qualitative stories but they are limited by current datasets for global crop production and trade, land use and hydrology. Climate change is likely to challenge the adaptive capacity of agricultural production and there are important knowledge gaps for modelling research to address. PMID:20713402
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
ERIC Educational Resources Information Center
Engberg, Mark E.; Jourian, T. J.; Davidson, Lisa M.
2016-01-01
This study examines the mediating role of intercultural wonderment in relation to students' development of a global perspective. We utilize both confirmatory factor analysis and structural equation modeling to validate the intercultural wonderment construct and test the direct and indirect effects of the structural pathways in the model,…
High-resolution local gravity model of the south pole of the Moon from GRAIL extended mission data.
Goossens, Sander; Sabaka, Terence J; Nicholas, Joseph B; Lemoine, Frank G; Rowlands, David D; Mazarico, Erwan; Neumann, Gregory A; Smith, David E; Zuber, Maria T
2014-05-28
We estimated a high-resolution local gravity field model over the south pole of the Moon using data from the Gravity Recovery and Interior Laboratory's extended mission. Our solution consists of adjustments with respect to a global model expressed in spherical harmonics. The adjustments are expressed as gridded gravity anomalies with a resolution of 1/6° by 1/6° (equivalent to that of a degree and order 1080 model in spherical harmonics), covering a cap over the south pole with a radius of 40°. The gravity anomalies have been estimated from a short-arc analysis using only Ka-band range-rate (KBRR) data over the area of interest. We apply a neighbor-smoothing constraint to our solution. Our local model removes striping present in the global model; it reduces the misfit to the KBRR data and improves correlations with topography to higher degrees than current global models. We present a high-resolution gravity model of the south pole of the Moon Improved correlations with topography to higher degrees than global models Improved fits to the data and reduced striping that is present in global models.
High-resolution local gravity model of the south pole of the Moon from GRAIL extended mission data
Goossens, Sander; Sabaka, Terence J; Nicholas, Joseph B; Lemoine, Frank G; Rowlands, David D; Mazarico, Erwan; Neumann, Gregory A; Smith, David E; Zuber, Maria T
2014-01-01
We estimated a high-resolution local gravity field model over the south pole of the Moon using data from the Gravity Recovery and Interior Laboratory's extended mission. Our solution consists of adjustments with respect to a global model expressed in spherical harmonics. The adjustments are expressed as gridded gravity anomalies with a resolution of 1/6° by 1/6° (equivalent to that of a degree and order 1080 model in spherical harmonics), covering a cap over the south pole with a radius of 40°. The gravity anomalies have been estimated from a short-arc analysis using only Ka-band range-rate (KBRR) data over the area of interest. We apply a neighbor-smoothing constraint to our solution. Our local model removes striping present in the global model; it reduces the misfit to the KBRR data and improves correlations with topography to higher degrees than current global models. Key Points We present a high-resolution gravity model of the south pole of the Moon Improved correlations with topography to higher degrees than global models Improved fits to the data and reduced striping that is present in global models PMID:26074637
Sensitivity analysis of a wing aeroelastic response
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.
1991-01-01
A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Einaudi, Franco (Technical Monitor)
2001-01-01
I will discuss the need for accurate rainfall observations to improve our ability to model the earth's climate and improve short-range weather forecasts. I will give an overview of the recent progress in using of rainfall data provided by TRMM and other microwave instruments in data assimilation to improve global analyses and diagnose state-dependent systematic errors in physical parameterizations. I will outline the current and future research strategies in preparation for the Global Precipitation Mission.
NASA Technical Reports Server (NTRS)
Chin, Mian; Ginoux, Paul; Torres, Omar; Zhao, Xue-Peng
2005-01-01
We propose a research project to incorporate a global 3-D model and satellite data into the multi-national Aerosol Characterization Experiment-Asia (ACE-Asia) mission. Our objectives are (1) to understand the physical, chemical, and optical properties of aerosols and the processes that control those properties over the Asian-Pacific region, (2) to investigate the interaction between aerosols and tropospheric chemistry, and (3) to determine the aerosol radiative forcing over the Asia-Pacific region. We will use the Georgia TecWGoddard Global Ozone Chemistry Aerosol Radiation and Transport (GOCART) model to link satellite observations and the ACE-Asia measurements. First, we will use the GOCART model to simulate aerosols and related species, and evaluate the model with satellite and in-situ observations. Second, the model generated aerosol vertical profiles and compositions will be used to validate the satellite products; and the satellite data will be used for during- and post- mission analysis. Third, we will use the model to analyze and interpret both satellite and ACE- Asia field campaign data and investigate the aerosol-chemistry interactions. Finally, we will calculate aerosol radiative forcing over the Asian-Pacific region, and assess the influence of Asian pollution in the global atmosphere. We propose a research project to incorporate a global 3-D model and satellite data into
NASA Astrophysics Data System (ADS)
van den Dool, G.
2017-11-01
This study (van den Dool, 2017) is a proof of concept for a global predictive wildfire model, in which the temporal-spatial characteristics of wildfires are placed in a Geographical Information System (GIS), and the risk analysis is based on data-driven fuzzy logic functions. The data sources used in this model are available as global datasets, but subdivided into three pilot areas: North America (California/Nevada), Europe (Spain), and Asia (Mongolia), and are downscaled to the highest resolution (3-arc second). The GIS is constructed around three themes: topography, fuel availability and climate. From the topographical data, six derived sub-themes are created and converted to a fuzzy membership based on the catchment area statistics. The fuel availability score is a composite of four data layers: land cover, wood loads, biomass, biovolumes. As input for the climatological sub-model reanalysed daily averaged, weather-related data is used, which is accumulated to a global weekly time-window (to account for the uncertainty within the climatological model) and forms the temporal component of the model. The final product is a wildfire risk score (from 0 to 1) by week, representing the average wildfire risk in an area. To compute the potential wildfire risk the sub-models are combined usinga Multi-Criteria Approach, and the model results are validated against the area under the Receiver Operating Characteristic curve.
NASA Astrophysics Data System (ADS)
Wang, Audrey; Price, David T.
2007-03-01
A simple integrated algorithm was developed to relate global climatology to distributions of tree plant functional types (PFT). Multivariate cluster analysis was performed to analyze the statistical homogeneity of the climate space occupied by individual tree PFTs. Forested regions identified from the satellite-based GLC2000 classification were separated into tropical, temperate, and boreal sub-PFTs for use in the Canadian Terrestrial Ecosystem Model (CTEM). Global data sets of monthly minimum temperature, growing degree days, an index of climatic moisture, and estimated PFT cover fractions were then used as variables in the cluster analysis. The statistical results for individual PFT clusters were found consistent with other global-scale classifications of dominant vegetation. As an improvement of the quantification of the climatic limitations on PFT distributions, the results also demonstrated overlapping of PFT cluster boundaries that reflected vegetation transitions, for example, between tropical and temperate biomes. The resulting global database should provide a better basis for simulating the interaction of climate change and terrestrial ecosystem dynamics using global vegetation models.
A fully traits-based approach to modeling global vegetation distribution.
van Bodegom, Peter M; Douma, Jacob C; Verheijen, Lieneke M
2014-09-23
Dynamic Global Vegetation Models (DGVMs) are indispensable for our understanding of climate change impacts. The application of traits in DGVMs is increasingly refined. However, a comprehensive analysis of the direct impacts of trait variation on global vegetation distribution does not yet exist. Here, we present such analysis as proof of principle. We run regressions of trait observations for leaf mass per area, stem-specific density, and seed mass from a global database against multiple environmental drivers, making use of findings of global trait convergence. This analysis explained up to 52% of the global variation of traits. Global trait maps, generated by coupling the regression equations to gridded soil and climate maps, showed up to orders of magnitude variation in trait values. Subsequently, nine vegetation types were characterized by the trait combinations that they possess using Gaussian mixture density functions. The trait maps were input to these functions to determine global occurrence probabilities for each vegetation type. We prepared vegetation maps, assuming that the most probable (and thus, most suited) vegetation type at each location will be realized. This fully traits-based vegetation map predicted 42% of the observed vegetation distribution correctly. Our results indicate that a major proportion of the predictive ability of DGVMs with respect to vegetation distribution can be attained by three traits alone if traits like stem-specific density and seed mass are included. We envision that our traits-based approach, our observation-driven trait maps, and our vegetation maps may inspire a new generation of powerful traits-based DGVMs.
NASA Astrophysics Data System (ADS)
Fan, Yun; van den Dool, Huug
2004-05-01
We have produced a 0.5° × 0.5° monthly global soil moisture data set for the period from 1948 to the present. The land model is a one-layer "bucket" water balance model, while the driving input fields are Climate Prediction Center monthly global precipitation over land, which uses over 17,000 gauges worldwide, and monthly global temperature from global Reanalysis. The output consists of global monthly soil moisture, evaporation, and runoff, starting from January 1948. A distinguishing feature of this data set is that all fields are updated monthly, which greatly enhances utility for near-real-time purposes. Data validation shows that the land model does well; both the simulated annual cycle and interannual variability of soil moisture are reasonably good against the limited observations in different regions. A data analysis reveals that, on average, the land surface water balance components have a stronger annual cycle in the Southern Hemisphere than those in the Northern Hemisphere. From the point of view of soil moisture, climates can be characterized into two types, monsoonal and midlatitude climates, with the monsoonal ones covering most of the low-latitude land areas and showing a more prominent annual variation. A global soil moisture empirical orthogonal function analysis and time series of hemisphere means reveal some interesting patterns (like El Niño-Southern Oscillation) and long-term trends in both regional and global scales.
Estimating Sobol Sensitivity Indices Using Correlations
Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...
Global Change adaptation in water resources management: the Water Change project.
Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine
2012-12-01
In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.
Modeling and Managing the Risks of Measles and Rubella: A Global Perspective, Part I.
Thompson, Kimberly M; Cochi, Stephen L
2016-07-01
Over the past 50 years, the use of vaccines led to significant decreases in the global burdens of measles and rubella, motivated at least in part by the successive development of global control and elimination targets. The Global Vaccine Action Plan (GVAP) includes specific targets for regional elimination of measles and rubella in five of six regions of the World Health Organization by 2020. Achieving the GVAP measles and rubella goals will require significant immunization efforts and associated financial investments and political commitments. Planning and budgeting for these efforts can benefit from learning some important lessons from the Global Polio Eradication Initiative (GPEI). Following an overview of the global context of measles and rubella risks and discussion of lessons learned from the GPEI, we introduce the contents of the special issue on modeling and managing the risks of measles and rubella. This introduction describes the synthesis of the literature available to support evidence-based model inputs to support the development of an integrated economic and dynamic disease transmission model to support global efforts to optimally manage these diseases globally using vaccines. © 2016 Society for Risk Analysis.
Information flow in an atmospheric model and data assimilation
NASA Astrophysics Data System (ADS)
Yoon, Young-noh
2011-12-01
Weather forecasting consists of two processes, model integration and analysis (data assimilation). During the model integration, the state estimate produced by the analysis evolves to the next cycle time according to the atmospheric model to become the background estimate. The analysis then produces a new state estimate by combining the background state estimate with new observations, and the cycle repeats. In an ensemble Kalman filter, the probability distribution of the state estimate is represented by an ensemble of sample states, and the covariance matrix is calculated using the ensemble of sample states. We perform numerical experiments on toy atmospheric models introduced by Lorenz in 2005 to study the information flow in an atmospheric model in conjunction with ensemble Kalman filtering for data assimilation. This dissertation consists of two parts. The first part of this dissertation is about the propagation of information and the use of localization in ensemble Kalman filtering. If we can perform data assimilation locally by considering the observations and the state variables only near each grid point, then we can reduce the number of ensemble members necessary to cover the probability distribution of the state estimate, reducing the computational cost for the data assimilation and the model integration. Several localized versions of the ensemble Kalman filter have been proposed. Although tests applying such schemes have proven them to be extremely promising, a full basic understanding of the rationale and limitations of localization is currently lacking. We address these issues and elucidate the role played by chaotic wave dynamics in the propagation of information and the resulting impact on forecasts. The second part of this dissertation is about ensemble regional data assimilation using joint states. Assuming that we have a global model and a regional model of higher accuracy defined in a subregion inside the global region, we propose a data assimilation scheme that produces the analyses for the global and the regional model simultaneously, considering forecast information from both models. We show that our new data assimilation scheme produces better results both in the subregion and the global region than the data assimilation scheme that produces the analyses for the global and the regional model separately.
Surrogate models for efficient stability analysis of brake systems
NASA Astrophysics Data System (ADS)
Nechak, Lyes; Gillot, Frédéric; Besset, Sébastien; Sinou, Jean-Jacques
2015-07-01
This study assesses capacities of the global sensitivity analysis combined together with the kriging formalism to be useful in the robust stability analysis of brake systems, which is too costly when performed with the classical complex eigenvalues analysis (CEA) based on finite element models (FEMs). By considering a simplified brake system, the global sensitivity analysis is first shown very helpful for understanding the effects of design parameters on the brake system's stability. This is allowed by the so-called Sobol indices which discriminate design parameters with respect to their influence on the stability. Consequently, only uncertainty of influent parameters is taken into account in the following step, namely, the surrogate modelling based on kriging. The latter is then demonstrated to be an interesting alternative to FEMs since it allowed, with a lower cost, an accurate estimation of the system's proportions of instability corresponding to the influent parameters.
NASA Astrophysics Data System (ADS)
Ise, T.; Litton, C. M.; Giardina, C. P.; Ito, A.
2009-12-01
Plant partitioning of carbon (C) to above- vs. belowground, to growth vs. respiration, and to short vs. long lived tissues exerts a large influence on ecosystem structure and function with implications for the global C budget. Importantly, outcomes of process-based terrestrial vegetation models are likely to vary substantially with different C partitioning algorithms. However, controls on C partitioning patterns remain poorly quantified, and studies have yielded variable, and at times contradictory, results. A recent meta-analysis of forest studies suggests that the ratio of net primary production (NPP) and gross primary production (GPP) is fairly conservative across large scales. To illustrate the effect of this unique meta-analysis-based partitioning scheme (MPS), we compared an application of MPS to a terrestrial satellite-based (MODIS) GPP to estimate NPP vs. two global process-based vegetation models (Biome-BGC and VISIT) to examine the influence of C partitioning on C budgets of woody plants. Due to the temperature dependence of maintenance respiration, NPP/GPP predicted by the process-based models increased with latitude while the ratio remained constant with MPS. Overall, global NPP estimated with MPS was 17 and 27% lower than the process-based models for temperate and boreal biomes, respectively, with smaller differences in the tropics. Global equilibrium biomass of woody plants was then calculated from the NPP estimates and tissue turnover rates from VISIT. Since turnover rates differed greatly across tissue types (i.e., metabolically active vs. structural), global equilibrium biomass estimates were sensitive to the partitioning scheme employed. The MPS estimate of global woody biomass was 7-21% lower than that of the process-based models. In summary, we found that model output for NPP and equilibrium biomass was quite sensitive to the choice of C partitioning schemes. Carbon use efficiency (CUE; NPP/GPP) by forest biome and the globe. Values are means for 2001-2006.
Global sensitivity analysis in stochastic simulators of uncertain reaction networks.
Navarro Jimenez, M; Le Maître, O P; Knio, O M
2016-12-28
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
Global sensitivity analysis in stochastic simulators of uncertain reaction networks
Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.
2016-12-23
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol’s decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes thatmore » the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. Here, a sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.« less
Global sensitivity analysis in stochastic simulators of uncertain reaction networks
NASA Astrophysics Data System (ADS)
Navarro Jimenez, M.; Le Maître, O. P.; Knio, O. M.
2016-12-01
Stochastic models of chemical systems are often subjected to uncertainties in kinetic parameters in addition to the inherent random nature of their dynamics. Uncertainty quantification in such systems is generally achieved by means of sensitivity analyses in which one characterizes the variability with the uncertain kinetic parameters of the first statistical moments of model predictions. In this work, we propose an original global sensitivity analysis method where the parametric and inherent variability sources are both treated through Sobol's decomposition of the variance into contributions from arbitrary subset of uncertain parameters and stochastic reaction channels. The conceptual development only assumes that the inherent and parametric sources are independent, and considers the Poisson processes in the random-time-change representation of the state dynamics as the fundamental objects governing the inherent stochasticity. A sampling algorithm is proposed to perform the global sensitivity analysis, and to estimate the partial variances and sensitivity indices characterizing the importance of the various sources of variability and their interactions. The birth-death and Schlögl models are used to illustrate both the implementation of the algorithm and the richness of the proposed analysis method. The output of the proposed sensitivity analysis is also contrasted with a local derivative-based sensitivity analysis method classically used for this type of systems.
NASA Astrophysics Data System (ADS)
Dowell, M.; Moore, T.; Follows, M.; Dutkiewicz, S.
2006-12-01
In recent years there has been significant progress both in the use of satellite ocean colour remote sensing and coupled hydrodynamic biological models for producing maps of different dominant phytoplankton groups in the global ocean. In parallel to these initiatives, there is ongoing research largely following on from Alan Longhurst's seminal work on defining a template of distinct ecological and biogeochemical provinces for the oceans based on their physical and biochemical characteristics. For these products and models to be of maximum use in their subsequent inclusion in re-analysis and climate scale models, there is a need to understand how the "observed" distributions of dominant phytoplankton (realized niche) coincide with of the environmental constraints in which they occur (fundamental niche). In the current paper, we base our analysis on the recently published results on the distribution of dominant phytoplankton species at global scale, resulting both from satellite and model analysis. Furthermore, we will present research in defining biogeochemical provinces using satellite and model data inputs and a fuzzy logic based approach. This will be compared with ongoing modelling efforts, which include competitive exclusion and therefore compatible with the definition of the realized ecological niche, to define the emergent distribution of dominant phytoplankton species. Ultimately we investigate the coherence of these two distinct approaches in studying phytoplankton distributions and propose the significance of this in the context of modelling and analysis at various scales.
Air Quality Forecasts Using the NASA GEOS Model
NASA Technical Reports Server (NTRS)
Keller, Christoph A.; Knowland, K. Emma; Nielsen, Jon E.; Orbe, Clara; Ott, Lesley; Pawson, Steven; Saunders, Emily; Duncan, Bryan; Follette-Cook, Melanie; Liu, Junhua;
2018-01-01
We provide an introduction to a new high-resolution (0.25 degree) global composition forecast produced by NASA's Global Modeling and Assimilation office. The NASA Goddard Earth Observing System version 5 (GEOS-5) model has been expanded to provide global near-real-time forecasts of atmospheric composition at a horizontal resolution of 0.25 degrees (25 km). Previously, this combination of detailed chemistry and resolution was only provided by regional models. This system combines the operational GEOS-5 weather forecasting model with the state-of-the-science GEOS-Chem chemistry module (version 11) to provide detailed chemical analysis of a wide range of air pollutants such as ozone, carbon monoxide, nitrogen oxides, and fine particulate matter (PM2.5). The resolution of the forecasts is the highest resolution compared to current, publically-available global composition forecasts. Evaluation and validation of modeled trace gases and aerosols compared to surface and satellite observations will be presented for constituents relative to health air quality standards. Comparisons of modeled trace gases and aerosols against satellite observations show that the model produces realistic concentrations of atmospheric constituents in the free troposphere. Model comparisons against surface observations highlight the model's capability to capture the diurnal variability of air pollutants under a variety of meteorological conditions. The GEOS-5 composition forecasting system offers a new tool for scientists and the public health community, and is being developed jointly with several government and non-profit partners. Potential applications include air quality warnings, flight campaign planning and exposure studies using the archived analysis fields.
Maintaining Atmospheric Mass and Water Balance Within Reanalysis
NASA Technical Reports Server (NTRS)
Takacs, Lawrence L.; Suarez, Max; Todling, Ricardo
2015-01-01
This report describes the modifications implemented into the Goddard Earth Observing System Version-5 (GEOS-5) Atmospheric Data Assimilation System (ADAS) to maintain global conservation of dry atmospheric mass as well as to preserve the model balance of globally integrated precipitation and surface evaporation during reanalysis. Section 1 begins with a review of these global quantities from four current reanalysis efforts. Section 2 introduces the modifications necessary to preserve these constraints within the atmospheric general circulation model (AGCM), the Gridpoint Statistical Interpolation (GSI) analysis procedure, and the Incremental Analysis Update (IAU) algorithm. Section 3 presents experiments quantifying the impact of the new procedure. Section 4 shows preliminary results from its use within the GMAO MERRA-2 Reanalysis project. Section 5 concludes with a summary.
Uncertainty analysis in 3D global models: Aerosol representation in MOZART-4
NASA Astrophysics Data System (ADS)
Gasore, J.; Prinn, R. G.
2012-12-01
The Probabilistic Collocation Method (PCM) has been proven to be an efficient general method of uncertainty analysis in atmospheric models (Tatang et al 1997, Cohen&Prinn 2011). However, its application has been mainly limited to urban- and regional-scale models and chemical source-sink models, because of the drastic increase in computational cost when the dimension of uncertain parameters increases. Moreover, the high-dimensional output of global models has to be reduced to allow a computationally reasonable number of polynomials to be generated. This dimensional reduction has been mainly achieved by grouping the model grids into a few regions based on prior knowledge and expectations; urban versus rural for instance. As the model output is used to estimate the coefficients of the polynomial chaos expansion (PCE), the arbitrariness in the regional aggregation can generate problems in estimating uncertainties. To address these issues in a complex model, we apply the probabilistic collocation method of uncertainty analysis to the aerosol representation in MOZART-4, which is a 3D global chemical transport model (Emmons et al., 2010). Thereafter, we deterministically delineate the model output surface into regions of homogeneous response using the method of Principal Component Analysis. This allows the quantification of the uncertainty associated with the dimensional reduction. Because only a bulk mass is calculated online in Mozart-4, a lognormal number distribution is assumed with a priori fixed scale and location parameters, to calculate the surface area for heterogeneous reactions involving tropospheric oxidants. We have applied the PCM to the six parameters of the lognormal number distributions of Black Carbon, Organic Carbon and Sulfate. We have carried out a Monte-Carlo sampling from the probability density functions of the six uncertain parameters, using the reduced PCE model. The global mean concentration of major tropospheric oxidants did not show a significant variation in response to the variation in input parameters. However, a substantial variation at regional and temporal scale has been found. Tatang M. A., Pan W., Prinn R G., McRae G. J., An efficient method for parametric uncertainty analysis of numerical geophysical models, J. Gephys. Res., 102, 21925-21932, 1997. Cohen, J.B., and R.G. Prinn, Development of a fast, urban chemistry metamodel for inclusion in global models,Atmos. Chem. Phys., 11, 7629-7656, doi:10.5194/acp-11-7629-2011, 2011. Emmons L. K., Walters S., Hess P. G., Lamarque J. -F., P_ster G. G., Fillmore D., Granier C., Guenther A., Kinnison D., Laepple T., Orlando J., Tie X., Tyndall G., Wiedinmyer C., Baughcum S. L., Kloster J. S., Description and evaluation of the Model for Ozone and Related chemical Tracers, version 4 (MOZART-4). Geosci. Model Dev., 3, 4367, 2010.
NASA Technical Reports Server (NTRS)
1986-01-01
A variety of topics relevant to global modeling and simulation are presented. Areas of interest include: (1) analysis and forecast studies; (2) satellite observing systems; (3) analysis and forecast model development; (4) atmospheric dynamics and diagnostic studies; (5) climate/ocean-air interactions; and notes from lectures.
Bravo, Adrian J; Pearson, Matthew R
2017-10-01
The present study sought to address an issue in the drinking to cope (DTC) motives literature, namely the inconsistent application of treating DTC motives as a single construct and splitting it into DTC-depression and DTC-anxiety motives. Specifically, we aimed to determine if the effects of anxiety and depression on alcohol-related problems are best explained via their associations with DTC with specific affects or via their associations with a more global measure of DTC by testing four distinct models: the effects of anxiety/depression on alcohol-related problems mediated by DTC-anxiety only (Model 1), these effects mediated by DTC-depression only (Model 2), these effects mediated by a combined, global DTC factor (Model 3), and these effects mediated by both DTC-anxiety and DTC-depression (Model 4). Using path analysis/structural equation modeling across two independent samples, we found that there was a significant total indirect effect of both anxiety and depressive symptoms on alcohol-related problems in every model. However, there was a slightly larger indirect effect in all models using the global DTC motives factor compared to even the model that included the two distinct DTC motives. Our results provide some preliminary evidence that at least at the between-subjects level, a global DTC motives factor may have more predictive validity than separate DTC motives. Additional research is needed to examine how to best operationalize DTC motives at different levels of analysis (e.g., within-subjects vs. between subjects) and in different populations (e.g., college students vs. individuals with alcohol use disorder). Copyright © 2017. Published by Elsevier Ltd.
Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See
2005-01-01
The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.
Lee, Yeonok; Wu, Hulin
2012-01-01
Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.
NASA Technical Reports Server (NTRS)
Schubert, Siegfried
2011-01-01
The Global Modeling and Assimilation Office at NASA's Goddard Space Flight Center is developing a number of experimental prediction and analysis products suitable for research and applications. The prediction products include a large suite of subseasonal and seasonal hindcasts and forecasts (as a contribution to the US National MME), a suite of decadal (10-year) hindcasts (as a contribution to the IPCC decadal prediction project), and a series of large ensemble and high resolution simulations of selected extreme events, including the 2010 Russian and 2011 US heat waves. The analysis products include an experimental atlas of climate (in particular drought) and weather extremes. This talk will provide an update on those activities, and discuss recent efforts by WCRP to leverage off these and similar efforts at other institutions throughout the world to develop an experimental global drought early warning system.
NASA Astrophysics Data System (ADS)
Liang, Q.; Chipperfield, M.; Daniel, J. S.; Burkholder, J. B.; Rigby, M. L.; Velders, G. J. M.
2015-12-01
The hydroxyl radical (OH) is the major oxidant in the atmosphere. Reaction with OH is the primary removal process for many non-CO2greenhouse gases (GHGs), ozone-depleting substances (ODSs) and their replacements, e.g. hydrochlorofluorocarbons (HCFCs) and hydrofluorocarbons (HFCs). Traditionally, the global OH abundance is inferred using the observed atmospheric rate of change for methyl chloroform (MCF). Due to the Montreal Protocol regulation, the atmospheric abundance of MCF has been decreasing rapidly to near-zero values. It is becoming critical to find an alternative reference compound to continue to provide quantitative information for the global OH abundance. Our model analysis using the NASA 3-D GEOS-5 Chemistry Climate Model suggests that the inter-hemispheric gradients (IHG) of the HCFCs and HFCs show a strong linear correlation with their global emissions. Therefore it is possible to use (i) the observed IHGs of HCFCs and HFCs to estimate their global emissions, and (ii) use the derived emissions and the observed long-term trend to calculate their lifetimes and to infer the global OH abundance. Preliminary analysis using a simple global two-box model (one box for each hemisphere) and information from the global 3-D model suggests that the quantitative relationship between IHG and global emissions varies slightly among individual compounds depending on their lifetime, their emissions history and emission fractions from the two hemispheres. While each compound shows different sensitivity to the above quantities, the combined suite of the HCFCs and HFCs provides a means to derive global OH abundance and the corresponding atmospheric lifetimes of long-lived gases with respect to OH (tOH). The fact that the OH partial lifetimes of these compounds are highly correlated, with the ratio of tOH equal to the reverse ratio of their OH thermal reaction rates at 272K, provides an additional constraint that can greatly reduce the uncertainty in the OH abundance and tOH estimates. We will use the observed IHGs and long-term trends of three major HCFCs and six major HFCs in the two-box model to derive their global emissions and atmospheric lifetimes as well as the global OH abundance. The derived global OH abundance between 2000 and 2014 will be compared with that derived using MCF for consistency.
Advection modes by optimal mass transfer
NASA Astrophysics Data System (ADS)
Iollo, Angelo; Lombardi, Damiano
2014-02-01
Classical model reduction techniques approximate the solution of a physical model by a limited number of global modes. These modes are usually determined by variants of principal component analysis. Global modes can lead to reduced models that perform well in terms of stability and accuracy. However, when the physics of the model is mainly characterized by advection, the nonlocal representation of the solution by global modes essentially reduces to a Fourier expansion. In this paper we describe a method to determine a low-order representation of advection. This method is based on the solution of Monge-Kantorovich mass transfer problems. Examples of application to point vortex scattering, Korteweg-de Vries equation, and hurricane Dean advection are discussed.
NASA Astrophysics Data System (ADS)
Hassan, Gasser E.; Youssef, M. Elsayed; Ali, Mohamed A.; Mohamed, Zahraa E.; Shehata, Ali I.
2016-11-01
Different models are introduced to predict the daily global solar radiation in different locations but there is no specific model based on the day of the year is proposed for many locations around the world. In this study, more than 20 years of measured data for daily global solar radiation on a horizontal surface are used to develop and validate seven models to estimate the daily global solar radiation by day of the year for ten cities around Egypt as a case study. Moreover, the generalization capability for the best models is examined all over the country. The regression analysis is employed to calculate the coefficients of different suggested models. The statistical indicators namely, RMSE, MABE, MAPE, r and R2 are calculated to evaluate the performance of the developed models. Based on the validation with the available data, the results show that the hybrid sine and cosine wave model and 4th order polynomial model have the best performance among other suggested models. Consequently, these two models coupled with suitable coefficients can be used for estimating the daily global solar radiation on a horizontal surface for each city, and also for all the locations around the studied region. It is believed that the established models in this work are applicable and significant for quick estimation for the average daily global solar radiation on a horizontal surface with higher accuracy. The values of global solar radiation generated by this approach can be utilized in the design and estimation of the performance of different solar applications.
1991-01-01
Foundation FYDP ......... Five Year Defense Plan FSI ............ Fog Stability Index 17 G G ................ gravity, giga- GISM ......... Gridded ...Global Circulation Model GOES-TAP GOES imagery processing & dissemination system GCS .......... grid course GOFS ........ Global Ocean Flux Study GD...Analysis Support System Complex Systems GRID .......... Global Resource Information Data -Base GEMAG ..... geomagnetic GRIST..... grazing-incidence solar
Posttest analysis of the 1:6-scale reinforced concrete containment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfeiffer, P.A.; Kennedy, J.M.; Marchertas, A.H.
A prediction of the response of the Sandia National Laboratories 1:6- scale reinforced concrete containment model test was made by Argonne National Laboratory. ANL along with nine other organizations performed a detailed nonlinear response analysis of the 1:6-scale model containment subjected to overpressurization in the fall of 1986. The two-dimensional code TEMP-STRESS and the three-dimensional NEPTUNE code were utilized (1) to predict the global response of the structure, (2) to identify global failure sites and the corresponding failure pressures and (3) to identify some local failure sites and pressure levels. A series of axisymmetric models was studied with the two-dimensionalmore » computer program TEMP-STRESS. The comparison of these pretest computations with test data from the containment model has provided a test for the capability of the respective finite element codes to predict global failure modes, and hence serves as a validation of these codes. Only the two-dimensional analyses will be discussed in this paper. 3 refs., 10 figs.« less
NASA Astrophysics Data System (ADS)
Wang, Yuan; Zhang, Renyi; Saravanan, R.
2014-01-01
Increasing levels of anthropogenic aerosols in Asia have raised considerable concern regarding its potential impact on the global atmosphere, but the magnitude of the associated climate forcing remains to be quantified. Here, using a novel hierarchical modelling approach and observational analysis, we demonstrate modulated mid-latitude cyclones by Asian pollution over the past three decades. Regional and seasonal simulations using a cloud-resolving model show that Asian pollution invigorates winter cyclones over the northwest Pacific, increasing precipitation by 7% and net cloud radiative forcing by 1.0 W m-2 at the top of the atmosphere and by 1.7 W m-2 at the Earth’s surface. A global climate model incorporating the diabatic heating anomalies from Asian pollution produces a 9% enhanced transient eddy meridional heat flux and reconciles a decadal variation of mid-latitude cyclones derived from the Reanalysis data. Our results unambiguously reveal a large impact of the Asian pollutant outflows on the global general circulation and climate.
NASA/MSFC FY90 Global Scale Atmospheric Processes Research Program Review
NASA Technical Reports Server (NTRS)
Leslie, Fred W. (Editor)
1990-01-01
Research supported by the Global Atmospheric Research Program at the Marshall Space Flight Center on atmospheric remote sensing, meteorology, numerical weather forecasting, satellite data analysis, cloud precipitation, atmospheric circulation, atmospheric models and related topics is discussed.
NASA Astrophysics Data System (ADS)
Casson, David; Werner, Micha; Weerts, Albrecht; Schellekens, Jaap; Solomatine, Dimitri
2017-04-01
Hydrological modelling in the Canadian Sub-Arctic is hindered by the limited spatial and temporal coverage of local meteorological data. Local watershed modelling often relies on data from a sparse network of meteorological stations with a rough density of 3 active stations per 100,000 km2. Global datasets hold great promise for application due to more comprehensive spatial and extended temporal coverage. A key objective of this study is to demonstrate the application of global datasets and data assimilation techniques for hydrological modelling of a data sparse, Sub-Arctic watershed. Application of available datasets and modelling techniques is currently limited in practice due to a lack of local capacity and understanding of available tools. Due to the importance of snow processes in the region, this study also aims to evaluate the performance of global SWE products for snowpack modelling. The Snare Watershed is a 13,300 km2 snowmelt driven sub-basin of the Mackenzie River Basin, Northwest Territories, Canada. The Snare watershed is data sparse in terms of meteorological data, but is well gauged with consistent discharge records since the late 1970s. End of winter snowpack surveys have been conducted every year from 1978-present. The application of global re-analysis datasets from the EU FP7 eartH2Observe project are investigated in this study. Precipitation data are taken from Multi-Source Weighted-Ensemble Precipitation (MSWEP) and temperature data from Watch Forcing Data applied to European Reanalysis (ERA)-Interim data (WFDEI). GlobSnow-2 is a global Snow Water Equivalent (SWE) measurement product funded by the European Space Agency (ESA) and is also evaluated over the local watershed. Downscaled precipitation, temperature and potential evaporation datasets are used as forcing data in a distributed version of the HBV model implemented in the WFLOW framework. Results demonstrate the successful application of global datasets in local watershed modelling, but that validation of actual frozen precipitation and snowpack conditions is very difficult. The distributed hydrological model shows good streamflow simulation performance based on statistical model evaluation techniques. Results are also promising for inter-annual variability, spring snowmelt onset and time to peak flows. It is expected that data assimilation of stream flow using an Ensemble Kalman Filter will further improve model performance. This study shows that global re-analysis datasets hold great potential for understanding the hydrology and snowpack dynamics of the expansive and data sparse sub-Arctic. However, global SWE products will require further validation and algorithm improvements, particularly over boreal forest and lake-rich regions.
NASA Astrophysics Data System (ADS)
Alonso-Contes, C.; Gerber, S.; Bliznyuk, N.; Duerr, I.
2017-12-01
Wetlands contribute approximately 20 to 40 % to global sources of methane emissions. We build a Methane model for tropical and subtropical forests, that allows inundated conditions, following the approaches used in more complex global biogeochemical emission models (LPJWhyMe and CLM4Me). The model was designed to replace model formulations with field and remotely sensed collected data for 2 essential drivers: plant productivity and hydrology. This allows us to directly focus on the central processes of methane production, consumption and transport. One of our long term goals is to make the model available to a scientists interested in including methane modeling in their location of study. Sensitivity analysis results help in focusing field data collection efforts. Here, we present results from a pilot global sensitivity analysis of the model order to determine which parameters and processes contribute most to the model's uncertainty of methane emissions. Results show that parameters related to water table behavior, carbon input (in form of plant productivity) and rooting depth affect simulated methane emissions the most. Current efforts include to perform the sensitivity analysis again on methane emissions outputs from an updated model that incorporates a soil heat flux routine and to determine the extent by which the soil temperature parameters affect CH4 emissions. Currently we are conducting field collection of data during Summer 2017 for comparison among 3 different landscapes located in the Ordway-Swisher Biological Station in Melrose, FL. We are collecting soil moisture and CH4 emission data from 4 different wetland types. Having data from 4 wetland types allows for calibration of the model to diverse soil, water and vegetation characteristics.
HYSOGs250m, global gridded hydrologic soil groups for curve-number-based runoff modeling.
Ross, C Wade; Prihodko, Lara; Anchang, Julius; Kumar, Sanath; Ji, Wenjie; Hanan, Niall P
2018-05-15
Hydrologic soil groups (HSGs) are a fundamental component of the USDA curve-number (CN) method for estimation of rainfall runoff; yet these data are not readily available in a format or spatial-resolution suitable for regional- and global-scale modeling applications. We developed a globally consistent, gridded dataset defining HSGs from soil texture, bedrock depth, and groundwater. The resulting data product-HYSOGs250m-represents runoff potential at 250 m spatial resolution. Our analysis indicates that the global distribution of soil is dominated by moderately high runoff potential, followed by moderately low, high, and low runoff potential. Low runoff potential, sandy soils are found primarily in parts of the Sahara and Arabian Deserts. High runoff potential soils occur predominantly within tropical and sub-tropical regions. No clear pattern could be discerned for moderately low runoff potential soils, as they occur in arid and humid environments and at both high and low elevations. Potential applications of this data include CN-based runoff modeling, flood risk assessment, and as a covariate for biogeographical analysis of vegetation distributions.
Social Sensor Analytics: Making Sense of Network Models in Social Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowling, Chase P.; Harrison, Joshua J.; Sathanur, Arun V.
Social networks can be thought of as noisy sensor networks mapping real world information to the web. Owing to the extensive body of literature in sensor network analysis, this work sought to apply several novel and traditional methods in sensor network analysis for the purposes of efficiently interrogating social media data streams from raw data. We carefully revisit our definition of a social media signal from previous work both in terms of time-varying features within the data and the networked nature of the medium. Further, we detail our analysis of global patterns in Twitter over the months of November 2013more » and June 2014, detect and categorize events, and illustrate how these analyses can be used to inform graph-based models of Twitter, namely using a recent network influence model called PhySense: similar to PageRank but tuned to behavioral analysis by leveraging a sociologically inspired probabilistic model. We ultimately identify forms of information dissemination via analysis of time series and dynamic graph spectra and corroborate these findings through manual investigation of the data as a requisite step in modeling the diffusion process with PhySense. We hope to sufficiently characterize global behavior in a medium such as Twitter as a means of learning global model parameters one may use to predict or simulate behavior on a large scale. We have made our time series and dynamic graph analytical code available via a GitHub repository https://github.com/cpatdowling/salsa and our data are available upon request.« less
NASA Astrophysics Data System (ADS)
Matsypura, Dmytro
In this dissertation, I develop a new theoretical framework for the modeling, pricing analysis, and computation of solutions to electric power supply chains with power generators, suppliers, transmission service providers, and the inclusion of consumer demands. In particular, I advocate the application of finite-dimensional variational inequality theory, projected dynamical systems theory, game theory, network theory, and other tools that have been recently proposed for the modeling and analysis of supply chain networks (cf. Nagurney (2006)) to electric power markets. This dissertation contributes to the extant literature on the modeling, analysis, and solution of supply chain networks, including global supply chains, in general, and electric power supply chains, in particular, in the following ways. It develops a theoretical framework for modeling, pricing analysis, and computation of electric power flows/transactions in electric power systems using the rationale for supply chain analysis. The models developed include both static and dynamic ones. The dissertation also adds a new dimension to the methodology of the theory of projected dynamical systems by proving that, irrespective of the speeds of adjustment, the equilibrium of the system remains the same. Finally, I include alternative fuel suppliers, along with their behavior into the supply chain modeling and analysis framework. This dissertation has strong practical implications. In an era in which technology and globalization, coupled with increasing risk and uncertainty, complicate electricity demand and supply within and between nations, the successful management of electric power systems and pricing become increasingly pressing topics with relevance not only for economic prosperity but also national security. This dissertation addresses such related topics by providing models, pricing tools, and algorithms for decentralized electric power supply chains. This dissertation is based heavily on the following coauthored papers: Nagurney, Cruz, and Matsypura (2003), Nagurney and Matsypura (2004, 2005, 2006), Matsypura and Nagurney (2005), Matsypura, Nagurney, and Liu (2006).
A brief review of models of DC-DC power electronic converters for analysis of their stability
NASA Astrophysics Data System (ADS)
Siewniak, Piotr; Grzesik, Bogusław
2014-10-01
A brief review of models of DC-DC power electronic converters (PECs) is presented in this paper. It contains the most popular, continuous-time and discrete-time models used for PEC simulation, design, stability analysis and other applications. Both large-signal and small-signal models are considered. Special attention is paid to models that are used in practice for the analysis of the global and local stability of PECs.
Global and regional ecosystem modeling: comparison of model outputs and field measurements
NASA Astrophysics Data System (ADS)
Olson, R. J.; Hibbard, K.
2003-04-01
The Ecosystem Model-Data Intercomparison (EMDI) Workshops provide a venue for global ecosystem modeling groups to compare model outputs against measurements of net primary productivity (NPP). The objective of EMDI Workshops is to evaluate model performance relative to observations in order to improve confidence in global model projections terrestrial carbon cycling. The questions addressed by EMDI include: How does the simulated NPP compare with the field data across biome and environmental gradients? How sensitive are models to site-specific climate? Does additional mechanistic detail in models result in a better match with field measurements? How useful are the measures of NPP for evaluating model predictions? How well do models represent regional patterns of NPP? Initial EMDI results showed general agreement between model predictions and field measurements but with obvious differences that indicated areas for potential data and model improvement. The effort was built on the development and compilation of complete and consistent databases for model initialization and comparison. Database development improves the data as well as models; however, there is a need to incorporate additional observations and model outputs (LAI, hydrology, etc.) for comprehensive analyses of biogeochemical processes and their relationships to ecosystem structure and function. EMDI initialization and NPP data sets are available from the Oak Ridge National Laboratory Distributed Active Archive Center http://www.daac.ornl.gov/. Acknowledgements: This work was partially supported by the International Geosphere-Biosphere Programme - Data and Information System (IGBP-DIS); the IGBP-Global Analysis, Interpretation and Modelling Task Force (GAIM); the National Center for Ecological Analysis and Synthesis (NCEAS); and the National Aeronautics and Space Administration (NASA) Terrestrial Ecosystem Program. Oak Ridge National Laboratory is managed by UT-Battelle LLC for the U.S. Department of Energy under contract DE-AC05-00OR22725
NASA Astrophysics Data System (ADS)
Wood, E. F.; Yuan, X.; Sheffield, J.; Pan, M.; Roundy, J.
2013-12-01
One of the key recommendations of the WCRP Global Drought Information System (GDIS) workshop is to develop an experimental real-time global monitoring and prediction system. While great advances has been made in global drought monitoring based on satellite observations and model reanalysis data, global drought forecasting has been stranded in part due to the limited skill both in climate forecast models and global hydrologic predictions. Having been working on drought monitoring and forecasting over USA for more than a decade, the Princeton land surface hydrology group is now developing an experimental global drought early warning system that is based on multiple climate forecast models and a calibrated global hydrologic model. In this presentation, we will test its capability in seasonal forecasting of meteorological, agricultural and hydrologic droughts over global major river basins, using precipitation, soil moisture and streamflow forecasts respectively. Based on the joint probability distribution between observations using Princeton's global drought monitoring system and model hindcasts and real-time forecasts from North American Multi-Model Ensemble (NMME) project, we (i) bias correct the monthly precipitation and temperature forecasts from multiple climate forecast models, (ii) downscale them to a daily time scale, and (iii) use them to drive the calibrated VIC model to produce global drought forecasts at a 1-degree resolution. A parallel run using the ESP forecast method, which is based on resampling historical forcings, is also carried out for comparison. Analysis is being conducted over global major river basins, with multiple drought indices that have different time scales and characteristics. The meteorological drought forecast does not have uncertainty from hydrologic models and can be validated directly against observations - making the validation an 'apples-to-apples' comparison. Preliminary results for the evaluation of meteorological drought onset hindcasts indicate that climate models increase drought detectability over ESP by 31%-81%. However, less than 30% of the global drought onsets can be detected by climate models. The missed drought events are associated with weak ENSO signals and lower potential predictability. Due to the high false alarms from climate models, the reliability is more important than sharpness for a skillful probabilistic drought onset forecast. Validations and skill assessments for agricultural and hydrologic drought forecasts are carried out using soil moisture and streamflow output from the VIC land surface model (LSM) forced by a global forcing data set. Given our previous drought forecasting experiences over USA and Africa, validating the hydrologic drought forecasting is a significant challenge for a global drought early warning system.
Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...
2015-07-01
In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less
Long term, non-anthropogenic groundwater storage changes simulated by a global land surface model
NASA Astrophysics Data System (ADS)
Li, B.; Rodell, M.; Sheffield, J.; Wood, E. F.
2017-12-01
Groundwater is crucial for meeting agricultural, industrial and municipal water needs, especially in arid, semi-arid and drought impacted regions. Yet, knowledge on groundwater response to climate variability is not well understood due to lack of systematic and continuous in situ measurements. In this study, we investigate global non-anthropogenic groundwater storage variations with a land surface model driven by a 67-year (1948-204) meteorological forcing data set. Model estimates were evaluated using in situ groundwater data from the central and northeastern U.S. and terrestrial water storage derived from the Gravity Recovery and Climate Experiment (GRACE) satellites and found to be reasonable. Empirical orthogonal function (EOF) analysis was employed to examine modes of variability of groundwater storage and their relationship with atmospheric effects such as precipitation and evapotranspiration. The result shows that the leading mode in global groundwater storage reflects the influence of the El Niño Southern Oscillation (ENSO). Consistent with the EOF analysis, global total groundwater storage reflected the low frequency variability of ENSO and decreased significantly over 1948-2014 while global ET and precipitation did not exhibit statistically significant trends. This study suggests that while precipitation and ET are the primary drivers of climate related groundwater variability, changes in other forcing fields than precipitation and temperature are also important because of their influence on ET. We discuss the need to improve model physics and to continuously validate model estimates and forcing data for future studies.
Global stability of a multiple infected compartments model for waterborne diseases
NASA Astrophysics Data System (ADS)
Wang, Yi; Cao, Jinde
2014-10-01
In this paper, mathematical analysis is carried out for a multiple infected compartments model for waterborne diseases, such as cholera, giardia, and rotavirus. The model accounts for both person-to-person and water-to-person transmission routes. Global stability of the equilibria is studied. In terms of the basic reproduction number R0, we prove that, if R0⩽1, then the disease-free equilibrium is globally asymptotically stable and the infection always disappears; whereas if R0>1, there exists a unique endemic equilibrium which is globally asymptotically stable for the corresponding fast-slow system. Numerical simulations verify our theoretical results and present that the decay rate of waterborne pathogens has a significant impact on the epidemic growth rate. Also, we observe numerically that the unique endemic equilibrium is globally asymptotically stable for the whole system. This statement indicates that the present method need to be improved by other techniques.
Long-Term Global Morphology of Gravity Wave Activity Using UARS Data
NASA Technical Reports Server (NTRS)
Eckermann, Stephen D.; Bacmeister, Julio T.; Wu, Dong L.
1998-01-01
This is the first quarter's report on research to extract global gravity-wave data from satellite data and to model those observations synoptically. Preliminary analysis of global maps of extracted middle atmospheric temperature variance from the CRISTA instrument is presented, which appear to contain gravity-wave information. Corresponding simulations of global gravity-wave and mountain-wave activity during this mission period are described using global ray-tracing and mountain-wave models, and interesting similarities among simulated data and CRISTA data are noted. Climatological simulations of mesospheric gravity-wave activity using the HWM-03 wind-temperature climatology are also reported, for comparison with UARS MLS data. Preparatory work on modeling of gravity wave observations from space-based platforms and subsequent interpretation of the MLS gravity-wave product are also described. Preliminary interpretation and relation to the research objectives are provided, and further action for the next quarter's research is recommended.
The Virtual Brain: Modeling Biological Correlates of Recovery after Chronic Stroke
Falcon, Maria Inez; Riley, Jeffrey D.; Jirsa, Viktor; McIntosh, Anthony R.; Shereen, Ahmed D.; Chen, E. Elinor; Solodkin, Ana
2015-01-01
There currently remains considerable variability in stroke survivor recovery. To address this, developing individualized treatment has become an important goal in stroke treatment. As a first step, it is necessary to determine brain dynamics associated with stroke and recovery. While recent methods have made strides in this direction, we still lack physiological biomarkers. The Virtual Brain (TVB) is a novel application for modeling brain dynamics that simulates an individual’s brain activity by integrating their own neuroimaging data with local biophysical models. Here, we give a detailed description of the TVB modeling process and explore model parameters associated with stroke. In order to establish a parallel between this new type of modeling and those currently in use, in this work we establish an association between a specific TVB parameter (long-range coupling) that increases after stroke with metrics derived from graph analysis. We used TVB to simulate the individual BOLD signals for 20 patients with stroke and 10 healthy controls. We performed graph analysis on their structural connectivity matrices calculating degree centrality, betweenness centrality, and global efficiency. Linear regression analysis demonstrated that long-range coupling is negatively correlated with global efficiency (P = 0.038), but is not correlated with degree centrality or betweenness centrality. Our results suggest that the larger influence of local dynamics seen through the long-range coupling parameter is closely associated with a decreased efficiency of the system. We thus propose that the increase in the long-range parameter in TVB (indicating a bias toward local over global dynamics) is deleterious because it reduces communication as suggested by the decrease in efficiency. The new model platform TVB hence provides a novel perspective to understanding biophysical parameters responsible for global brain dynamics after stroke, allowing the design of focused therapeutic interventions. PMID:26579071
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin; Haghnegahdar, Amin
2016-04-01
Global sensitivity analysis (GSA) is a systems theoretic approach to characterizing the overall (average) sensitivity of one or more model responses across the factor space, by attributing the variability of those responses to different controlling (but uncertain) factors (e.g., model parameters, forcings, and boundary and initial conditions). GSA can be very helpful to improve the credibility and utility of Earth and Environmental System Models (EESMs), as these models are continually growing in complexity and dimensionality with continuous advances in understanding and computing power. However, conventional approaches to GSA suffer from (1) an ambiguous characterization of sensitivity, and (2) poor computational efficiency, particularly as the problem dimension grows. Here, we identify several important sensitivity-related characteristics of response surfaces that must be considered when investigating and interpreting the ''global sensitivity'' of a model response (e.g., a metric of model performance) to its parameters/factors. Accordingly, we present a new and general sensitivity and uncertainty analysis framework, Variogram Analysis of Response Surfaces (VARS), based on an analogy to 'variogram analysis', that characterizes a comprehensive spectrum of information on sensitivity. We prove, theoretically, that Morris (derivative-based) and Sobol (variance-based) methods and their extensions are special cases of VARS, and that their SA indices are contained within the VARS framework. We also present a practical strategy for the application of VARS to real-world problems, called STAR-VARS, including a new sampling strategy, called "star-based sampling". Our results across several case studies show the STAR-VARS approach to provide reliable and stable assessments of "global" sensitivity, while being at least 1-2 orders of magnitude more efficient than the benchmark Morris and Sobol approaches.
The regional and global significance of nitrogen removal in lakes and reservoirs
Harrison, J.A.; Maranger, R.J.; Alexander, Richard B.; Giblin, A.E.; Jacinthe, P.-A.; Mayorga, Emilio; Seitzinger, S.P.; Sobota, D.J.; Wollheim, W.M.
2009-01-01
Human activities have greatly increased the transport of biologically available nitrogen (N) through watersheds to potentially sensitive coastal ecosystems. Lentic water bodies (lakes and reservoirs) have the potential to act as important sinks for this reactive N as it is transported across the landscape because they offer ideal conditions for N burial in sediments or permanent loss via denitrification. However, the patterns and controls on lentic N removal have not been explored in great detail at large regional to global scales. In this paper we describe, evaluate, and apply a new, spatially explicit, annual-scale, global model of lentic N removal called NiRReLa (Nitrogen Retention in Reservoirs and Lakes). The NiRReLa model incorporates small lakes and reservoirs than have been included in previous global analyses, and also allows for separate treatment and analysis of reservoirs and natural lakes. Model runs for the mid-1990s indicate that lentic systems are indeed important sinks for N and are conservatively estimated to remove 19.7 Tg N year-1 from watersheds globally. Small lakes (<50 km2) were critical in the analysis, retaining almost half (9.3 Tg N year -1) of the global total. In model runs, capacity of lakes and reservoirs to remove watershed N varied substantially at the half-degree scale (0-100%) both as a function of climate and the density of lentic systems. Although reservoirs occupy just 6% of the global lentic surface area, we estimate they retain ~33% of the total N removed by lentic systems, due to a combination of higher drainage ratios (catchment surface area:lake or reservoir surface area), higher apparent settling velocities for N, and greater average N loading rates in reservoirs than in lakes. Finally, a sensitivity analysis of NiRReLa suggests that, on-average, N removal within lentic systems will respond more strongly to changes in land use and N loading than to changes in climate at the global scale. ?? 2008 Springer Science+Business Media B.V.
Implications of global warming for the climate of African rainforests
James, Rachel; Washington, Richard; Rowell, David P.
2013-01-01
African rainforests are likely to be vulnerable to changes in temperature and precipitation, yet there has been relatively little research to suggest how the regional climate might respond to global warming. This study presents projections of temperature and precipitation indices of relevance to African rainforests, using global climate model experiments to identify local change as a function of global temperature increase. A multi-model ensemble and two perturbed physics ensembles are used, one with over 100 members. In the east of the Congo Basin, most models (92%) show a wet signal, whereas in west equatorial Africa, the majority (73%) project an increase in dry season water deficits. This drying is amplified as global temperature increases, and in over half of coupled models by greater than 3% per °C of global warming. Analysis of atmospheric dynamics in a subset of models suggests that this could be partly because of a rearrangement of zonal circulation, with enhanced convection in the Indian Ocean and anomalous subsidence over west equatorial Africa, the Atlantic Ocean and, in some seasons, the Amazon Basin. Further research to assess the plausibility of this and other mechanisms is important, given the potential implications of drying in these rainforest regions. PMID:23878329
Implications of global warming for the climate of African rainforests.
James, Rachel; Washington, Richard; Rowell, David P
2013-01-01
African rainforests are likely to be vulnerable to changes in temperature and precipitation, yet there has been relatively little research to suggest how the regional climate might respond to global warming. This study presents projections of temperature and precipitation indices of relevance to African rainforests, using global climate model experiments to identify local change as a function of global temperature increase. A multi-model ensemble and two perturbed physics ensembles are used, one with over 100 members. In the east of the Congo Basin, most models (92%) show a wet signal, whereas in west equatorial Africa, the majority (73%) project an increase in dry season water deficits. This drying is amplified as global temperature increases, and in over half of coupled models by greater than 3% per °C of global warming. Analysis of atmospheric dynamics in a subset of models suggests that this could be partly because of a rearrangement of zonal circulation, with enhanced convection in the Indian Ocean and anomalous subsidence over west equatorial Africa, the Atlantic Ocean and, in some seasons, the Amazon Basin. Further research to assess the plausibility of this and other mechanisms is important, given the potential implications of drying in these rainforest regions.
Ecological network analysis on global virtual water trade.
Yang, Zhifeng; Mao, Xufeng; Zhao, Xu; Chen, Bin
2012-02-07
Global water interdependencies are likely to increase with growing virtual water trade. To address the issues of the indirect effects of water trade through the global economic circulation, we use ecological network analysis (ENA) to shed insight into the complicated system interactions. A global model of virtual water flow among agriculture and livestock production trade in 1995-1999 is also built as the basis for network analysis. Control analysis is used to identify the quantitative control or dependency relations. The utility analysis provides more indicators for describing the mutual relationship between two regions/countries by imitating the interactions in the ecosystem and distinguishes the beneficiary and the contributor of virtual water trade system. Results show control and utility relations can well depict the mutual relation in trade system, and direct observable relations differ from integral ones with indirect interactions considered. This paper offers a new way to depict the interrelations between trade components and can serve as a meaningful start as we continue to use ENA in providing more valuable implications for freshwater study on a global scale.
A Global Analysis of Light and Charge Yields in Liquid Xenon
Lenardo, Brian; Kazkaz, Kareem; Manalaysay, Aaron; ...
2015-11-04
Here, we present an updated model of light and charge yields from nuclear recoils in liquid xenon with a simultaneously constrained parameter set. A global analysis is performed using measurements of electron and photon yields compiled from all available historical data, as well as measurements of the ratio of the two. These data sweep over energies from keV and external applied electric fields from V/cm. The model is constrained by constructing global cost functions and using a simulated annealing algorithm and a Markov Chain Monte Carlo approach to optimize and find confidence intervals on all free parameters in the model.more » This analysis contrasts with previous work in that we do not unnecessarily exclude datasets nor impose artificially conservative assumptions, do not use spline functions, and reduce the number of parameters used in NEST v 0.98. Here, we report our results and the calculated best-fit charge and light yields. These quantities are crucial to understanding the response of liquid xenon detectors in the energy regime important for rare event searches such as the direct detection of dark matter particles.« less
NASA Astrophysics Data System (ADS)
Huang, Shih-Yu; Deng, Yi; Wang, Jingfeng
2017-09-01
The maximum-entropy-production (MEP) model of surface heat fluxes, based on contemporary non-equilibrium thermodynamics, information theory, and atmospheric turbulence theory, is used to re-estimate the global surface heat fluxes. The MEP model predicted surface fluxes automatically balance the surface energy budgets at all time and space scales without the explicit use of near-surface temperature and moisture gradient, wind speed and surface roughness data. The new MEP-based global annual mean fluxes over the land surface, using input data of surface radiation, temperature data from National Aeronautics and Space Administration-Clouds and the Earth's Radiant Energy System (NASA CERES) supplemented by surface specific humidity data from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), agree closely with previous estimates. The new estimate of ocean evaporation, not using the MERRA reanalysis data as model inputs, is lower than previous estimates, while the new estimate of ocean sensible heat flux is higher than previously reported. The MEP model also produces the first global map of ocean surface heat flux that is not available from existing global reanalysis products.
Evaluation of Diagnostic CO2 Flux and Transport Modeling in NU-WRF and GEOS-5
NASA Astrophysics Data System (ADS)
Kawa, S. R.; Collatz, G. J.; Tao, Z.; Wang, J. S.; Ott, L. E.; Liu, Y.; Andrews, A. E.; Sweeney, C.
2015-12-01
We report on recent diagnostic (constrained by observations) model simulations of atmospheric CO2 flux and transport using a newly developed facility in the NASA Unified-Weather Research and Forecast (NU-WRF) model. The results are compared to CO2 data (ground-based, airborne, and GOSAT) and to corresponding simulations from a global model that uses meteorology from the NASA GEOS-5 Modern Era Retrospective analysis for Research and Applications (MERRA). The objective of these intercomparisons is to assess the relative strengths and weaknesses of the respective models in pursuit of an overall carbon process improvement at both regional and global scales. Our guiding hypothesis is that the finer resolution and improved land surface representation in NU-WRF will lead to better comparisons with CO2 data than those using global MERRA, which will, in turn, inform process model development in global prognostic models. Initial intercomparison results, however, have generally been mixed: NU-WRF is better at some sites and times but not uniformly. We are examining the model transport processes in detail to diagnose differences in the CO2 behavior. These comparisons are done in the context of a long history of simulations from the Parameterized Chemistry and Transport Model, based on GEOS-5 meteorology and Carnegie Ames-Stanford Approach-Global Fire Emissions Database (CASA-GFED) fluxes, that capture much of the CO2 variation from synoptic to seasonal to global scales. We have run the NU-WRF model using unconstrained, internally generated meteorology within the North American domain, and with meteorological 'nudging' from Global Forecast System and North American Regional Reanalysis (NARR) in an effort to optimize the CO2 simulations. Output results constrained by NARR show the best comparisons to data. Discrepancies, of course, may arise either from flux or transport errors and compensating errors are possible. Resolving their interplay is also important to using the data in inverse models. Recent analysis is focused on planetary boundary depth, which can be significantly different between MERRA and NU-WRF, along with subgrid transport differences. Characterization of transport differences between the models will allow us to better constrain the CO2 fluxes, which is the major objective of this work.
Global Energy and Water Budgets in MERRA
NASA Technical Reports Server (NTRS)
Bosilovich, Michael G.; Robertson, Franklin R.; Chen, Junye
2010-01-01
Reanalyses, retrospectively analyzing observations over climatological time scales, represent a merger between satellite observations and models to provide globally continuous data and have improved over several generations. Balancing the Earth s global water and energy budgets has been a focus of research for more than two decades. Models tend to their own climate while remotely sensed observations have had varying degrees of uncertainty. This study evaluates the latest NASA reanalysis, called the Modern Era Retrospective-analysis for Research and Applications (MERRA), from a global water and energy cycles perspective. MERRA was configured to provide complete budgets in its output diagnostics, including the Incremental Analysis Update (IAU), the term that represents the observations influence on the analyzed states, alongside the physical flux terms. Precipitation in reanalyses is typically sensitive to the observational analysis. For MERRA, the global mean precipitation bias and spatial variability are more comparable to merged satellite observations (GPCP and CMAP) than previous generations of reanalyses. Ocean evaporation also has a much lower value which is comparable to observed data sets. The global energy budget shows that MERRA cloud effects may be generally weak, leading to excess shortwave radiation reaching the ocean surface. Evaluating the MERRA time series of budget terms, a significant change occurs, which does not appear to be represented in observations. In 1999, the global analysis increments of water vapor changes sign from negative to positive, and primarily lead to more oceanic precipitation. This change is coincident with the beginning of AMSU radiance assimilation. Previous and current reanalyses all exhibit some sensitivity to perturbations in the observation record, and this remains a significant research topic for reanalysis development. The effect of the changing observing system is evaluated for MERRA water and energy budget terms.
Consistency between the global and regional modeling components of CAMS over Europe.
NASA Astrophysics Data System (ADS)
Katragkou, Eleni; Akritidis, Dimitrios; Kontos, Serafim; Zanis, Prodromos; Melas, Dimitrios; Engelen, Richard; Plu, Matthieu; Eskes, Henk
2017-04-01
The Copernicus Atmosphere Monitoring Service (CAMS) is a component of the European Earth Observation programme Copernicus. CAMS consists of two major forecast and analysis systems: i) the CAMS global near-real time service, based on the ECMWF Integrated Forecast System (C-IFS), which provides daily analyses and forecasts of reactive trace gases, greenhouse gases and aerosol concentrations ii) a regional ensemble (ENS) for European air quality, compiled and disseminated by Météo-France, which consists of seven ensemble members. The boundaries from the regional ensemble members are extracted from the global CAMS forecast product. This work reports on the consistency between the global and regional modeling components of CAMS, and the impact of global CAMS boundary conditions on regional forecasts. The current analysis includes ozone (O3) carbon monoxide (CO) and aerosol (PM10/PM2.5) forecasts. The comparison indicates an overall good agreement between the global C-IFS and the regional ENS patterns for O3 and CO, especially above 250m altitude, indicating that the global boundary conditions are efficiently included in the regional ensemble simulations. As expected, differences are found within the PBL, with lower/higher C-IFS O3/CO concentrations over continental Europe with respect to ENS.
NASA Astrophysics Data System (ADS)
Raleigh, M. S.; Lundquist, J. D.; Clark, M. P.
2015-07-01
Physically based models provide insights into key hydrologic processes but are associated with uncertainties due to deficiencies in forcing data, model parameters, and model structure. Forcing uncertainty is enhanced in snow-affected catchments, where weather stations are scarce and prone to measurement errors, and meteorological variables exhibit high variability. Hence, there is limited understanding of how forcing error characteristics affect simulations of cold region hydrology and which error characteristics are most important. Here we employ global sensitivity analysis to explore how (1) different error types (i.e., bias, random errors), (2) different error probability distributions, and (3) different error magnitudes influence physically based simulations of four snow variables (snow water equivalent, ablation rates, snow disappearance, and sublimation). We use the Sobol' global sensitivity analysis, which is typically used for model parameters but adapted here for testing model sensitivity to coexisting errors in all forcings. We quantify the Utah Energy Balance model's sensitivity to forcing errors with 1 840 000 Monte Carlo simulations across four sites and five different scenarios. Model outputs were (1) consistently more sensitive to forcing biases than random errors, (2) generally less sensitive to forcing error distributions, and (3) critically sensitive to different forcings depending on the relative magnitude of errors. For typical error magnitudes found in areas with drifting snow, precipitation bias was the most important factor for snow water equivalent, ablation rates, and snow disappearance timing, but other forcings had a more dominant impact when precipitation uncertainty was due solely to gauge undercatch. Additionally, the relative importance of forcing errors depended on the model output of interest. Sensitivity analysis can reveal which forcing error characteristics matter most for hydrologic modeling.
Sensitivity analysis of Repast computational ecology models with R/Repast.
Prestes García, Antonio; Rodríguez-Patón, Alfonso
2016-12-01
Computational ecology is an emerging interdisciplinary discipline founded mainly on modeling and simulation methods for studying ecological systems. Among the existing modeling formalisms, the individual-based modeling is particularly well suited for capturing the complex temporal and spatial dynamics as well as the nonlinearities arising in ecosystems, communities, or populations due to individual variability. In addition, being a bottom-up approach, it is useful for providing new insights on the local mechanisms which are generating some observed global dynamics. Of course, no conclusions about model results could be taken seriously if they are based on a single model execution and they are not analyzed carefully. Therefore, a sound methodology should always be used for underpinning the interpretation of model results. The sensitivity analysis is a methodology for quantitatively assessing the effect of input uncertainty in the simulation output which should be incorporated compulsorily to every work based on in-silico experimental setup. In this article, we present R/Repast a GNU R package for running and analyzing Repast Simphony models accompanied by two worked examples on how to perform global sensitivity analysis and how to interpret the results.
Rapid Global Fitting of Large Fluorescence Lifetime Imaging Microscopy Datasets
Warren, Sean C.; Margineanu, Anca; Alibhai, Dominic; Kelly, Douglas J.; Talbot, Clifford; Alexandrov, Yuriy; Munro, Ian; Katan, Matilda
2013-01-01
Fluorescence lifetime imaging (FLIM) is widely applied to obtain quantitative information from fluorescence signals, particularly using Förster Resonant Energy Transfer (FRET) measurements to map, for example, protein-protein interactions. Extracting FRET efficiencies or population fractions typically entails fitting data to complex fluorescence decay models but such experiments are frequently photon constrained, particularly for live cell or in vivo imaging, and this leads to unacceptable errors when analysing data on a pixel-wise basis. Lifetimes and population fractions may, however, be more robustly extracted using global analysis to simultaneously fit the fluorescence decay data of all pixels in an image or dataset to a multi-exponential model under the assumption that the lifetime components are invariant across the image (dataset). This approach is often considered to be prohibitively slow and/or computationally expensive but we present here a computationally efficient global analysis algorithm for the analysis of time-correlated single photon counting (TCSPC) or time-gated FLIM data based on variable projection. It makes efficient use of both computer processor and memory resources, requiring less than a minute to analyse time series and multiwell plate datasets with hundreds of FLIM images on standard personal computers. This lifetime analysis takes account of repetitive excitation, including fluorescence photons excited by earlier pulses contributing to the fit, and is able to accommodate time-varying backgrounds and instrument response functions. We demonstrate that this global approach allows us to readily fit time-resolved fluorescence data to complex models including a four-exponential model of a FRET system, for which the FRET efficiencies of the two species of a bi-exponential donor are linked, and polarisation-resolved lifetime data, where a fluorescence intensity and bi-exponential anisotropy decay model is applied to the analysis of live cell homo-FRET data. A software package implementing this algorithm, FLIMfit, is available under an open source licence through the Open Microscopy Environment. PMID:23940626
DOE Office of Scientific and Technical Information (OSTI.GOV)
MACKEY, T.C.
M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project-DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Themore » overall seismic analysis of the DSTs is being performed with the general-purpose finite element code ANSYS. The overall model used for the seismic analysis of the DSTs includes the DST structure, the contained waste, and the surrounding soil. The seismic analysis of the DSTs must address the fluid-structure interaction behavior and sloshing response of the primary tank and contained liquid. ANSYS has demonstrated capabilities for structural analysis, but the capabilities and limitations of ANSYS to perform fluid-structure interaction are less well understood. The purpose of this study is to demonstrate the capabilities and investigate the limitations of ANSYS for performing a fluid-structure interaction analysis of the primary tank and contained waste. To this end, the ANSYS solutions are benchmarked against theoretical solutions appearing in BNL 1995, when such theoretical solutions exist. When theoretical solutions were not available, comparisons were made to theoretical solutions of similar problems and to the results from Dytran simulations. The capabilities and limitations of the finite element code Dytran for performing a fluid-structure interaction analysis of the primary tank and contained waste were explored in a parallel investigation (Abatt 2006). In conjunction with the results of the global ANSYS analysis reported in Carpenter et al. (2006), the results of the two investigations will be compared to help determine if a more refined sub-model of the primary tank is necessary to capture the important fluid-structure interaction effects in the tank and if so, how to best utilize a refined sub-model of the primary tank. Both rigid tank and flexible tank configurations were analyzed with ANSYS. The response parameters of interest are total hydrodynamic reaction forces, impulsive and convective mode frequencies, waste pressures, and slosh heights. To a limited extent: tank stresses are also reported. The results of this study demonstrate that the ANSYS model has the capability to adequately predict global responses such as frequencies and overall reaction forces. Thus, the model is suitable for predicting the global response of the tank and contained waste. On the other hand, while the ANSYS model is capable of adequately predicting waste pressures and primary tank stresses in a large portion of the waste tank, the model does not accurately capture the convective behavior of the waste near the free surface, nor did the model give accurate predictions of slosh heights. Based on the ability of the ANSYS benchmark model to accurately predict frequencies and global reaction forces and on the results presented in Abatt, et al. (2006), the global ANSYS model described in Carpenter et al. (2006) is sufficient for the seismic evaluation of all tank components except for local areas of the primary tank. Due to the limitations of the ANSYS model in predicting the convective response of the waste, the evaluation of primary tank stresses near the waste free surface should be supplemented by results from an ANSYS sub-model of the primary tank that incorporates pressures from theoretical solutions or from Dytran solutions. However, the primary tank is expected to have low demand to capacity ratios in the upper wall. Moreover, due to the less than desired mesh resolution in the primary tank knuckle of the global ANSYS model, the evaluation of the primary tank stresses in the lower knuckle should be supplemented by results from a more refined ANSYS sub-model of the primary tank that incorporates pressures from theoretical solutions or from Dytran solutions.« less
Long-range persistence in the global mean surface temperature and the global warming "time bomb"
NASA Astrophysics Data System (ADS)
Rypdal, M.; Rypdal, K.
2012-04-01
Detrended Fluctuation Analysis (DFA) and Maximum Likelihood Estimations (MLE) based on instrumental data over the last 160 years indicate that there is Long-Range Persistence (LRP) in Global Mean Surface Temperature (GMST) on time scales of months to decades. The persistence is much higher in sea surface temperature than in land temperatures. Power spectral analysis of multi-model, multi-ensemble runs of global climate models indicate further that this persistence may extend to centennial and maybe even millennial time-scales. We also support these conclusions by wavelet variogram analysis, DFA, and MLE of Northern hemisphere mean surface temperature reconstructions over the last two millennia. These analyses indicate that the GMST is a strongly persistent noise with Hurst exponent H>0.9 on time scales from decades up to at least 500 years. We show that such LRP can be very important for long-term climate prediction and for the establishment of a "time bomb" in the climate system due to a growing energy imbalance caused by the slow relaxation to radiative equilibrium under rising anthropogenic forcing. We do this by the construction of a multi-parameter dynamic-stochastic model for the GMST response to deterministic and stochastic forcing, where LRP is represented by a power-law response function. Reconstructed data for total forcing and GMST over the last millennium are used with this model to estimate trend coefficients and Hurst exponent for the GMST on multi-century time scale by means of MLE. Ensembles of solutions generated from the stochastic model also allow us to estimate confidence intervals for these estimates.
Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H
2016-12-01
Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity. © 2016 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
An operational global-scale ocean thermal analysis system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, R. M.; Pollak, K.D.; Phoebus, P.A.
1990-04-01
The Optimum Thermal Interpolation System (OTIS) is an ocean thermal analysis system designed for operational use at FNOC. It is based on the optimum interpolation of the assimilation technique and functions in an analysis-prediction-analysis data assimilation cycle with the TOPS mixed-layer model. OTIS provides a rigorous framework for combining real-time data, climatology, and predictions from numerical ocean prediction models to produce a large-scale synoptic representation of ocean thermal structure. The techniques and assumptions used in OTIS are documented and results of operational tests of global scale OTIS at FNOC are presented. The tests involved comparisons of OTIS against an existingmore » operational ocean thermal structure model and were conducted during February, March, and April 1988. Qualitative comparison of the two products suggests that OTIS gives a more realistic representation of subsurface anomalies and horizontal gradients and that it also gives a more accurate analysis of the thermal structure, with improvements largest below the mixed layer. 37 refs.« less
Ménard, Richard; Deshaies-Jacques, Martin; Gasset, Nicolas
2016-09-01
An objective analysis is one of the main components of data assimilation. By combining observations with the output of a predictive model we combine the best features of each source of information: the complete spatial and temporal coverage provided by models, with a close representation of the truth provided by observations. The process of combining observations with a model output is called an analysis. To produce an analysis requires the knowledge of observation and model errors, as well as its spatial correlation. This paper is devoted to the development of methods of estimation of these error variances and the characteristic length-scale of the model error correlation for its operational use in the Canadian objective analysis system. We first argue in favor of using compact support correlation functions, and then introduce three estimation methods: the Hollingsworth-Lönnberg (HL) method in local and global form, the maximum likelihood method (ML), and the [Formula: see text] diagnostic method. We perform one-dimensional (1D) simulation studies where the error variance and true correlation length are known, and perform an estimation of both error variances and correlation length where both are non-uniform. We show that a local version of the HL method can capture accurately the error variances and correlation length at each observation site, provided that spatial variability is not too strong. However, the operational objective analysis requires only a single and globally valid correlation length. We examine whether any statistics of the local HL correlation lengths could be a useful estimate, or whether other global estimation methods such as by the global HL, ML, or [Formula: see text] should be used. We found in both 1D simulation and using real data that the ML method is able to capture physically significant aspects of the correlation length, while most other estimates give unphysical and larger length-scale values. This paper describes a proposed improvement of the objective analysis of surface pollutants at Environment and Climate Change Canada (formerly known as Environment Canada). Objective analyses are essentially surface maps of air pollutants that are obtained by combining observations with an air quality model output, and are thought to provide a complete and more accurate representation of the air quality. The highlight of this study is an analysis of methods to estimate the model (or background) error correlation length-scale. The error statistics are an important and critical component to the analysis scheme.
Mathematical supply-chain modelling: Product analysis of cost and time
NASA Astrophysics Data System (ADS)
Easters, D. J.
2014-03-01
Establishing a mathematical supply-chain model is a proposition that has received attention due to its inherent benefits of evolving global supply-chain efficiencies. This paper discusses the prevailing relationships found within apparel supply-chain environments, and contemplates the complex issues indicated for constituting a mathematical model. Principal results identified within the data suggest, that the multifarious nature of global supply-chain activities require a degree of simplification in order to fully dilate the necessary factors which affect, each sub-section of the chain. Subsequently, the research findings allowed the division of supply-chain components into sub-sections, which amassed a coherent method of product development activity. Concurrently, the supply-chain model was found to allow systematic mathematical formulae analysis, of cost and time, within the multiple contexts of each subsection encountered. The paper indicates the supply-chain model structure, the mathematics, and considers how product analysis of cost and time can improve the comprehension of product lifecycle management.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.
2013-01-01
ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.
NASA Technical Reports Server (NTRS)
1978-01-01
Research activities related to global weather, ocean/air interactions, and climate are reported. The global weather research is aimed at improving the assimilation of satellite-derived data in weather forecast models, developing analysis/forecast models that can more fully utilize satellite data, and developing new measures of forecast skill to properly assess the impact of satellite data on weather forecasting. The oceanographic research goal is to understand and model the processes that determine the general circulation of the oceans, focusing on those processes that affect sea surface temperature and oceanic heat storage, which are the oceanographic variables with the greatest influence on climate. The climate research objective is to support the development and effective utilization of space-acquired data systems in climate forecast models and to conduct sensitivity studies to determine the affect of lower boundary conditions on climate and predictability studies to determine which global climate features can be modeled either deterministically or statistically.
Polar motion excitation analysis due to global continental water redistribution
NASA Astrophysics Data System (ADS)
Fernandez, L.; Schuh, H.
2006-10-01
We present the results obtained when studying the hydrological excitation of the Earth‘s wobble due to global redistribution of continental water storage. This work was performed in two steps. First, we computed the hydrological angular momentum (HAM) time series based on the global hydrological model LaD (Land Dynamics model) for the period 1980 till 2004. Then, we compared the effectiveness of this excitation by analysing the residuals of the geodetic time series after removing atmospheric and oceanic contributions with the respective hydrological ones. The emphasis was put on low frequency variations. We also present a comparison of HAM time series from LaD with respect to that one from a global model based on the assimilated soil moisture and snow accumulation data from NCEP/NCAR (The National Center for Environmental Prediction/The National Center for Atmospheric Research) reanalysis. Finally, we evaluate the performance of LaD model in closing the polar motion budget at seasonal periods in comparison with the NCEP and the Land Data Assimilation System (LDAS) models.
High-resolution Local Gravity Model of the South Pole of the Moon from GRAIL Extended Mission Data
NASA Technical Reports Server (NTRS)
Goossens, Sander Johannes; Sabaka, Terence J.; Nicholas, Joseph B.; Lemoine, Frank G.; Rowlands, David D.; Mazarico, Erwan; Neumann, Gregory A.; Smith, David E.; Zuber, Maria T.
2014-01-01
We estimated a high-resolution local gravity field model over the south pole of the Moon using data from the Gravity Recovery and Interior Laboratory's extended mission. Our solution consists of adjustments with respect to a global model expressed in spherical harmonics. The adjustments are expressed as gridded gravity anomalies with a resolution of 1/6deg by 1/6deg (equivalent to that of a degree and order 1080 model in spherical harmonics), covering a cap over the south pole with a radius of 40deg. The gravity anomalies have been estimated from a short-arc analysis using only Ka-band range-rate (KBRR) data over the area of interest. We apply a neighbor-smoothing constraint to our solution. Our local model removes striping present in the global model; it reduces the misfit to the KBRR data and improves correlations with topography to higher degrees than current global models.
Production of NOx by Lightning and its Effects on Atmospheric Chemistry
NASA Technical Reports Server (NTRS)
Pickering, Kenneth E.
2009-01-01
Production of NO(x) by lightning remains the NO(x) source with the greatest uncertainty. Current estimates of the global source strength range over a factor of four (from 2 to 8 TgN/year). Ongoing efforts to reduce this uncertainty through field programs, cloud-resolved modeling, global modeling, and satellite data analysis will be described in this seminar. Representation of the lightning source in global or regional chemical transport models requires three types of information: the distribution of lightning flashes as a function of time and space, the production of NO(x) per flash, and the effective vertical distribution of the lightning-injected NO(x). Methods of specifying these items in a model will be discussed. For example, the current method of specifying flash rates in NASA's Global Modeling Initiative (GMI) chemical transport model will be discussed, as well as work underway in developing algorithms for use in the regional models CMAQ and WRF-Chem. A number of methods have been employed to estimate either production per lightning flash or the production per unit flash length. Such estimates derived from cloud-resolved chemistry simulations and from satellite NO2 retrievals will be presented as well as the methodologies employed. Cloud-resolved model output has also been used in developing vertical profiles of lightning NO(x) for use in global models. Effects of lightning NO(x) on O3 and HO(x) distributions will be illustrated regionally and globally.
NASA Astrophysics Data System (ADS)
Bennett, Katrina E.; Urrego Blanco, Jorge R.; Jonko, Alexandra; Bohn, Theodore J.; Atchley, Adam L.; Urban, Nathan M.; Middleton, Richard S.
2018-01-01
The Colorado River Basin is a fundamentally important river for society, ecology, and energy in the United States. Streamflow estimates are often provided using modeling tools which rely on uncertain parameters; sensitivity analysis can help determine which parameters impact model results. Despite the fact that simulated flows respond to changing climate and vegetation in the basin, parameter sensitivity of the simulations under climate change has rarely been considered. In this study, we conduct a global sensitivity analysis to relate changes in runoff, evapotranspiration, snow water equivalent, and soil moisture to model parameters in the Variable Infiltration Capacity (VIC) hydrologic model. We combine global sensitivity analysis with a space-filling Latin Hypercube Sampling of the model parameter space and statistical emulation of the VIC model to examine sensitivities to uncertainties in 46 model parameters following a variance-based approach. We find that snow-dominated regions are much more sensitive to uncertainties in VIC parameters. Although baseflow and runoff changes respond to parameters used in previous sensitivity studies, we discover new key parameter sensitivities. For instance, changes in runoff and evapotranspiration are sensitive to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI) in the VIC model. It is critical for improved modeling to narrow uncertainty in these parameters through improved observations and field studies. This is important because LAI and albedo are anticipated to change under future climate and narrowing uncertainty is paramount to advance our application of models such as VIC for water resource management.
2015-07-17
under- ice scattering, bathymetric diffraction and the application of the ocean acoustic Parabolic Equation to infrasound. 2. Tasks a. Task 1...and Climate of the Ocean, Phase II (ECCO2): High-Resolution Global-Ocean and Sea- Ice Data Synthesis) model re- analysis for the years 1992 and 1993...The ECCO2 model is a state estimation based upon data syntheses obtained by least squares fitting of the global ocean and sea- ice configuration of
GLOBAL REFERENCE ATMOSPHERIC MODELS FOR AEROASSIST APPLICATIONS
NASA Technical Reports Server (NTRS)
Duvall, Aleta; Justus, C. G.; Keller, Vernon W.
2005-01-01
Aeroassist is a broad category of advanced transportation technology encompassing aerocapture, aerobraking, aeroentry, precision landing, hazard detection and avoidance, and aerogravity assist. The eight destinations in the Solar System with sufficient atmosphere to enable aeroassist technology are Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Saturn's moon Titan. Engineering-level atmospheric models for five of these targets - Earth, Mars, Titan, Neptune, and Venus - have been developed at NASA's Marshall Space Flight Center. These models are useful as tools in mission planning and systems analysis studies associated with aeroassist applications. The series of models is collectively named the Global Reference Atmospheric Model or GRAM series. An important capability of all the models in the GRAM series is their ability to simulate quasi-random perturbations for Monte Carlo analysis in developing guidance, navigation and control algorithms, for aerothermal design, and for other applications sensitive to atmospheric variability. Recent example applications are discussed.
Multi-water-bag models of ion temperature gradient instability in cylindrical geometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coulette, David; Besse, Nicolas
2013-05-15
Ion temperature gradient instabilities play a major role in the understanding of anomalous transport in core fusion plasmas. In the considered cylindrical geometry, ion dynamics is described using a drift-kinetic multi-water-bag model for the parallel velocity dependency of the ion distribution function. In a first stage, global linear stability analysis is performed. From the obtained normal modes, parametric dependencies of the main spectral characteristics of the instability are then examined. Comparison of the multi-water-bag results with a reference continuous Maxwellian case allows us to evaluate the effects of discrete parallel velocity sampling induced by the Multi-Water-Bag model. Differences between themore » global model and local models considered in previous works are discussed. Using results from linear, quasilinear, and nonlinear numerical simulations, an analysis of the first stage saturation dynamics of the instability is proposed, where the divergence between the three models is examined.« less
NASA Astrophysics Data System (ADS)
Miller, D. O.; Brune, W. H.
2017-12-01
Accurate estimates of secondary organic aerosol (SOA) from atmospheric models is a major research challenge due to the complexity of the chemical and physical processes involved in the SOA formation and continuous aging. The primary uncertainties of SOA models include those associated with the formation of gas-phase products, the conversion between gas phase and particle phase, the aging mechanisms of SOA, and other processes related to the heterogeneous and particle-phase reactions. To address this challenge, we us a modular modeling framework that combines both simple and near-explicit gas-phase reactions and a two-dimensional volatility basis set (2D-VBS) to simulate the formation and evolution of SOA. Global sensitivity analysis is used to assess the relative importance of the model input parameters. In addition, the model is compared to the measurements from the Focused Isoprene eXperiment at the California Institute of Technology (FIXCIT).
A New Methodology of Spatial Cross-Correlation Analysis
Chen, Yanguang
2015-01-01
Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120
A new methodology of spatial cross-correlation analysis.
Chen, Yanguang
2015-01-01
Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.
High resolution global climate modelling; the UPSCALE project, a large simulation campaign
NASA Astrophysics Data System (ADS)
Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.
2014-01-01
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.
High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign
NASA Astrophysics Data System (ADS)
Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.
2014-08-01
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.
Disciplinary competitiveness analysis in international stomatology education.
Wen, Ping; Hong, Xiao; Zhu, Lu; Zhang, Linglin; Gu, Xuekui; Gao, Zhihua; Chen, Qianming
2013-11-01
With economic and cultural globalization, the trend of globalization of higher education becomes inevitable. Using the concept of competitiveness, the authors established a principal component analysis (PCA) model to examine disciplinary competitiveness in stomatology of various higher education institutions worldwide. A total of forty-four universities entered the final list according to these calculations. Possible reasons for their selection were explored and explained at macro and micro levels. The authors further accessed various sources of data and summarized several suggestions for enhancing disciplinary competitiveness for other universities in pursuit of promoting their position in the global spectrum.
NASA Astrophysics Data System (ADS)
Holt, C. R.; Szunyogh, I.; Gyarmati, G.; Hoffman, R. N.; Leidner, M.
2011-12-01
Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs. We implement the Local Ensemble Transform Kalman Filter algorithm, a vari- ation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area anal- ysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific. The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs. The global component of the ensemble-based system shows improvement in po- sition analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional com- ponent of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
Global impacts of the 1980s regime shift.
Reid, Philip C; Hari, Renata E; Beaugrand, Grégory; Livingstone, David M; Marty, Christoph; Straile, Dietmar; Barichivich, Jonathan; Goberville, Eric; Adrian, Rita; Aono, Yasuyuki; Brown, Ross; Foster, James; Groisman, Pavel; Hélaouët, Pierre; Hsu, Huang-Hsiung; Kirby, Richard; Knight, Jeff; Kraberg, Alexandra; Li, Jianping; Lo, Tzu-Ting; Myneni, Ranga B; North, Ryan P; Pounds, J Alan; Sparks, Tim; Stübi, René; Tian, Yongjun; Wiltshire, Karen H; Xiao, Dong; Zhu, Zaichun
2016-02-01
Despite evidence from a number of Earth systems that abrupt temporal changes known as regime shifts are important, their nature, scale and mechanisms remain poorly documented and understood. Applying principal component analysis, change-point analysis and a sequential t-test analysis of regime shifts to 72 time series, we confirm that the 1980s regime shift represented a major change in the Earth's biophysical systems from the upper atmosphere to the depths of the ocean and from the Arctic to the Antarctic, and occurred at slightly different times around the world. Using historical climate model simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) and statistical modelling of historical temperatures, we then demonstrate that this event was triggered by rapid global warming from anthropogenic plus natural forcing, the latter associated with the recovery from the El Chichón volcanic eruption. The shift in temperature that occurred at this time is hypothesized as the main forcing for a cascade of abrupt environmental changes. Within the context of the last century or more, the 1980s event was unique in terms of its global scope and scale; our observed consequences imply that if unavoidable natural events such as major volcanic eruptions interact with anthropogenic warming unforeseen multiplier effects may occur. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Ampatzidis, Dimitrios; König, Rolf; Glaser, Susanne; Heinkelmann, Robert; Schuh, Harald; Flechtner, Frank; Nilsson, Tobias
2016-04-01
The aim of our study is to assess the classical Helmert similarity transformation using the Velocity Decomposition Analysis (VEDA). The VEDA is a new methodology, developed by GFZ for the assessment of the reference frames' temporal variation and it is based on the separation of the velocities into two specified parts: The first is related to the reference system choice (the so called datum effect) and the latter one which refers to the real deformation of the terrestrial points. The advantage of the VEDA is its ability to detect the relative biases and reference system effects between two different frames or two different realizations of the same frame, respectively. We apply the VEDA for the assessment between several modern tectonic plate models and the recent global terrestrial reference frames.
A program in global biology. [biota-environment interaction important to life
NASA Technical Reports Server (NTRS)
Mooneyhan, D. W.
1983-01-01
NASA's Global Biology Research Program and its goals for greater understanding of planetary biological processes are discussed. Consideration is given to assessing major pathways and rates of exchange of elements such as carbon and nitrogen, extrapolating local rates of anaerobic activities, determining exchange rates of ocean nutrients, and developing models for the global cycles of carbon, nitrogen, sulfur, and phosphorus. Satellites and sensors operating today are covered: the Nimbus, NOAA, and Landsat series. Block diagrams of the software and hardware for a typical ground data processing and analysis system are provided. Samples of the surface cover data achieved with the Advanced Very High Resolution Radiometer, the Multispectral Scanner, and the Thematic Mapper are presented, as well as a productive capacity model for coastal wetlands. Finally, attention is given to future goals, their engineering requirements, and the necessary data analysis system.
Global Gridded Crop Model Evaluation: Benchmarking, Skills, Deficiencies and Implications.
NASA Technical Reports Server (NTRS)
Muller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; Deryng, Delphine; Folberth, Christian; Glotter, Michael; Hoek, Steven;
2017-01-01
Crop models are increasingly used to simulate crop yields at the global scale, but so far there is no general framework on how to assess model performance. Here we evaluate the simulation results of 14 global gridded crop modeling groups that have contributed historic crop yield simulations for maize, wheat, rice and soybean to the Global Gridded Crop Model Intercomparison (GGCMI) of the Agricultural Model Intercomparison and Improvement Project (AgMIP). Simulation results are compared to reference data at global, national and grid cell scales and we evaluate model performance with respect to time series correlation, spatial correlation and mean bias. We find that global gridded crop models (GGCMs) show mixed skill in reproducing time series correlations or spatial patterns at the different spatial scales. Generally, maize, wheat and soybean simulations of many GGCMs are capable of reproducing larger parts of observed temporal variability (time series correlation coefficients (r) of up to 0.888 for maize, 0.673 for wheat and 0.643 for soybean at the global scale) but rice yield variability cannot be well reproduced by most models. Yield variability can be well reproduced for most major producing countries by many GGCMs and for all countries by at least some. A comparison with gridded yield data and a statistical analysis of the effects of weather variability on yield variability shows that the ensemble of GGCMs can explain more of the yield variability than an ensemble of regression models for maize and soybean, but not for wheat and rice. We identify future research needs in global gridded crop modeling and for all individual crop modeling groups. In the absence of a purely observation-based benchmark for model evaluation, we propose that the best performing crop model per crop and region establishes the benchmark for all others, and modelers are encouraged to investigate how crop model performance can be increased. We make our evaluation system accessible to all crop modelers so that other modeling groups can also test their model performance against the reference data and the GGCMI benchmark.
NASA Technical Reports Server (NTRS)
Hailperin, Max
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.
NASA Astrophysics Data System (ADS)
Wang, Wei; Zhong, Ming; Cheng, Ling; Jin, Lu; Shen, Si
2018-02-01
In the background of building global energy internet, it has both theoretical and realistic significance for forecasting and analysing the ratio of electric energy to terminal energy consumption. This paper firstly analysed the influencing factors of the ratio of electric energy to terminal energy and then used combination method to forecast and analyse the global proportion of electric energy. And then, construct the cointegration model for the proportion of electric energy by using influence factor such as electricity price index, GDP, economic structure, energy use efficiency and total population level. At last, this paper got prediction map of the proportion of electric energy by using the combination-forecasting model based on multiple linear regression method, trend analysis method, and variance-covariance method. This map describes the development trend of the proportion of electric energy in 2017-2050 and the proportion of electric energy in 2050 was analysed in detail using scenario analysis.
A variable resolution nonhydrostatic global atmospheric semi-implicit semi-Lagrangian model
NASA Astrophysics Data System (ADS)
Pouliot, George Antoine
2000-10-01
The objective of this project is to develop a variable-resolution finite difference adiabatic global nonhydrostatic semi-implicit semi-Lagrangian (SISL) model based on the fully compressible nonhydrostatic atmospheric equations. To achieve this goal, a three-dimensional variable resolution dynamical core was developed and tested. The main characteristics of the dynamical core can be summarized as follows: Spherical coordinates were used in a global domain. A hydrostatic/nonhydrostatic switch was incorporated into the dynamical equations to use the fully compressible atmospheric equations. A generalized horizontal variable resolution grid was developed and incorporated into the model. For a variable resolution grid, in contrast to a uniform resolution grid, the order of accuracy of finite difference approximations is formally lost but remains close to the order of accuracy associated with the uniform resolution grid provided the grid stretching is not too significant. The SISL numerical scheme was implemented for the fully compressible set of equations. In addition, the generalized minimum residual (GMRES) method with restart and preconditioner was used to solve the three-dimensional elliptic equation derived from the discretized system of equations. The three-dimensional momentum equation was integrated in vector-form to incorporate the metric terms in the calculations of the trajectories. Using global re-analysis data for a specific test case, the model was compared to similar SISL models previously developed. Reasonable agreement between the model and the other independently developed models was obtained. The Held-Suarez test for dynamical cores was used for a long integration and the model was successfully integrated for up to 1200 days. Idealized topography was used to test the variable resolution component of the model. Nonhydrostatic effects were simulated at grid spacings of 400 meters with idealized topography and uniform flow. Using a high-resolution topographic data set and the variable resolution grid, sets of experiments with increasing resolution were performed over specific regions of interest. Using realistic initial conditions derived from re-analysis fields, nonhydrostatic effects were significant for grid spacings on the order of 0.1 degrees with orographic forcing. If the model code was adapted for use in a message passing interface (MPI) on a parallel supercomputer today, it was estimated that a global grid spacing of 0.1 degrees would be achievable for a global model. In this case, nonhydrostatic effects would be significant for most areas. A variable resolution grid in a global model provides a unified and flexible approach to many climate and numerical weather prediction problems. The ability to configure the model from very fine to very coarse resolutions allows for the simulation of atmospheric phenomena at different scales using the same code. We have developed a dynamical core illustrating the feasibility of using a variable resolution in a global model.
ERIC Educational Resources Information Center
Walsh, Jim; McGehee, Richard
2013-01-01
A dynamical systems approach to energy balance models of climate is presented, focusing on low order, or conceptual, models. Included are global average and latitude-dependent, surface temperature models. The development and analysis of the differential equations and corresponding bifurcation diagrams provides a host of appropriate material for…
The Impact of Desert Dust Aerosol Radiative Forcing on Global and West African Precipitation
NASA Astrophysics Data System (ADS)
Jordan, A.; Zaitchik, B. F.; Gnanadesikan, A.; Dezfuli, A. K.
2015-12-01
Desert dust aerosols exert a radiative forcing on the atmosphere, influencing atmospheric temperature structure and modifying radiative fluxes at the top of the atmosphere (TOA) and surface. As dust aerosols perturb radiative fluxes, the atmosphere responds by altering both energy and moisture dynamics, with potentially significant impacts on regional and global precipitation. Global Climate Model (GCM) experiments designed to characterize these processes have yielded a wide range of results, owing to both the complex nature of the system and diverse differences across models. Most model results show a general decrease in global precipitation, but regional results vary. Here, we compare simulations from GFDL's CM2Mc GCM with multiple other model experiments from the literature in order to investigate mechanisms of radiative impact and reasons for GCM differences on a global and regional scale. We focus on West Africa, a region of high interannual rainfall variability that is a source of dust and that neighbors major Sahara Desert dust sources. As such, changes in West African climate due to radiative forcing of desert dust aerosol have serious implications for desertification feedbacks. Our CM2Mc results show net cooling of the planet at TOA and surface, net warming of the atmosphere, and significant increases in precipitation over West Africa during the summer rainy season. These results differ from some previous GCM studies, prompting comparative analysis of desert dust parameters across models. This presentation will offer quantitative analysis of differences in dust aerosol parameters, aerosol optical properties, and overall particle burden across GCMs, and will characterize the contribution of model differences to the uncertainty of forcing and climate response affecting West Africa.
Global and Local Stress Analyses of McDonnell Douglas Stitched/RFI Composite Wing Stub Box
NASA Technical Reports Server (NTRS)
Wang, John T.
1996-01-01
This report contains results of structural analyses performed in support of the NASA structural testing of an all-composite stitched/RFI (resin film infusion) wing stub box. McDonnell Douglas Aerospace Company designed and fabricated the wing stub box. The analyses used a global/local approach. The global model contains the entire test article. It includes the all-composite stub box, a metallic load-transition box and a metallic wing-tip extension box. The two metallic boxes are connected to the inboard and outboard ends of the composite wing stub box, respectively. The load-transition box was attached to a steel and concrete vertical reaction structure and a load was applied at the tip of the extension box to bend the wing stub box upward. The local model contains an upper cover region surrounding three stringer runouts. In that region, a large nonlinear deformation was identified by the global analyses. A more detailed mesh was used for the local model to obtain more accurate analysis results near stringer runouts. Numerous analysis results such as deformed shapes, displacements at selected locations, and strains at critical locations are included in this report.
Modal simulation of gearbox vibration with experimental correlation
NASA Technical Reports Server (NTRS)
Choy, Fred K.; Ruan, Yeefeng F.; Zakrajsek, James J.; Oswald, Fred B.
1992-01-01
A newly developed global dynamic model was used to simulate the dynamics of a gear noise rig at NASA Lewis Research Center. Experimental results from the test rig were used to verify the analytical model. In this global dynamic model, the number of degrees of freedom of the system are reduced by transforming the system equations of motion into modal coordinates. The vibration of the individual gear-shaft system are coupled through the gear mesh forces. A three-dimensional, axial-lateral coupled, bearing model was used to couple the casing structural vibration to the gear-rotor dynamics. The coupled system of modal equations is solved to predict the resulting vibration at several locations on the test rig. Experimental vibration data was compared to the predictions of the global dynamic model. There is excellent agreement between the vibration results from analysis and experiment.
Global Qualitative Flow-Path Modeling for Local State Determination in Simulation and Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Fleming, Land D. (Inventor)
1998-01-01
For qualitative modeling and analysis, a general qualitative abstraction of power transmission variables (flow and effort) for elements of flow paths includes information on resistance, net flow, permissible directions of flow, and qualitative potential is discussed. Each type of component model has flow-related variables and an associated internal flow map, connected into an overall flow network of the system. For storage devices, the implicit power transfer to the environment is represented by "virtual" circuits that include an environmental junction. A heterogeneous aggregation method simplifies the path structure. A method determines global flow-path changes during dynamic simulation and analysis, and identifies corresponding local flow state changes that are effects of global configuration changes. Flow-path determination is triggered by any change in a flow-related device variable in a simulation or analysis. Components (path elements) that may be affected are identified, and flow-related attributes favoring flow in the two possible directions are collected for each of them. Next, flow-related attributes are determined for each affected path element, based on possibly conflicting indications of flow direction. Spurious qualitative ambiguities are minimized by using relative magnitudes and permissible directions of flow, and by favoring flow sources over effort sources when comparing flow tendencies. The results are output to local flow states of affected components.
SensA: web-based sensitivity analysis of SBML models.
Floettmann, Max; Uhlendorf, Jannis; Scharp, Till; Klipp, Edda; Spiesser, Thomas W
2014-10-01
SensA is a web-based application for sensitivity analysis of mathematical models. The sensitivity analysis is based on metabolic control analysis, computing the local, global and time-dependent properties of model components. Interactive visualization facilitates interpretation of usually complex results. SensA can contribute to the analysis, adjustment and understanding of mathematical models for dynamic systems. SensA is available at http://gofid.biologie.hu-berlin.de/ and can be used with any modern browser. The source code can be found at https://bitbucket.org/floettma/sensa/ (MIT license) © The Author 2014. Published by Oxford University Press.
Simplified Models for the Study of Postbuckled Hat-Stiffened Composite Panels
NASA Technical Reports Server (NTRS)
Vescovini, Riccardo; Davila, Carlos G.; Bisagni, Chiara
2012-01-01
The postbuckling response and failure of multistringer stiffened panels is analyzed using models with three levels of approximation. The first model uses a relatively coarse mesh to capture the global postbuckling response of a five-stringer panel. The second model can predict the nonlinear response as well as the debonding and crippling failure mechanisms in a single stringer compression specimen (SSCS). The third model consists of a simplified version of the SSCS that is designed to minimize the computational effort. The simplified model is well-suited to perform sensitivity analyses for studying the phenomena that lead to structural collapse. In particular, the simplified model is used to obtain a deeper understanding of the role played by geometric and material modeling parameters such as mesh size, inter-laminar strength, fracture toughness, and fracture mode mixity. Finally, a global/local damage analysis method is proposed in which a detailed local model is used to scan the global model to identify the locations that are most critical for damage tolerance.
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
Simulating the effects of climate and agricultural management practices on global crop yield
NASA Astrophysics Data System (ADS)
Deryng, D.; Sacks, W. J.; Barford, C. C.; Ramankutty, N.
2011-06-01
Climate change is expected to significantly impact global food production, and it is important to understand the potential geographic distribution of yield losses and the means to alleviate them. This study presents a new global crop model, PEGASUS 1.0 (Predicting Ecosystem Goods And Services Using Scenarios) that integrates, in addition to climate, the effect of planting dates and cultivar choices, irrigation, and fertilizer application on crop yield for maize, soybean, and spring wheat. PEGASUS combines carbon dynamics for crops with a surface energy and soil water balance model. It also benefits from the recent development of a suite of global data sets and analyses that serve as model inputs or as calibration data. These include data on crop planting and harvesting dates, crop-specific irrigated areas, a global analysis of yield gaps, and harvested area and yield of major crops. Model results for present-day climate and farm management compare reasonably well with global data. Simulated planting and harvesting dates are within the range of crop calendar observations in more than 75% of the total crop-harvested areas. Correlation of simulated and observed crop yields indicates a weighted coefficient of determination, with the weighting based on crop-harvested area, of 0.81 for maize, 0.66 for soybean, and 0.45 for spring wheat. We found that changes in temperature and precipitation as predicted by global climate models for the 2050s lead to a global yield reduction if planting and harvesting dates remain unchanged. However, adapting planting dates and cultivar choices increases yield in temperate regions and avoids 7-18% of global losses.
Naujokaitis-Lewis, Ilona; Curtis, Janelle M R
2016-01-01
Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options.
Curtis, Janelle M.R.
2016-01-01
Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along with demographic parameters in sensitivity routines. GRIP 2.0 is an important decision-support tool that can be used to prioritize research, identify habitat-based thresholds and management intervention points to improve probability of species persistence, and evaluate trade-offs of alternative management options. PMID:27547529
Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, Ronald M.
2015-01-01
The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.
Parametric uncertainties in global model simulations of black carbon column mass concentration
NASA Astrophysics Data System (ADS)
Pearce, Hana; Lee, Lindsay; Reddington, Carly; Carslaw, Ken; Mann, Graham
2016-04-01
Previous studies have deduced that the annual mean direct radiative forcing from black carbon (BC) aerosol may regionally be up to 5 W m-2 larger than expected due to underestimation of global atmospheric BC absorption in models. We have identified the magnitude and important sources of parametric uncertainty in simulations of BC column mass concentration from a global aerosol microphysics model (GLOMAP-Mode). A variance-based uncertainty analysis of 28 parameters has been performed, based on statistical emulators trained on model output from GLOMAP-Mode. This is the largest number of uncertain model parameters to be considered in a BC uncertainty analysis to date and covers primary aerosol emissions, microphysical processes and structural parameters related to the aerosol size distribution. We will present several recommendations for further research to improve the fidelity of simulated BC. In brief, we find that the standard deviation around the simulated mean annual BC column mass concentration varies globally between 2.5 x 10-9 g cm-2 in remote marine regions and 1.25 x 10-6 g cm-2 near emission sources due to parameter uncertainty Between 60 and 90% of the variance over source regions is due to uncertainty associated with primary BC emission fluxes, including biomass burning, fossil fuel and biofuel emissions. While the contributions to BC column uncertainty from microphysical processes, for example those related to dry and wet deposition, are increased over remote regions, we find that emissions still make an important contribution in these areas. It is likely, however, that the importance of structural model error, i.e. differences between models, is greater than parametric uncertainty. We have extended our analysis to emulate vertical BC profiles at several locations in the mid-Pacific Ocean and identify the parameters contributing to uncertainty in the vertical distribution of black carbon at these locations. We will present preliminary comparisons of emulated BC vertical profiles from the AeroCom multi-model ensemble and Hiaper Pole-to-Pole (HIPPO) observations.
Realism of Indian Summer Monsoon Simulation in a Quarter Degree Global Climate Model
NASA Astrophysics Data System (ADS)
Salunke, P.; Mishra, S. K.; Sahany, S.; Gupta, K.
2017-12-01
This study assesses the fidelity of Indian Summer Monsoon (ISM) simulations using a global model at an ultra-high horizontal resolution (UHR) of 0.25°. The model used was the atmospheric component of the Community Earth System Model version 1.2.0 (CESM 1.2.0) developed at the National Center for Atmospheric Research (NCAR). Precipitation and temperature over the Indian region were analyzed for a wide range of space and time scales to evaluate the fidelity of the model under UHR, with special emphasis on the ISM simulations during the period of June-through-September (JJAS). Comparing the UHR simulations with observed data from the India Meteorological Department (IMD) over the Indian land, it was found that 0.25° resolution significantly improved spatial rainfall patterns over many regions, including the Western Ghats and the South-Eastern peninsula as compared to the standard model resolution. Convective and large-scale rainfall components were analyzed using the European Centre for Medium Range Weather Forecast (ECMWF) Re-Analysis (ERA)-Interim (ERA-I) data and it was found that at 0.25° resolution, there was an overall increase in the large-scale component and an associated decrease in the convective component of rainfall as compared to the standard model resolution. Analysis of the diurnal cycle of rainfall suggests a significant improvement in the phase characteristics simulated by the UHR model as compared to the standard model resolution. Analysis of the annual cycle of rainfall, however, failed to show any significant improvement in the UHR model as compared to the standard version. Surface temperature analysis showed small improvements in the UHR model simulations as compared to the standard version. Thus, one may conclude that there are some significant improvements in the ISM simulations using a 0.25° global model, although there is still plenty of scope for further improvement in certain aspects of the annual cycle of rainfall.
Yang, Junyuan; Martcheva, Maia; Wang, Lin
2015-10-01
Vaccination is the most effective method of preventing the spread of infectious diseases. For many diseases, vaccine-induced immunity is not life long and the duration of immunity is not always fixed. In this paper, we propose an SIVS model taking the waning of vaccine-induced immunity and general nonlinear incidence into consideration. Our analysis shows that the model exhibits global threshold dynamics in the sense that if the basic reproduction number is less than 1, then the disease-free equilibrium is globally asymptotically stable implying the disease dies out; while if the basic reproduction number is larger than 1, then the endemic equilibrium is globally asymptotically stable indicating that the disease persists. This global threshold result indicates that if the vaccination coverage rate is below a critical value, then the disease always persists and only if the vaccination coverage rate is above the critical value, the disease can be eradicated. Copyright © 2015 Elsevier Inc. All rights reserved.
Disentangling climatic and anthropogenic controls on global terrestrial evapotranspiration trends
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Jiafu; Shi, Xiaoying; Ricciuto, Daniel M.
Here, we examined natural and anthropogenic controls on terrestrial evapotranspiration (ET) changes from 1982-2010 using multiple estimates from remote sensing-based datasets and process-oriented land surface models. A significant increased trend of ET in each hemisphere was consistently revealed by observationally-constrained data and multi-model ensembles that considered historic natural and anthropogenic drivers. The climate impacts were simulated to determine the spatiotemporal variations in ET. Globally, rising CO 2 ranked second in these models after the predominant climatic influences, and yielded a decreasing trend in canopy transpiration and ET, especially for tropical forests and high-latitude shrub land. Increased nitrogen deposition slightly amplifiedmore » global ET via enhanced plant growth. Land-use-induced ET responses, albeit with substantial uncertainties across the factorial analysis, were minor globally, but pronounced locally, particularly over regions with intensive land-cover changes. Our study highlights the importance of employing multi-stream ET and ET-component estimates to quantify the strengthening anthropogenic fingerprint in the global hydrologic cycle.« less
Disentangling climatic and anthropogenic controls on global terrestrial evapotranspiration trends
Mao, Jiafu; Shi, Xiaoying; Ricciuto, Daniel M.; ...
2015-09-08
Here, we examined natural and anthropogenic controls on terrestrial evapotranspiration (ET) changes from 1982-2010 using multiple estimates from remote sensing-based datasets and process-oriented land surface models. A significant increased trend of ET in each hemisphere was consistently revealed by observationally-constrained data and multi-model ensembles that considered historic natural and anthropogenic drivers. The climate impacts were simulated to determine the spatiotemporal variations in ET. Globally, rising CO 2 ranked second in these models after the predominant climatic influences, and yielded a decreasing trend in canopy transpiration and ET, especially for tropical forests and high-latitude shrub land. Increased nitrogen deposition slightly amplifiedmore » global ET via enhanced plant growth. Land-use-induced ET responses, albeit with substantial uncertainties across the factorial analysis, were minor globally, but pronounced locally, particularly over regions with intensive land-cover changes. Our study highlights the importance of employing multi-stream ET and ET-component estimates to quantify the strengthening anthropogenic fingerprint in the global hydrologic cycle.« less
Sweetapple, Christine; Fu, Guangtao; Butler, David
2013-09-01
This study investigates sources of uncertainty in the modelling of greenhouse gas emissions from wastewater treatment, through the use of local and global sensitivity analysis tools, and contributes to an in-depth understanding of wastewater treatment modelling by revealing critical parameters and parameter interactions. One-factor-at-a-time sensitivity analysis is used to screen model parameters and identify those with significant individual effects on three performance indicators: total greenhouse gas emissions, effluent quality and operational cost. Sobol's method enables identification of parameters with significant higher order effects and of particular parameter pairs to which model outputs are sensitive. Use of a variance-based global sensitivity analysis tool to investigate parameter interactions enables identification of important parameters not revealed in one-factor-at-a-time sensitivity analysis. These interaction effects have not been considered in previous studies and thus provide a better understanding wastewater treatment plant model characterisation. It was found that uncertainty in modelled nitrous oxide emissions is the primary contributor to uncertainty in total greenhouse gas emissions, due largely to the interaction effects of three nitrogen conversion modelling parameters. The higher order effects of these parameters are also shown to be a key source of uncertainty in effluent quality. Copyright © 2013 Elsevier Ltd. All rights reserved.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.
Designing novel cellulase systems through agent-based modeling and global sensitivity analysis
Apte, Advait A; Senger, Ryan S; Fong, Stephen S
2014-01-01
Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736
Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities
NASA Astrophysics Data System (ADS)
Esposito, Gaetano
Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and identifying sources of uncertainty affecting relevant reaction pathways are usually addressed by resorting to Global Sensitivity Analysis (GSA) techniques. In particular, the most sensitive reactions controlling combustion phenomena are first identified using the Morris Method and then analyzed under the Random Sampling -- High Dimensional Model Representation (RS-HDMR) framework. The HDMR decomposition shows that 10% of the variance seen in the extinction strain rate of non-premixed flames is due to second-order effects between parameters, whereas the maximum concentration of acetylene, a key soot precursor, is affected by mostly only first-order contributions. Moreover, the analysis of the global sensitivity indices demonstrates that improving the accuracy of the reaction rates including the vinyl radical, C2H3, can drastically reduce the uncertainty of predicting targeted flame properties. Finally, the back-propagation of the experimental uncertainty of the extinction strain rate to the parameter space is also performed. This exercise, achieved by recycling the numerical solutions of the RS-HDMR, shows that some regions of the parameter space have a high probability of reproducing the experimental value of the extinction strain rate between its own uncertainty bounds. Therefore this study demonstrates that the uncertainty analysis of bulk flame properties can effectively provide information on relevant chemical reactions.
Modelling and analysis of creep deformation and fracture in a 1 Cr 1/2 Mo ferritic steel
NASA Astrophysics Data System (ADS)
Dyson, B. F.; Osgerby, D.
A quantitative model, based upon a proposed new mechanism of creep deformation in particle-hardened alloys, has been validated by analysis of creep data from a 13CrMo 4 4 (1Cr 1/2 Mo) material tested under a range of stresses and temperatures. The methodology that has been used to extract the model parameters quantifies, as a first approximation, only the main degradation (damage) processes - in the case of the 1CR 1/2 Mo steel, these are considered to be the parallel operation of particle-coarsening and a progressively increasing stress due to a constant-load boundary condition. These 'global' model parameters can then be modified (only slightly) as required to obtain a detailed description and 'fit' to the rupture lifetime and strain/time trajectory of any individual test. The global model parameter approach may be thought of as predicting average behavior and the detailed fits as taking account of uncertainties (scatter) due to variability in the material. Using the global parameter dataset, predictions have also been made of behavior under biaxial stressing; constant straining rate; constant total strain (stress relaxation) and the likely success or otherwise of metallographic and mechanical remanent lifetime procedures.
Symmetry analysis for hyperbolic equilibria using a TB/dengue fever model
NASA Astrophysics Data System (ADS)
Massoukou, R. Y. M.'Pika; Govinder, K. S.
2016-08-01
We investigate the interplay between Lie symmetry analysis and dynamical systems analysis. As an example, we take a toy model describing the spread of TB and dengue fever. We first undertake a comprehensive dynamical systems analysis including a discussion about local stability. For those regions in which such analyzes cannot be translated to global behavior, we undertake a Lie symmetry analysis. It is shown that the Lie analysis can be useful in providing information for systems where the (local) dynamical systems analysis breaks down.
NASA/MSFC FY88 Global Scale Atmospheric Processes Research Program Review
NASA Technical Reports Server (NTRS)
Wilson, Greg S. (Editor); Leslie, Fred W. (Editor); Arnold, J. E. (Editor)
1989-01-01
Interest in environmental issues and the magnitude of the environmental changes continues. One way to gain more understanding of the atmosphere is to make measurements on a global scale from space. The Earth Observation System is a series of new sensors to measure globally atmospheric parameters. Analysis of satellite data by developing algorithms to interpret the radiance information improves the understanding and also defines requirements for these sensors. One measure of knowledge of the atmosphere lies in the ability to predict its behavior. Use of numerical and experimental models provides a better understanding of these processes. These efforts are described in the context of satellite data analysis and fundamental studies of atmospheric dynamics which examine selected processes important to the global circulation.
Assessing Climate Change Risks Using a Multi-Model Approach
NASA Astrophysics Data System (ADS)
Knorr, W.; Scholze, M.; Prentice, C.
2007-12-01
We quantify the risks of climate-induced changes in key ecosystem processes during the 21st century by forcing a dynamic global vegetation model with multiple scenarios from the IPCC AR4 data archive using 16 climate models and mapping the proportions of model runs showing exceedance of natural variability in wildfire frequency and freshwater supply or shifts in vegetation cover. Our analysis does not assign probabilities to scenarios. Instead, we consider the distribution of outcomes within three sets of model runs grouped according to the amount of global warming they simulate: < 2 degree C (including committed climate change simulations), 2-3 degree C, and >3 degree C. Here, we are contrasting two different methods for calculating the risks: first we use an equal weighting approach giving every model within one of the three sets the same weight, and second, we weight the models according to their ability to model ENSO. The differences are underpinning the need for the development of more robust performance metrics for global climate models.
Localized Principal Component Analysis based Curve Evolution: A Divide and Conquer Approach
Appia, Vikram; Ganapathy, Balaji; Yezzi, Anthony; Faber, Tracy
2014-01-01
We propose a novel localized principal component analysis (PCA) based curve evolution approach which evolves the segmenting curve semi-locally within various target regions (divisions) in an image and then combines these locally accurate segmentation curves to obtain a global segmentation. The training data for our approach consists of training shapes and associated auxiliary (target) masks. The masks indicate the various regions of the shape exhibiting highly correlated variations locally which may be rather independent of the variations in the distant parts of the global shape. Thus, in a sense, we are clustering the variations exhibited in the training data set. We then use a parametric model to implicitly represent each localized segmentation curve as a combination of the local shape priors obtained by representing the training shapes and the masks as a collection of signed distance functions. We also propose a parametric model to combine the locally evolved segmentation curves into a single hybrid (global) segmentation. Finally, we combine the evolution of these semilocal and global parameters to minimize an objective energy function. The resulting algorithm thus provides a globally accurate solution, which retains the local variations in shape. We present some results to illustrate how our approach performs better than the traditional approach with fully global PCA. PMID:25520901
Downscaling global precipitation for local applications - a case for the Rhine basin
NASA Astrophysics Data System (ADS)
Sperna Weiland, Frederiek; van Verseveld, Willem; Schellekens, Jaap
2017-04-01
Within the EU FP7 project eartH2Observe a global Water Resources Re-analysis (WRR) is being developed. This re-analysis consists of meteorological and hydrological water balance variables with global coverage, spanning the period 1979-2014 at 0.25 degrees resolution (Schellekens et al., 2016). The dataset can be of special interest in regions with limited in-situ data availability, yet for local scale analysis particularly in mountainous regions, a resolution of 0.25 degrees may be too coarse and downscaling the data to a higher resolution may be required. A downscaling toolbox has been made that includes spatial downscaling of precipitation based on the global WorldClim dataset that is available at 1 km resolution as a monthly climatology (Hijmans et al., 2005). The input of the down-scaling tool are either the global eartH2Observe WRR1 and WRR2 datasets based on the WFDEI correction methodology (Weedon et al., 2014) or the global Multi-Source Weighted-Ensemble Precipitation (MSWEP) dataset (Beck et al., 2016). Here we present a validation of the datasets over the Rhine catchment by means of a distributed hydrological model (wflow, Schellekens et al., 2014) using a number of precipitation scenarios. (1) We start by running the model using the local reference dataset derived by spatial interpolation of gauge observations. Furthermore we use (2) the MSWEP dataset at the native 0.25-degree resolution followed by (3) MSWEP downscaled with the WorldClim dataset and final (4) MSWEP downscaled with the local reference dataset. The validation will be based on comparison of the modeled river discharges as well as rainfall statistics. We expect that down-scaling the MSWEP dataset with the WorldClim data to higher resolution will increase its performance. To test the performance of the down-scaling routine we have added a run with MSWEP data down-scaled with the local dataset and compare this with the run based on the local dataset itself. - Beck, H. E. et al., 2016. MSWEP: 3-hourly 0.25° global gridded precipitation (1979-2015) by merging gauge, satellite, and reanalysis data, Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-236, accepted for final publication. - Hijmans, R.J. et al., 2005. Very high resolution interpolated climate surfaces for global land areas. International Journal of Climatology 25: 1965-1978. - Schellekens, J. et al., 2016. A global water resources ensemble of hydrological models: the eartH2Observe Tier-1 dataset, Earth Syst. Sci. Data Discuss., doi:10.5194/essd-2016-55, under review. - Schellekens, J. et al., 2014. Rapid setup of hydrological and hydraulic models using OpenStreetMap and the SRTM derived digital elevation model. Environmental Modelling&Software - Weedon, G.P. et al., 2014. The WFDEI meteorological forcing data set: WATCH Forcing Data methodology applied to ERA-Interim reanalysis data. Water Resources Research, 50, doi:10.1002/2014WR015638.
2017-01-01
The input-output table is comprehensive and detailed in describing the national economic system with complex economic relationships, which embodies information of supply and demand among industrial sectors. This paper aims to scale the degree of competition/collaboration on the global value chain from the perspective of econophysics. Global Industrial Strongest Relevant Network models were established by extracting the strongest and most immediate industrial relevance in the global economic system with inter-country input-output tables and then transformed into Global Industrial Resource Competition Network/Global Industrial Production Collaboration Network models embodying the competitive/collaborative relationships based on bibliographic coupling/co-citation approach. Three indicators well suited for these two kinds of weighted and non-directed networks with self-loops were introduced, including unit weight for competitive/collaborative power, disparity in the weight for competitive/collaborative amplitude and weighted clustering coefficient for competitive/collaborative intensity. Finally, these models and indicators were further applied to empirically analyze the function of sectors in the latest World Input-Output Database, to reveal inter-sector competitive/collaborative status during the economic globalization. PMID:28873432
Xing, Lizhi
2017-01-01
The input-output table is comprehensive and detailed in describing the national economic system with complex economic relationships, which embodies information of supply and demand among industrial sectors. This paper aims to scale the degree of competition/collaboration on the global value chain from the perspective of econophysics. Global Industrial Strongest Relevant Network models were established by extracting the strongest and most immediate industrial relevance in the global economic system with inter-country input-output tables and then transformed into Global Industrial Resource Competition Network/Global Industrial Production Collaboration Network models embodying the competitive/collaborative relationships based on bibliographic coupling/co-citation approach. Three indicators well suited for these two kinds of weighted and non-directed networks with self-loops were introduced, including unit weight for competitive/collaborative power, disparity in the weight for competitive/collaborative amplitude and weighted clustering coefficient for competitive/collaborative intensity. Finally, these models and indicators were further applied to empirically analyze the function of sectors in the latest World Input-Output Database, to reveal inter-sector competitive/collaborative status during the economic globalization.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-09-01
Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-01-01
Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037
Analyzing Regional Climate Change in Africa in a 1.5, 2, and 3°C Global Warming World
NASA Astrophysics Data System (ADS)
Weber, T.; Haensler, A.; Rechid, D.; Pfeifer, S.; Eggert, B.; Jacob, D.
2018-04-01
At the 21st session of the United Nations Framework Convention on Climate Change Conference of the Parties (COP21) in Paris, an agreement to strengthen the effort to limit the global temperature increase well below 2°C was decided. However, even if global warming is limited, some regions might still be substantially affected by climate change, especially for continents like Africa where the socio-economic conditions are strongly linked to the climatic conditions. In the paper we will discuss the analysis of indices assigned to the sectors health, agriculture, and infrastructure in a 1.5, 2, and 3°C global warming world for the African continent. For this analysis an ensemble of 10 different general circulation model-regional climate model simulations conducted in the framework of the COordinated Downscaling EXperiment for Africa was investigated. The results show that the African continent, in particular the regions between 15°S and 15°N, has to expect an increase in hot nights and longer and more frequent heat waves even if the global temperature will be kept below 2°C. These effects intensify if the global mean temperature will exceed the 2°C threshold. Moreover, the daily rainfall intensity is expected to increase toward higher global warming scenarios and will affect especially the African Sub-Saharan coastal regions.
Global-constrained hidden Markov model applied on wireless capsule endoscopy video segmentation
NASA Astrophysics Data System (ADS)
Wan, Yiwen; Duraisamy, Prakash; Alam, Mohammad S.; Buckles, Bill
2012-06-01
Accurate analysis of wireless capsule endoscopy (WCE) videos is vital but tedious. Automatic image analysis can expedite this task. Video segmentation of WCE into the four parts of the gastrointestinal tract is one way to assist a physician. The segmentation approach described in this paper integrates pattern recognition with statiscal analysis. Iniatially, a support vector machine is applied to classify video frames into four classes using a combination of multiple color and texture features as the feature vector. A Poisson cumulative distribution, for which the parameter depends on the length of segments, models a prior knowledge. A priori knowledge together with inter-frame difference serves as the global constraints driven by the underlying observation of each WCE video, which is fitted by Gaussian distribution to constrain the transition probability of hidden Markov model.Experimental results demonstrated effectiveness of the approach.
International Management: Creating a More Realistic Global Planning Environment.
ERIC Educational Resources Information Center
Waldron, Darryl G.
2000-01-01
Discusses the need for realistic global planning environments in international business education, introducing a strategic planning model that has teams interacting with teams to strategically analyze a selected multinational company. This dynamic process must result in a single integrated written analysis that specifies an optimal strategy for…
NASA Astrophysics Data System (ADS)
Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian
2018-01-01
In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.
Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact Us Website Weather Service NWS logo - Click to go to the NWS homepage Environmental Modeling Center Home News for Environmental Prediction Environmental Modeling Center W/NP 21, NCWCP 5830 Unversity Research
National Centers for Environmental Prediction
Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Products People GLOBAL CLIMATE & WEATHER MODELING Personnel Jordan Alpert Email Website Dave Behringer Prediction Environmental Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); Bosilovich, Michael G.; Akella, Santha; Lawrence, Coy; Cullather, Richard; Draper, Clara; Gelaro, Ronald; Kovach, Robin; Liu, Qing; Molod, Andrea;
2015-01-01
The years since the introduction of MERRA have seen numerous advances in the GEOS-5 Data Assimilation System as well as a substantial decrease in the number of observations that can be assimilated into the MERRA system. To allow continued data processing into the future, and to take advantage of several important innovations that could improve system performance, a decision was made to produce MERRA-2, an updated retrospective analysis of the full modern satellite era. One of the many advances in MERRA-2 is a constraint on the global dry mass balance; this allows the global changes in water by the analysis increment to be near zero, thereby minimizing abrupt global interannual variations due to changes in the observing system. In addition, MERRA-2 includes the assimilation of interactive aerosols into the system, a feature of the Earth system absent from previous reanalyses. Also, in an effort to improve land surface hydrology, observations-corrected precipitation forcing is used instead of model-generated precipitation. Overall, MERRA-2 takes advantage of numerous updates to the global modeling and data assimilation system. In this document, we summarize an initial evaluation of the climate in MERRA-2, from the surface to the stratosphere and from the tropics to the poles. Strengths and weaknesses of the MERRA-2 climate are accordingly emphasized.
Simplifiying global biogeochemistry models to evaluate methane emissions
NASA Astrophysics Data System (ADS)
Gerber, S.; Alonso-Contes, C.
2017-12-01
Process-based models are important tools to quantify wetland methane emissions, particularly also under climate change scenarios, evaluating these models is often cumbersome as they are embedded in larger land-surface models where fluctuating water table and the carbon cycle (including new readily decomposable plant material) are predicted variables. Here, we build on these large scale models but instead of modeling water table and plant productivity we provide values as boundary conditions. In contrast, aerobic and anaerobic decomposition, as well as soil column transport of oxygen and methane are predicted by the model. Because of these simplifications, the model has the potential to be more readily adaptable to the analysis of field-scale data. Here we determine the sensitivity of the model to specific setups, parameter choices, and to boundary conditions in order to determine set-up needs and inform what critical auxiliary variables need to be measured in order to better predict field-scale methane emissions from wetland soils. To that end we performed a global sensitivity analysis that also considers non-linear interactions between processes. The global sensitivity analysis revealed, not surprisingly, that water table dynamics (both mean level and amplitude of fluctuations), and the rate of the carbon cycle (i.e. net primary productivity) are critical determinants of methane emissions. The depth-scale where most of the potential decomposition occurs also affects methane emissions. Different transport mechanisms are compensating each other to some degree: If plant conduits are constrained, methane emissions by diffusive flux and ebullition compensate to some degree, however annual emissions are higher when plants help to bypass methanotrophs in temporally unsaturated upper layers. Finally, while oxygen consumption by plant roots help creating anoxic conditions it has little effect on overall methane emission. Our initial sensitivity analysis helps guiding further model development and improvement. However, an important goal for our model is to use it in field settings as a tool to deconvolve the different processes that contribute to the net transfer of methane from soils to atmosphere.
Venus Global Reference Atmospheric Model
NASA Technical Reports Server (NTRS)
Justh, Hilary L.
2017-01-01
Venus Global Reference Atmospheric Model (Venus-GRAM) is an engineering-level atmospheric model developed by MSFC that is widely used for diverse mission applications including: Systems design; Performance analysis; Operations planning for aerobraking, Entry, Descent and Landing, and aerocapture; Is not a forecast model; Outputs include density, temperature, pressure, wind components, and chemical composition; Provides dispersions of thermodynamic parameters, winds, and density; Optional trajectory and auxiliary profile input files Has been used in multiple studies and proposals including NASA Engineering and Safety Center (NESC) Autonomous Aerobraking and various Discovery proposals; Released in 2005; Available at: https://software.nasa.gov/software/MFS-32314-1.
Effect of antibodies on pathogen dynamics with delays and two routes of infection
NASA Astrophysics Data System (ADS)
Elaiw, A. M.; Almatrafi, A. A.; Hobiny, A. D.
2018-06-01
We study the global stability of pathogen dynamics models with saturated pathogen-susceptible and infected-susceptible incidence. The models incorporate antibody immune response and three types of discrete or distributed time delays. We first show that the solutions of the model are nonnegative and ultimately bounded. We determine two threshold parameters, the basic reproduction number and antibody response activation number. We establish the existence and stability of the steady states. We study the global stability analysis of models using Lyapunov method. The numerical simulations have shown that antibodies can reduce the pathogen progression.
Remote Sensing Information Science Research
NASA Technical Reports Server (NTRS)
Clarke, Keith C.; Scepan, Joseph; Hemphill, Jeffrey; Herold, Martin; Husak, Gregory; Kline, Karen; Knight, Kevin
2002-01-01
This document is the final report summarizing research conducted by the Remote Sensing Research Unit, Department of Geography, University of California, Santa Barbara under National Aeronautics and Space Administration Research Grant NAG5-10457. This document describes work performed during the period of 1 March 2001 thorough 30 September 2002. This report includes a survey of research proposed and performed within RSRU and the UCSB Geography Department during the past 25 years. A broad suite of RSRU research conducted under NAG5-10457 is also described under themes of Applied Research Activities and Information Science Research. This research includes: 1. NASA ESA Research Grant Performance Metrics Reporting. 2. Global Data Set Thematic Accuracy Analysis. 3. ISCGM/Global Map Project Support. 4. Cooperative International Activities. 5. User Model Study of Global Environmental Data Sets. 6. Global Spatial Data Infrastructure. 7. CIESIN Collaboration. 8. On the Value of Coordinating Landsat Operations. 10. The California Marine Protected Areas Database: Compilation and Accuracy Issues. 11. Assessing Landslide Hazard Over a 130-Year Period for La Conchita, California Remote Sensing and Spatial Metrics for Applied Urban Area Analysis, including: (1) IKONOS Data Processing for Urban Analysis. (2) Image Segmentation and Object Oriented Classification. (3) Spectral Properties of Urban Materials. (4) Spatial Scale in Urban Mapping. (5) Variable Scale Spatial and Temporal Urban Growth Signatures. (6) Interpretation and Verification of SLEUTH Modeling Results. (7) Spatial Land Cover Pattern Analysis for Representing Urban Land Use and Socioeconomic Structures. 12. Colorado River Flood Plain Remote Sensing Study Support. 13. African Rainfall Modeling and Assessment. 14. Remote Sensing and GIS Integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Zeli; Leung, L. Ruby; Li, Hongyi
Although sediment yield (SY) from water erosion is ubiquitous and its environmental consequences are well recognized, its impacts on the global carbon cycle remain largely uncertain. This knowledge gap is partly due to the lack of soil erosion modeling in Earth System Models (ESMs), which are important tools used to understand the global carbon cycle and explore its changes. This study analyzed sediment and particulate organic carbon yield (CY) data from 1081 and 38 small catchments (0.1-200 km27 ), respectively, in different environments across the globe. Using multiple statistical analysis techniques, we explored environmental factors and hydrological processes important formore » SY and CY modeling in ESMs. Our results show clear correlations of high SY with traditional agriculture, seismicity and heavy storms, as well as strong correlations between SY and annual peak runoff. These highlight the potential limitation of SY models that represent only interrill and rill erosion because shallow overland flow and rill flow have limited transport capacity due to their hydraulic geometry to produce high SY. Further, our results suggest that SY modeling in ESMs should be implemented at the event scale to produce the catastrophic mass transport during episodic events. Several environmental factors such as seismicity and land management that are often not considered in current catchment-scale SY models can be important in controlling global SY. Our analyses show that SY is likely the primary control on CY in small catchments and a statistically significant empirical relationship is established to calculate SY and CY jointly in ESMs.« less
NASA Technical Reports Server (NTRS)
Huffman, George J.; Adler, Robert F.; Rudolf, Bruno; Schneider, Udo; Keehn, Peter R.
1995-01-01
The 'satellite-gauge model' (SGM) technique is described for combining precipitation estimates from microwave satellite data, infrared satellite data, rain gauge analyses, and numerical weather prediction models into improved estimates of global precipitation. Throughout, monthly estimates on a 2.5 degrees x 2.5 degrees lat-long grid are employed. First, a multisatellite product is developed using a combination of low-orbit microwave and geosynchronous-orbit infrared data in the latitude range 40 degrees N - 40 degrees S (the adjusted geosynchronous precipitation index) and low-orbit microwave data alone at higher latitudes. Then the rain gauge analysis is brougth in, weighting each field by its inverse relative error variance to produce a nearly global, observationally based precipitation estimate. To produce a complete global estimate, the numerical model results are used to fill data voids in the combined satellite-gauge estimate. Our sequential approach to combining estimates allows a user to select the multisatellite estimate, the satellite-gauge estimate, or the full SGM estimate (observationally based estimates plus the model information). The primary limitation in the method is imperfections in the estimation of relative error for the individual fields. The SGM results for one year of data (July 1987 to June 1988) show important differences from the individual estimates, including model estimates as well as climatological estimates. In general, the SGM results are drier in the subtropics than the model and climatological results, reflecting the relatively dry microwave estimates that dominate the SGM in oceanic regions.
Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina
2010-04-01
DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kyle, P.; Patel, P.; Calvin, K. V.
2014-12-01
Global integrated assessment models used for understanding the linkages between the future energy, agriculture, and climate systems typically represent between 8 and 30 geopolitical macro-regions, balancing the benefits of geographic resolution with the costs of additional data collection, processing, analysis, and computing resources. As these models are continually being improved and updated in order to address new questions for the research and policy communities, it is worth examining the consequences of the country-to-region mapping schemes used for model results. This study presents an application of a data processing system built for the GCAM integrated assessment model that allows any country-to-region assignments, with a minimum of four geopolitical regions and a maximum of 185. We test ten different mapping schemes, including the specific mappings used in existing major integrated assessment models. We also explore the impacts of clustering nations into regions according to the similarity of the structure of each nation's energy and agricultural sectors, as indicated by multivariate analysis. Scenarios examined include a reference scenario, a low-emissions scenario, and scenarios with agricultural and buildings sector climate change impacts. We find that at the global level, the major output variables (primary energy, agricultural land use) are surprisingly similar regardless of regional assignments, but at finer geographic scales, differences are pronounced. We suggest that enhancing geographic resolution is advantageous for analysis of climate impacts on the buildings and agricultural sectors, due to the spatial heterogeneity of these drivers.
Error Analysis for High Resolution Topography with Bi-Static Single-Pass SAR Interferometry
NASA Technical Reports Server (NTRS)
Muellerschoen, Ronald J.; Chen, Curtis W.; Hensley, Scott; Rodriguez, Ernesto
2006-01-01
We present a flow down error analysis from the radar system to topographic height errors for bi-static single pass SAR interferometry for a satellite tandem pair. Because of orbital dynamics the baseline length and baseline orientation evolve spatially and temporally, the height accuracy of the system is modeled as a function of the spacecraft position and ground location. Vector sensitivity equations of height and the planar error components due to metrology, media effects, and radar system errors are derived and evaluated globally for a baseline mission. Included in the model are terrain effects that contribute to layover and shadow and slope effects on height errors. The analysis also accounts for nonoverlapping spectra and the non-overlapping bandwidth due to differences between the two platforms' viewing geometries. The model is applied to a 514 km altitude 97.4 degree inclination tandem satellite mission with a 300 m baseline separation and X-band SAR. Results from our model indicate that global DTED level 3 can be achieved.
Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).
MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J
2018-02-01
The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.
On dynamical systems approaches and methods in f ( R ) cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alho, Artur; Carloni, Sante; Uggla, Claes, E-mail: aalho@math.ist.utl.pt, E-mail: sante.carloni@tecnico.ulisboa.pt, E-mail: claes.uggla@kau.se
We discuss dynamical systems approaches and methods applied to flat Robertson-Walker models in f ( R )-gravity. We argue that a complete description of the solution space of a model requires a global state space analysis that motivates globally covering state space adapted variables. This is shown explicitly by an illustrative example, f ( R ) = R + α R {sup 2}, α > 0, for which we introduce new regular dynamical systems on global compactly extended state spaces for the Jordan and Einstein frames. This example also allows us to illustrate several local and global dynamical systems techniquesmore » involving, e.g., blow ups of nilpotent fixed points, center manifold analysis, averaging, and use of monotone functions. As a result of applying dynamical systems methods to globally state space adapted dynamical systems formulations, we obtain pictures of the entire solution spaces in both the Jordan and the Einstein frames. This shows, e.g., that due to the domain of the conformal transformation between the Jordan and Einstein frames, not all the solutions in the Jordan frame are completely contained in the Einstein frame. We also make comparisons with previous dynamical systems approaches to f ( R ) cosmology and discuss their advantages and disadvantages.« less
NASA Astrophysics Data System (ADS)
Hu, Shujuan; Chou, Jifan; Cheng, Jianbo
2018-04-01
In order to study the interactions between the atmospheric circulations at the middle-high and low latitudes from the global perspective, the authors proposed the mathematical definition of three-pattern circulations, i.e., horizontal, meridional and zonal circulations with which the actual atmospheric circulation is expanded. This novel decomposition method is proved to accurately describe the actual atmospheric circulation dynamics. The authors used the NCEP/NCAR reanalysis data to calculate the climate characteristics of those three-pattern circulations, and found that the decomposition model agreed with the observed results. Further dynamical analysis indicates that the decomposition model is more accurate to capture the major features of global three dimensional atmospheric motions, compared to the traditional definitions of Rossby wave, Hadley circulation and Walker circulation. The decomposition model for the first time realized the decomposition of global atmospheric circulation using three orthogonal circulations within the horizontal, meridional and zonal planes, offering new opportunities to study the large-scale interactions between the middle-high latitudes and low latitudes circulations.
Teamwork tools and activities within the hazard component of the Global Earthquake Model
NASA Astrophysics Data System (ADS)
Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.
2013-05-01
The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.
Combined constraints on global ocean primary production using observations and models
NASA Astrophysics Data System (ADS)
Buitenhuis, Erik T.; Hashioka, Taketo; Quéré, Corinne Le
2013-09-01
production is at the base of the marine food web and plays a central role for global biogeochemical cycles. Yet global ocean primary production is known to only a factor of 2, with previous estimates ranging from 38 to 65 Pg C yr-1 and no formal uncertainty analysis. Here, we present an improved global ocean biogeochemistry model that includes a mechanistic representation of photosynthesis and a new observational database of net primary production (NPP) in the ocean. We combine the model and observations to constrain particulate NPP in the ocean with statistical metrics. The PlankTOM5.3 model includes a new photosynthesis formulation with a dynamic representation of iron-light colimitation, which leads to a considerable improvement of the interannual variability of surface chlorophyll. The database includes a consistent set of 50,050 measurements of 14C primary production. The model best reproduces observations when global NPP is 58 ± 7 Pg C yr-1, with a most probable value of 56 Pg C yr-1. The most probable value is robust to the model used. The uncertainty represents 95% confidence intervals. It considers all random errors in the model and observations, but not potential biases in the observations. We show that tropical regions (23°S-23°N) contribute half of the global NPP, while NPPs in the Northern and Southern Hemispheres are approximately equal in spite of the larger ocean area in the South.
NASA Astrophysics Data System (ADS)
Rhodes, R. C.; Barron, C. N.; Fox, D. N.; Smedstad, L. F.
2001-12-01
A global implementation of the Navy Coastal Ocean Model (NCOM), developed by the Naval Research Laboratory (NRL) at Stennis Space Center is currently running in real-time and is planned for transition to the Naval Oceanographic Office (NAVOCEANO) in 2002. The model encompasses the open ocean to 5 m depth on a curvilinear global model grid with 1/8 degree grid spacing at 45N, extending from 80 S to a complete arctic cap with grid singularities mapped into Canada and Russia. Vertically, the model employs 41 sigma-z levels with sigma in the upper-ocean and coastal regions and z in the deeper ocean. The Navy Operational Global Atmospheric Prediction System (NOGAPS) provides 6-hourly wind stresses and heat fluxes for forcing, while the operational Modular Ocean Data Assimilation System (MODAS) provides the background climatology and tools for data pre-processing. Operationally available sea surface temperature (SST) and altimetry (SSH) data are assimilated into the NAVOCEANO global 1/8 degree MODAS 2-D analysis and the 1/16 degree Navy Layered Ocean Model (NLOM) to provide analyses and forecasts of SSH and SST. The 2-D SSH and SST nowcast fields are used as input to the MODAS synthetic climatology database to yield three-dimensional fields of synthetic temperature and salinity for assimilation into global NCOM. The synthetic profiles are weighted higher at depth in the assimilation process to allow the numerical model to properly develop the mixed-layer structure driven by the real-time atmospheric forcing. Global NCOM nowcasts and forecasts provide a valuable resource for rapid response to the varied and often unpredictable operational requests for 3-dimensional fields of ocean temperature, salinity, and currents. In some cases, the resolution of the global product is sufficient for guidance. In cases requiring higher resolution, the global product offers a quick overview of local circulation and provides initial and boundary conditions for higher resolution coastal models that may be more specialized for a particular task or domain. Nowcast and forecast results are presented globally and in selected areas of interest and model results are compared with historical and concurrent observations and analyses.
NASA Astrophysics Data System (ADS)
Oliveira, José J.
2017-10-01
In this paper, we investigate the global convergence of solutions of non-autonomous Hopfield neural network models with discrete time-varying delays, infinite distributed delays, and possible unbounded coefficient functions. Instead of using Lyapunov functionals, we explore intrinsic features between the non-autonomous systems and their asymptotic systems to ensure the boundedness and global convergence of the solutions of the studied models. Our results are new and complement known results in the literature. The theoretical analysis is illustrated with some examples and numerical simulations.
NASA Astrophysics Data System (ADS)
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun
2017-12-01
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.
Seismic waves and earthquakes in a global monolithic model
NASA Astrophysics Data System (ADS)
Roubíček, Tomáš
2018-03-01
The philosophy that a single "monolithic" model can "asymptotically" replace and couple in a simple elegant way several specialized models relevant on various Earth layers is presented and, in special situations, also rigorously justified. In particular, global seismicity and tectonics is coupled to capture, e.g., (here by a simplified model) ruptures of lithospheric faults generating seismic waves which then propagate through the solid-like mantle and inner core both as shear (S) or pressure (P) waves, while S-waves are suppressed in the fluidic outer core and also in the oceans. The "monolithic-type" models have the capacity to describe all the mentioned features globally in a unified way together with corresponding interfacial conditions implicitly involved, only when scaling its parameters appropriately in different Earth's layers. Coupling of seismic waves with seismic sources due to tectonic events is thus an automatic side effect. The global ansatz is here based, rather for an illustration, only on a relatively simple Jeffreys' viscoelastic damageable material at small strains whose various scaling (limits) can lead to Boger's viscoelastic fluid or even to purely elastic (inviscid) fluid. Self-induced gravity field, Coriolis, centrifugal, and tidal forces are counted in our global model, as well. The rigorous mathematical analysis as far as the existence of solutions, convergence of the mentioned scalings, and energy conservation is briefly presented.
Simulating PACE Global Ocean Radiances
NASA Technical Reports Server (NTRS)
Gregg, Watson W.; Rousseaux, Cecile S.
2017-01-01
The NASA PACE mission is a hyper-spectral radiometer planned for launch in the next decade. It is intended to provide new information on ocean biogeochemical constituents by parsing the details of high resolution spectral absorption and scattering. It is the first of its kind for global applications and as such, poses challenges for design and operation. To support pre-launch mission development and assess on-orbit capabilities, the NASA Global Modeling and Assimilation Office has developed a dynamic simulation of global water-leaving radiances, using an ocean model containing multiple ocean phytoplankton groups, particulate detritus, particulate inorganic carbon (PIC), and chromophoric dissolved organic carbon (CDOC) along with optical absorption and scattering processes at 1 nm spectral resolution. The purpose here is to assess the skill of the dynamic model and derived global radiances. Global bias, uncertainty, and correlation are derived using available modern satellite radiances at moderate spectral resolution. Total chlorophyll, PIC, and the absorption coefficient of CDOC (aCDOC), are simultaneously assimilated to improve the fidelity of the optical constituent fields. A 5-year simulation showed statistically significant (P < 0.05) comparisons of chlorophyll (r = 0.869), PIC (r = 0.868), and a CDOC (r =0.890) with satellite data. Additionally, diatoms (r = 0.890), cyanobacteria (r = 0.732), and coccolithophores (r = 0.716) were significantly correlated with in situ data. Global assimilated distributions of optical constituents were coupled with a radiative transfer model (Ocean-Atmosphere Spectral Irradiance Model, OASIM) to estimate normalized water-leaving radiances at 1 nm for the spectral range 250-800 nm. These unassimilated radiances were within 0.074 mW/sq cm/micron/sr of MODIS-Aqua radiances at 412, 443, 488, 531, 547, and 667 nm. This difference represented a bias of 10.4% (model low). A mean correlation of 0.706 (P < 0.05) was found with global distributions of MODIS radiances. These results suggest skill in the global assimilated model and resulting radiances. The reported error characterization suggests that the global dynamical simulation can support some aspects of mission design and analysis. For example, the high spectral resolution of the simulation supports investigations of band selection. The global nature of the radiance representations supports investigations of satellite observing scenarios. Global radiances at bands not available in current and past missions support investigations of mission capability. PACE, ocean color, water-leaving radiances, biogeochemical model, radiative transfer model
Computing diffuse fraction of global horizontal solar radiation: A model comparison.
Dervishi, Sokol; Mahdavi, Ardeshir
2012-06-01
For simulation-based prediction of buildings' energy use or expected gains from building-integrated solar energy systems, information on both direct and diffuse component of solar radiation is necessary. Available measured data are, however, typically restricted to global horizontal irradiance. There have been thus many efforts in the past to develop algorithms for the derivation of the diffuse fraction of solar irradiance. In this context, the present paper compares eight models for estimating diffuse fraction of irradiance based on a database of measured irradiance from Vienna, Austria. These models generally involve mathematical formulations with multiple coefficients whose values are typically valid for a specific location. Subsequent to a first comparison of these eight models, three better performing models were selected for a more detailed analysis. Thereby, the coefficients of the models were modified to account for Vienna data. The results suggest that some models can provide relatively reliable estimations of the diffuse fractions of the global irradiance. The calibration procedure could only slightly improve the models' performance.
Ionospheric Slant Total Electron Content Analysis Using Global Positioning System Based Estimation
NASA Technical Reports Server (NTRS)
Komjathy, Attila (Inventor); Mannucci, Anthony J. (Inventor); Sparks, Lawrence C. (Inventor)
2017-01-01
A method, system, apparatus, and computer program product provide the ability to analyze ionospheric slant total electron content (TEC) using global navigation satellite systems (GNSS)-based estimation. Slant TEC is estimated for a given set of raypath geometries by fitting historical GNSS data to a specified delay model. The accuracy of the specified delay model is estimated by computing delay estimate residuals and plotting a behavior of the delay estimate residuals. An ionospheric threat model is computed based on the specified delay model. Ionospheric grid delays (IGDs) and grid ionospheric vertical errors (GIVEs) are computed based on the ionospheric threat model.
A global optimization approach to multi-polarity sentiment analysis.
Li, Xinmiao; Li, Jing; Wu, Yukeng
2015-01-01
Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From the results of this comparison, we found that PSOGO-Senti is more suitable for improving a difficult multi-polarity sentiment analysis problem.
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment
NASA Astrophysics Data System (ADS)
Zhao, F.; Frieler, K.; Warszawski, L.; Lange, S.; Schewe, J.; Reyer, C.; Ostberg, S.; Piontek, F.; Betts, R. A.; Burke, E.; Ciais, P.; Deryng, D.; Ebi, K. L.; Emanuel, K.; Elliott, J. W.; Galbraith, E. D.; Gosling, S.; Hickler, T.; Hinkel, J.; Jones, C.; Krysanova, V.; Lotze-Campen, H.; Mouratiadou, I.; Popp, A.; Tian, H.; Tittensor, D.; Vautard, R.; van Vliet, M. T. H.; Eddy, T.; Hattermann, F.; Huber, V.; Mengel, M.; Stevanovic, M.; Kirsten, T.; Mueller Schmied, H.; Denvil, S.; Halladay, K.; Suzuki, T.; Lotze, H. K.
2016-12-01
In Paris, France, December 2015 the Conference of Parties (COP) to the United Nations Framework Convention on Climate Change (UNFCCC) invited the IPCC to provide a "special report in 2018 on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways". In Nairobi, Kenya, April 2016 the IPCC panel accepted the invitation. Here we describe the model simulations planned within the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) to address the request by providing tailored cross-sectoral consistent impacts projections. The protocol is designed to allow for 1) a separation of the impacts of the historical warming starting from pre-industrial conditions from other human drivers such as historical land use changes (based on pre-industrial and historical impact model simulations), 2) a quantification of the effects of an additional warming to 1.5°C including a potential overshoot and long term effects up to 2300 in comparison to a no-mitigation scenario (based on the low emissions Representative Concentration Pathway RCP2.6 and a no-mitigation scenario RCP6.0) keeping socio-economic conditions fixed at year 2005 levels, and 3) an assessment of the climate effects based on the same climate scenarios but accounting for parallel changes in socio-economic conditions following the middle of the road Shared Socioeconomic Pathway (SSP2) and differential bio-energy requirements associated with the transformation of the energy system to reach RCP2.6 compared to RCP6.0. To provide the scientific basis for an aggregation of impacts across sectors and an analysis of cross-sectoral interactions potentially damping or amplifying sectoral impacts the protocol is designed to provide consistent impacts projections across a range of impact models from different sectors (global and regional hydrological models, global gridded crop models, global vegetation models, regional forestry models, global and regional marine ecosystem and fisheries models, global and regional coastal infrastructure models, energy models, health models, and agro-economic models).
NASA Astrophysics Data System (ADS)
Frieler, Katja; Warszawski, Lila; Zhao, Fang
2017-04-01
In Paris, France, December 2015 the Conference of Parties (COP) to the United Nations Framework Convention on Climate Change (UNFCCC) invited the IPCC to provide a "special report in 2018 on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways". In Nairobi, Kenya, April 2016 the IPCC panel accepted the invitation. Here we describe the model simulations planned within the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) to address the request by providing tailored cross-sectoral consistent impacts projections. The protocol is designed to allow for 1) a separation of the impacts of the historical warming starting from pre-industrial conditions from other human drivers such as historical land use changes (based on pre-industrial and historical impact model simulations), 2) a quantification of the effects of an additional warming to 1.5°C including a potential overshoot and long term effects up to 2300 in comparison to a no-mitigation scenario (based on the low emissions Representative Concentration Pathway RCP2.6 and a no-mitigation scenario RCP6.0) keeping socio-economic conditions fixed at year 2005 levels, and 3) an assessment of the climate effects based on the same climate scenarios but accounting for parallel changes in socio-economic conditions following the middle of the road Shared Socioeconomic Pathway (SSP2) and differential bio-energy requirements associated with the transformation of the energy system to reach RCP2.6 compared to RCP6.0. To provide the scientific basis for an aggregation of impacts across sectors and an analysis of cross-sectoral interactions potentially damping or amplifying sectoral impacts the protocol is designed to provide consistent impacts projections across a range of impact models from different sectors (global and regional hydrological models, global gridded crop models, global vegetation models, regional forestry models, global and regional marine ecosystem and fisheries models, global and regional coastal infrastructure models, energy models, health models, and agro-economic models).
Plank, Barbara; Eisenmenger, Nina; Schaffartzik, Anke; Wiedenhofer, Dominik
2018-04-03
Globalization led to an immense increase of international trade and the emergence of complex global value chains. At the same time, global resource use and pressures on the environment are increasing steadily. With these two processes in parallel, the question arises whether trade contributes positively to resource efficiency, or to the contrary is further driving resource use? In this article, the socioeconomic driving forces of increasing global raw material consumption (RMC) are investigated to assess the role of changing trade relations, extended supply chains and increasing consumption. We apply a structural decomposition analysis of changes in RMC from 1990 to 2010, utilizing the Eora multi-regional input-output (MRIO) model. We find that changes in international trade patterns significantly contributed to an increase of global RMC. Wealthy developed countries play a major role in driving global RMC growth through changes in their trade structures, as they shifted production processes increasingly to less material-efficient input suppliers. Even the dramatic increase in material consumption in the emerging economies has not diminished the role of industrialized countries as drivers of global RMC growth.
NASA Astrophysics Data System (ADS)
Misra, Vasubandhu; Li, H.; Wu, Z.; DiNapoli, S.
2014-03-01
This paper shows demonstrable improvement in the global seasonal climate predictability of boreal summer (at zero lead) and fall (at one season lead) seasonal mean precipitation and surface temperature from a two-tiered seasonal hindcast forced with forecasted SST relative to two other contemporary operational coupled ocean-atmosphere climate models. The results from an extensive set of seasonal hindcasts are analyzed to come to this conclusion. This improvement is attributed to: (1) The multi-model bias corrected SST used to force the atmospheric model. (2) The global atmospheric model which is run at a relatively high resolution of 50 km grid resolution compared to the two other coupled ocean-atmosphere models. (3) The physics of the atmospheric model, especially that related to the convective parameterization scheme. The results of the seasonal hindcast are analyzed for both deterministic and probabilistic skill. The probabilistic skill analysis shows that significant forecast skill can be harvested from these seasonal hindcasts relative to the deterministic skill analysis. The paper concludes that the coupled ocean-atmosphere seasonal hindcasts have reached a reasonable fidelity to exploit their SST anomaly forecasts to force such relatively higher resolution two tier prediction experiments to glean further boreal summer and fall seasonal prediction skill.
Exploring and Analyzing Climate Variations Online by Using MERRA-2 data at GES DISC
NASA Astrophysics Data System (ADS)
Shen, S.; Ostrenga, D.; Vollmer, B.; Kempler, S.
2016-12-01
NASA Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) (http://giovanni.sci.gsfc.nasa.gov/giovanni/) is a web-based data visualization and analysis system developed by the Goddard Earth Sciences Data and Information Services Center (GES DISC). Current data analysis functions include Lat-Lon map, time series, scatter plot, correlation map, difference, cross-section, vertical profile, and animation etc. The system enables basic statistical analysis and comparisons of multiple variables. This web-based tool facilitates data discovery, exploration and analysis of large amount of global and regional remote sensing and model data sets from a number of NASA data centers. Recently, long term global assimilated atmospheric, land, and ocean data have been integrated into the system that enables quick exploration and analysis of climate data without downloading, and preprocessing the data. Example data include climate reanalysis from NASA Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) which provides data beginning 1980 to present; land data from NASA Global Land Data Assimilation System (GLDAS) which assimilates data from 1948 to 2012; as well as ocean biological data from NASA Ocean Biogeochemical Model (NOBM) which assimilates data from 1998 to 2012. This presentation, using surface air temperature, precipitation, ozone, and aerosol, etc. from MERRA-2, demonstrates climate variation analysis with Giovanni at selected regions.
Exploring and Analyzing Climate Variations Online by Using NASA MERRA-2 Data at GES DISC
NASA Technical Reports Server (NTRS)
Shen, Suhung; Ostrenga, Dana M.; Vollmer, Bruce E.; Kempler, Steven J.
2016-01-01
NASA Giovanni (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) (http:giovanni.sci.gsfc.nasa.govgiovanni) is a web-based data visualization and analysis system developed by the Goddard Earth Sciences Data and Information Services Center (GES DISC). Current data analysis functions include Lat-Lon map, time series, scatter plot, correlation map, difference, cross-section, vertical profile, and animation etc. The system enables basic statistical analysis and comparisons of multiple variables. This web-based tool facilitates data discovery, exploration and analysis of large amount of global and regional remote sensing and model data sets from a number of NASA data centers. Long term global assimilated atmospheric, land, and ocean data have been integrated into the system that enables quick exploration and analysis of climate data without downloading, preprocessing, and learning data. Example data include climate reanalysis data from NASA Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) which provides data beginning in 1980 to present; land data from NASA Global Land Data Assimilation System (GLDAS), which assimilates data from 1948 to 2012; as well as ocean biological data from NASA Ocean Biogeochemical Model (NOBM), which provides data from 1998 to 2012. This presentation, using surface air temperature, precipitation, ozone, and aerosol, etc. from MERRA-2, demonstrates climate variation analysis with Giovanni at selected regions.
Cosmological backreaction within the Szekeres model and emergence of spatial curvature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolejko, Krzysztof, E-mail: krzysztof.bolejko@sydney.edu.au
This paper discusses the phenomenon of backreaction within the Szekeres model. Cosmological backreaction describes how the mean global evolution of the Universe deviates from the Friedmannian evolution. The analysis is based on models of a single cosmological environment and the global ensemble of the Szekeres models (of the Swiss-Cheese-type and Styrofoam-type). The obtained results show that non-linear growth of cosmic structures is associated with the growth of the spatial curvature Ω{sub R} (in the FLRW limit Ω{sub R} → Ω {sub k} ). If averaged over global scales the result depends on the assumed global model of the Universe. Withinmore » the Swiss-Cheese model, which does have a fixed background, the volume average follows the evolution of the background, and the global spatial curvature averages out to zero (the background model is the ΛCDM model, which is spatially flat). In the Styrofoam-type model, which does not have a fixed background, the mean evolution deviates from the spatially flat ΛCDM model, and the mean spatial curvature evolves from Ω{sub R} =0 at the CMB to Ω{sub R} ∼ 0.1 at 0 z =. If the Styrofoam-type model correctly captures evolutionary features of the real Universe then one should expect that in our Universe, the spatial curvature should build up (local growth of cosmic structures) and its mean global average should deviate from zero (backreaction). As a result, this paper predicts that the low-redshift Universe should not be spatially flat (i.e. Ω {sub k} ≠ 0, even if in the early Universe Ω {sub k} = 0) and therefore when analysing low- z cosmological data one should keep Ω {sub k} as a free parameter and independent from the CMB constraints.« less
Cosmological backreaction within the Szekeres model and emergence of spatial curvature
NASA Astrophysics Data System (ADS)
Bolejko, Krzysztof
2017-06-01
This paper discusses the phenomenon of backreaction within the Szekeres model. Cosmological backreaction describes how the mean global evolution of the Universe deviates from the Friedmannian evolution. The analysis is based on models of a single cosmological environment and the global ensemble of the Szekeres models (of the Swiss-Cheese-type and Styrofoam-type). The obtained results show that non-linear growth of cosmic structures is associated with the growth of the spatial curvature ΩScript R (in the FLRW limit ΩScript R → Ωk). If averaged over global scales the result depends on the assumed global model of the Universe. Within the Swiss-Cheese model, which does have a fixed background, the volume average follows the evolution of the background, and the global spatial curvature averages out to zero (the background model is the ΛCDM model, which is spatially flat). In the Styrofoam-type model, which does not have a fixed background, the mean evolution deviates from the spatially flat ΛCDM model, and the mean spatial curvature evolves from ΩScript R =0 at the CMB to ΩScript R ~ 0.1 at 0z =. If the Styrofoam-type model correctly captures evolutionary features of the real Universe then one should expect that in our Universe, the spatial curvature should build up (local growth of cosmic structures) and its mean global average should deviate from zero (backreaction). As a result, this paper predicts that the low-redshift Universe should not be spatially flat (i.e. Ωk ≠ 0, even if in the early Universe Ωk = 0) and therefore when analysing low-z cosmological data one should keep Ωk as a free parameter and independent from the CMB constraints.
NASA Astrophysics Data System (ADS)
Drapeau, L.; Mangiarotti, S.; Le Jean, F.; Gascoin, S.; Jarlan, L.
2014-12-01
The global modeling technique provides a way to obtain ordinary differential equations from single time series1. This technique, initiated in the 1990s, could be applied successfully to numerous theoretic and experimental systems. More recently it could be applied to environmental systems2,3. Here this technique is applied to seasonal snow cover area in the Pyrenees mountain (Europe) and Mont Lebanon (Mediterranean region). The snowpack evolution is complex because it results from combination of processes driven by physiography (elevation, slope, land cover...) and meteorological variables (precipitation, temperature, wind speed...), which are highly heterogeneous in such regions. Satellite observations in visible bands offer a powerful tool to monitor snow cover areas at global scale, with large resolutions range. Although this observable does not directly inform about snow water equivalent, its dynamical behavior strongly relies on it. Therefore, snow cover area is likely to be a good proxy of the global dynamics and global modeling technique a well adapted approach. The MOD10A2 product (500m) generated from MODIS by the NASA is used after a pretreatment is applied to minimize clouds effect. The global modeling technique is then applied using two packages4,5. The analysis is performed with two time series for the whole period (2000-2012) and year by year. Low-dimensional chaotic models are obtained in many cases. Such models provide a strong argument for chaos since involving the two necessary conditions in a synthetic way: determinism and strong sensitivity to initial conditions. The models comparison suggests important non-stationnarities at interannual scale which prevent from detecting long term changes. 1: Letellier et al 2009. Frequently asked questions about global modeling, Chaos, 19, 023103. 2: Maquet et al 2007. Global models from the Canadian lynx cycles as a direct evidence for chaos in real ecosystems. J. of Mathematical Biology, 55 (1), 21-39 3: Mangiarotti et al 2014. Two chaotic global models for cereal crops cycles observed from satellite in Northern Morocco. Chaos, 24, 023130. 4 : Mangiarotti et al 2012. Polynomial search and Global modelling: two algorithms for modeling chaos. Physical Review E, 86(4), 046205. 5: http://cran.r-project.org/web/packages/PoMoS/index.html.
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
Examination of Satellite and Model Reanalysis Precipitation with Climate Oscillations
NASA Astrophysics Data System (ADS)
Donato, T. F.; Houser, P. R.
2016-12-01
The purpose of this study is to examine the efficacy of satellite and model reanalysis precipitation with climate oscillations. Specifically, we examine and compare the relationship between the Global Precipitation Climate Project (GPCP) with Modern-Era Retrospective Analysis for Research and Application, Version 2 (MERRA-2) in regards to four climate indices: The North Atlantic Oscillation, Southern Oscillation Index, the Southern Annular Mode and Solar Activity. This analysis covers a 35-year observation period from 1980 through 2015. We ask two questions: How is global and regional precipitation changing over the observation period, and how are global and regional variations in precipitation related to global climate variation? We explore and compare global and regional precipitation trends between the two data sets. To do this, we constructed a total of 56 Regions of Interest (ROI). Nineteen of the ROIs were focused on geographic regions including continents, ocean basins, and marginal seas. Twelve ROIs examine hemispheric processes. The remaining 26 regions are derived from spatial-temporal classification analysis of GPCP data over a ten-year period (2001-2010). These regions include the primary wet and dry monsoon regions, regions influenced by western boundary currents, and orography. We investigate and interpret the monthly, seasonal and yearly global and regional response to the selected climate indices. Initial results indicate that no correlation exist between the GPCP data and Merra-2 data. Preliminary qualitative assessment between GCPC and solar activity suggest a possible relationship in intra-annual variability. This work is performed under the State of the Global Water and Energy Cycle (SWEC) project, a NASA-sponsored program in support of NASA's Energy and Water cycle Study (NEWS).
A human-driven decline in global burned area
NASA Astrophysics Data System (ADS)
Andela, N.; Morton, D. C.; Chen, Y.; van der Werf, G.; Giglio, L.; Kasibhatla, P. S.; Randerson, J. T.
2016-12-01
Fire is an important and dynamic ecosystem process that influences many aspects of the global Earth system. Here, we used several different satellite datasets to assess trends in global burned area during 1998 to 2014. Global burned area decreased by about 21.6 ± 8.5% over the period from 1998-2014, with large regional declines observed in savanna and grassland ecosystems in northern Africa, Eurasia, and South America. The decrease in burned area remained robust after removing the influence of climate (16.0 ± 6.0%), implicating human activity as a likely driver. To further investigate the mechanisms contributing to regional and global trends, we conducted several kinds of analysis, including separation of burned area into ignition and fire size components and geospatial analysis of fire trends in relationship with demographic and land use variables. We found that fire number was a more important factor contributing to burned area trends than fire size, suggesting a reduction in the use of fire for management purposes. Concurrent decreases in fire size also contributed to the trend outside of North and South America, suggesting a role for greater landscape fragmentation. From our geospatial analysis, we developed a conceptual model that incorporates a range of drivers for human-driven changes in biomass burning that can be used to guide global fire models, currently unable to reproduce these large scale recent trends. Patterns of agricultural expansion and land use intensification are likely to further contribute to declining burned area trends in future decades, with important consequences for Earth system processes mediated by surface albedo, greenhouse gas emissions, and aerosols. Our results also highlight the vulnerability of savannas and grassland to land use changes with unprecedented global scale consequences for vegetation structure and the carbon cycle.
Limits of Risk Predictability in a Cascading Alternating Renewal Process Model.
Lin, Xin; Moussawi, Alaa; Korniss, Gyorgy; Bakdash, Jonathan Z; Szymanski, Boleslaw K
2017-07-27
Most risk analysis models systematically underestimate the probability and impact of catastrophic events (e.g., economic crises, natural disasters, and terrorism) by not taking into account interconnectivity and interdependence of risks. To address this weakness, we propose the Cascading Alternating Renewal Process (CARP) to forecast interconnected global risks. However, assessments of the model's prediction precision are limited by lack of sufficient ground truth data. Here, we establish prediction precision as a function of input data size by using alternative long ground truth data generated by simulations of the CARP model with known parameters. We illustrate the approach on a model of fires in artificial cities assembled from basic city blocks with diverse housing. The results confirm that parameter recovery variance exhibits power law decay as a function of the length of available ground truth data. Using CARP, we also demonstrate estimation using a disparate dataset that also has dependencies: real-world prediction precision for the global risk model based on the World Economic Forum Global Risk Report. We conclude that the CARP model is an efficient method for predicting catastrophic cascading events with potential applications to emerging local and global interconnected risks.
, GFS, RAP, HRRR, HIRESW, SREF mean, International Global Models, HPC analysis Precipitation Skill Scores : 1995-Present NAM, GFS, NAM CONUS nest, International Models EMC Forecast Verfication Stats: NAM ) Real Time Verification of NCEP Operational Models against observations Real Time Verification of NCEP
Beretta, E; Capasso, V; Rinaldi, F
1988-01-01
The paper contains an extension of the general ODE system proposed in previous papers by the same authors, to include distributed time delays in the interaction terms. The new system describes a large class of Lotka-Volterra like population models and epidemic models with continuous time delays. Sufficient conditions for the boundedness of solutions and for the global asymptotic stability of nontrivial equilibrium solutions are given. A detailed analysis of the epidemic system is given with respect to the conditions for global stability. For a relevant subclass of these systems an existence criterion for steady states is also given.
NASA Astrophysics Data System (ADS)
Hoff, R. M.; Pappalardo, G.
2010-12-01
In 2007, the WMO Global Atmospheric Watch’s Science Advisory Group on Aerosols described a global network of lidar networks called GAW Aerosol Lidar Observation Network (GALION). GALION has a purpose of providing expanded coverage of aerosol observations for climate and air quality use. Comprised of networks in Asia (AD-NET), Europe (EARLINET and CIS-LINET), North America (CREST and CORALNET), South America (ALINE) and with contribution from global networks such as MPLNET and NDACC, the collaboration provides a unique capability to define aerosol profiles in the vertical. GALION is designed to supplement existing ground-based and column profiling (AERONET, PHOTONS, SKYNET, GAWPFR) stations. In September 2010, GALION held its second workshop and one component of discussion focussed how the network would integrate into model needs. GALION partners have contributed to the Sand and Dust Storm Warning and Analysis System (SDS-WAS) and to assimilation in models such as DREAM. This paper will present the conclusions of those discussions and how these observations can fit into a global model analysis framework. Questions of availability, latency, and aerosol parameters that might be ingested into models will be discussed. An example of where EARLINET and GALION have contributed in near-real time observations was the suite of measurements during the Eyjafjallajokull eruption in Iceland and its impact on European air travel. Lessons learned from this experience will be discussed.
Hydroclimatic Controls over Global Variations in Phenology and Carbon Flux
NASA Technical Reports Server (NTRS)
Koster, Randal; Walker, G.; Thornton, Patti; Collatz, G. J.
2012-01-01
The connection between phenological and hydroclimatological variations are quantified through joint analyses of global NDVI, LAI, and precipitation datasets. The global distributions of both NDVI and LAI in the warm season are strongly controlled by three quantities: mean annual precipitation, the standard deviation of annual precipitation, and Budyko's index of dryness. Upon demonstrating that these same basic (if biased) relationships are produced by a dynamic vegetation model (the dynamic vegetation and carbon storage components of the NCAR Community Land Model version 4 combined with the water and energy balance framework of the Catchment Land Surface Model of the NASA Global Modeling and Assimilation Office), we use the model to perform a sensitivity study focusing on how phenology and carbon flux might respond to climatic change. The offline (decoupled from the atmosphere) simulations show us, for example, where on the globe a given small increment in precipitation mean or variability would have the greatest impact on carbon uptake. The analysis framework allows us in addition to quantify the degree to which climatic biases in a free-running GCM are manifested as biases in simulated phenology.
Hydroclimatic Controls over Global Variations in Phenology and Carbon Flux
NASA Astrophysics Data System (ADS)
Koster, R. D.; Walker, G.; Thornton, P. E.; Collatz, G. J.
2012-12-01
The connection between phenological and hydroclimatological variations are quantified through joint analyses of global NDVI, LAI, and precipitation datasets. The global distributions of both NDVI and LAI in the warm season are strongly controlled by three quantities: mean annual precipitation, the standard deviation of annual precipitation, and Budyko's index of dryness. Upon demonstrating that these same basic (if somewhat biased) relationships are produced by a dynamic vegetation model (the dynamic vegetation and carbon storage components of the NCAR Community Land Model version 4 combined with the water and energy balance framework of the Catchment Land Surface Model of the NASA Global Modeling and Assimilation Office), we use the model to perform a sensitivity study focusing on how phenology and carbon flux might respond to climatic change. The offline (decoupled from the atmosphere) simulations show us, for example, where on the globe a given small increment in precipitation mean or variability would have the greatest impact on carbon uptake. The analysis framework allows us in addition to quantify the degree to which climatic biases in a free-running GCM are manifested as biases in simulated phenology.
The CAFE model: A net production model for global ocean phytoplankton
NASA Astrophysics Data System (ADS)
Silsbe, Greg M.; Behrenfeld, Michael J.; Halsey, Kimberly H.; Milligan, Allen J.; Westberry, Toby K.
2016-12-01
The Carbon, Absorption, and Fluorescence Euphotic-resolving (CAFE) net primary production model is an adaptable framework for advancing global ocean productivity assessments by exploiting state-of-the-art satellite ocean color analyses and addressing key physiological and ecological attributes of phytoplankton. Here we present the first implementation of the CAFE model that incorporates inherent optical properties derived from ocean color measurements into a mechanistic and accurate model of phytoplankton growth rates (μ) and net phytoplankton production (NPP). The CAFE model calculates NPP as the product of energy absorption (QPAR), and the efficiency (ϕμ) by which absorbed energy is converted into carbon biomass (CPhyto), while μ is calculated as NPP normalized to CPhyto. The CAFE model performance is evaluated alongside 21 other NPP models against a spatially robust and globally representative set of direct NPP measurements. This analysis demonstrates that the CAFE model explains the greatest amount of variance and has the lowest model bias relative to other NPP models analyzed with this data set. Global oceanic NPP from the CAFE model (52 Pg C m-2 yr-1) and mean division rates (0.34 day-1) are derived from climatological satellite data (2002-2014). This manuscript discusses and validates individual CAFE model parameters (e.g., QPAR and ϕμ), provides detailed sensitivity analyses, and compares the CAFE model results and parameterization to other widely cited models.
NASA Astrophysics Data System (ADS)
Hancher, M.; Lieber, A.; Scott, L.
2017-12-01
The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.
Principal process analysis of biological models.
Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc
2018-06-14
Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.
NASA Astrophysics Data System (ADS)
Li, Longhui
2015-04-01
Twelve Earth System Models (ESMs) from phase 5 of the Coupled Model Intercomparison Project (CMIP5) are evaluated in terms of ecosystem water use efficiency (WUE) and Budyko framework. Simulated values of GPP and ET from ESMs were validated against with FLUXNET measurements, and the slope of linear regression between the measurement and the model ranged from 0.24 in CanESM2 to 0.8 in GISS-E2 for GPP, and from 0.51 to 0.86 for ET. The performances of 12 ESMs in simulating ET are generally better than GPP. Compared with flux-tower-based estimates by Jung et al. [Journal of Geophysical Research 116 (2011) G00J07] (JU11), all ESMs could capture the latitudinal variations of GPP and ET, but the majority of models extremely overestimated GPP and ET, particularly around the equator. The 12 ESMs showed much larger variations in latitudinal WUE. 4 of 12 ESMs predicted global annual GPP of higher than 150 Pg C year-1, and the other 8 ESMs predicted global GPP with ±15% error of the JU11 GPP. In contrast, all EMSs predicted moderate bias for global ET. The coefficient of variation (CV) of ET (0.11) is significantly less than that of GPP (0.25). More than half of 12 ESMs generally comply with the Budyko framework but some models deviated much. Spatial analysis of error in GPP and ET indicated that model results largely differ among models at different regions. This study suggested that the estimate of ET was much better than GPP. Incorporating the convergence of WUE and the Budyko framework into ESMs as constraints in the next round of CMIP scheme is expected to decrease the uncertainties of carbon and water fluxes estimates.
Kuo, Jane Z.; Zangwill, Linda M.; Medeiros, Felipe A.; Liebmann, Jeffery M.; Girkin, Christopher A.; Hammel, Na’ama; Rotter, Jerome I.; Weinreb, Robert N.
2015-01-01
Purpose To perform a quantitative trait locus (QTL) analysis and evaluate whether a locus between SIX1 and SIX6 is associated with retinal nerve fiber layer (RNFL) thickness in individuals of European descent. Design Observational, multi-center, cross-sectional study. Methods 231 participants were recruited from the Diagnostic Innovations in Glaucoma Study and the African Descent and Glaucoma Evaluation Study. Association of rs10483727 in SIX1-SIX6 with global and sectoral RNFL thickness was performed. Quantitative trait analysis with the additive model of inheritance was analyzed using linear regression. Trend analysis was performed to evaluate the mean global and sectoral RNFL thickness with 3 genotypes of interest (T/T, C/T, C/C). All models were adjusted for age and gender. Results Direction of association between T allele and RNFL thickness was consistent in the global and different sectoral RNFL regions. Each copy of the T risk allele in rs10483727 was associated with −0.16 μm thinner global RNFL thickness (β=−0.16, 95% CI: −0.28 to −0.03; P=0.01). Similar patterns were found for the sectoral regions, including inferior (P=0.03), inferior-nasal (P=0.017), superior-nasal (P=0.0025), superior (P=0.002) and superior-temporal (P=0.008). The greatest differences were observed in the superior and inferior quadrants, supporting clinical observations for RNFL thinning in glaucoma. Thinner global RNFL was found in subjects with T/T genotypes compared to subjects with C/T and C/C genotypes (P=0.044). Conclusions Each copy of the T risk allele has an additive effect and was associated with thinner global and sectoral RNFL. Findings from this QTL analysis further support a genetic contribution to glaucoma pathophysiology. PMID:25849520
A Canonical Response in Rainfall Characteristics to Global Warming: Projections by IPCC CMIP5 Models
NASA Technical Reports Server (NTRS)
Lau, William K. M.; Wu, H. T.; Kim, K. M.
2012-01-01
Changes in rainfall characteristics induced by global warming are examined based on probability distribution function (PDF) analysis, from outputs of 14 IPCC (Intergovernmental Panel on Climate Change), CMIP (5th Coupled Model Intercomparison Project) models under various scenarios of increased CO2 emissions. Results show that collectively CMIP5 models project a robust and consistent global and regional rainfall response to CO2 warming. Globally, the models show a 1-3% increase in rainfall per degree rise in temperature, with a canonical response featuring large increase (100-250 %) in frequency of occurrence of very heavy rain, a reduction (5-10%) of moderate rain, and an increase (10-15%) of light rain events. Regionally, even though details vary among models, a majority of the models (>10 out of 14) project a consistent large scale response with more heavy rain events in climatologically wet regions, most pronounced in the Pacific ITCZ and the Asian monsoon. Moderate rain events are found to decrease over extensive regions of the subtropical and extratropical oceans, but increases over the extratropical land regions, and the Southern Oceans. The spatial distribution of light rain resembles that of moderate rain, but mostly with opposite polarity. The majority of the models also show increase in the number of dry events (absence or only trace amount of rain) over subtropical and tropical land regions in both hemispheres. These results suggest that rainfall characteristics are changing and that increased extreme rainfall events and droughts occurrences are connected, as a consequent of a global adjustment of the large scale circulation to global warming.
Lascola, Robert; O'Rourke, Patrick E.; Kyser, Edward A.
2017-10-05
Here, we have developed a piecewise local (PL) partial least squares (PLS) analysis method for total plutonium measurements by absorption spectroscopy in nitric acid-based nuclear material processing streams. Instead of using a single PLS model that covers all expected solution conditions, the method selects one of several local models based on an assessment of solution absorbance, acidity, and Pu oxidation state distribution. The local models match the global model for accuracy against the calibration set, but were observed in several instances to be more robust to variations associated with measurements in the process. The improvements are attributed to the relativemore » parsimony of the local models. Not all of the sources of spectral variation are uniformly present at each part of the calibration range. Thus, the global model is locally overfitting and susceptible to increased variance when presented with new samples. A second set of models quantifies the relative concentrations of Pu(III), (IV), and (VI). Standards containing a mixture of these species were not at equilibrium due to a disproportionation reaction. Therefore, a separate principal component analysis is used to estimate of the concentrations of the individual oxidation states in these standards in the absence of independent confirmatory analysis. The PL analysis approach is generalizable to other systems where the analysis of chemically complicated systems can be aided by rational division of the overall range of solution conditions into simpler sub-regions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lascola, Robert; O'Rourke, Patrick E.; Kyser, Edward A.
Here, we have developed a piecewise local (PL) partial least squares (PLS) analysis method for total plutonium measurements by absorption spectroscopy in nitric acid-based nuclear material processing streams. Instead of using a single PLS model that covers all expected solution conditions, the method selects one of several local models based on an assessment of solution absorbance, acidity, and Pu oxidation state distribution. The local models match the global model for accuracy against the calibration set, but were observed in several instances to be more robust to variations associated with measurements in the process. The improvements are attributed to the relativemore » parsimony of the local models. Not all of the sources of spectral variation are uniformly present at each part of the calibration range. Thus, the global model is locally overfitting and susceptible to increased variance when presented with new samples. A second set of models quantifies the relative concentrations of Pu(III), (IV), and (VI). Standards containing a mixture of these species were not at equilibrium due to a disproportionation reaction. Therefore, a separate principal component analysis is used to estimate of the concentrations of the individual oxidation states in these standards in the absence of independent confirmatory analysis. The PL analysis approach is generalizable to other systems where the analysis of chemically complicated systems can be aided by rational division of the overall range of solution conditions into simpler sub-regions.« less
Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V; Petway, Joy R
2017-07-12
This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH₃-N and NO₃-N. Results indicate that the integrated FME-GLUE-based model, with good Nash-Sutcliffe coefficients (0.53-0.69) and correlation coefficients (0.76-0.83), successfully simulates the concentrations of ON-N, NH₃-N and NO₃-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH₃-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO₃-N simulation, which was measured using global sensitivity.
A global goodness-of-fit statistic for Cox regression models.
Parzen, M; Lipsitz, S R
1999-06-01
In this paper, a global goodness-of-fit test statistic for a Cox regression model, which has an approximate chi-squared distribution when the model has been correctly specified, is proposed. Our goodness-of-fit statistic is global and has power to detect if interactions or higher order powers of covariates in the model are needed. The proposed statistic is similar to the Hosmer and Lemeshow (1980, Communications in Statistics A10, 1043-1069) goodness-of-fit statistic for binary data as well as Schoenfeld's (1980, Biometrika 67, 145-153) statistic for the Cox model. The methods are illustrated using data from a Mayo Clinic trial in primary billiary cirrhosis of the liver (Fleming and Harrington, 1991, Counting Processes and Survival Analysis), in which the outcome is the time until liver transplantation or death. The are 17 possible covariates. Two Cox proportional hazards models are fit to the data, and the proposed goodness-of-fit statistic is applied to the fitted models.
USDA-ARS?s Scientific Manuscript database
Soil moisture is a fundamental data source used in crop growth stage and crop stress models developed by the USDA Foreign Agriculture Service for global crop estimation. USDA’s International Production Assessment Division (IPAD) of the Office of Global Analysis (OGA). Currently, the PECAD DSS utiliz...
Global Information Justice: Rights, Responsibilities, and Caring Connections.
ERIC Educational Resources Information Center
Smith, Martha
2001-01-01
Explains the concept of global information justice and describes it as an ethical ideal, as an organizing principle for a model for analysis, and as a direction for policy making. Discusses the use of new technologies; access to technology; ownership; privacy; security; community; and the Universal Declaration of Human Rights. (Author/LRW)
NASA Astrophysics Data System (ADS)
Tohidnia, S.; Tohidi, G.
2018-02-01
The current paper develops three different ways to measure the multi-period global cost efficiency for homogeneous networks of processes when the prices of exogenous inputs are known at all time periods. A multi-period network data envelopment analysis model is presented to measure the minimum cost of the network system based on the global production possibility set. We show that there is a relationship between the multi-period global cost efficiency of network system and its subsystems, and also its processes. The proposed model is applied to compute the global cost Malmquist productivity index for measuring the productivity changes of network system and each of its process between two time periods. This index is circular. Furthermore, we show that the productivity changes of network system can be defined as a weighted average of the process productivity changes. Finally, a numerical example will be presented to illustrate the proposed approach.
A remote sensing based vegetation classification logic for global land cover analysis
Running, Steven W.; Loveland, Thomas R.; Pierce, Lars L.; Nemani, R.R.; Hunt, E. Raymond
1995-01-01
This article proposes a simple new logic for classifying global vegetation. The critical features of this classification are that 1) it is based on simple, observable, unambiguous characteristics of vegetation structure that are important to ecosystem biogeochemistry and can be measured in the field for validation, 2) the structural characteristics are remotely sensible so that repeatable and efficient global reclassifications of existing vegetation will be possible, and 3) the defined vegetation classes directly translate into the biophysical parameters of interest by global climate and biogeochemical models. A first test of this logic for the continental United States is presented based on an existing 1 km AVHRR normalized difference vegetation index database. Procedures for solving critical remote sensing problems needed to implement the classification are discussed. Also, some inferences from this classification to advanced vegetation biophysical variables such as specific leaf area and photosynthetic capacity useful to global biogeochemical modeling are suggested.
NASA Astrophysics Data System (ADS)
Zhuang, Y.; Tian, F.; Yigzaw, W.; Hejazi, M. I.; Li, H. Y.; Turner, S. W. D.; Vernon, C. R.
2017-12-01
More and more reservoirs are being build or planned in order to help meet the increasing water demand all over the world. However, is building new reservoirs always helpful to water supply? To address this question, the river routing module of Global Change Assessment Model (GCAM) has been extended with a simple yet physical-based reservoir scheme accounting for irrigation, flood control and hydropower operations at each individual reservoir. The new GCAM river routing model has been applied over the global domain with the runoff inputs from the Variable Infiltration Capacity Model. The simulated streamflow is validated at 150 global river basins where the observed streamflow data are available. The model performance has been significantly improved at 77 basins and worsened at 35 basins. To facilitate the analysis of additional reservoir storage impacts at the basin level, a lumped version of GCAM reservoir model has been developed, representing a single lumped reservoir at each river basin which has the regulation capacity of all reservoir combined. A Sequent Peak Analysis is used to estimate how much additional reservoir storage is required to satisfy the current water demand. For basins with water deficit, the water supply reliability can be improved with additional storage. However, there is a threshold storage value at each basin beyond which the reliability stops increasing, suggesting that building new reservoirs will not help better relieve the water stress. Findings in the research can be helpful to the future planning and management of new reservoirs.
Internally Consistent MODIS Estimate of Aerosol Clear-Sky Radiative Effect Over the Global Oceans
NASA Technical Reports Server (NTRS)
Remer, Lorraine A.; Kaufman, Yoram J.
2004-01-01
Modern satellite remote sensing, and in particular the MODerate resolution Imaging Spectroradiometer (MODIS), offers a measurement-based pathway to estimate global aerosol radiative effects and aerosol radiative forcing. Over the Oceans, MODIS retrieves the total aerosol optical thickness, but also reports which combination of the 9 different aerosol models was used to obtain the retrieval. Each of the 9 models is characterized by a size distribution and complex refractive index, which through Mie calculations correspond to a unique set of single scattering albedo, assymetry parameter and spectral extinction for each model. The combination of these sets of optical parameters weighted by the optical thickness attributed to each model in the retrieval produces the best fit to the observed radiances at the top of the atmosphere. Thus the MODIS Ocean aerosol retrieval provides us with (1) An observed distribution of global aerosol loading, and (2) An internally-consistent, observed, distribution of aerosol optical models that when used in combination will best represent the radiances at the top of the atmosphere. We use these two observed global distributions to initialize the column climate model by Chou and Suarez to calculate the aerosol radiative effect at top of the atmosphere and the radiative efficiency of the aerosols over the global oceans. We apply the analysis to 3 years of MODIS retrievals from the Terra satellite and produce global and regional, seasonally varying, estimates of aerosol radiative effect over the clear-sky oceans.
Integrated modelling of anthropogenic land-use and land-cover change on the global scale
NASA Astrophysics Data System (ADS)
Schaldach, R.; Koch, J.; Alcamo, J.
2009-04-01
In many cases land-use activities go hand in hand with substantial modifications of the physical and biological cover of the Earth's surface, resulting in direct effects on energy and matter fluxes between terrestrial ecosystems and the atmosphere. For instance, the conversion of forest to cropland is changing climate relevant surface parameters (e.g. albedo) as well as evapotranspiration processes and carbon flows. In turn, human land-use decisions are also influenced by environmental processes. Changing temperature and precipitation patterns for example are important determinants for location and intensity of agriculture. Due to these close linkages, processes of land-use and related land-cover change should be considered as important components in the construction of Earth System models. A major challenge in modelling land-use change on the global scale is the integration of socio-economic aspects and human decision making with environmental processes. One of the few global approaches that integrates functional components to represent both anthropogenic and environmental aspects of land-use change, is the LandSHIFT model. It simulates the spatial and temporal dynamics of the human land-use activities settlement, cultivation of food crops and grazing management, which compete for the available land resources. The rational of the model is to regionalize the demands for area intensive commodities (e.g. crop production) and services (e.g. space for housing) from the country-level to a global grid with the spatial resolution of 5 arc-minutes. The modelled land-use decisions within the agricultural sector are influenced by changing climate and the resulting effects on biomass productivity. Currently, this causal chain is modelled by integrating results from the process-based vegetation model LPJmL model for changing crop yields and net primary productivity of grazing land. Model output of LandSHIFT is a time series of grid maps with land-use/land-cover information that can serve as basis for further impact analysis. An exemplary simulation study with LandSHIFT is presented, based on scenario assumptions from the UNEP Global Environmental Outlook 4. Time horizon of the analysis is the year 2050. Changes of future food production on country level are computed by the agro-economy model IMPACT as a function of demography, economic development and global trade pattern. Together with scenario assumptions on climatic change and population growth, this data serves as model input to compute the changing land-use und land-cover. The continental and global scale model results are then analysed with respect to changes in the spatial pattern of natural vegetation as well as the resulting effects on evapotranspiration processes and land surface parameters. Furthermore, possible linkages of LandSHIFT to the different components of Earth System models (e.g. climate and natural vegetation) are discussed.
NASA Astrophysics Data System (ADS)
Cai, X.; Zhang, X.; Zhu, T.
2014-12-01
Global food security is constrained by local and regional land and water availability, as well as other agricultural input limitations and inappropriate national and global regulations. In a theoretical context, this study assumes that optimal water and land uses in local food production to maximize food security and social welfare at the global level can be driven by global trade. It follows the context of "virtual resources trade", i.e., utilizing international trade of agricultural commodities to reduce dependency on local resources, and achieves land and water savings in the world. An optimization model based on the partial equilibrium of agriculture is developed for the analysis, including local commodity production and land and water resources constraints, demand by country, and global food market. Through the model, the marginal values (MVs) of social welfare for water and land at the level of so-called food production units (i.e., sub-basins with similar agricultural production conditions) are derived and mapped in the world. In this personation, we will introduce the model structure, explain the meaning of MVs at the local level and their distribution around the world, and discuss the policy implications for global communities to enhance global food security. In particular, we will examine the economic values of water and land under different world targets of food security (e.g., number of malnourished population or children in a future year). In addition, we will also discuss the opportunities on data to improve such global modeling exercises.
Nonlinear dynamics of global atmospheric and Earth-system processes
NASA Technical Reports Server (NTRS)
Saltzman, Barry; Ebisuzaki, Wesley; Maasch, Kirk A.; Oglesby, Robert; Pandolfo, Lionel
1990-01-01
Researchers are continuing their studies of the nonlinear dynamics of global weather systems. Sensitivity analyses of large-scale dynamical models of the atmosphere (i.e., general circulation models i.e., GCM's) were performed to establish the role of satellite-signatures of soil moisture, sea surface temperature, snow cover, and sea ice as crucial boundary conditions determining global weather variability. To complete their study of the bimodality of the planetary wave states, they are using the dynamical systems approach to construct a low-order theoretical explanation of this phenomenon. This work should have important implications for extended range forecasting of low-frequency oscillations, elucidating the mechanisms for the transitions between the two wave modes. Researchers are using the methods of jump analysis and attractor dimension analysis to examine the long-term satellite records of significant variables (e.g., long wave radiation, and cloud amount), to explore the nature of mode transitions in the atmosphere, and to determine the minimum number of equations needed to describe the main weather variations with a low-order dynamical system. Where feasible they will continue to explore the applicability of the methods of complex dynamical systems analysis to the study of the global earth-system from an integrative viewpoint involving the roles of geochemical cycling and the interactive behavior of the atmosphere, hydrosphere, and biosphere.
NASA Technical Reports Server (NTRS)
Cullather, Richard; Bosilovich, Michael
2017-01-01
The Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) is a global atmospheric reanalysis produced by the NASA Global Modeling and Assimilation Office (GMAO). It spans the satellite observing era from 1980 to the present. The goals of MERRA-2 are to provide a regularly-gridded, homogeneous record of the global atmosphere, and to incorporate additional aspects of the climate system including trace gas constituents (stratospheric ozone), and improved land surface representation, and cryospheric processes. MERRA-2 is also the first satellite-era global reanalysis to assimilate space-based observations of aerosols and represent their interactions with other physical processes in the climate system. The inclusion of these additional components are consistent with the overall objectives of an Integrated Earth System Analysis (IESA). MERRA-2 is intended to replace the original MERRA product, and reflects recent advances in atmospheric modeling and data assimilation. Modern hyperspectral radiance and microwave observations, along with GPS-Radio Occultation and NASA ozone datasets are now assimilated in MERRA-2. Much of the structure of the data files remains the same in MERRA-2. While the original MERRA data format was HDF-EOS, the MERRA-2 supplied binary data format is now NetCDF4 (with lossy compression to save space).
The NASA Modern Era Reanalysis for Research and Applications, Version-2 (MERRA-2)
NASA Astrophysics Data System (ADS)
Gelaro, R.; McCarty, W.; Molod, A.; Suarez, M.; Takacs, L.; Todling, R.
2014-12-01
The NASA Modern Era Reanalysis for Research Applications Version-2 (MERRA-2) is a reanalysis for the satellite era using an updated version of the Goddard Earth Observing System Data Assimilation System Version-5 (GEOS-5) produced by the Global Modeling and Assimilation Office (GMAO). MERRA-2 will assimilate meteorological and aerosol observations not available to MERRA and includes improvements to the GEOS-5 model and analysis scheme so as to provide an ongoing climate analysis beyond MERRA's terminus. MERRA-2 will also serve as a development milestone for a future GMAO coupled Earth system analysis. Production of MERRA-2 began in June 2014 in four processing streams, with convergence to a single near-real time climate analysis expected by early 2015. This talk provides an overview of the MERRA-2 system developments and key science results. For example, compared with MERRA, MERRA-2 exhibits a well-balanced relationship between global precipitation and evaporation, with significantly reduced sensitivity to changes in the global observing system through time. Other notable improvements include reduced biases in the tropical middle- and upper-tropospheric wind and near-surface temperature over continents.
MULTIVARIATE LINEAR MIXED MODELS FOR MULTIPLE OUTCOMES. (R824757)
We propose a multivariate linear mixed (MLMM) for the analysis of multiple outcomes, which generalizes the latent variable model of Sammel and Ryan. The proposed model assumes a flexible correlation structure among the multiple outcomes, and allows a global test of the impact of ...
A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis
Rahman, M. M.; Antani, S. K.; Thoma, G. R.
2011-01-01
We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350
Güizado-Rodríguez, Martha Anahí; Ballesteros-Barrera, Claudia; Casas-Andreu, Gustavo; Barradas-Miranda, Victor Luis; Téllez-Valdés, Oswaldo; Salgado-Ugarte, Isaías Hazarmabeth
2012-12-01
The ectothermic nature of reptiles makes them especially sensitive to global warming. Although climate change and its implications are a frequent topic of detailed studies, most of these studies are carried out without making a distinction between populations. Here we present the first study of an Aspidoscelis species that evaluates the effects of global warming on its distribution using ecological niche modeling. The aims of our study were (1) to understand whether predicted warmer climatic conditions affect the geographic potential distribution of different climatic groups of Aspidoscelis costata costata and (2) to identify potential altitudinal changes of these groups under global warming. We used the maximum entropy species distribution model (MaxEnt) to project the potential distributions expected for the years 2020, 2050, and 2080 under a single simulated climatic scenario. Our analysis suggests that some climatic groups of Aspidoscelis costata costata will exhibit reductions and in others expansions in their distribution, with potential upward shifts toward higher elevation in response to climate warming. Different climatic groups were revealed in our analysis that subsequently showed heterogeneous responses to climatic change illustrating the complex nature of species geographic responses to environmental change and the importance of modeling climatic or geographic groups and/or populations instead of the entire species' range treated as a homogeneous entity.
Scaling law analysis of paraffin thin films on different surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dotto, M. E. R.; Camargo, S. S. Jr.
2010-01-15
The dynamics of paraffin deposit formation on different surfaces was analyzed based on scaling laws. Carbon-based films were deposited onto silicon (Si) and stainless steel substrates from methane (CH{sub 4}) gas using radio frequency plasma enhanced chemical vapor deposition. The different substrates were characterized with respect to their surface energy by contact angle measurements, surface roughness, and morphology. Paraffin thin films were obtained by the casting technique and were subsequently characterized by an atomic force microscope in noncontact mode. The results indicate that the morphology of paraffin deposits is strongly influenced by substrates used. Scaling laws analysis for coated substratesmore » present two distinct dynamics: a local roughness exponent ({alpha}{sub local}) associated to short-range surface correlations and a global roughness exponent ({alpha}{sub global}) associated to long-range surface correlations. The local dynamics is described by the Wolf-Villain model, and a global dynamics is described by the Kardar-Parisi-Zhang model. A local correlation length (L{sub local}) defines the transition between the local and global dynamics with L{sub local} approximately 700 nm in accordance with the spacing of planes measured from atomic force micrographs. For uncoated substrates, the growth dynamics is related to Edwards-Wilkinson model.« less
NASA Astrophysics Data System (ADS)
Reisner, Jon; D'Angelo, Gennaro; Koo, Eunmo; Even, Wesley; Hecht, Matthew; Hunke, Elizabeth; Comeau, Darin; Bos, Randall; Cooley, James
2018-03-01
We present a multiscale study examining the impact of a regional exchange of nuclear weapons on global climate. Our models investigate multiple phases of the effects of nuclear weapons usage, including growth and rise of the nuclear fireball, ignition and spread of the induced firestorm, and comprehensive Earth system modeling of the oceans, land, ice, and atmosphere. This study follows from the scenario originally envisioned by Robock, Oman, Stenchikov, et al. (2007, https://doi.org/10.5194/acp-7-2003-2007), based on the analysis of Toon et al. (2007, https://doi.org/10.5194/acp-7-1973-2007), which assumes a regional exchange between India and Pakistan of fifty 15 kt weapons detonated by each side. We expand this scenario by modeling the processes that lead to production of black carbon, in order to refine the black carbon forcing estimates of these previous studies. When the Earth system model is initiated with 5 × 109 kg of black carbon in the upper troposphere (approximately from 9 to 13 km), the impact on climate variables such as global temperature and precipitation in our simulations is similar to that predicted by previously published work. However, while our thorough simulations of the firestorm produce about 3.7 × 109 kg of black carbon, we find that the vast majority of the black carbon never reaches an altitude above weather systems (approximately 12 km). Therefore, our Earth system model simulations conducted with model-informed atmospheric distributions of black carbon produce significantly lower global climatic impacts than assessed in prior studies, as the carbon at lower altitudes is more quickly removed from the atmosphere. In addition, our model ensembles indicate that statistically significant effects on global surface temperatures are limited to the first 5 years and are much smaller in magnitude than those shown in earlier works. None of the simulations produced a nuclear winter effect. We find that the effects on global surface temperatures are not uniform and are concentrated primarily around the highest arctic latitudes, dramatically reducing the global impact on human health and agriculture compared with that reported by earlier studies. Our analysis demonstrates that the probability of significant global cooling from a limited exchange scenario as envisioned in previous studies is highly unlikely, a conclusion supported by examination of natural analogs, such as large forest fires and volcanic eruptions.
Reisner, Jon Michael; D'Angelo, Gennaro; Koo, Eunmo; ...
2018-02-13
In this paper, we present a multi-scale study examining the impact of a regional exchange of nuclear weapons on global climate. Our models investigate multiple phases of the effects of nuclear weapons usage, including growth and rise of the nuclear fireball, ignition and spread of the induced firestorm, and comprehensive Earth system modeling of the oceans, land, ice, and atmosphere. This study follows from the scenario originally envisioned by Robock et al. (2007a), based on the analysis of Toon et al. (2007), which assumes a regional exchange between India and Pakistan of fifty 15-kiloton weapons detonated by each side. Wemore » expand this scenario by modeling the processes that lead to production of black carbon, in order to refine the black carbon forcing estimates of these previous studies. When the Earth system model is initiated with 5 × 10 9 kg of black carbon in the upper troposphere (approximately 9 to 13 km), the impact on climate variables such as global temperature and precipitation in our simulations is similar to that predicted by previously published work. However, while our thorough simulations of the firestorm produce about 3.7 × 10 9 kg of black carbon, we find that the vast majority of the black carbon never reaches an altitude above weather systems (approximately 12 km). Therefore, our Earth system model simulations conducted with model-informed atmospheric distributions of black carbon produce significantly lower global climatic impacts than assessed in prior studies, as the carbon at lower altitudes is more quickly removed from the atmosphere. In addition, our model ensembles indicate that statistically significant effects on global surface temperatures are limited to the first 5 years and are much smaller in magnitude than those shown in earlier works. None of the simulations produced a nuclear winter effect. We find that the effects on global surface temperatures are not uniform and are concentrated primarily around the highest arctic latitudes, dramatically reducing the global impact on human health and agriculture compared with that reported by earlier studies. Lastly, our analysis demonstrates that the probability of significant global cooling from a limited exchange scenario as envisioned in the previous studies is highly unlikely, a conclusion supported by examination of natural analogs, such as large forest fires and volcanic eruptions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisner, Jon Michael; D'Angelo, Gennaro; Koo, Eunmo
In this paper, we present a multi-scale study examining the impact of a regional exchange of nuclear weapons on global climate. Our models investigate multiple phases of the effects of nuclear weapons usage, including growth and rise of the nuclear fireball, ignition and spread of the induced firestorm, and comprehensive Earth system modeling of the oceans, land, ice, and atmosphere. This study follows from the scenario originally envisioned by Robock et al. (2007a), based on the analysis of Toon et al. (2007), which assumes a regional exchange between India and Pakistan of fifty 15-kiloton weapons detonated by each side. Wemore » expand this scenario by modeling the processes that lead to production of black carbon, in order to refine the black carbon forcing estimates of these previous studies. When the Earth system model is initiated with 5 × 10 9 kg of black carbon in the upper troposphere (approximately 9 to 13 km), the impact on climate variables such as global temperature and precipitation in our simulations is similar to that predicted by previously published work. However, while our thorough simulations of the firestorm produce about 3.7 × 10 9 kg of black carbon, we find that the vast majority of the black carbon never reaches an altitude above weather systems (approximately 12 km). Therefore, our Earth system model simulations conducted with model-informed atmospheric distributions of black carbon produce significantly lower global climatic impacts than assessed in prior studies, as the carbon at lower altitudes is more quickly removed from the atmosphere. In addition, our model ensembles indicate that statistically significant effects on global surface temperatures are limited to the first 5 years and are much smaller in magnitude than those shown in earlier works. None of the simulations produced a nuclear winter effect. We find that the effects on global surface temperatures are not uniform and are concentrated primarily around the highest arctic latitudes, dramatically reducing the global impact on human health and agriculture compared with that reported by earlier studies. Lastly, our analysis demonstrates that the probability of significant global cooling from a limited exchange scenario as envisioned in the previous studies is highly unlikely, a conclusion supported by examination of natural analogs, such as large forest fires and volcanic eruptions.« less
Human and climate impact on global riverine water and sediment fluxes - a distributed analysis
NASA Astrophysics Data System (ADS)
Cohen, S.; Kettner, A.; Syvitski, J. P.
2013-05-01
Understanding riverine water and sediment dynamics is an important undertaking for both socially-relevant issues such as agriculture, water security and infrastructure management and for scientific analysis of climate, landscapes, river ecology, oceanography and other disciplines. Providing good quantitative and predictive tools in therefore timely particularly in light of predicted climate and landuse changes. The intensity and dynamics between man-made and climatic factors vary widely across the globe and are therefore hard to predict. Using sophisticated numerical models is therefore warranted. Here we use a distributed global riverine sediment and water discharge model (WBMsed) to simulate human and climate effect on our planet's large rivers.
Integrative Analysis of Desert Dust Size and Abundance Suggests Less Dust Climate Cooling
NASA Technical Reports Server (NTRS)
Kok, Jasper F.; Ridley, David A.; Zhou, Qing; Miller, Ron L.; Zhao, Chun; Heald, Colette L.; Ward, Daniel S.; Albani, Samuel; Haustein, Karsten
2017-01-01
Desert dust aerosols affect Earths global energy balance through interactions with radiation, clouds, and ecosystems. But the magnitudes of these effects are so uncertain that it remains unclear whether atmospheric dust has a net warming or cooling effect on global climate. Consequently, it is still uncertain whether large changes in atmospheric dust loading over the past century have slowed or accelerated anthropogenic climate change, and the climate impact of possible future alterations in dust loading is similarly disputed. Here we use an integrative analysis of dust aerosol sizes and abundance to constrain the climatic impact of dust through direct interactions with radiation. Using a combination of observational, experimental, and model data, we find that atmospheric dust is substantially coarser than represented in current climate models. Since coarse dust warms global climate, the dust direct radiative effect (DRE) is likely less cooling than the 0.4 W m superscript 2 estimated by models in a current ensemble. We constrain the dust DRE to -0.20 (-0.48 to +0.20) W m superscript 2, which suggests that the dust DRE produces only about half the cooling that current models estimate, and raises the possibility that dust DRE is actually net warming the planet.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollias, Pavlos
This is a multi-institutional, collaborative project using a three-tier modeling approach to bridge field observations and global cloud-permitting models, with emphases on cloud population structural evolution through various large-scale environments. Our contribution was in data analysis for the generation of high value cloud and precipitation products and derive cloud statistics for model validation. There are two areas in data analysis that we contributed: the development of a synergistic cloud and precipitation cloud classification that identify different cloud (e.g. shallow cumulus, cirrus) and precipitation types (shallow, deep, convective, stratiform) using profiling ARM observations and the development of a quantitative precipitation ratemore » retrieval algorithm using profiling ARM observations. Similar efforts have been developed in the past for precipitation (weather radars), but not for the millimeter-wavelength (cloud) radar deployed at the ARM sites.« less
A mechanics framework for a progressive failure methodology for laminated composites
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Allen, David H.; Lo, David C.
1989-01-01
A laminate strength and life prediction methodology has been postulated for laminated composites which accounts for the progressive development of microstructural damage to structural failure. A damage dependent constitutive model predicts the stress redistribution in an average sense that accompanies damage development in laminates. Each mode of microstructural damage is represented by a second-order tensor valued internal state variable which is a strain like quantity. The mechanics framework together with the global-local strategy for predicting laminate strength and life is presented in the paper. The kinematic effects of damage are represented by effective engineering moduli in the global analysis and the results of the global analysis provide the boundary conditions for the local ply level stress analysis. Damage evolution laws are based on experimental results.
NASA Astrophysics Data System (ADS)
DY, C. Y.; Fung, J. C. H.
2016-08-01
A meteorological model requires accurate initial conditions and boundary conditions to obtain realistic numerical weather predictions. The land surface controls the surface heat and moisture exchanges, which can be determined by the physical properties of the soil and soil state variables, subsequently exerting an effect on the boundary layer meteorology. The initial and boundary conditions of soil moisture are currently obtained via National Centers for Environmental Prediction FNL (Final) Operational Global Analysis data, which are collected operationally in 1° by 1° resolutions every 6 h. Another input to the model is the soil map generated by the Food and Agriculture Organization of the United Nations - United Nations Educational, Scientific and Cultural Organization (FAO-UNESCO) soil database, which combines several soil surveys from around the world. Both soil moisture from the FNL analysis data and the default soil map lack accuracy and feature coarse resolutions, particularly for certain areas of China. In this study, we update the global soil map with data from Beijing Normal University in 1 km by 1 km grids and propose an alternative method of soil moisture initialization. Simulations of the Weather Research and Forecasting model show that spinning-up the soil moisture improves near-surface temperature and relative humidity prediction using different types of soil moisture initialization. Explanations of that improvement and improvement of the planetary boundary layer height in performing process analysis are provided.
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Votava, P.; Golden, K.; Hashimoto, H.; Jolly, M.; White, M.; Running, S.; Coughlan, J.
2003-12-01
The latest generation of NASA Earth Observing System satellites has brought a new dimension to continuous monitoring of the living part of the Earth System, the Biosphere. EOS data can now provide weekly global measures of vegetation productivity and ocean chlorophyll, and many related biophysical factors such as land cover changes or snowmelt rates. However, information with the highest economic value would be forecasting impending conditions of the biosphere that would allow advanced decision-making to mitigate dangers, or exploit positive trends. We have developed a software system called the Terrestrial Observation and Prediction System (TOPS) to facilitate rapid analysis of ecosystem states/functions by integrating EOS data with ecosystem models, surface weather observations and weather/climate forecasts. Land products from MODIS (Moderate Resolution Imaging Spectroradiometer) including land cover, albedo, snow, surface temperature, leaf area index are ingested into TOPS for parameterization of models and for verifying model outputs such as snow cover and vegetation phenology. TOPS is programmed to gather data from observing networks such as USDA soil moisture, AMERIFLUX, SNOWTEL to further enhance model predictions. Key technologies enabling TOPS implementation include the ability to understand and process heterogeneous-distributed data sets, automated planning and execution of ecosystem models, causation analysis for understanding model outputs. Current TOPS implementations at local (vineyard) to global scales (global net primary production) can be found at http://www.ntsg.umt.edu/tops.
NASA Technical Reports Server (NTRS)
da Silva, Arlindo M.; Alpert, Pinhas
2016-01-01
In the late 1990's, prior to the launch of the Terra satellite, atmospheric general circulation models (GCMs) did not include aerosol processes because aerosols were not properly monitored on a global scale and their spatial distributions were not known well enough for their incorporation in operational GCMs. At the time of the first GEOS Reanalysis (Schubert et al. 1993), long time series of analysis increments (the corrections to the atmospheric state by all available meteorological observations) became readily available, enabling detailed analysis of the GEOS-1 errors on a global scale. Such analysis revealed that temperature biases were particularly pronounced in the Tropical Atlantic region, with patterns depicting a remarkable similarity to dust plumes emanating from the African continent as evidenced by TOMS aerosol index maps. Yoram Kaufman was instrumental encouraging us to pursue this issue further, resulting in the study reported in Alpert et al. (1998) where we attempted to assess aerosol forcing by studying the errors of a the GEOS-1 GCM without aerosol physics within a data assimilation system. Based on this analysis, Alpert et al. (1998) put forward that dust aerosols are an important source of inaccuracies in numerical weather-prediction models in the Tropical Atlantic region, although a direct verification of this hypothesis was not possible back then. Nearly 20 years later, numerical prediction models have increased in resolution and complexity of physical parameterizations, including the representation of aerosols and their interactions with the circulation. Moreover, with the advent of NASA's EOS program and subsequent satellites, atmospheric aerosols are now monitored globally on a routine basis, and their assimilation in global models are becoming well established. In this talk we will reexamine the Alpert et al. (1998) hypothesis using the most recent version of the GEOS-5 Data Assimilation System with assimilation of aerosols. We will explicitly calculate the impact of aerosols on the temperature analysis increments in the tropical Atlantic and assess the extent to which inclusion of atmospheric aerosols have reduced these increments.
Moho Modeling Using FFT Technique
NASA Astrophysics Data System (ADS)
Chen, Wenjin; Tenzer, Robert
2017-04-01
To improve the numerical efficiency, the Fast Fourier Transform (FFT) technique was facilitated in Parker-Oldenburg's method for a regional gravimetric Moho recovery, which assumes the Earth's planar approximation. In this study, we extend this definition for global applications while assuming a spherical approximation of the Earth. In particular, we utilize the FFT technique for a global Moho recovery, which is practically realized in two numerical steps. The gravimetric forward modeling is first applied, based on methods for a spherical harmonic analysis and synthesis of the global gravity and lithospheric structure models, to compute the refined gravity field, which comprises mainly the gravitational signature of the Moho geometry. The gravimetric inverse problem is then solved iteratively in order to determine the Moho depth. The application of FFT technique to both numerical steps reduces the computation time to a fraction of that required without applying this fast algorithm. The developed numerical producers are used to estimate the Moho depth globally, and the gravimetric result is validated using the global (CRUST1.0) and regional (ESC) seismic Moho models. The comparison reveals a relatively good agreement between the gravimetric and seismic models, with the RMS of differences (of 4-5 km) at the level of expected uncertainties of used input datasets, while without the presence of significant systematic bias.
A global magnetic anomaly map. [obtained from POGO satellite data
NASA Technical Reports Server (NTRS)
Regan, R. D.; Davis, W. M.; Cain, J. C.
1974-01-01
A subset of POGO satellite magnetometer data has been formed that is suitable for analysis of crustal magnetic anomalies. Using a thirteenth order field model, fit to these data, magnetic residuals have been calculated over the world to latitude limits of plus 50 deg. These residuals averaged over one degree latitude-longitude blocks represent a detailed global magnetic anomaly map derived solely from satellite data. Preliminary analysis of the map indicates that the anomalies are real and of geological origin.
NASA Astrophysics Data System (ADS)
Bloom, A. Anthony; Bowman, Kevin W.; Lee, Meemong; Turner, Alexander J.; Schroeder, Ronny; Worden, John R.; Weidner, Richard; McDonald, Kyle C.; Jacob, Daniel J.
2017-06-01
Wetland emissions remain one of the principal sources of uncertainty in the global atmospheric methane (CH4) budget, largely due to poorly constrained process controls on CH4 production in waterlogged soils. Process-based estimates of global wetland CH4 emissions and their associated uncertainties can provide crucial prior information for model-based top-down CH4 emission estimates. Here we construct a global wetland CH4 emission model ensemble for use in atmospheric chemical transport models (WetCHARTs version 1.0). Our 0.5° × 0.5° resolution model ensemble is based on satellite-derived surface water extent and precipitation reanalyses, nine heterotrophic respiration simulations (eight carbon cycle models and a data-constrained terrestrial carbon cycle analysis) and three temperature dependence parameterizations for the period 2009-2010; an extended ensemble subset based solely on precipitation and the data-constrained terrestrial carbon cycle analysis is derived for the period 2001-2015. We incorporate the mean of the full and extended model ensembles into GEOS-Chem and compare the model against surface measurements of atmospheric CH4; the model performance (site-level and zonal mean anomaly residuals) compares favourably against published wetland CH4 emissions scenarios. We find that uncertainties in carbon decomposition rates and the wetland extent together account for more than 80 % of the dominant uncertainty in the timing, magnitude and seasonal variability in wetland CH4 emissions, although uncertainty in the temperature CH4 : C dependence is a significant contributor to seasonal variations in mid-latitude wetland CH4 emissions. The combination of satellite, carbon cycle models and temperature dependence parameterizations provides a physically informed structural a priori uncertainty that is critical for top-down estimates of wetland CH4 fluxes. Specifically, our ensemble can provide enhanced information on the prior CH4 emission uncertainty and the error covariance structure, as well as a means for using posterior flux estimates and their uncertainties to quantitatively constrain the biogeochemical process controls of global wetland CH4 emissions.
NASA Astrophysics Data System (ADS)
Takle, E. S.; Gustafson, D. I.; Beachy, R.; Nelson, G. C.; Mason-D'Croz, D.; Palazzo, A.
2013-12-01
Agreement is developing among agricultural scientists on the emerging inability of agriculture to meet growing global food demands. The lack of additional arable land and availability of freshwater have long been constraints on agriculture. Changes in trends of weather conditions that challenge physiological limits of crops, as projected by global climate models, are expected to exacerbate the global food challenge toward the middle of the 21st century. These climate- and constraint-driven crop production challenges are interconnected within a complex global economy, where diverse factors add to price volatility and food scarcity. We use the DSSAT crop modeling suite, together with mid-century projections of four AR4 global models, as input to the International Food Policy Research Institute IMPACT model to project the impact of climate change on food security through the year 2050 for internationally traded crops. IMPACT is an iterative model that responds to endogenous and exogenous drivers to dynamically solve for the world prices that ensure global supply equals global demand. The modeling methodology reconciles the limited spatial resolution of macro-level economic models that operate through equilibrium-driven relationships at a national level with detailed models of biophysical processes at high spatial resolution. The analysis presented here suggests that climate change in the first half of the 21st century does not represent a near-term threat to food security in the US due to the availability of adaptation strategies (e.g., loss of current growing regions is balanced by gain of new growing regions). However, as climate continues to trend away from 20th century norms current adaptation measures will not be sufficient to enable agriculture to meet growing food demand. Climate scenarios from higher-level carbon emissions exacerbate the food shortfall, although uncertainty in climate model projections (particularly precipitation) is a limitation to impact studies.
Modeling global Hammond landform regions from 250-m elevation data
Karagulle, Deniz; Frye, Charlie; Sayre, Roger; Breyer, Sean P.; Aniello, Peter; Vaughan, Randy; Wright, Dawn J.
2017-01-01
In 1964, E.H. Hammond proposed criteria for classifying and mapping physiographic regions of the United States. Hammond produced a map entitled “Classes of Land Surface Form in the Forty-Eight States, USA”, which is regarded as a pioneering and rigorous treatment of regional physiography. Several researchers automated Hammond?s model in GIS. However, these were local or regional in application, and resulted in inadequate characterization of tablelands. We used a global 250 m DEM to produce a new characterization of global Hammond landform regions. The improved algorithm we developed for the regional landform modeling: (1) incorporated a profile parameter for the delineation of tablelands; (2) accommodated negative elevation data values; (3) allowed neighborhood analysis window (NAW) size to vary between parameters; (4) more accurately bounded plains regions; and (5) mapped landform regions as opposed to discrete landform features. The new global Hammond landform regions product builds on an existing global Hammond landform features product developed by the U.S. Geological Survey, which, while globally comprehensive, did not include tablelands, used a fixed NAW size, and essentially classified pixels rather than regions. Our algorithm also permits the disaggregation of “mixed” Hammond types (e.g. plains with high mountains) into their component parts.
Hwang, Yen-Ting; Frierson, Dargan M. W.
2013-01-01
The double-Intertropical Convergence Zone (ITCZ) problem, in which excessive precipitation is produced in the Southern Hemisphere tropics, which resembles a Southern Hemisphere counterpart to the strong Northern Hemisphere ITCZ, is perhaps the most significant and most persistent bias of global climate models. In this study, we look to the extratropics for possible causes of the double-ITCZ problem by performing a global energetic analysis with historical simulations from a suite of global climate models and comparing with satellite observations of the Earth’s energy budget. Our results show that models with more energy flux into the Southern Hemisphere atmosphere (at the top of the atmosphere and at the surface) tend to have a stronger double-ITCZ bias, consistent with recent theoretical studies that suggest that the ITCZ is drawn toward heating even outside the tropics. In particular, we find that cloud biases over the Southern Ocean explain most of the model-to-model differences in the amount of excessive precipitation in Southern Hemisphere tropics, and are suggested to be responsible for this aspect of the double-ITCZ problem in most global climate models. PMID:23493552
Yang, Xujun; Li, Chuandong; Song, Qiankun; Chen, Jiyang; Huang, Junjian
2018-05-04
This paper talks about the stability and synchronization problems of fractional-order quaternion-valued neural networks (FQVNNs) with linear threshold neurons. On account of the non-commutativity of quaternion multiplication resulting from Hamilton rules, the FQVNN models are separated into four real-valued neural network (RVNN) models. Consequently, the dynamic analysis of FQVNNs can be realized by investigating the real-valued ones. Based on the method of M-matrix, the existence and uniqueness of the equilibrium point of the FQVNNs are obtained without detailed proof. Afterwards, several sufficient criteria ensuring the global Mittag-Leffler stability for the unique equilibrium point of the FQVNNs are derived by applying the Lyapunov direct method, the theory of fractional differential equation, the theory of matrix eigenvalue, and some inequality techniques. In the meanwhile, global Mittag-Leffler synchronization for the drive-response models of the addressed FQVNNs are investigated explicitly. Finally, simulation examples are designed to verify the feasibility and availability of the theoretical results. Copyright © 2018 Elsevier Ltd. All rights reserved.
Torres, Jaume; Briggs, John A G; Arkin, Isaiah T
2002-01-01
Molecular interactions between transmembrane alpha-helices can be explored using global searching molecular dynamics simulations (GSMDS), a method that produces a group of probable low energy structures. We have shown previously that the correct model in various homooligomers is always located at the bottom of one of various possible energy basins. Unfortunately, the correct model is not necessarily the one with the lowest energy according to the computational protocol, which has resulted in overlooking of this parameter in favor of experimental data. In an attempt to use energetic considerations in the aforementioned analysis, we used global searching molecular dynamics simulations on three homooligomers of different sizes, the structures of which are known. As expected, our results show that even when the conformational space searched includes the correct structure, taking together simulations using both left and right handedness, the correct model does not necessarily have the lowest energy. However, for the models derived from the simulation that uses the correct handedness, the lowest energy model is always at, or very close to, the correct orientation. We hypothesize that this should also be true when simulations are performed using homologous sequences, and consequently lowest energy models with the right handedness should produce a cluster around a certain orientation. In contrast, using the wrong handedness the lowest energy structures for each sequence should appear at many different orientations. The rationale behind this is that, although more than one energy basin may exist, basins that do not contain the correct model will shift or disappear because they will be destabilized by at least one conservative (i.e. silent) mutation, whereas the basin containing the correct model will remain. This not only allows one to point to the possible handedness of the bundle, but can be used to overcome ambiguities arising from the use of homologous sequences in the analysis of global searching molecular dynamics simulations. In addition, because clustering of lowest energy models arising from homologous sequences only happens when the estimation of the helix tilt is correct, it may provide a validation for the helix tilt estimate. PMID:12023229
Gutiérrez, Salvador; Tardaguila, Javier; Fernández-Novales, Juan; Diago, María P
2015-01-01
The identification of different grapevine varieties, currently attended using visual ampelometry, DNA analysis and very recently, by hyperspectral analysis under laboratory conditions, is an issue of great importance in the wine industry. This work presents support vector machine and artificial neural network's modelling for grapevine varietal classification from in-field leaf spectroscopy. Modelling was attempted at two scales: site-specific and a global scale. Spectral measurements were obtained on the near-infrared (NIR) spectral range between 1600 to 2400 nm under field conditions in a non-destructive way using a portable spectrophotometer. For the site specific approach, spectra were collected from the adaxial side of 400 individual leaves of 20 grapevine (Vitis vinifera L.) varieties one week after veraison. For the global model, two additional sets of spectra were collected one week before harvest from two different vineyards in another vintage, each one consisting on 48 measurement from individual leaves of six varieties. Several combinations of spectra scatter correction and smoothing filtering were studied. For the training of the models, support vector machines and artificial neural networks were employed using the pre-processed spectra as input and the varieties as the classes of the models. The results from the pre-processing study showed that there was no influence whether using scatter correction or not. Also, a second-degree derivative with a window size of 5 Savitzky-Golay filtering yielded the highest outcomes. For the site-specific model, with 20 classes, the best results from the classifiers thrown an overall score of 87.25% of correctly classified samples. These results were compared under the same conditions with a model trained using partial least squares discriminant analysis, which showed a worse performance in every case. For the global model, a 6-class dataset involving samples from three different vineyards, two years and leaves monitored at post-veraison and harvest was also built up, reaching a 77.08% of correctly classified samples. The outcomes obtained demonstrate the capability of using a reliable method for fast, in-field, non-destructive grapevine varietal classification that could be very useful in viticulture and wine industry, either global or site-specific.
Ma, Jianyong; Shugart, Herman H; Yan, Xiaodong; Cao, Cougui; Wu, Shuang; Fang, Jing
2017-05-15
The carbon budget of forest ecosystems, an important component of the terrestrial carbon cycle, needs to be accurately quantified and predicted by ecological models. As a preamble to apply the model to estimate global carbon uptake by forest ecosystems, we used the CO 2 flux measurements from 37 forest eddy-covariance sites to examine the individual tree-based FORCCHN model's performance globally. In these initial tests, the FORCCHN model simulated gross primary production (GPP), ecosystem respiration (ER) and net ecosystem production (NEP) with correlations of 0.72, 0.70 and 0.53, respectively, across all forest biomes. The model underestimated GPP and slightly overestimated ER across most of the eddy-covariance sites. An underestimation of NEP arose primarily from the lower GPP estimates. Model performance was better in capturing both the temporal changes and magnitude of carbon fluxes in deciduous broadleaf forest than in evergreen broadleaf forest, and it performed less well for sites in Mediterranean climate. We then applied the model to estimate the carbon fluxes of forest ecosystems on global scale over 1982-2011. This application of FORCCHN gave a total GPP of 59.41±5.67 and an ER of 57.21±5.32PgCyr -1 for global forest ecosystems during 1982-2011. The forest ecosystems over this same period contributed a large carbon storage, with total NEP being 2.20±0.64PgCyr -1 . These values are comparable to and reinforce estimates reported in other studies. This analysis highlights individual tree-based model FORCCHN could be used to evaluate carbon fluxes of forest ecosystems on global scale. Copyright © 2017 Elsevier B.V. All rights reserved.
Intercomparison of hydrologic processes in global climate models
NASA Technical Reports Server (NTRS)
Lau, W. K.-M.; Sud, Y. C.; Kim, J.-H.
1995-01-01
In this report, we address the intercomparison of precipitation (P), evaporation (E), and surface hydrologic forcing (P-E) for 23 Atmospheric Model Intercomparison Project (AMIP) general circulation models (GCM's) including relevant observations, over a variety of spatial and temporal scales. The intercomparison includes global and hemispheric means, latitudinal profiles, selected area means for the tropics and extratropics, ocean and land, respectively. In addition, we have computed anomaly pattern correlations among models and observations for different seasons, harmonic analysis for annual and semiannual cycles, and rain-rate frequency distribution. We also compare the joint influence of temperature and precipitation on local climate using the Koeppen climate classification scheme.
2016-09-01
Laboratory Change in Weather Research and Forecasting (WRF) Model Accuracy with Age of Input Data from the Global Forecast System (GFS) by JL Cogan...analysis. As expected, accuracy generally tended to decline as the large-scale data aged , but appeared to improve slightly as the age of the large...19 Table 7 Minimum and maximum mean RMDs for each WRF time (or GFS data age ) category. Minimum and
Wang, Kefeng; Peng, Changhui; Zhu, Qiuan; ...
2017-09-28
Microbial physiology plays a critical role in the biogeochemical cycles of the Earth system. However, most traditional soil carbon models are lacking in terms of the representation of key microbial processes that control the soil carbon response to global climate change. In this study, the improved process-based model TRIPLEX-GHG was developed by coupling it with the new MEND (Microbial-ENzyme-mediated Decomposition) model to estimate total global soil organic carbon (SOC) and global soil microbial carbon. The new model (TRIPLEX-MICROBE) shows considerable improvement over the previous version (TRIPLEX-GHG) in simulating SOC. We estimated the global soil carbon stock to be approximately 1195more » Pg C, with 348 Pg C located in the high northern latitudes, which is in good agreement with the well-regarded Harmonized World Soil Database (HWSD) and the Northern Circumpolar Soil Carbon Database (NCSCD). We also estimated the global soil microbial carbon to be 21 Pg C, similar to the 23 Pg C estimated. We found that the microbial carbon quantity in the latitudinal direction showed reversions at approximately 30°N, near the equator and at 25°S. A sensitivity analysis suggested that the tundra ecosystem exhibited the highest sensitivity to a 1°C increase or decrease in temperature in terms of dissolved organic carbon (DOC), microbial biomass carbon (MBC) and mineral-associated organic carbon (MOC). Furthermore, our work represents the first step towards a new generation of ecosystem process models capable of integrating key microbial processes into soil carbon cycles.« less
NASA Astrophysics Data System (ADS)
Wang, Kefeng; Peng, Changhui; Zhu, Qiuan; Zhou, Xiaolu; Wang, Meng; Zhang, Kerou; Wang, Gangsheng
2017-10-01
Microbial physiology plays a critical role in the biogeochemical cycles of the Earth system. However, most traditional soil carbon models are lacking in terms of the representation of key microbial processes that control the soil carbon response to global climate change. In this study, the improved process-based model TRIPLEX-GHG was developed by coupling it with the new MEND (Microbial-ENzyme-mediated Decomposition) model to estimate total global soil organic carbon (SOC) and global soil microbial carbon. The new model (TRIPLEX-MICROBE) shows considerable improvement over the previous version (TRIPLEX-GHG) in simulating SOC. We estimated the global soil carbon stock to be approximately 1195 Pg C, with 348 Pg C located in the high northern latitudes, which is in good agreement with the well-regarded Harmonized World Soil Database (HWSD) and the Northern Circumpolar Soil Carbon Database (NCSCD). We also estimated the global soil microbial carbon to be 21 Pg C, similar to the 23 Pg C estimated by Xu et al. (2014). We found that the microbial carbon quantity in the latitudinal direction showed reversions at approximately 30°N, near the equator and at 25°S. A sensitivity analysis suggested that the tundra ecosystem exhibited the highest sensitivity to a 1°C increase or decrease in temperature in terms of dissolved organic carbon (DOC), microbial biomass carbon (MBC), and mineral-associated organic carbon (MOC). However, our work represents the first step toward a new generation of ecosystem process models capable of integrating key microbial processes into soil carbon cycles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Kefeng; Peng, Changhui; Zhu, Qiuan
Microbial physiology plays a critical role in the biogeochemical cycles of the Earth system. However, most traditional soil carbon models are lacking in terms of the representation of key microbial processes that control the soil carbon response to global climate change. In this study, the improved process-based model TRIPLEX-GHG was developed by coupling it with the new MEND (Microbial-ENzyme-mediated Decomposition) model to estimate total global soil organic carbon (SOC) and global soil microbial carbon. The new model (TRIPLEX-MICROBE) shows considerable improvement over the previous version (TRIPLEX-GHG) in simulating SOC. We estimated the global soil carbon stock to be approximately 1195more » Pg C, with 348 Pg C located in the high northern latitudes, which is in good agreement with the well-regarded Harmonized World Soil Database (HWSD) and the Northern Circumpolar Soil Carbon Database (NCSCD). We also estimated the global soil microbial carbon to be 21 Pg C, similar to the 23 Pg C estimated. We found that the microbial carbon quantity in the latitudinal direction showed reversions at approximately 30°N, near the equator and at 25°S. A sensitivity analysis suggested that the tundra ecosystem exhibited the highest sensitivity to a 1°C increase or decrease in temperature in terms of dissolved organic carbon (DOC), microbial biomass carbon (MBC) and mineral-associated organic carbon (MOC). Furthermore, our work represents the first step towards a new generation of ecosystem process models capable of integrating key microbial processes into soil carbon cycles.« less
NASA Technical Reports Server (NTRS)
Pickering, K. E.; Ziemke, J.; Bucsela, E.; Gleason, J.; Marufu, L.; Dickerson, R.; Mathur, R.; Davidson, P.; Duncan, B.; Bhartia, P. K.
2006-01-01
The Ozone Monitoring Instrument (OMI) on board NASA s Aura satellite was launched in July 2004, and is now providing daily global observations of total column ozone, NO2, and SO2, as well as aerosol information. Algorithms have also been developed to produce daily tropospheric ozone and NO2 products. The tropospheric ozone product reported here is a tropospheric residual computed through use of Aura Microwave Limb Sounder (MLS) ozone profile data to quantify stratospheric ozone. We are investigating the applicability of OMI products for use in air quality modeling, forecasting, and analysis. These investigations include comparison of the OMI tropospheric O3 and NO2 products with global and regional models and with lower tropospheric aircraft observations. Large-scale transport of pollution seen in the OM1 tropospheric O3 data is compared with output from NASA's Global Modeling Initiative global chemistry and transport model. On the regional scale we compare the OMI tropospheric O3 and NO2 with fields from the National Oceanic and Atmospheric Administration and Environmental Protection Agency (NOAA/EPA) operational Eta/CMAQ air quality forecasting model over the eastern United States. This 12-km horizontal resolution model output is roughly of equivalent resolution to the OMI pixel data. Correlation analysis between lower tropospheric aircraft O3 profile data taken by the University of Maryland over the Mid-Atlantic States and OMI tropospheric column mean volume mixing ratio for O3 will be presented. These aircraft data are representative of the lowest 3 kilometers of the atmosphere, the region in which much of the locally-generated and regionally-transported ozone exists.
NASA Astrophysics Data System (ADS)
Guan, Jun; Xu, Xiaoyu; Xing, Lizhi
2018-03-01
The input-output table is comprehensive and detailed in describing national economic systems with abundance of economic relationships depicting information of supply and demand among industrial sectors. This paper focuses on how to quantify the degree of competition on the global value chain (GVC) from the perspective of econophysics. Global Industrial Strongest Relevant Network models are established by extracting the strongest and most immediate industrial relevance in the global economic system with inter-country input-output (ICIO) tables and then have them transformed into Global Industrial Resource Competition Network models to analyze the competitive relationships based on bibliographic coupling approach. Three indicators well suited for the weighted and undirected networks with self-loops are introduced here, including unit weight for competitive power, disparity in the weight for competitive amplitude and weighted clustering coefficient for competitive intensity. Finally, these models and indicators were further applied empirically to analyze the function of industrial sectors on the basis of the latest World Input-Output Database (WIOD) in order to reveal inter-sector competitive status during the economic globalization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Yuanshun; Baek, Seung H.; Garcia-Diza, Alberto
2012-01-01
This paper designs a comprehensive approach based on the engineering machine/system concept, to model, analyze, and assess the level of CO2 exchange between the atmosphere and terrestrial ecosystems, which is an important factor in understanding changes in global climate. The focus of this article is on spatial patterns and on the correlation between levels of CO2 fluxes and a variety of influencing factors in eco-environments. The engineering/machine concept used is a system protocol that includes the sequential activities of design, test, observe, and model. This concept is applied to explicitly include various influencing factors and interactions associated with CO2 fluxes.more » To formulate effective models of a large and complex climate system, this article introduces a modeling technique that will be referred to as Stochastic Filtering Analysis of Variance (SFANOVA). The CO2 flux data observed from some sites of AmeriFlux are used to illustrate and validate the analysis, prediction and globalization capabilities of the proposed engineering approach and the SF-ANOVA technology. The SF-ANOVA modeling approach was compared to stepwise regression, ridge regression, and neural networks. The comparison indicated that the proposed approach is a valid and effective tool with similar accuracy and less complexity than the other procedures.« less
Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T.; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio
2014-01-01
Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden–Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003–2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts. PMID:24801254
Classification of Clouds and Deep Convection from GEOS-5 Using Satellite Observations
NASA Technical Reports Server (NTRS)
Putman, William; Suarez, Max
2010-01-01
With the increased resolution of global atmospheric models and the push toward global cloud resolving models, the resemblance of model output to satellite observations has become strikingly similar. As we progress with our adaptation of the Goddard Earth Observing System Model, Version 5 (GEOS-5) as a high resolution cloud system resolving model, evaluation of cloud properties and deep convection require in-depth analysis beyond a visual comparison. Outgoing long-wave radiation (OLR) provides a sufficient comparison with infrared (IR) satellite imagery to isolate areas of deep convection. We have adopted a binning technique to generate a series of histograms for OLR which classify the presence and fraction of clear sky versus deep convection in the tropics that can be compared with a similar analyses of IR imagery from composite Geostationary Operational Environmental Satellite (GOES) observations. We will present initial results that have been used to evaluate the amount of deep convective parameterization required within the model as we move toward cloud system resolving resolutions of 10- to 1-km globally.
Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio
2014-05-06
Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden-Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003-2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts.
Stability analysis of an HIV/AIDS epidemic model with treatment
NASA Astrophysics Data System (ADS)
Cai, Liming; Li, Xuezhi; Ghosh, Mini; Guo, Baozhu
2009-07-01
An HIV/AIDS epidemic model with treatment is investigated. The model allows for some infected individuals to move from the symptomatic phase to the asymptomatic phase by all sorts of treatment methods. We first establish the ODE treatment model with two infective stages. Mathematical analyses establish that the global dynamics of the spread of the HIV infectious disease are completely determined by the basic reproduction number [real]0. If [real]0<=1, the disease-free equilibrium is globally stable, whereas the unique infected equilibrium is globally asymptotically stable if [real]0>1. Then, we introduce a discrete time delay to the model to describe the time from the start of treatment in the symptomatic stage until treatment effects become visible. The effect of the time delay on the stability of the endemically infected equilibrium is investigated. Moreover, the delay model exhibits Hopf bifurcations by using the delay as a bifurcation parameter. Finally, numerical simulations are presented to illustrate the results.
Development of mpi_EPIC model for global agroecosystem modeling
Kang, Shujiang; Wang, Dali; Jeff A. Nichols; ...
2014-12-31
Models that address policy-maker concerns about multi-scale effects of food and bioenergy production systems are computationally demanding. We integrated the message passing interface algorithm into the process-based EPIC model to accelerate computation of ecosystem effects. Simulation performance was further enhanced by applying the Vampir framework. When this enhanced mpi_EPIC model was tested, total execution time for a global 30-year simulation of a switchgrass cropping system was shortened to less than 0.5 hours on a supercomputer. The results illustrate that mpi_EPIC using parallel design can balance simulation workloads and facilitate large-scale, high-resolution analysis of agricultural production systems, management alternatives and environmentalmore » effects.« less
Grid Transmission Expansion Planning Model Based on Grid Vulnerability
NASA Astrophysics Data System (ADS)
Tang, Quan; Wang, Xi; Li, Ting; Zhang, Quanming; Zhang, Hongli; Li, Huaqiang
2018-03-01
Based on grid vulnerability and uniformity theory, proposed global network structure and state vulnerability factor model used to measure different grid models. established a multi-objective power grid planning model which considering the global power network vulnerability, economy and grid security constraint. Using improved chaos crossover and mutation genetic algorithm to optimize the optimal plan. For the problem of multi-objective optimization, dimension is not uniform, the weight is not easy given. Using principal component analysis (PCA) method to comprehensive assessment of the population every generation, make the results more objective and credible assessment. the feasibility and effectiveness of the proposed model are validated by simulation results of Garver-6 bus system and Garver-18 bus.
A structural model decomposition framework for systems health management
NASA Astrophysics Data System (ADS)
Roychoudhury, I.; Daigle, M.; Bregon, A.; Pulido, B.
Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.
A Structural Model Decomposition Framework for Systems Health Management
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino
2013-01-01
Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.
NASA Astrophysics Data System (ADS)
Ruiz-Sinoga, José D.; Hueso-González, Paloma; León-Gross, Teodoro; Molina, Julián; Remond, Ricardo; Martínez-Murillo, Juan F.
2017-04-01
The Global Change is referred to the occurrence of great environmental changes associated to climatic fluctuations and human activity as wel (Vitousek et al., 1997; Steffen et al., 2004; Dearing et al., 2006). García-Ruiz et al. (2015) indicated that the relief varies very slowly in time while the changes in vegetation, overland flow generation and erosion occurred very rapidly and conditioned by their interactions and the climate variability as well. The GLOMED-LAND Project has its bases and scientific justification on the combination of the experience of the members of the research team, from one side, in the analysis of the dynamics and eco-geomorphological and climatic processes in Mediterranean environments of southern Spain, in the context of current Global change, and from another, in the study, development and application of new tools for simulation and modelling of future scenarios, and finally, in the analysis of the impact that society exercises the broadcast media related to the problem derived from the awareness and adaptation to Global change. Climate change (CC), directly affects the elements that compose the landscape. Both in the analysis of future climate scenarios raised by the IPCC (2013), such as the regionalisation carried out by AEMET, the Mediterranean region and, especially, the South of Spain, - with its defined longitudinal pluviometric gradient - configured as one of the areas of greatest uncertainty, reflected in a higher concentration of temporal rainfall, and even a reduction in the rainfall. Faced with this situation, the CC can modify the current landscape setting, with all the environmental impacts that this would entail for the terrestrial ecosystems and the systemic services rendered to the society. The combination of different work scales allows the analysis of the dynamics of the landscape and the consequence of its modifications on, hydro-geomorphological processes, closely related to degradation processes that can affect the abiotic, biotic, and human elements of the landscape (soil, plant cover, crops, water resources, etc.). Simulation and modelling is now an essential tool in the study of landscape and of the effects of Climate Change, not only towards the future through scenarios and simulation modelling, also to the past, to better understand what causes have led to effects, and to what extent. In this work we aim to create a set of software tools for analysis, modelling and simulation of the effects of Global change on two Mediterranean catchments: the middle and upper basin of the Grande River and the high Benamargosa River, both of them in the Province of Málaga (South of Spain). This will allow a full analysis, monitor, and predict those effects at local scale. Finally, we analyse the role that the impact of Global Change issues has had from the media point of view and what tendency can follow. References Dearing, J. et al. (2006): «Human-environment interactions: towards synthesis and simulation». Regional Environmental Change, n° 6, 115-123. García-Ruiz et al. (2015): «Los efectos geoecológicos del cambio global en el Pirineo central español: una revisión a distintas escalas espaciales y temporales». Pirineos, 170. Steffen, W. et al. (2004): Global Change and the Earth System: a planet under pressure. Executive summary. The IGBP Global Change Series. Springer-Verlag, Berlin, Heidelburg, 44 pp., New York. Vitousek, P.M. et al. (1997): «Human domination of earth's ecosystems». Science, n° 277, 494-499.
Bifurcation Analysis and Optimal Harvesting of a Delayed Predator-Prey Model
NASA Astrophysics Data System (ADS)
Tchinda Mouofo, P.; Djidjou Demasse, R.; Tewa, J. J.; Aziz-Alaoui, M. A.
A delay predator-prey model is formulated with continuous threshold prey harvesting and Holling response function of type III. Global qualitative and bifurcation analyses are combined to determine the global dynamics of the model. The positive invariance of the non-negative orthant is proved and the uniform boundedness of the trajectories. Stability of equilibria is investigated and the existence of some local bifurcations is established: saddle-node bifurcation, Hopf bifurcation. We use optimal control theory to provide the correct approach to natural resource management. Results are also obtained for optimal harvesting. Numerical simulations are given to illustrate the results.
Mars Global Reference Atmospheric Model 2010 Version: Users Guide
NASA Technical Reports Server (NTRS)
Justh, H. L.
2014-01-01
This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...
2017-12-27
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less
Prospects for development of unified global flood observation and prediction systems (Invited)
NASA Astrophysics Data System (ADS)
Lettenmaier, D. P.
2013-12-01
Floods are among the most damaging of natural hazards, with global flood losses in 2011 alone estimated to have exceeded $100B. Historically, flood economic damages have been highest in the developed world (due in part to encroachment on historical flood plains), but loss of life, and human impacts have been greatest in the developing world. However, as the 2011 Thailand floods show, industrializing countries, many of which do not have well developed flood protection systems, are increasingly vulnerable to economic damages as they become more industrialized. At present, unified global flood observation and prediction systems are in their infancy; notwithstanding that global weather forecasting is a mature field. The summary for this session identifies two evolving capabilities that hold promise for development of more sophisticated global flood forecast systems: global hydrologic models and satellite remote sensing (primarily of precipitation, but also of flood inundation). To this I would add the increasing sophistication and accuracy of global precipitation analysis (and forecast) fields from numerical weather prediction models. In this brief overview, I will review progress in all three areas, and especially the evolution of hydrologic data assimilation which integrates modeling and data sources. I will also comment on inter-governmental and inter-agency cooperation, and related issues that have impeded progress in the development and utilization of global flood observation and prediction systems.
Hydrologic Derivatives for Modeling and Analysis—A new global high-resolution database
Verdin, Kristine L.
2017-07-17
The U.S. Geological Survey has developed a new global high-resolution hydrologic derivative database. Loosely modeled on the HYDRO1k database, this new database, entitled Hydrologic Derivatives for Modeling and Analysis, provides comprehensive and consistent global coverage of topographically derived raster layers (digital elevation model data, flow direction, flow accumulation, slope, and compound topographic index) and vector layers (streams and catchment boundaries). The coverage of the data is global, and the underlying digital elevation model is a hybrid of three datasets: HydroSHEDS (Hydrological data and maps based on SHuttle Elevation Derivatives at multiple Scales), GMTED2010 (Global Multi-resolution Terrain Elevation Data 2010), and the SRTM (Shuttle Radar Topography Mission). For most of the globe south of 60°N., the raster resolution of the data is 3 arc-seconds, corresponding to the resolution of the SRTM. For the areas north of 60°N., the resolution is 7.5 arc-seconds (the highest resolution of the GMTED2010 dataset) except for Greenland, where the resolution is 30 arc-seconds. The streams and catchments are attributed with Pfafstetter codes, based on a hierarchical numbering system, that carry important topological information. This database is appropriate for use in continental-scale modeling efforts. The work described in this report was conducted by the U.S. Geological Survey in cooperation with the National Aeronautics and Space Administration Goddard Space Flight Center.
Lapaige, Véronique
2009-01-01
The current phase of globalization represents a "double-edged sword" challenge facing public health practitioners and health policy makers. The first "edge" throws light on two constructs in the field of public health: global health (formerly international health) and globalized public health. The second "edge" is that of global governance, and raises the question, "how can we construct public health regulations that adequately respond to both global and local complexities related to the two constructs mentioned earlier (global health and globalized public health)?" The two constructs call for the development of norms that will assure sustained population-wide health improvement and these two constructs have their own conceptual tools and theoretical models that permit a better understanding of them. In this paper, we introduce the "globalized public health" construct and we present an interactive comprehensive framework for critically analyzing contemporary globalization's influences on the field of public health. "Globalized public health", simultaneously a theoretical model and a conceptual framework, concerns the transformation of the field of public health in the sociohistorical context of globalization. The model is the fruit of an original theoretical research study conducted from 2005 to 2008 ("contextualized research," Gibbons' Mode II of knowledge production), founded on a QUAL-quant sequential mixed-method design. This research also reflects our political and ideological position, fuelled with aspirations of social democracy and cosmopolitical values. It is profoundly anchored in the pragmatic approach to globalization, looking to "reconcile" the market and equity. The model offers several features to users: (1) it is transdisciplinary; (2) it is interactive (CD-ROM); (3) it is nonlinear (nonlinear interrelations between the contextual globalization and the field of public health); (4) it is synchronic/diachronic (a double-crossed perspective permits analysis of global social change, the emergence of global agency and the transmutation of the field of public health, in the full complexity of their nonlinear interaction); (5) it offers five characteristics as an auto-eco-organized system of social interactions, or dynamic, nonlinear sociohistorical system. The model features a visual interface (five interrelated figures), a structure of 30 "integrator concepts" that integrates 114 other element-parts via 1,300 hypertext links. The model is both a knowledge translation tool and an interactive heuristic guide designed for practitioners and researchers in public health/community health/population health, as well as for decision-makers at all levels.
Global, spatial, and temporal sensitivity analysis for a complex pesticide fate and transport model.
Background/Questions/Methods As one ofthe most heavily used exposure models by U.S. EPA, Pesticide Root Zone Model (PRZM) is a one-dimensional, dynamic, compartment model that predicts the fate and transport of a pesticide in the unsaturated soil system around a plant's root zo...
2011-01-01
Background Design of newly engineered microbial strains for biotechnological purposes would greatly benefit from the development of realistic mathematical models for the processes to be optimized. Such models can then be analyzed and, with the development and application of appropriate optimization techniques, one could identify the modifications that need to be made to the organism in order to achieve the desired biotechnological goal. As appropriate models to perform such an analysis are necessarily non-linear and typically non-convex, finding their global optimum is a challenging task. Canonical modeling techniques, such as Generalized Mass Action (GMA) models based on the power-law formalism, offer a possible solution to this problem because they have a mathematical structure that enables the development of specific algorithms for global optimization. Results Based on the GMA canonical representation, we have developed in previous works a highly efficient optimization algorithm and a set of related strategies for understanding the evolution of adaptive responses in cellular metabolism. Here, we explore the possibility of recasting kinetic non-linear models into an equivalent GMA model, so that global optimization on the recast GMA model can be performed. With this technique, optimization is greatly facilitated and the results are transposable to the original non-linear problem. This procedure is straightforward for a particular class of non-linear models known as Saturable and Cooperative (SC) models that extend the power-law formalism to deal with saturation and cooperativity. Conclusions Our results show that recasting non-linear kinetic models into GMA models is indeed an appropriate strategy that helps overcoming some of the numerical difficulties that arise during the global optimization task. PMID:21867520
Simulating PACE Global Ocean Radiances
Gregg, Watson W.; Rousseaux, Cécile S.
2017-01-01
The NASA PACE mission is a hyper-spectral radiometer planned for launch in the next decade. It is intended to provide new information on ocean biogeochemical constituents by parsing the details of high resolution spectral absorption and scattering. It is the first of its kind for global applications and as such, poses challenges for design and operation. To support pre-launch mission development and assess on-orbit capabilities, the NASA Global Modeling and Assimilation Office has developed a dynamic simulation of global water-leaving radiances, using an ocean model containing multiple ocean phytoplankton groups, particulate detritus, particulate inorganic carbon (PIC), and chromophoric dissolved organic carbon (CDOC) along with optical absorption and scattering processes at 1 nm spectral resolution. The purpose here is to assess the skill of the dynamic model and derived global radiances. Global bias, uncertainty, and correlation are derived using available modern satellite radiances at moderate spectral resolution. Total chlorophyll, PIC, and the absorption coefficient of CDOC (aCDOC), are simultaneously assimilated to improve the fidelity of the optical constituent fields. A 5-year simulation showed statistically significant (P <0.05) comparisons of chlorophyll (r = 0.869), PIC (r = 0.868), and aCDOC (r = 0.890) with satellite data. Additionally, diatoms (r = 0.890), cyanobacteria (r = 0.732), and coccolithophores (r = 0.716) were significantly correlated with in situ data. Global assimilated distributions of optical constituents were coupled with a radiative transfer model (Ocean-Atmosphere Spectral Irradiance Model, OASIM) to estimate normalized water-leaving radiances at 1 nm for the spectral range 250–800 nm. These unassimilated radiances were within −0.074 mW cm−2 μm1 sr−1 of MODIS-Aqua radiances at 412, 443, 488, 531, 547, and 667 nm. This difference represented a bias of −10.4% (model low). A mean correlation of 0.706 (P < 0.05) was found with global distributions of MODIS radiances. These results suggest skill in the global assimilated model and resulting radiances. The reported error characterization suggests that the global dynamical simulation can support some aspects of mission design and analysis. For example, the high spectral resolution of the simulation supports investigations of band selection. The global nature of the radiance representations supports investigations of satellite observing scenarios. Global radiances at bands not available in current and past missions support investigations of mission capability. PMID:29292403
Local and Global Gestalt Laws: A Neurally Based Spectral Approach.
Favali, Marta; Citti, Giovanna; Sarti, Alessandro
2017-02-01
This letter presents a mathematical model of figure-ground articulation that takes into account both local and global gestalt laws and is compatible with the functional architecture of the primary visual cortex (V1). The local gestalt law of good continuation is described by means of suitable connectivity kernels that are derived from Lie group theory and quantitatively compared with long-range connectivity in V1. Global gestalt constraints are then introduced in terms of spectral analysis of a connectivity matrix derived from these kernels. This analysis performs grouping of local features and individuates perceptual units with the highest salience. Numerical simulations are performed, and results are obtained by applying the technique to a number of stimuli.
NASA Technical Reports Server (NTRS)
Gallagher, D. L.; Fok, M.-C.; Fuselier, S.; Gladstone, G. R.; Green, J. L.; Fung, S. F.; Perez, J.; Reiff, P.; Roelof, E. C.; Wilson, G.
1998-01-01
Simultaneous, global measurement of major magnetospheric plasma systems will be performed for the first time with the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE) Mission. The ring current, plasmasphere, and auroral systems will be imaged using energetic neutral and ultraviolet cameras. Quantitative remote measurement of the magnetosheath, plasmaspheric, and magnetospheric densities will be obtained through radio sounding by the Radio Plasma Imager. The IMAGE Mission will open a new era in global magnetospheric physics, while bringing with it new challenges in data analysis. An overview of the IMAGE Theory and Modeling team efforts will be presented, including the state of development of Internet tools that will be available to the science community for access and analysis of IMAGE observations.
Carbonell, Felix; Bellec, Pierre; Shmuel, Amir
2011-01-01
The influence of the global average signal (GAS) on functional-magnetic resonance imaging (fMRI)-based resting-state functional connectivity is a matter of ongoing debate. The global average fluctuations increase the correlation between functional systems beyond the correlation that reflects their specific functional connectivity. Hence, removal of the GAS is a common practice for facilitating the observation of network-specific functional connectivity. This strategy relies on the implicit assumption of a linear-additive model according to which global fluctuations, irrespective of their origin, and network-specific fluctuations are super-positioned. However, removal of the GAS introduces spurious negative correlations between functional systems, bringing into question the validity of previous findings of negative correlations between fluctuations in the default-mode and the task-positive networks. Here we present an alternative method for estimating global fluctuations, immune to the complications associated with the GAS. Principal components analysis was applied to resting-state fMRI time-series. A global-signal effect estimator was defined as the principal component (PC) that correlated best with the GAS. The mean correlation coefficient between our proposed PC-based global effect estimator and the GAS was 0.97±0.05, demonstrating that our estimator successfully approximated the GAS. In 66 out of 68 runs, the PC that showed the highest correlation with the GAS was the first PC. Since PCs are orthogonal, our method provides an estimator of the global fluctuations, which is uncorrelated to the remaining, network-specific fluctuations. Moreover, unlike the regression of the GAS, the regression of the PC-based global effect estimator does not introduce spurious anti-correlations beyond the decrease in seed-based correlation values allowed by the assumed additive model. After regressing this PC-based estimator out of the original time-series, we observed robust anti-correlations between resting-state fluctuations in the default-mode and the task-positive networks. We conclude that resting-state global fluctuations and network-specific fluctuations are uncorrelated, supporting a Resting-State Linear-Additive Model. In addition, we conclude that the network-specific resting-state fluctuations of the default-mode and task-positive networks show artifact-free anti-correlations.
NASA Technical Reports Server (NTRS)
Prive, Nikki C.; Errico, Ronald M.
2013-01-01
A series of experiments that explore the roles of model and initial condition error in numerical weather prediction are performed using an observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASA/GMAO). The use of an OSSE allows the analysis and forecast errors to be explicitly calculated, and different hypothetical observing networks can be tested with ease. In these experiments, both a full global OSSE framework and an 'identical twin' OSSE setup are utilized to compare the behavior of the data assimilation system and evolution of forecast skill with and without model error. The initial condition error is manipulated by varying the distribution and quality of the observing network and the magnitude of observation errors. The results show that model error has a strong impact on both the quality of the analysis field and the evolution of forecast skill, including both systematic and unsystematic model error components. With a realistic observing network, the analysis state retains a significant quantity of error due to systematic model error. If errors of the analysis state are minimized, model error acts to rapidly degrade forecast skill during the first 24-48 hours of forward integration. In the presence of model error, the impact of observation errors on forecast skill is small, but in the absence of model error, observation errors cause a substantial degradation of the skill of medium range forecasts.
PISA: Multiple 'Truths' and Mediatised Global Governance
ERIC Educational Resources Information Center
Grey, Sue; Morris, Paul
2018-01-01
The OECD's PISA programme has been portrayed as central to the emergence of a regime of global educational governance and the subsequent convergence of policies towards a standardised model. Whilst there is an extensive literature describing PISA's impact on education policies, there is a paucity of analysis of how PISA data is presented to the…
A Land System representation for global assessments and land-use modeling.
van Asselen, Sanneke; Verburg, Peter H
2012-10-01
Current global scale land-change models used for integrated assessments and climate modeling are based on classifications of land cover. However, land-use management intensity and livestock keeping are also important aspects of land use, and are an integrated part of land systems. This article aims to classify, map, and to characterize Land Systems (LS) at a global scale and analyze the spatial determinants of these systems. Besides proposing such a classification, the article tests if global assessments can be based on globally uniform allocation rules. Land cover, livestock, and agricultural intensity data are used to map LS using a hierarchical classification method. Logistic regressions are used to analyze variation in spatial determinants of LS. The analysis of the spatial determinants of LS indicates strong associations between LS and a range of socioeconomic and biophysical indicators of human-environment interactions. The set of identified spatial determinants of a LS differs among regions and scales, especially for (mosaic) cropland systems, grassland systems with livestock, and settlements. (Semi-)Natural LS have more similar spatial determinants across regions and scales. Using LS in global models is expected to result in a more accurate representation of land use capturing important aspects of land systems and land architecture: the variation in land cover and the link between land-use intensity and landscape composition. Because the set of most important spatial determinants of LS varies among regions and scales, land-change models that include the human drivers of land change are best parameterized at sub-global level, where similar biophysical, socioeconomic and cultural conditions prevail in the specific regions. © 2012 Blackwell Publishing Ltd.
VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.
2016-12-01
VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.
The impact of lake and reservoir parameterization on global streamflow simulation.
Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke
2017-05-01
Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.
Convection in Extratropical Cyclones: Analysis of GPM, NexRAD, GCMs and Re-Analysis
NASA Astrophysics Data System (ADS)
Jeyaratnam, J.; Booth, J. F.; Naud, C. M.; Luo, J.
2017-12-01
Extratropical Cyclones (ETCs) are the most common cause of extreme precipitation in mid-latitudes and are important in the general atmospheric circulation as they redistribute moisture and heat. Isentropic lifting, upright convection, and slantwise convection are mechanisms of vertical motion within an ETC, which deliver different rain rates and might respond differently to global warming. In this study we compare different metrics for identifying convection within the ETC's and calculate the relative contribution of convection to total ETC precipitation. We determine if convection occurs preferentially in specific regions of the storm and decide how to best utilize GPM retrievals covering other parts of the mid-latitudes. Additionally, mid-latitude cyclones are tracked and composites of these tracked cyclones are compared amongst multiple versions of Global Circulation Models (GCMs) from Coupled Model Intercomparison Project Phase 6 (CMIP6) prototype models and re-analysis data; Model Diagnostic Task Force (MDTF) Geophysical Fluid Dynamics Laboratory (GFDL) using a two-plume convection scheme, MDTF GFDL using the Donner convection scheme, Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2), and European Reanalysis produced by the European Center for Medium-Range Weather Forecasts (ECMWF).
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290
Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav
2015-01-01
Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.
The Crossover Time as an Evaluation of Ocean Models Against Persistence
NASA Astrophysics Data System (ADS)
Phillipson, L. M.; Toumi, R.
2018-01-01
A new ocean evaluation metric, the crossover time, is defined as the time it takes for a numerical model to equal the performance of persistence. As an example, the average crossover time calculated using the Lagrangian separation distance (the distance between simulated trajectories and observed drifters) for the global MERCATOR ocean model analysis is found to be about 6 days. Conversely, the model forecast has an average crossover time longer than 6 days, suggesting limited skill in Lagrangian predictability by the current generation of global ocean models. The crossover time of the velocity error is less than 3 days, which is similar to the average decorrelation time of the observed drifters. The crossover time is a useful measure to quantify future ocean model improvements.
Hydrodynamic modelling and global datasets: Flow connectivity and SRTM data, a Bangkok case study.
NASA Astrophysics Data System (ADS)
Trigg, M. A.; Bates, P. B.; Michaelides, K.
2012-04-01
The rise in the global interconnected manufacturing supply chains requires an understanding and consistent quantification of flood risk at a global scale. Flood risk is often better quantified (or at least more precisely defined) in regions where there has been an investment in comprehensive topographical data collection such as LiDAR coupled with detailed hydrodynamic modelling. Yet in regions where these data and modelling are unavailable, the implications of flooding and the knock on effects for global industries can be dramatic, as evidenced by the recent floods in Bangkok, Thailand. There is a growing momentum in terms of global modelling initiatives to address this lack of a consistent understanding of flood risk and they will rely heavily on the application of available global datasets relevant to hydrodynamic modelling, such as Shuttle Radar Topography Mission (SRTM) data and its derivatives. These global datasets bring opportunities to apply consistent methodologies on an automated basis in all regions, while the use of coarser scale datasets also brings many challenges such as sub-grid process representation and downscaled hydrology data from global climate models. There are significant opportunities for hydrological science in helping define new, realistic and physically based methodologies that can be applied globally as well as the possibility of gaining new insights into flood risk through analysis of the many large datasets that will be derived from this work. We use Bangkok as a case study to explore some of the issues related to using these available global datasets for hydrodynamic modelling, with particular focus on using SRTM data to represent topography. Research has shown that flow connectivity on the floodplain is an important component in the dynamics of flood flows on to and off the floodplain, and indeed within different areas of the floodplain. A lack of representation of flow connectivity, often due to data resolution limitations, means that important subgrid processes are missing from hydrodynamic models leading to poor model predictive capabilities. Specifically here, the issue of flow connectivity during flood events is explored using geostatistical techniques to quantify the change of flow connectivity on floodplains due to grid rescaling methods. We also test whether this method of assessing connectivity can be used as new tool in the quantification of flood risk that moves beyond the simple flood extent approach, encapsulating threshold changes and data limitations.
Assessing the vertical structure of baroclinic tidal currents in a global model
NASA Astrophysics Data System (ADS)
Timko, Patrick; Arbic, Brian; Scott, Robert
2010-05-01
Tidal forcing plays an important role in many aspects of oceanography. Mixing, transport of particulates and internal wave generation are just three examples of local phenomena that may depend on the strength of local tidal currents. Advances in satellite altimetry have made an assessment of the global barotropic tide possible. However, the vertical structure of the tide may only be observed by deployment of instruments throughout the water column. Typically these observations are conducted at pre-determined depths based upon the interest of the observer. The high cost of such observations often limits both the number and the length of the observations resulting in a limit to our knowledge of the vertical structure of tidal currents. One way to expand our insight into the baroclinic structure of the ocean is through the use of numerical models. We compare the vertical structure of the global baroclinic tidal velocities in 1/12 degree HYCOM (HYbrid Coordinate Ocean Model) to a global database of current meter records. The model output is a subset of a 5 year global simulation that resolves the eddying general circulation, barotropic tides and baroclinic tides using 32 vertical layers. The density structure within the simulation is both vertically and horizontally non-uniform. In addition to buoyancy forcing the model is forced by astronomical tides and winds. We estimate the dominant semi-diurnal (M2), and diurnal (K1) tidal constituents of the model data using classical harmonic analysis. In regions where current meter record coverage is adequate, the model skill in replicating the vertical structure of the dominant diurnal and semi-diurnal tidal currents is assessed based upon the strength, orientation and phase of the tidal ellipses. We also present a global estimate of the baroclinic tidal energy at fixed depths estimated from the model output.
A Data Analysis Toolbox for Modeling the Global Food-Energy-Water Nexus
NASA Astrophysics Data System (ADS)
AghaKouchak, A.; Sadegh, M.; Mallakpour, I.
2017-12-01
Water, Food and energy systems are highly interconnected. More than seventy percent of global water resource is used for food production. Water withdrawal, purification, and transfer systems are energy intensive. Furthermore, energy generation strongly depends on water availability. Therefore, considering the interactions in the nexus of water, food and energy is crucial for sustainable management of available resources. In this presentation, we introduce a user-friendly data analysis toolbox that mines the available global data on food, energy and water, and analyzes their interactions. This toolbox provides estimates of water footprint for a wide range of food types in different countries and also approximates the required energy and water resources. The toolbox also provides estimates of the corresponding emissions and biofuel production of different crops. In summary, this toolbox allows evaluating dependencies of the food, energy, and water systems at the country scale. We present global analysis of the interactions between water, food and energy from different perspectives including efficiency and diversity of resources use.
NASA Technical Reports Server (NTRS)
Pu, Zhao-Xia; Tao, Wei-Kuo
2004-01-01
An effort has been made at NASA/GSFC to use the Goddard Earth Observing system (GEOS) global analysis in generating the initial and boundary conditions for MM5/WRF simulation. This linkage between GEOS global analysis and MM5/WRF models has made possible for a few useful applications. As one of the sample studies, a series of MM5 simulations were conducted to test the sensitivity of initial and boundary conditions to MM5 simulated precipitation over the eastern; USA. Global analyses horn different operational centers (e.g., NCEP, ECMWF, I U ASA/GSFCj were used to provide first guess field and boundary conditions for MM5. Numerical simulations were performed for one- week period over the eastern coast areas of USA. the distribution and quantities of MM5 simulated precipitation were compared. Results will be presented in the workshop. In addition,other applications from recent and future studies will also be addressed.
Global CLEWs model - A novel application of OSeMOSYS
NASA Astrophysics Data System (ADS)
Avgerinopoulos, Georgios; Pereira Ramos, Eunice; Howells, Mark
2017-04-01
Over the past years, studies that analyse Nexus issues from a holistic point of view and not energy, land or water separately have been gaining momentum. This project aims at giving insights into global issues through the application and the analysis of a global scale OSeMOSYS model. The latter -which is based on a fully open and amendable code- has been used successfully in the latest years as it has been the producing fully accessible energy models suitable for capacity building and policy making suggestions. This study develops a CLEWs (climate, land, energy and water) model with the objective of interrogating global challenges (e.g. increasing food demand) and international trade features, with policy priorities on food security, resource efficiency, low-carbon energy and climate change mitigation, water availability and vulnerability to water stress and floods, water quality, biodiversity and ecosystem services. It will for instance assess (i) the impact of water constraints on food security and human development (clean water for human use; industrial and energy water demands), as well as (ii) the impact of climate change on aggravating or relieving water problems.
Implementation of a cost-accounting model in a biobank: practical implications.
Gonzalez-Sanchez, Maria Beatriz; Lopez-Valeiras, Ernesto; García-Montero, Andres C
2014-01-01
Given the state of global economy, cost measurement and control have become increasingly relevant over the past years. The scarcity of resources and the need to use these resources more efficiently is making cost information essential in management, even in non-profit public institutions. Biobanks are no exception. However, no empirical experiences on the implementation of cost accounting in biobanks have been published to date. The aim of this paper is to present a step-by-step implementation of a cost-accounting tool for the main production and distribution activities of a real/active biobank, including a comprehensive explanation on how to perform the calculations carried out in this model. Two mathematical models for the analysis of (1) production costs and (2) request costs (order management and sample distribution) have stemmed from the analysis of the results of this implementation, and different theoretical scenarios have been prepared. Global analysis and discussion provides valuable information for internal biobank management and even for strategic decisions at the research and development governmental policies level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rafique, Rashid; Xia, Jianyang; Hararuk, Oleksandra
Land models are valuable tools to understand the dynamics of global carbon (C) cycle. Various models have been developed and used for predictions of future C dynamics but uncertainties still exist. Diagnosing the models’ behaviors in terms of structures can help to narrow down the uncertainties in prediction of C dynamics. In this study three widely used land surface models, namely CSIRO’s Atmosphere Biosphere Land Exchange (CABLE) with 9 C pools, Community Land Model (version 3.5) combined with Carnegie-Ames-Stanford Approach (CLM-CASA) with 12 C pools and Community Land Model (version 4) (CLM4) with 26 C pools were driven by themore » observed meteorological forcing. The simulated C storage and residence time were used for analysis. The C storage and residence time were computed globally for all individual soil and plant pools, as well as net primary productivity (NPP) and its allocation to different plant components’ based on these models. Remotely sensed NPP and statistically derived HWSD, and GLC2000 datasets were used as a reference to evaluate the performance of these models. Results showed that CABLE exhibited better agreement with referenced C storage and residence time for plant and soil pools, as compared with CLM-CASA and CLM4. CABLE had longer bulk residence time for soil C pools and stored more C in roots, whereas, CLM-CASA and CLM4 stored more C in woody pools due to differential NPP allocation. Overall, these results indicate that the differences in C storage and residence times in three models are largely due to the differences in their fundamental structures (number of C pools), NPP allocation and C transfer rates. Our results have implications in model development and provide a general framework to explain the bias/uncertainties in simulation of C storage and residence times from the perspectives of model structures.« less
The GEOS-5 Data Assimilation System-Documentation of Versions 5.0.1, 5.1.0, and 5.2.0
NASA Technical Reports Server (NTRS)
Suarez, Max J.; Rienecker, M. M.; Todling, R.; Bacmeister, J.; Takacs, L.; Liu, H. C.; Gu, W.; Sienkiewicz, M.; Koster, R. D.; Gelaro, R.;
2008-01-01
This report documents the GEOS-5 global atmospheric model and data assimilation system (DAS), including the versions 5.0.1, 5.1.0, and 5.2.0, which have been implemented in products distributed for use by various NASA instrument team algorithms and ultimately for the Modem Era Retrospective analysis for Research and Applications (MERRA). The DAS is the integration of the GEOS-5 atmospheric model with the Gridpoint Statistical Interpolation (GSI) Analysis, a joint analysis system developed by the NOAA/National Centers for Environmental Prediction and the NASA/Global Modeling and Assimilation Office. The primary performance drivers for the GEOS DAS are temperature and moisture fields suitable for the EOS instrument teams, wind fields for the transport studies of the stratospheric and tropospheric chemistry communities, and climate-quality analyses to support studies of the hydrological cycle through MERRA. The GEOS-5 atmospheric model has been approved for open source release and is available from: http://opensource.gsfc.nasa.gov/projects/GEOS-5/GEOS-5.php.
Evaluation of methodology for detecting/predicting migration of forest species
Dale S. Solomon; William B. Leak
1996-01-01
Available methods for analyzing migration of forest species are evaluated, including simulation models, remeasured plots, resurveys, pollen/vegetation analysis, and age/distance trends. Simulation models have provided some of the most drastic estimates of species changes due to predicted changes in global climate. However, these models require additional testing...
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log
National Centers for Environmental Prediction
Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts
Spatially-explicit models of global tree density.
Glick, Henry B; Bettigole, Charlie; Maynard, Daniel S; Covey, Kristofer R; Smith, Jeffrey R; Crowther, Thomas W
2016-08-16
Remote sensing and geographic analysis of woody vegetation provide means of evaluating the distribution of natural resources, patterns of biodiversity and ecosystem structure, and socio-economic drivers of resource utilization. While these methods bring geographic datasets with global coverage into our day-to-day analytic spheres, many of the studies that rely on these strategies do not capitalize on the extensive collection of existing field data. We present the methods and maps associated with the first spatially-explicit models of global tree density, which relied on over 420,000 forest inventory field plots from around the world. This research is the result of a collaborative effort engaging over 20 scientists and institutions, and capitalizes on an array of analytical strategies. Our spatial data products offer precise estimates of the number of trees at global and biome scales, but should not be used for local-level estimation. At larger scales, these datasets can contribute valuable insight into resource management, ecological modelling efforts, and the quantification of ecosystem services.
Redefinition and global estimation of basal ecosystem respiration rate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Wenping; Luo, Yiqi; Li, Xianglan
2011-10-13
Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located atmore » latitudes ranging from ~3°S to ~70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual« less
Large historical growth in global terrestrial gross primary production
Campbell, J. E.; Berry, J. A.; Seibt, U.; ...
2017-04-05
Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less
Large historical growth in global terrestrial gross primary production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, J. E.; Berry, J. A.; Seibt, U.
Growth in terrestrial gross primary production (GPP) may provide a negative feedback for climate change. It remains uncertain, however, to what extent biogeochemical processes can suppress global GPP growth. In consequence, model estimates of terrestrial carbon storage and carbon cycle –climate feedbacks remain poorly constrained. Here we present a global, measurement-based estimate of GPP growth during the twentieth century based on long-term atmospheric carbonyl sulphide (COS) records derived from ice core, firn, and ambient air samples. Here, we interpret these records using a model that simulates changes in COS concentration due to changes in its sources and sinks, including amore » large sink that is related to GPP. We find that the COS record is most consistent with climate-carbon cycle model simulations that assume large GPP growth during the twentieth century (31% ± 5%; mean ± 95% confidence interval). Finally, while this COS analysis does not directly constrain estimates of future GPP growth it provides a global-scale benchmark for historical carbon cycle simulations.« less
NASA Astrophysics Data System (ADS)
Bostrom, A.; Lashof, D.
2004-12-01
For almost two decades both national polls and in-depth studies of global warming perceptions have shown that people commonly conflate weather and global climate change. Not only are current weather events such as anecdotal heat waves, droughts or cold spells treated as evidence for or against global warming, but weather changes such as warmer weather and increased storm intensity and frequency are the consequences most likely to come to mind. Distinguishing weather from climate remains a challenge for many. This weather 'framing' of global warming may inhibit behavioral and policy change in several ways. Weather is understood as natural, on an immense scale that makes controlling it difficult to conceive. Further, these attributes contribute to perceptions that global warming, like weather, is uncontrollable. This talk presents an analysis of data from public opinion polls, focus groups, and cognitive studies regarding people's mental models of and 'frames' for global warming and climate change, and the role weather plays in these. This research suggests that priming people with a model of global warming as being caused by a "thickening blanket of carbon dioxide" that "traps heat" in the atmosphere solves some of these communications problems and makes it more likely that people will support policies to address global warming.
Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.; ...
2016-05-02
Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D. A.
Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, modelmore » structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Furthermore, current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of LULC change impacts on climate, water resources or biodiversity.« less
A review of some problems in global-local stress analysis
NASA Technical Reports Server (NTRS)
Nelson, Richard B.
1989-01-01
The various types of local-global finite-element problems point out the need to develop a new generation of software. First, this new software needs to have a complete analysis capability, encompassing linear and nonlinear analysis of 1-, 2-, and 3-dimensional finite-element models, as well as mixed dimensional models. The software must be capable of treating static and dynamic (vibration and transient response) problems, including the stability effects of initial stress, and the software should be able to treat both elastic and elasto-plastic materials. The software should carry a set of optional diagnostics to assist the program user during model generation in order to help avoid obvious structural modeling errors. In addition, the program software should be well documented so the user has a complete technical reference for each type of element contained in the program library, including information on such topics as the type of numerical integration, use of underintegration, and inclusion of incompatible modes, etc. Some packaged information should also be available to assist the user in building mixed-dimensional models. An important advancement in finite-element software should be in the development of program modularity, so that the user can select from a menu various basic operations in matrix structural analysis.
NASA Astrophysics Data System (ADS)
Vanrolleghem, Peter A.; Mannina, Giorgio; Cosenza, Alida; Neumann, Marc B.
2015-03-01
Sensitivity analysis represents an important step in improving the understanding and use of environmental models. Indeed, by means of global sensitivity analysis (GSA), modellers may identify both important (factor prioritisation) and non-influential (factor fixing) model factors. No general rule has yet been defined for verifying the convergence of the GSA methods. In order to fill this gap this paper presents a convergence analysis of three widely used GSA methods (SRC, Extended FAST and Morris screening) for an urban drainage stormwater quality-quantity model. After the convergence was achieved the results of each method were compared. In particular, a discussion on peculiarities, applicability, and reliability of the three methods is presented. Moreover, a graphical Venn diagram based classification scheme and a precise terminology for better identifying important, interacting and non-influential factors for each method is proposed. In terms of convergence, it was shown that sensitivity indices related to factors of the quantity model achieve convergence faster. Results for the Morris screening method deviated considerably from the other methods. Factors related to the quality model require a much higher number of simulations than the number suggested in literature for achieving convergence with this method. In fact, the results have shown that the term "screening" is improperly used as the method may exclude important factors from further analysis. Moreover, for the presented application the convergence analysis shows more stable sensitivity coefficients for the Extended-FAST method compared to SRC and Morris screening. Substantial agreement in terms of factor fixing was found between the Morris screening and Extended FAST methods. In general, the water quality related factors exhibited more important interactions than factors related to water quantity. Furthermore, in contrast to water quantity model outputs, water quality model outputs were found to be characterised by high non-linearity.
NASA Astrophysics Data System (ADS)
Switzer, A.; Yap, W.; Lauro, F.; Gouramanis, C.; Dominey-Howes, D.; Labbate, M.
2016-12-01
This presentation provides an overview of the PERSIANN precipitation products from the near real time high-resolution (4km, 30 min) PERSIANN-CCS to the most recent 34+-year PERSIANN-CDR (25km, daily). It is widely believed that the hydrologic cycle has been intensifying due to global warming and the frequency and the intensity of hydrologic extremes has also been increasing. Using the long-term historical global high resolution (daily, 0.25 degree) PERSIANN-CDR dataset covering over three decades from 1983 to the present day, we assess changes in global precipitation across different spatial scales. Our results show differences in trends, depending on which spatial scale is used, highlighting the importance of spatial scale in trend analysis. In addition, while there is an easily observable increasing global temperature trend, the global precipitation trend results created by the PERSIANN-CDR dataset used in this study are inconclusive. In addition, we use PERSIANN-CDR to assess the performance of the 32 CMIP5 models in terms of extreme precipitation indices in various continent-climate zones. The assessment can provide a guide for both model developers to target regions and processes that are not yet fully captured in certain climate types, and for climate model output users to be able to select the models and/or the study areas that may best fit their applications of interest.
NASA Astrophysics Data System (ADS)
Sorooshian, S.; Nguyen, P.; Hsu, K. L.
2017-12-01
This presentation provides an overview of the PERSIANN precipitation products from the near real time high-resolution (4km, 30 min) PERSIANN-CCS to the most recent 34+-year PERSIANN-CDR (25km, daily). It is widely believed that the hydrologic cycle has been intensifying due to global warming and the frequency and the intensity of hydrologic extremes has also been increasing. Using the long-term historical global high resolution (daily, 0.25 degree) PERSIANN-CDR dataset covering over three decades from 1983 to the present day, we assess changes in global precipitation across different spatial scales. Our results show differences in trends, depending on which spatial scale is used, highlighting the importance of spatial scale in trend analysis. In addition, while there is an easily observable increasing global temperature trend, the global precipitation trend results created by the PERSIANN-CDR dataset used in this study are inconclusive. In addition, we use PERSIANN-CDR to assess the performance of the 32 CMIP5 models in terms of extreme precipitation indices in various continent-climate zones. The assessment can provide a guide for both model developers to target regions and processes that are not yet fully captured in certain climate types, and for climate model output users to be able to select the models and/or the study areas that may best fit their applications of interest.
Neutron densities from a global analysis of medium-energy proton-nucleus elastic scattering
NASA Astrophysics Data System (ADS)
Clark, B. C.; Kerr, L. J.; Hama, S.
2003-05-01
A new method for extracting neutron densities from intermediate-energy elastic proton-nucleus scattering observables uses a global Dirac phenomenological approach based on the relativistic impulse approximation. Datasets for 40Ca, 48Ca, and 208Pb in the energy range from 500 MeV to 1040 MeV are considered. The global fits are successful in reproducing the data and in predicting datasets not included in the analysis. Using this global approach, energy-independent neutron densities are obtained. The vector point proton density distribution ρpv is determined from the empirical charge density after unfolding the proton form factor. The other densities, ρnv, ρps, ρns, are parametrized. This work provides energy-independent values for the rms neutron radius Rn and the neutron skin thickness Sn, in contrast to the energy-dependent values obtained by previous studies. In addition, the results presented in this paper show that the expected rms neutron radius and the skin thickness for 40Ca are accurately reproduced. The values of Rn and Sn obtained from the global fits that we consider to be the most reliable are given as follows: for 40Ca, 3.314>Rn>3.310 fm and -0.063>Sn >-0.067 fm; for 48Ca, 3.459>Rn>3.413 fm and 0.102>Sn>0.056 fm; and for 208Pb, 5.550>Rn>5.522 fm and 0.111>Sn>0.083 fm. These values are in reasonable agreement with nonrelativistic Skyrme-Hartree-Fock models and with relativistic Hartree-Bogoliubov models with density-dependent meson-nucleon couplings. The results from the global fits for 48Ca and 208Pb are generally not in agreement with the usual relativistic mean-field models.
Evaluating Effects of H2O and overhead O3 on Global Mean Tropospheric OH Concentration
NASA Technical Reports Server (NTRS)
Nicely, Julie M.; Salawitch, R.J.; Canty, T.; Lang, Chang; Duncan, Bryan; Liang, Qing; Oman, Luke David; Stolarski, Richard S.; Waugh, Darryn
2012-01-01
The oxidizing capacity of the troposphere is controlled, to a large extent, by the abundance of hydroxyl radical (OH). The global mean concentration of OH, [OH]GLOBAL, inferred from measurements of methyl chloroform, has remained relatively constant during the past several decades, despite rising levels of CH4 that should have led to a steady decline. Here we examine other factors that may have affected [OH]GLOBAL, such as the overhead burden of stratospheric O3 and tropospheric H2O, using global OH fields from the GEOS-CHEM Chemistry-Climate Model. Our analysis suggests these factors may have contributed a positive trend to [OH]GLOBAL large enough to counter the decrease due to CH4.
Process evaluation of sea salt aerosol concentrations at remote marine locations
NASA Astrophysics Data System (ADS)
Struthers, H.; Ekman, A. M.; Nilsson, E. D.
2011-12-01
Sea salt, an important natural aerosol, is generated by bubbles bursting at the surface of the ocean. Sea salt aerosol contributes significantly to the global aerosol burden and radiative budget and are a significant source of cloud condensation nuclei in remote marine areas (Monahan et al., 1986). Consequently, changes in marine aerosol abundance is expected to impact on climate forcing. Estimates of the atmospheric burden of sea salt aerosol mass derived from chemical transport and global climate models vary greatly both in the global total and the spatial distribution (Texor et al. 2006). This large uncertainty in the sea salt aerosol distribution in turn contributes to the large uncertainty in the current estimates of anthropogenic aerosol climate forcing (IPCC, 2007). To correctly attribute anthropogenic climate change and to veraciously project future climate, natural aerosols including sea salt must be understood and accurately modelled. In addition, the physical processes that determine the sea salt aerosol concentration are susceptible to modification due to climate change (Carslaw et al., 2010) which means there is the potential for feedbacks within the climate/aerosol system. Given the large uncertainties in sea salt aerosol modelling, there is an urgent need to evaluate the process description of sea salt aerosols in global models. An extremely valuable source of data for model evaluation is the long term measurements of PM10 sea salt aerosol mass available from a number of remote marine observation sites around the globe (including the GAW network). Sea salt aerosol concentrations at remote marine locations depend strongly on the surface exchange (emission and deposition) as well as entrainment or detrainment to the free troposphere. This suggests that the key parameters to consider in any analysis include the sea surface water temperature, wind speed, precipitation rate and the atmospheric stability. In this study, the sea salt aerosol observations are analysed to quantify the key sensitivities of the processes connecting the physical drivers of sea salt aerosol to the mass tendency. The analysis employs a semi-empirical model based on the time-tendency of the aerosol mass. This approach of focusing on the time-tendency of the sea salt aerosol concentration provides a framework for the process evaluation of sea salt aerosol concentrations in global models. The same analysis methodology can be applied to output from global models. A process of comparing the sensitivity parameters derived from observations and models will reveal model inadequacies and thus guide model improvements. Carslaw, K. S., Boucher, O., Spracklen, D. V., Mann G. W., Rae, J. G. L, Woodward, S., Kulmala, M. (2010). Atmos. Chem. Phys., 10, 1701-1737 IPCC (2007). Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Solomon, S., D. Monahan, E. C., Spiel, D. E., Davidson, K. L. (1986) Oceanic Whitecaps ed. Monahan E. C. & MacNiochaill, D. Reidel, Norwell, Mass. Texor, C., et al. (2006) Atmos. Chem. Phys., 6, 1777-1813.
Jiang, Weiping; Wang, Li; Niu, Xiaoji; Zhang, Quan; Zhang, Hui; Tang, Min; Hu, Xiangyun
2014-01-01
A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference. PMID:25330046
Satellite-enhanced dynamical downscaling for the analysis of extreme events
NASA Astrophysics Data System (ADS)
Nunes, Ana M. B.
2016-09-01
The use of regional models in the downscaling of general circulation models provides a strategy to generate more detailed climate information. In that case, boundary-forcing techniques can be useful to maintain the large-scale features from the coarse-resolution global models in agreement with the inner modes of the higher-resolution regional models. Although those procedures might improve dynamics, downscaling via regional modeling still aims for better representation of physical processes. With the purpose of improving dynamics and physical processes in regional downscaling of global reanalysis, the Regional Spectral Model—originally developed at the National Centers for Environmental Prediction—employs a newly reformulated scale-selective bias correction, together with the 3-hourly assimilation of the satellite-based precipitation estimates constructed from the Climate Prediction Center morphing technique. The two-scheme technique for the dynamical downscaling of global reanalysis can be applied in analyses of environmental disasters and risk assessment, with hourly outputs, and resolution of about 25 km. Here the satellite-enhanced dynamical downscaling added value is demonstrated in simulations of the first reported hurricane in the western South Atlantic Ocean basin through comparisons with global reanalyses and satellite products available in ocean areas.
Estimating heterotrophic respiration at large scales: Challenges, approaches, and next steps
Bond-Lamberty, Ben; Epron, Daniel; Harden, Jennifer W.; Harmon, Mark E.; Hoffman, Forrest; Kumar, Jitendra; McGuire, Anthony David; Vargas, Rodrigo
2016-01-01
Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of “Decomposition Functional Types” (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelers and experimentalists to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present an example clustering analysis to show how annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from (but complementary to) already-existing PFTs. A similar analysis incorporating observational data could form the basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with rigorous testing of analytical results; using point measurements and realistic forcing variables to constrain process-based models; and planning by the global modeling community for decoupling decomposition from fixed site data. These are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR.
NASA Astrophysics Data System (ADS)
Wilkin, J.; Levin, J.; Lopez, A.; Arango, H.
2016-02-01
Coastal ocean models that downscale output from basin and global scale models are widely used to study regional circulation at enhanced resolution and locally important ecosystem, biogeochemical, and geomorphologic processes. When operated as now-cast or forecast systems, these models offer predictions that assist decision-making for numerous maritime applications. We describe such a system for shelf waters of the Mid-Atlantic Bight (MAB) and Gulf of Maine (GoM) where the MARACOOS and NERACOOS associations of U.S. IOOS operate coastal ocean observing systems that deliver a dense observation set using CODAR HF-radar, autonomous underwater glider vehicles (AUGV), telemetering moorings, and drifting buoys. Other U.S. national and global observing systems deliver further sustained observations from moorings, ships, profiling floats, and a constellation of satellites. Our MAB and GoM re-analysis and forecast system uses the Regional Ocean Modeling System (ROMS; myroms.org) with 4-dimensional Variational (4D-Var) data assimilation to adjust initial conditions, boundary conditions, and surface forcing in each analysis cycle. Data routinely assimilated include CODAR velocities, altimeter satellite sea surface height (with coastal corrections), satellite temperature, in situ CTD data from AUGV and ships (NMFS Ecosystem Monitoring voyages), and all in situ data reported via the WMO GTS network. A climatological data assimilative analysis of hydrographic and long-term mean velocity observations specifies the regional Mean Dynamic Topography that augments altimeter sea level anomaly data and is also used to adjust boundary condition biases that would otherwise be introduced in the process of downscaling from global models. System performance is described with respect to the impact of satellite, CODAR and in situ observations on analysis skill. Results from a 2-way nested modeling system that adds enhanced resolution over the NSF OOI Pioneer Array in the central MAB are also shown.
Maritime Continent seasonal climate biases in AMIP experiments of the CMIP5 multimodel ensemble
NASA Astrophysics Data System (ADS)
Toh, Ying Ying; Turner, Andrew G.; Johnson, Stephanie J.; Holloway, Christopher E.
2018-02-01
The fidelity of 28 Coupled Model Intercomparison Project phase 5 (CMIP5) models in simulating mean climate over the Maritime Continent in the Atmospheric Model Intercomparison Project (AMIP) experiment is evaluated in this study. The performance of AMIP models varies greatly in reproducing seasonal mean climate and the seasonal cycle. The multi-model mean has better skill at reproducing the observed mean climate than the individual models. The spatial pattern of 850 hPa wind is better simulated than the precipitation in all four seasons. We found that model horizontal resolution is not a good indicator of model performance. Instead, a model's local Maritime Continent biases are somewhat related to its biases in the local Hadley circulation and global monsoon. The comparison with coupled models in CMIP5 shows that AMIP models generally performed better than coupled models in the simulation of the global monsoon and local Hadley circulation but less well at simulating the Maritime Continent annual cycle of precipitation. To characterize model systematic biases in the AMIP runs, we performed cluster analysis on Maritime Continent annual cycle precipitation. Our analysis resulted in two distinct clusters. Cluster I models are able to capture both the winter monsoon and summer monsoon shift, but they overestimate the precipitation; especially during the JJA and SON seasons. Cluster II models simulate weaker seasonal migration than observed, and the maximum rainfall position stays closer to the equator throughout the year. The tropics-wide properties of these clusters suggest a connection between the skill of simulating global properties of the monsoon circulation and the skill of simulating the regional scale of Maritime Continent precipitation.
Did Child Restraint Laws Globally Converge? Examining 40 Years of Policy Diffusion.
Nazif-Muñoz, José Ignacio
2015-01-01
The objective of the current study is to determine what factors have been associated with the global adoption of mandatory child restraint laws (ChRLs) since 1975. In order to determine what factors explained the global adoption of mandatory ChRLs, Weibull models were analyzed. To carry out this analysis, 170 countries were considered and the time risk corresponded to 5,146 observations for the period 1957-2013. The dependent variable was first time to adopt a ChRL. Independent variables representing global factors were the World Health Organization (WHO) and World Bank's (WB) road safety global campaign; the Geneva Convention on Road Traffic; and the United Nation's (UN) 1958 Vehicle Agreement. Independent variables representing regional factors were the creation of the European Transport Safety Council and being a Commonwealth country. Independent variables representing national factors were population; gross domestic product (GDP) per capita; political violence; existence of road safety nongovernmental organizations (NGOs); and existence of road safety agencies. Urbanization served as a control variable. To examine regional dynamics, Weibull models for Africa, Asia, Europe, North America, Latin America, the Caribbean, and the Commonwealth were also carried out. Empirical estimates from full Weibull models suggest that 2 global factors and 2 national factors are significantly associated with the adoption of this measure. The global factors explaining adoption are the WHO and WB's road safety global campaign implemented after 2004 (P <.01), and the UN's 1958 Vehicle Agreement (P <.001). National factors were GDP (P <.01) and existence of road safety agencies (P <.05). The time parameter ρ for the full Weibull model was 1.425 (P <.001), suggesting that the likelihood of ChRL adoption increased over the observed period of time, confirming that the diffusion of this policy was global. Regional analysis showed that the UN's Convention on Road Traffic was significant in Asia, the creation of the European Transport Safety Council was significant in Europe and North America, and the global campaign was in Africa. In Commonwealth and European and North American countries, the existence of road safety agencies was also positively associated with ChRL adoption. Results of the world models suggest that the WHO and WB's global road safety campaign was effective in disseminating ChRLs after 2004. Furthermore, regions such as Asia and Europe and North America were early adopters since specific regional and national characteristics anticipated the introduction of this policy before 2004. In this particular case, the creation of the European Transport Safety Council was fundamental in promoting ChRLs. Thus, in order to introduce conditions to more rapidly diffuse road safety measures across lagging regions, the maintenance of global efforts and the creation of road safety regional organizations should be encouraged. Lastly, the case of ChRL convergence illustrates how mechanisms of global and regional diffusion need to be analytically differentiated in order better to assess the process of policy diffusion.
A high resolution global scale groundwater model
NASA Astrophysics Data System (ADS)
de Graaf, Inge; Sutanudjaja, Edwin; van Beek, Rens; Bierkens, Marc
2014-05-01
As the world's largest accessible source of freshwater, groundwater plays a vital role in satisfying the basic needs of human society. It serves as a primary source of drinking water and supplies water for agricultural and industrial activities. During times of drought, groundwater storage provides a large natural buffer against water shortage and sustains flows to rivers and wetlands, supporting ecosystem habitats and biodiversity. Yet, the current generation of global scale hydrological models (GHMs) do not include a groundwater flow component, although it is a crucial part of the hydrological cycle. Thus, a realistic physical representation of the groundwater system that allows for the simulation of groundwater head dynamics and lateral flows is essential for GHMs that increasingly run at finer resolution. In this study we present a global groundwater model with a resolution of 5 arc-minutes (approximately 10 km at the equator) using MODFLOW (McDonald and Harbaugh, 1988). With this global groundwater model we eventually intend to simulate the changes in the groundwater system over time that result from variations in recharge and abstraction. Aquifer schematization and properties of this groundwater model were developed from available global lithological maps and datasets (Dürr et al., 2005; Gleeson et al., 2010; Hartmann and Moosdorf, 2013), combined with our estimate of aquifer thickness for sedimentary basins. We forced the groundwater model with the output from the global hydrological model PCR-GLOBWB (van Beek et al., 2011), specifically the net groundwater recharge and average surface water levels derived from routed channel discharge. For the parameterization, we relied entirely on available global datasets and did not calibrate the model so that it can equally be expanded to data poor environments. Based on our sensitivity analysis, in which we run the model with various hydrogeological parameter settings, we observed that most variance in groundwater depth is explained by variation in saturated conductivity, and, for the sediment basins, also by variation in recharge. We validated simulated groundwater heads with piezometer heads (available from www.glowasis.eu), resulting in a coefficient of determination for sedimentary basins of 0.92 with regression constant of 0.8. This shows the used method is suitable to build a global groundwater model using best available global information, and estimated water table depths are within acceptable accuracy in many parts of the world.
Objective analysis of observational data from the FGGE observing systems
NASA Technical Reports Server (NTRS)
Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.
1981-01-01
An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.
Shaping the global landscape in the Anthropocene
NASA Astrophysics Data System (ADS)
Lotze-Campen, H.
2012-12-01
In the emerging era of the Anthropocene (Crutzen and Stoermer 2000) most ecosystems are either directly or indirectly influenced by human activities, and neither socio-economic processes nor environmental changes can be understood without taking their interactions into account. Social transitions towards more sustainable development paths will only be achieved through a co-evolution process of society and nature. Both are parts of one integrated "Earth system", where land and water use are key linking elements. In the industrialised countries the transition task will have to focus on maintaining current standards of living while reducing the demand for ecosystem services. In the developing countries the major challenge will be to raise income levels substantially and find more sustainable development paths that try to minimise the negative side-effects of economic growth. Due to technological changes and a globally integrated economy, human society is now in a position where it has to ask itself: "What kind of landscapes and ecosystems do we really want in the future?" Shaping environmental conditions in the course of economic growth and climate change becomes a social management task. While many environmental and social problems have to be dealt with at the regional and national scale, in some areas, like climate change and international trade, the level of analysis and political action extends to the global scale. The allocation of land and water resources for different human uses has to be consciously managed. The potential and limitations of different options and the trade-offs between land expansion, increased land use intensity and re-allocation between different uses have to be carefully assessed. While agricultural productivity has continuously grown in the past, a slowing pace has to be expected in many regions in the future. Water may pose the most serious limitation to future global food and bioenergy supplies. Rising crop outputs per unit of land and water are essential to feed growing demands. The technological and organisational changes required to increase productivity will only be achieved through continuous investments and appropriate institutional settings and incentives. Strategies for a "sustainable land management" will only emerge from truly integrated methods of analysis. These have to combine theories, models and data from various social sciences (e.g. economics, sociology, psychology) and natural sciences (e.g. ecology, hydrology, biogeochemistry). We provide an integrated assessment approach for modeling global landscape change and related management options, including changes in lifestyles and global consumption patterns. The global biogeochemistry model LPJmL (Bondeau et al. 2007) is linked to the economic land and water use model MAgPIE (Lotze-Campen et al. 2008) and the economy-climate model REMIND-R (Leimbach et al. 2010). We illustrate the trade-offs between different societal goals with regard to land use and landscape diversity. Finally, we provide a research design for multi-scale analysis of landscape change through a combination of regional case studies with our global models of the economy, biosphere, and climate.
Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V.; Petway, Joy R.
2017-01-01
This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH3-N and NO3-N. Results indicate that the integrated FME-GLUE-based model, with good Nash–Sutcliffe coefficients (0.53–0.69) and correlation coefficients (0.76–0.83), successfully simulates the concentrations of ON-N, NH3-N and NO3-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH3-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO3-N simulation, which was measured using global sensitivity. PMID:28704958
The Solsticial Pause on Mars. Part 1; A Planetary Wave Reanalysis
NASA Technical Reports Server (NTRS)
Lewis, Stephen R.; Mulholland, David P.; Read, Peter L.; Montabone, Luca; Wilson, R. John; Smith, Michael D.
2015-01-01
Large-scale planetary waves are diagnosed from an analysis of profiles retrieved from the Thermal Emission Spectrometer aboard the Mars Global Surveyor spacecraft during its scientific mapping phase. The analysis is conducted by assimilating thermal profiles and total dust opacity retrievals into a Mars global circulation model. Transient waves are largest throughout the northern hemisphere autumn, winter and spring period and almost absent during the summer. The southern hemisphere exhibits generally weaker transient wave behavior. A striking feature of the low-altitude transient waves in the analysis is that they show a broad subsidiary minimum in amplitude centred on the winter solstice, a period when the thermal contrast between the summer hemisphere and the winter pole is strongest and baroclinic wave activity might be expected to be strong. This behavior, here called the 'solsticial pause,' is present in every year of the analysis. This strong pause is under-represented in many independent model experiments, which tend to produce relatively uniform baroclinic wave activity throughout the winter. This paper documents and diagnoses the transient wave solsticial pause found in the analysis; a companion paper investigates the origin of the phenomenon in a series of model experiments.
NASA Astrophysics Data System (ADS)
Emery, C. M.; Biancamaria, S.; Boone, A. A.; Ricci, S. M.; Garambois, P. A.; Decharme, B.; Rochoux, M. C.
2015-12-01
Land Surface Models (LSM) coupled with River Routing schemes (RRM), are used in Global Climate Models (GCM) to simulate the continental part of the water cycle. They are key component of GCM as they provide boundary conditions to atmospheric and oceanic models. However, at global scale, errors arise mainly from simplified physics, atmospheric forcing, and input parameters. More particularly, those used in RRM, such as river width, depth and friction coefficients, are difficult to calibrate and are mostly derived from geomorphologic relationships, which may not always be realistic. In situ measurements are then used to calibrate these relationships and validate the model, but global in situ data are very sparse. Additionally, due to the lack of existing global river geomorphology database and accurate forcing, models are run at coarse resolution. This is typically the case of the ISBA-TRIP model used in this study.A complementary alternative to in-situ data are satellite observations. In this regard, the Surface Water and Ocean Topography (SWOT) satellite mission, jointly developed by NASA/CNES/CSA/UKSA and scheduled for launch around 2020, should be very valuable to calibrate RRM parameters. It will provide maps of water surface elevation for rivers wider than 100 meters over continental surfaces in between 78°S and 78°N and also direct observation of river geomorphological parameters such as width ans slope.Yet, before assimilating such kind of data, it is needed to analyze RRM temporal sensitivity to time-constant parameters. This study presents such analysis over large river basins for the TRIP RRM. Model output uncertainty, represented by unconditional variance, is decomposed into ordered contribution from each parameter. Doing a time-dependent analysis allows then to identify to which parameters modeled water level and discharge are the most sensitive along a hydrological year. The results show that local parameters directly impact water levels, while discharge is more affected by parameters from the whole upstream drainage area. Understanding model output variance behavior will have a direct impact on the design and performance of the ensemble-based data assimilation platform, for which uncertainties are also modeled by variances. It will help to select more objectively RRM parameters to correct.
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
NASA Astrophysics Data System (ADS)
Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.
Modeling nitrous oxide emission from rivers: a global assessment.
Hu, Minpeng; Chen, Dingjiang; Dahlgren, Randy A
2016-11-01
Estimates of global riverine nitrous oxide (N 2 O) emissions contain great uncertainty. We conducted a meta-analysis incorporating 169 observations from published literature to estimate global riverine N 2 O emission rates and emission factors. Riverine N 2 O flux was significantly correlated with NH 4 , NO 3 and DIN (NH 4 + NO 3 ) concentrations, loads and yields. The emission factors EF(a) (i.e., the ratio of N 2 O emission rate and DIN load) and EF(b) (i.e., the ratio of N 2 O and DIN concentrations) values were comparable and showed negative correlations with nitrogen concentration, load and yield and water discharge, but positive correlations with the dissolved organic carbon : DIN ratio. After individually evaluating 82 potential regression models based on EF(a) or EF(b) for global, temperate zone and subtropical zone datasets, a power function of DIN yield multiplied by watershed area was determined to provide the best fit between modeled and observed riverine N 2 O emission rates (EF(a): R 2 = 0.92 for both global and climatic zone models, n = 70; EF(b): R 2 = 0.91 for global model and R 2 = 0.90 for climatic zone models, n = 70). Using recent estimates of DIN loads for 6400 rivers, models estimated global riverine N 2 O emission rates of 29.6-35.3 (mean = 32.2) Gg N 2 O-N yr -1 and emission factors of 0.16-0.19% (mean = 0.17%). Global riverine N 2 O emission rates are forecasted to increase by 35%, 25%, 18% and 3% in 2050 compared to the 2000s under the Millennium Ecosystem Assessment's Global Orchestration, Order from Strength, Technogarden, and Adapting Mosaic scenarios, respectively. Previous studies may overestimate global riverine N 2 O emission rates (300-2100 Gg N 2 O-N yr -1 ) because they ignore declining emission factor values with increasing nitrogen levels and channel size, as well as neglect differences in emission factors corresponding to different nitrogen forms. Riverine N 2 O emission estimates will be further enhanced through refining emission factor estimates, extending measurements longitudinally along entire river networks and improving estimates of global riverine nitrogen loads. © 2016 John Wiley & Sons Ltd.
A White Paper on Global Wheat Health Based on Scenario Development and Analysis.
Savary, S; Djurle, A; Yuen, J; Ficke, A; Rossi, V; Esker, P D; Fernandes, J M C; Del Ponte, E M; Kumar, J; Madden, L V; Paul, P; McRoberts, N; Singh, P K; Huber, L; Pope de Vallavielle, C; Saint-Jean, S; Willocquet, L
2017-10-01
Scenario analysis constitutes a useful approach to synthesize knowledge and derive hypotheses in the case of complex systems that are documented with mainly qualitative or very diverse information. In this article, a framework for scenario analysis is designed and then, applied to global wheat health within a timeframe from today to 2050. Scenario analysis entails the choice of settings, the definition of scenarios of change, and the analysis of outcomes of these scenarios in the chosen settings. Three idealized agrosystems, representing a large fraction of the global diversity of wheat-based agrosystems, are considered, which represent the settings of the analysis. Several components of global changes are considered in their consequences on global wheat health: climate change and climate variability, nitrogen fertilizer use, tillage, crop rotation, pesticide use, and the deployment of host plant resistances. Each idealized agrosystem is associated with a scenario of change that considers first, a production situation and its dynamics, and second, the impacts of the evolving production situation on the evolution of crop health. Crop health is represented by six functional groups of wheat pathogens: the pathogens associated with Fusarium head blight; biotrophic fungi, Septoria-like fungi, necrotrophic fungi, soilborne pathogens, and insect-transmitted viruses. The analysis of scenario outcomes is conducted along a risk-analytical pattern, which involves risk probabilities represented by categorized probability levels of disease epidemics, and risk magnitudes represented by categorized levels of crop losses resulting from these levels of epidemics within each production situation. The results from this scenario analysis suggest an overall increase of risk probabilities and magnitudes in the three idealized agrosystems. Changes in risk probability or magnitude however vary with the agrosystem and the functional groups of pathogens. We discuss the effects of global changes on the six functional groups, in terms of their epidemiology and of the crop losses they cause. Scenario analysis enables qualitative analysis of complex systems, such as plant pathosystems that are evolving in response to global changes, including climate change and technology shifts. It also provides a useful framework for quantitative simulation modeling analysis for plant disease epidemiology.
NASA Astrophysics Data System (ADS)
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.
Lapaige, Véronique
2009-01-01
The current phase of globalization represents a “double-edged sword” challenge facing public health practitioners and health policy makers. The first “edge” throws light on two constructs in the field of public health: global health (formerly international health) and globalized public health. The second “edge” is that of global governance, and raises the question, “how can we construct public health regulations that adequately respond to both global and local complexities related to the two constructs mentioned earlier (global health and globalized public health)?” The two constructs call for the development of norms that will assure sustained population-wide health improvement and these two constructs have their own conceptual tools and theoretical models that permit a better understanding of them. In this paper, we introduce the “globalized public health” construct and we present an interactive comprehensive framework for critically analyzing contemporary globalization’s influences on the field of public health. “Globalized public health”, simultaneously a theoretical model and a conceptual framework, concerns the transformation of the field of public health in the sociohistorical context of globalization. The model is the fruit of an original theoretical research study conducted from 2005 to 2008 (“contextualized research,” Gibbons’ Mode II of knowledge production), founded on a QUAL-quant sequential mixed-method design. This research also reflects our political and ideological position, fuelled with aspirations of social democracy and cosmopolitical values. It is profoundly anchored in the pragmatic approach to globalization, looking to “reconcile” the market and equity. The model offers several features to users: (1) it is transdisciplinary; (2) it is interactive (CD-ROM); (3) it is nonlinear (nonlinear interrelations between the contextual globalization and the field of public health); (4) it is synchronic/diachronic (a double-crossed perspective permits analysis of global social change, the emergence of global agency and the transmutation of the field of public health, in the full complexity of their nonlinear interaction); (5) it offers five characteristics as an auto-eco-organized system of social interactions, or dynamic, nonlinear sociohistorical system. The model features a visual interface (five interrelated figures), a structure of 30 “integrator concepts” that integrates 114 other element-parts via 1,300 hypertext links. The model is both a knowledge translation tool and an interactive heuristic guide designed for practitioners and researchers in public health/community health/population health, as well as for decision-makers at all levels. PMID:22312210
Gravity model development for precise orbit computations for satellite altimetry
NASA Technical Reports Server (NTRS)
Marsh, James G.; Lerch, Francis, J.; Smith, David E.; Klosko, Steven M.; Pavlis, Erricos
1986-01-01
Two preliminary gravity models developed as a first step in reaching the TOPEX/Poseidon modeling goals are discussed. They were obtained by NASA-Goddard from an analysis of exclusively satellite tracking observations. With the new Preliminary Gravity Solution-T2 model, an improved global estimate of the field is achieved with an improved description of the geoid.
NASA Technical Reports Server (NTRS)
Ruane, Alex C.; McDermid, Sonali; Rosenzweig, Cynthia; Baigorria, Guillermo A.; Jones, James W.; Romero, Consuelo C.; Cecil, L. DeWayne
2014-01-01
Climate change is projected to push the limits of cropping systems and has the potential to disrupt the agricultural sector from local to global scales. This article introduces the Coordinated Climate-Crop Modeling Project (C3MP), an initiative of the Agricultural Model Intercomparison and Improvement Project (AgMIP) to engage a global network of crop modelers to explore the impacts of climate change via an investigation of crop responses to changes in carbon dioxide concentration ([CO2]), temperature, and water. As a demonstration of the C3MP protocols and enabled analyses, we apply the Decision Support System for Agrotechnology Transfer (DSSAT) CROPGRO-Peanut crop model for Henry County, Alabama, to evaluate responses to the range of plausible [CO2], temperature changes, and precipitation changes projected by climate models out to the end of the 21st century. These sensitivity tests are used to derive crop model emulators that estimate changes in mean yield and the coefficient of variation for seasonal yields across a broad range of climate conditions, reproducing mean yields from sensitivity test simulations with deviations of ca. 2% for rain-fed conditions. We apply these statistical emulators to investigate how peanuts respond to projections from various global climate models, time periods, and emissions scenarios, finding a robust projection of modest (<10%) median yield losses in the middle of the 21st century accelerating to more severe (>20%) losses and larger uncertainty at the end of the century under the more severe representative concentration pathway (RCP8.5). This projection is not substantially altered by the selection of the AgMERRA global gridded climate dataset rather than the local historical observations, differences between the Third and Fifth Coupled Model Intercomparison Project (CMIP3 and CMIP5), or the use of the delta method of climate impacts analysis rather than the C3MP impacts response surface and emulator approach.
NASA Astrophysics Data System (ADS)
Werner, Micha; Blyth, Eleanor; Schellekens, Jaap
2016-04-01
Global hydrological and land-surface models are becoming increasingly available, and as the resolution of these improves, as well how hydrological processes are represented, so does their potential. These offer consistent datasets at the global scale, which can be used to establish water balances and derive policy relevant indicators in medium to large basins, including those that are poorly gauged. However, differences in model structure, model parameterisation, and model forcing may result in quite different indicator values being derived, depending on the model used. In this paper we explore indicators developed using four land surface models (LSM) and five global hydrological models (GHM). Results from these models have been made available through the Earth2Observe project, a recent research initiative funded by the European Union 7th Research Framework. All models have a resolution of 0.5 arc degrees, and are forced using the same WATCH-ERA-Interim (WFDEI) meteorological re-analysis data at a daily time step for the 32 year period from 1979 to 2012. We explore three water resources indicators; an aridity index, a simplified water exploitation index; and an indicator that calculates the frequency of occurrence of root zone stress. We compare indicators derived over selected areas/basins in Europe, Colombia, Southern Africa, the Indian Subcontinent and Australia/New Zealand. The hydrological fluxes calculated show quite significant differences between the nine models, despite the common forcing dataset, with these differences reflected in the indicators subsequently derived. The results show that the variability between models is related to the different climates types, with that variability quite logically depending largely on the availability of water. Patterns are also found in the type of models that dominate different parts of the distribution of the indicator values, with LSM models providing lower values, and GHM models providing higher values in some climates, and vice versa in others. How important this variability is in supporting a policy decision, depends largely on how a decision thresholds are set. For example in the case of the aridity index, with areas being denoted as arid with an index of 0.6 or above, we show that the variability is primarily of interest in transitional climates, such as the Mediterranean The analysis shows that while both LSM's and GHM's provide useful data, indices derived to support water resources management planning may differ substantially, depending on the model used. The analysis also identifies in which climates improvements to the models are particularly relevant to support the confidence with which decisions can be taken based on derived indicators.
NASA Astrophysics Data System (ADS)
Felfelani, Farshid; Wada, Yoshihide; Longuevergne, Laurent; Pokhrel, Yadu N.
2017-10-01
Hydrological models and the data derived from the Gravity Recovery and Climate Experiment (GRACE) satellite mission have been widely used to study the variations in terrestrial water storage (TWS) over large regions. However, both GRACE products and model results suffer from inherent uncertainties, calling for the need to make a combined use of GRACE and models to examine the variations in total TWS and their individual components, especially in relation to natural and human-induced changes in the terrestrial water cycle. In this study, we use the results from two state-of-the-art hydrological models and different GRACE spherical harmonic products to examine the variations in TWS and its individual components, and to attribute the changes to natural and human-induced factors over large global river basins. Analysis of the spatial patterns of the long-term trend in TWS from the two models and GRACE suggests that both models capture the GRACE-measured direction of change, but differ from GRACE as well as each other in terms of the magnitude over different regions. A detailed analysis of the seasonal cycle of TWS variations over 30 river basins shows notable differences not only between models and GRACE but also among different GRACE products and between the two models. Further, it is found that while one model performs well in highly-managed river basins, it fails to reproduce the GRACE-observed signal in snow-dominated regions, and vice versa. The isolation of natural and human-induced changes in TWS in some of the managed basins reveals a consistently declining TWS trend during 2002-2010, however; significant differences are again obvious both between GRACE and models and among different GRACE products and models. Results from the decomposition of the TWS signal into the general trend and seasonality indicate that both models do not adequately capture both the trend and seasonality in the managed or snow-dominated basins implying that the TWS variations from a single model cannot be reliably used for all global regions. It is also found that the uncertainties arising from climate forcing datasets can introduce significant additional uncertainties, making direct comparison of model results and GRACE products even more difficult. Our results highlight the need to further improve the representation of human land-water management and snow processes in large-scale models to enable a reliable use of models and GRACE to study the changes in freshwater systems in all global regions.
Global Hopf bifurcation analysis on a BAM neural network with delays
NASA Astrophysics Data System (ADS)
Sun, Chengjun; Han, Maoan; Pang, Xiaoming
2007-01-01
A delayed differential equation that models a bidirectional associative memory (BAM) neural network with four neurons is considered. By using a global Hopf bifurcation theorem for FDE and a Bendixon's criterion for high-dimensional ODE, a group of sufficient conditions for the system to have multiple periodic solutions are obtained when the sum of delays is sufficiently large.
ERIC Educational Resources Information Center
Greenberg, Daniel J.
2008-01-01
For educators, the anti-globalization movement has created a literature of opposition which offers enhanced opportunities for teaching critical analysis of neo-liberal political economy. The movement also aids those who wish to teach how First World wealth and privilege is functionally related to Third World poverty and underdevelopment. The…
Benefit Sharing in a Global Context: Working Towards Solutions for Implementation.
Hurst, Daniel J
2017-08-01
Due to the state of globalized clinical research, questions have been raised as to what, if any, benefits those who contribute to research should receive. One model for compensating research participants is "benefit sharing," and the basic premise is that, as a matter of justice, those who contribute to scientific research should share in its benefits. While incorporated into several international documents for over two decades, benefit sharing has only been sparsely implemented. This analysis begins by addressing the concept of benefit sharing, its historical development, and how it has been applied in the context of virus sharing for influenza research. The second portion of this analysis presents recommendations for ensuring benefit sharing. These recommendations are threefold: 1) an emphasis on social pressure, 2) the revision of international documents as means to ensure benefit sharing, and 3) greater collaboration between sponsor IRB and host country IRB. Because clinical research is a globalized industry, a global model will be proposed in the second that focuses on collaboration between the sponsor and host country. This collaboration is vital in order to ensure that proper forms of benefit sharing are accomplished as a matter of justice. © 2016 John Wiley & Sons Ltd.
Validation of reactive gases and aerosols in the MACC global analysis and forecast system
NASA Astrophysics Data System (ADS)
Eskes, H.; Huijnen, V.; Arola, A.; Benedictow, A.; Blechschmidt, A.-M.; Botek, E.; Boucher, O.; Bouarar, I.; Chabrillat, S.; Cuevas, E.; Engelen, R.; Flentje, H.; Gaudel, A.; Griesfeller, J.; Jones, L.; Kapsomenakis, J.; Katragkou, E.; Kinne, S.; Langerock, B.; Razinger, M.; Richter, A.; Schultz, M.; Schulz, M.; Sudarchikova, N.; Thouret, V.; Vrekoussis, M.; Wagner, A.; Zerefos, C.
2015-02-01
The European MACC (Monitoring Atmospheric Composition and Climate) project is preparing the operational Copernicus Atmosphere Monitoring Service (CAMS), one of the services of the European Copernicus Programme on Earth observation and environmental services. MACC uses data assimilation to combine in-situ and remote sensing observations with global and regional models of atmospheric reactive gases, aerosols and greenhouse gases, and is based on the Integrated Forecast System of the ECMWF. The global component of the MACC service has a dedicated validation activity to document the quality of the atmospheric composition products. In this paper we discuss the approach to validation that has been developed over the past three years. Topics discussed are the validation requirements, the operational aspects, the measurement data sets used, the structure of the validation reports, the models and assimilation systems validated, the procedure to introduce new upgrades, and the scoring methods. One specific target of the MACC system concerns forecasting special events with high pollution concentrations. Such events receive extra attention in the validation process. Finally, a summary is provided of the results from the validation of the latest set of daily global analysis and forecast products from the MACC system reported in November 2014.
Liu, Gang; Müller, Daniel B
2013-10-15
Material cycles have become increasingly coupled and interconnected in a globalizing era. While material flow analysis (MFA) has been widely used to characterize stocks and flows along technological life cycle within a specific geographical area, trade networks among individual cycles have remained largely unexplored. Here we developed a trade-linked multilevel MFA model to map the contemporary global journey of anthropogenic aluminum. We demonstrate that the anthropogenic aluminum cycle depends substantially on international trade of aluminum in all forms and becomes highly interconnected in nature. While the Southern hemisphere is the main primary resource supplier, aluminum production and consumption concentrate in the Northern hemisphere, where we also find the largest potential for recycling. The more developed countries tend to have a substantial and increasing presence throughout the stages after bauxite refining and possess highly consumption-based cycles, thus maintaining advantages both economically and environmentally. A small group of countries plays a key role in the global redistribution of aluminum and in the connectivity of the network, which may render some countries vulnerable to supply disruption. The model provides potential insights to inform government and industry policies in resource criticality, supply chain security, value chain management, and cross-boundary environmental impacts mitigation.
Optimizing human activity patterns using global sensitivity analysis.
Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M
2014-12-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.
Optimizing human activity patterns using global sensitivity analysis
Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.
2014-01-01
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080
Optimizing human activity patterns using global sensitivity analysis
Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...
2013-12-10
Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less
Inference of Global Mean Temperature Trend and Climate Change from MSU and AMSU
NASA Technical Reports Server (NTRS)
Prabhakara, Cuddapah; Iacovazzi, R. A., Jr.; Yoo, J.-M.; Lau, William K. M. (Technical Monitor)
2001-01-01
Microwave Sounding Unit (MSU) and Advanced MSU (AMSU) radiometers flown on the NOAA operational satellite series are potentially valuable as global temperature monitoring devices. Spencer and Christy pioneered the analysis of mid-tropospheric temperature, given by MSU Channel 2 (Ch 2) at 53.74 GHz, to derive the global temperature trend. Also, in addition to monitoring global temperature, these microwave radiometers have the potential to reveal interannual climate signals in tropics. We have analyzed the data of MSU Ch 2 and AMSU Ch 5 (53.6 GHz) from the NOAA operational satellites for the period 1980 to 2000, utilizing the NOAA calibration procedure. The data are corrected for the satellite orbital drift based on the temporal changes of the on-board warm blackbody temperature. From our analysis, we find that the global temperature increased at a rate of 0.13 +/- 0.05 Kdecade(sup -1) during 1980 to 2000. From an Empirical Orthogonal Function (EOF) analysis of the MSU global data, we find that the mid-tropospheric temperature in middle and high latitudes responds to the ENSO forcing during the Northern Hemisphere Winter in a distinct manner. This mid-latitude response is opposite in phase to that in the tropics. This result is in accord with simulations performed with an ECMWF global spectral model. This study shows a potential use of the satellite observations for climatic change.
Modeling the Heterogeneous Effects of GHG Mitigation Policies on Global Agriculture and Forestry
NASA Astrophysics Data System (ADS)
Golub, A.; Henderson, B.; Hertel, T. W.; Rose, S. K.; Sohngen, B.
2010-12-01
Agriculture and forestry are envisioned as potentially key sectors for climate change mitigation policy, yet the depth of analysis of mitigation options and their economic consequences remains remarkably shallow in comparison to that for industrial mitigation. Farming and land use change - much of it induced by agriculture -account for one-third of global greenhouse gas (GHG) emissions. Any serious attempt to curtail these emissions will involve changes in the way farming is conducted, as well as placing limits on agricultural expansion into areas currently under more carbon-intensive land cover. However, agriculture and forestry are extremely heterogeneous, both in the technology and intensity of production, as well as in the GHG emissions intensity of these activities. And these differences, in turn, give rise to significant changes in the distribution of agricultural production, trade and consumption in the wake of mitigation policies. This paper assesses such distributional impacts via a global economic analysis undertaken with a modified version of the GTAP model. The paper builds on a global general equilibrium GTAP-AEZ-GHG model (Golub et al., 2009). This is a unified modeling framework that links the agricultural, forestry, food processing and other sectors through land, and other factor markets and international trade, and incorporates different land-types, land uses and related CO2 and non-CO2 GHG emissions and sequestration. The economic data underlying this work is the global GTAP data base aggregated up to 19 regions and 29 sectors. The model incorporates mitigation cost curves for different regions and sectors based on information from the US-EPA. The forestry component of the model is calibrated to the results of the state of the art partial equilibrium global forestry model of Sohngen and Mendelson (2007). Forest carbon sequestration at both the extensive and intensive margins are modeled separately to better isolate land competition between agriculture and timber products. We analyze regional changes in land use, output, competitiveness, and food consumption under climate change mitigation policy regimes which differ by participation/exclusion of agricultural sectors and non-Annex I countries, as well as policy instruments. While responsible for only a third of global GHG emissions, under the global carbon tax the land using sectors could contribute half of all economically efficient mitigation in the near term, at modest carbon prices. The imposition of a carbon tax in agriculture, however, has adverse effects on food consumption, especially in developing countries. These effects are much smaller if an agricultural producer subsidy is introduced to compensate for carbon tax the producers pay. The global forest carbon sequestration subsidy effectively controls emission leakage when the carbon tax is imposed only in Annex I regions, since the sequestration subsidy bids land away from agriculture in non-Annex I regions. Though the sequestration subsidy yields GHG abatement benefit, the policy may adversely affect food security and agricultural development in developing countries.
Changes in Concurrent Risk of Warm and Dry Years under Impact of Climate Change
NASA Astrophysics Data System (ADS)
Sarhadi, A.; Wiper, M.; Touma, D. E.; Ausín, M. C.; Diffenbaugh, N. S.
2017-12-01
Anthropogenic global warming has changed the nature and the risk of extreme climate phenomena. The changing concurrence of multiple climatic extremes (warm and dry years) may result in intensification of undesirable consequences for water resources, human and ecosystem health, and environmental equity. The present study assesses how global warming influences the probability that warm and dry years co-occur in a global scale. In the first step of the study a designed multivariate Mann-Kendall trend analysis is used to detect the areas in which the concurrence of warm and dry years has increased in the historical climate records and also climate models in the global scale. The next step investigates the concurrent risk of the extremes under dynamic nonstationary conditions. A fully generalized multivariate risk framework is designed to evolve through time under dynamic nonstationary conditions. In this methodology, Bayesian, dynamic copulas are developed to model the time-varying dependence structure between the two different climate extremes (warm and dry years). The results reveal an increasing trend in the concurrence risk of warm and dry years, which are in agreement with the multivariate trend analysis from historical and climate models. In addition to providing a novel quantification of the changing probability of compound extreme events, the results of this study can help decision makers develop short- and long-term strategies to prepare for climate stresses now and in the future.
NASA Astrophysics Data System (ADS)
Nunes, A.; Ivanov, V. Y.
2014-12-01
Although current global reanalyses provide reasonably accurate large-scale features of the atmosphere, systematic errors are still found in the hydrological and energy budgets of such products. In the tropics, precipitation is particularly challenging to model, which is also adversely affected by the scarcity of hydrometeorological datasets in the region. With the goal of producing downscaled analyses that are appropriate for a climate assessment at regional scales, a regional spectral model has used a combination of precipitation assimilation with scale-selective bias correction. The latter is similar to the spectral nudging technique, which prevents the departure of the regional model's internal states from the large-scale forcing. The target area in this study is the Amazon region, where large errors are detected in reanalysis precipitation. To generate the downscaled analysis, the regional climate model used NCEP/DOE R2 global reanalysis as the initial and lateral boundary conditions, and assimilated NOAA's Climate Prediction Center (CPC) MORPHed precipitation (CMORPH), available at 0.25-degree resolution, every 3 hours. The regional model's precipitation was successfully brought closer to the observations, in comparison to the NCEP global reanalysis products, as a result of the impact of a precipitation assimilation scheme on cumulus-convection parameterization, and improved boundary forcing achieved through a new version of scale-selective bias correction. Water and energy budget terms were also evaluated against global reanalyses and other datasets.
NASA Astrophysics Data System (ADS)
Adam, L.; Döll, P.; Prigent, C.; Papa, F.
2010-08-01
Floodplains play an important role in the terrestrial water cycle and are very important for biodiversity. Therefore, an improved representation of the dynamics of floodplain water flows and storage in global hydrological and land surface models is required. To support model validation, we combined monthly time series of satellite-derived inundation areas (Papa et al., 2010) with data on irrigated rice areas (Portmann et al., 2010). In this way, we obtained global-scale time series of naturally inundated areas (NIA), with monthly values of inundation extent during 1993-2004 and a spatial resolution of 0.5°. For most grid cells (0.5°×0.5°), the mean annual maximum of NIA agrees well with the static open water extent of the Global Lakes and Wetlands database (GLWD) (Lehner and Döll, 2004), but in 16% of the cells NIA is larger than GLWD. In some regions, like Northwestern Europe, NIA clearly overestimates inundated areas, probably because of confounding very wet soils with inundated areas. In other areas, such as South Asia, it is likely that NIA can help to enhance GLWD. NIA data will be very useful for developing and validating a floodplain modeling algorithm for the global hydrological model WGHM. For example, we found that monthly NIAs correlate with observed river discharges.
Bradley, Beverly D; Jung, Tiffany; Tandon-Verma, Ananya; Khoury, Bassem; Chan, Timothy C Y; Cheng, Yu-Ling
2017-04-18
Operations research (OR) is a discipline that uses advanced analytical methods (e.g. simulation, optimisation, decision analysis) to better understand complex systems and aid in decision-making. Herein, we present a scoping review of the use of OR to analyse issues in global health, with an emphasis on health equity and research impact. A systematic search of five databases was designed to identify relevant published literature. A global overview of 1099 studies highlights the geographic distribution of OR and common OR methods used. From this collection of literature, a narrative description of the use of OR across four main application areas of global health - health systems and operations, clinical medicine, public health and health innovation - is also presented. The theme of health equity is then explored in detail through a subset of 44 studies. Health equity is a critical element of global health that cuts across all four application areas, and is an issue particularly amenable to analysis through OR. Finally, we present seven select cases of OR analyses that have been implemented or have influenced decision-making in global health policy or practice. Based on these cases, we identify three key drivers for success in bridging the gap between OR and global health policy, namely international collaboration with stakeholders, use of contextually appropriate data, and varied communication outlets for research findings. Such cases, however, represent a very small proportion of the literature found. Poor availability of representative and quality data, and a lack of collaboration between those who develop OR models and stakeholders in the contexts where OR analyses are intended to serve, were found to be common challenges for effective OR modelling in global health.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tselioudis, George
2016-03-04
From its location on the subtropics-midlatitude boundary, the Azores is influenced by both the subtropical high pressure and the midlatitude baroclinic storm regimes, and therefore experiences a wide range of cloud structures, from fair-weather scenes to stratocumulus sheets to deep convective systems. This project combined three types of data sets to study cloud variability in the Azores: a satellite analysis of cloud regimes, a reanalysis characterization of storminess, and a 19-month field campaign that occurred on Graciosa Island. Combined analysis of the three data sets provides a detailed picture of cloud variability and the respective dynamic influences, with emphasis onmore » low clouds that constitute a major uncertainty source in climate model simulations. The satellite cloud regime analysis shows that the Azores cloud distribution is similar to the mean global distribution and can therefore be used to evaluate cloud simulation in global models. Regime analysis of low clouds shows that stratocumulus decks occur under the influence of the Azores high-pressure system, while shallow cumulus clouds are sustained by cold-air outbreaks, as revealed by their preference for post-frontal environments and northwesterly flows. An evaluation of CMIP5 climate model cloud regimes over the Azores shows that all models severely underpredict shallow cumulus clouds, while most models also underpredict the occurrence of stratocumulus cloud decks. It is demonstrated that carefully selected case studies can be related through regime analysis to climatological cloud distributions, and a methodology is suggested utilizing process-resolving model simulations of individual cases to better understand cloud-dynamics interactions and attempt to explain and correct climate model cloud deficiencies.« less
NASA Astrophysics Data System (ADS)
James, Rachel; Washington, Richard; Jones, Richard
2015-04-01
There is a demand from adaptation planners for regional climate change projections, particularly the finer resolution data delivered by regional models. However, climate models are subject to important uncertainties, and their projections diverge substantially, particularly for precipitation. So how should decision makers know which futures to consider and which to disregard? Model evaluation is clearly a priority. The majority of studies seeking to assess the validity of projections are based on comparison of the models' twentieth century climatologies with observations or reanalysis. Whilst this work is very important, examination of the modelled mean state it is not sufficient to assess the credibility of modelled changes. Direct investigation of the mechanisms for change is also vital. In this study, a framework for process-based analysis of projections is presented, whereby circulation changes accompanying future responses are examined, and then compared to atmospheric dynamics during historical years in models and reanalyses. This framework has previously been applied to investigate a drying signal in West Africa, and will here be used to examine projected precipitation change in southern Africa. An ensemble of five global and regional model experiments will be employed, consisting of five perturbed versions of HadCM3 and five corresponding runs of HadRM3P (PRECIS), run over the CORDEX Africa domain. The global and regional model runs show contrasting future responses: there is a strong drying in the global models over southern Africa during the rainy season, but the regional models show drying over Madagascar and the south west Indian Ocean. Circulation changes associated with these projections will be presented as a first step towards understanding the mechanisms for change and the reasons for difference between the global and regional models. The interannual variability will also be examined and compared to reanalysis to explore how well the models represent the dipole between southern Africa and Madagascar in the twentieth century simulations. This analysis could shed light on the credibility of the projected changes, and the relative trustworthiness of the global and regional models. This research makes a valuable contribution to the understanding of mechanisms for change in southern Africa. It also has wider relevance for regional climate model studies, in highlighting the need to evaluate models on a case by case basis, and providing a framework for assessment which could be applied to other models and other regions.
A hierarchical structure for automatic meshing and adaptive FEM analysis
NASA Technical Reports Server (NTRS)
Kela, Ajay; Saxena, Mukul; Perucchio, Renato
1987-01-01
A new algorithm for generating automatically, from solid models of mechanical parts, finite element meshes that are organized as spatially addressable quaternary trees (for 2-D work) or octal trees (for 3-D work) is discussed. Because such meshes are inherently hierarchical as well as spatially addressable, they permit efficient substructuring techniques to be used for both global analysis and incremental remeshing and reanalysis. The global and incremental techniques are summarized and some results from an experimental closed loop 2-D system in which meshing, analysis, error evaluation, and remeshing and reanalysis are done automatically and adaptively are presented. The implementation of 3-D work is briefly discussed.
Multi-region statistical shape model for cochlear implantation
NASA Astrophysics Data System (ADS)
Romera, Jordi; Kjer, H. Martin; Piella, Gemma; Ceresa, Mario; González Ballester, Miguel A.
2016-03-01
Statistical shape models are commonly used to analyze the variability between similar anatomical structures and their use is established as a tool for analysis and segmentation of medical images. However, using a global model to capture the variability of complex structures is not enough to achieve the best results. The complexity of a proper global model increases even more when the amount of data available is limited to a small number of datasets. Typically, the anatomical variability between structures is associated to the variability of their physiological regions. In this paper, a complete pipeline is proposed for building a multi-region statistical shape model to study the entire variability from locally identified physiological regions of the inner ear. The proposed model, which is based on an extension of the Point Distribution Model (PDM), is built for a training set of 17 high-resolution images (24.5 μm voxels) of the inner ear. The model is evaluated according to its generalization ability and specificity. The results are compared with the ones of a global model built directly using the standard PDM approach. The evaluation results suggest that better accuracy can be achieved using a regional modeling of the inner ear.
Unitary subsector of generalized minimal models
NASA Astrophysics Data System (ADS)
Behan, Connor
2018-05-01
We revisit the line of nonunitary theories that interpolate between the Virasoro minimal models. Numerical bootstrap applications have brought about interest in the four-point function involving the scalar primary of lowest dimension. Using recent progress in harmonic analysis on the conformal group, we prove the conjecture that global conformal blocks in this correlator appear with positive coefficients. We also compute many such coefficients in the simplest mixed correlator system. Finally, we comment on the status of using global conformal blocks to isolate the truly unitary points on this line.
NASA Technical Reports Server (NTRS)
Bosilovich, Michael G.; Yang, Runhua; Houser, Paul R.
1998-01-01
Land surface hydrology for the Off-line Land-surface GEOS Analysis (OLGA) system and Goddard Earth Observing System (GEOS-1) Data Assimilation System (DAS) has been examined using a river routing model. The GEOS-1 DAS land-surface parameterization is very simple, using an energy balance prediction of surface temperature and prescribed soil water. OLGA uses near-surface atmospheric data from the GEOS-1 DAS to drive a more comprehensive parameterization of the land-surface physics. The two global systems are evaluated using a global river routing model. The river routing model uses climatologic surface runoff from each system to simulate the river discharge from global river basins, which can be compared to climatologic river discharge. Due to the soil hydrology, the OLGA system shows a general improvement in the simulation of river discharge compared to the GEOS-1 DAS. Snowmelt processes included in OLGA also have a positive effect on the annual cycle of river discharge and source runoff. Preliminary tests of a coupled land-atmosphere model indicate improvements to the hydrologic cycle compared to the uncoupled system. The river routing model has provided a useful tool in the evaluation of the GCM hydrologic cycle, and has helped quantify the influence of the more advanced land surface model.
Topology driven modeling: the IS metaphor.
Merelli, Emanuela; Pettini, Marco; Rasetti, Mario
In order to define a new method for analyzing the immune system within the realm of Big Data, we bear on the metaphor provided by an extension of Parisi's model, based on a mean field approach. The novelty is the multilinearity of the couplings in the configurational variables. This peculiarity allows us to compare the partition function [Formula: see text] with a particular functor of topological field theory-the generating function of the Betti numbers of the state manifold of the system-which contains the same global information of the system configurations and of the data set representing them. The comparison between the Betti numbers of the model and the real Betti numbers obtained from the topological analysis of phenomenological data, is expected to discover hidden n-ary relations among idiotypes and anti-idiotypes. The data topological analysis will select global features, reducible neither to a mere subgraph nor to a metric or vector space. How the immune system reacts, how it evolves, how it responds to stimuli is the result of an interaction that took place among many entities constrained in specific configurations which are relational. Within this metaphor, the proposed method turns out to be a global topological application of the S[B] paradigm for modeling complex systems.
Pseudo Phase Plane and Fractional Calculus modeling of western global economic downturn
NASA Astrophysics Data System (ADS)
Tenreiro Machado, J. A.; Mata, Maria Eugénia
2015-05-01
This paper applies Pseudo Phase Plane (PPP) and Fractional Calculus (FC) mathematical tools for modeling world economies. A challenging global rivalry among the largest international economies began in the early 1970s, when the post-war prosperity declined. It went on, up to now. If some worrying threatens may exist actually in terms of possible ambitious military aggression, invasion, or hegemony, countries' PPP relative positions can tell something on the current global peaceful equilibrium. A global political downturn of the USA on global hegemony in favor of Asian partners is possible, but can still be not accomplished in the next decades. If the 1973 oil chock has represented the beginning of a long-run recession, the PPP analysis of the last four decades (1972-2012) does not conclude for other partners' global dominance (Russian, Brazil, Japan, and Germany) in reaching high degrees of similarity with the most developed world countries. The synergies of the proposed mathematical tools lead to a better understanding of the dynamics underlying world economies and point towards the estimation of future states based on the memory of each time series.
Building confidence and credibility amid growing model and computing complexity
NASA Astrophysics Data System (ADS)
Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.
2017-12-01
As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.
Using global sensitivity analysis of demographic models for ecological impact assessment.
Aiello-Lammens, Matthew E; Akçakaya, H Resit
2017-02-01
Population viability analysis (PVA) is widely used to assess population-level impacts of environmental changes on species. When combined with sensitivity analysis, PVA yields insights into the effects of parameter and model structure uncertainty. This helps researchers prioritize efforts for further data collection so that model improvements are efficient and helps managers prioritize conservation and management actions. Usually, sensitivity is analyzed by varying one input parameter at a time and observing the influence that variation has over model outcomes. This approach does not account for interactions among parameters. Global sensitivity analysis (GSA) overcomes this limitation by varying several model inputs simultaneously. Then, regression techniques allow measuring the importance of input-parameter uncertainties. In many conservation applications, the goal of demographic modeling is to assess how different scenarios of impact or management cause changes in a population. This is challenging because the uncertainty of input-parameter values can be confounded with the effect of impacts and management actions. We developed a GSA method that separates model outcome uncertainty resulting from parameter uncertainty from that resulting from projected ecological impacts or simulated management actions, effectively separating the 2 main questions that sensitivity analysis asks. We applied this method to assess the effects of predicted sea-level rise on Snowy Plover (Charadrius nivosus). A relatively small number of replicate models (approximately 100) resulted in consistent measures of variable importance when not trying to separate the effects of ecological impacts from parameter uncertainty. However, many more replicate models (approximately 500) were required to separate these effects. These differences are important to consider when using demographic models to estimate ecological impacts of management actions. © 2016 Society for Conservation Biology.
NASA Astrophysics Data System (ADS)
Jacquin, A. P.; Shamseldin, A. Y.
2009-04-01
This study analyses the sensitivity of the parameters of Takagi-Sugeno-Kang rainfall-runoff fuzzy models previously developed by the authors. These models can be classified in two types, where the first type is intended to account for the effect of changes in catchment wetness and the second type incorporates seasonality as a source of non-linearity in the rainfall-runoff relationship. The sensitivity analysis is performed using two global sensitivity analysis methods, namely Regional Sensitivity Analysis (RSA) and Sobol's Variance Decomposition (SVD). In general, the RSA method has the disadvantage of not being able to detect sensitivities arising from parameter interactions. By contrast, the SVD method is suitable for analysing models where the model response surface is expected to be affected by interactions at a local scale and/or local optima, such as the case of the rainfall-runoff fuzzy models analysed in this study. The data of six catchments from different geographical locations and sizes are used in the sensitivity analysis. The sensitivity of the model parameters is analysed in terms of two measures of goodness of fit, assessing the model performance from different points of view. These measures are the Nash-Sutcliffe criterion and the index of volumetric fit. The results of the study show that the sensitivity of the model parameters depends on both the type of non-linear effects (i.e. changes in catchment wetness or seasonality) that dominates the catchment's rainfall-runoff relationship and the measure used to assess the model performance. Acknowledgements: This research was supported by FONDECYT, Research Grant 11070130. We would also like to express our gratitude to Prof. Kieran M. O'Connor from the National University of Ireland, Galway, for providing the data used in this study.
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
A new framework for comprehensive, robust, and efficient global sensitivity analysis: 2. Application
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin V.
2016-01-01
Based on the theoretical framework for sensitivity analysis called "Variogram Analysis of Response Surfaces" (VARS), developed in the companion paper, we develop and implement a practical "star-based" sampling strategy (called STAR-VARS), for the application of VARS to real-world problems. We also develop a bootstrap approach to provide confidence level estimates for the VARS sensitivity metrics and to evaluate the reliability of inferred factor rankings. The effectiveness, efficiency, and robustness of STAR-VARS are demonstrated via two real-data hydrological case studies (a 5-parameter conceptual rainfall-runoff model and a 45-parameter land surface scheme hydrology model), and a comparison with the "derivative-based" Morris and "variance-based" Sobol approaches are provided. Our results show that STAR-VARS provides reliable and stable assessments of "global" sensitivity across the full range of scales in the factor space, while being 1-2 orders of magnitude more efficient than the Morris or Sobol approaches.
The Existence and Stability Analysis of the Equilibria in Dengue Disease Infection Model
NASA Astrophysics Data System (ADS)
Anggriani, N.; Supriatna, A. K.; Soewono, E.
2015-06-01
In this paper we formulate an SIR (Susceptible - Infective - Recovered) model of Dengue fever transmission with constant recruitment. We found a threshold parameter K0, known as the Basic Reproduction Number (BRN). This model has two equilibria, disease-free equilibrium and endemic equilibrium. By constructing suitable Lyapunov function, we show that the disease- free equilibrium is globally asymptotic stable whenever BRN is less than one and when it is greater than one, the endemic equilibrium is globally asymptotic stable. Numerical result shows the dynamic of each compartment together with effect of multiple bio-agent intervention as a control to the dengue transmission.
Epidemic spreading and global stability of an SIS model with an infective vector on complex networks
NASA Astrophysics Data System (ADS)
Kang, Huiyan; Fu, Xinchu
2015-10-01
In this paper, we present a new SIS model with delay on scale-free networks. The model is suitable to describe some epidemics which are not only transmitted by a vector but also spread between individuals by direct contacts. In view of the biological relevance and real spreading process, we introduce a delay to denote average incubation period of disease in a vector. By mathematical analysis, we obtain the epidemic threshold and prove the global stability of equilibria. The simulation shows the delay will effect the epidemic spreading. Finally, we investigate and compare two major immunization strategies, uniform immunization and targeted immunization.
Heat balance statistics derived from four-dimensional assimilations with a global circulation model
NASA Technical Reports Server (NTRS)
Schubert, S. D.; Herman, G. F.
1981-01-01
The reported investigation was conducted to develop a reliable procedure for obtaining the diabatic and vertical terms required for atmospheric heat balance studies. The method developed employs a four-dimensional assimilation mode in connection with the general circulation model of NASA's Goddard Laboratory for Atmospheric Sciences. The initial analysis was conducted with data obtained in connection with the 1976 Data Systems Test. On the basis of the results of the investigation, it appears possible to use the model's observationally constrained diagnostics to provide estimates of the global distribution of virtually all of the quantities which are needed to compute the atmosphere's heat and energy balance.
Use of wind data in global modelling
NASA Technical Reports Server (NTRS)
Pailleux, J.
1985-01-01
The European Centre for Medium Range Weather Forecasts (ECMWF) is producing operational global analyses every 6 hours and operational global forecasts every day from the 12Z analysis. How the wind data are used in the ECMWF golbal analysis is described. For each current wind observing system, its ability to provide initial conditions for the forecast model is discussed as well as its weaknesses. An assessment of the impact of each individual system on the quality of the analysis and the forecast is given each time it is possible. Sometimes the deficiencies which are pointed out are related not only to the observing system itself but also to the optimum interpolation (OI) analysis scheme; then some improvements are generally possible through ad hoc modifications of the analysis scheme and especially tunings of the structure functions. Examples are given. The future observing network over the North Atlantic is examined. Several countries, coordinated by WMO, are working to set up an 'Operational WWW System Evaluation' (OWSE), in order to evaluate the operational aspects of the deployment of new systems (ASDAR, ASAP). Most of the new systems are expected to be deployed before January 1987, and in order to make the best use of the available resources during the deployment phase, some network studies are carried out at the present time, by using simulated data for ASDAR and ASAP systems. They are summarized.
Global sensitivity analysis of groundwater transport
NASA Astrophysics Data System (ADS)
Cvetkovic, V.; Soltani, S.; Vigouroux, G.
2015-12-01
In this work we address the model and parametric sensitivity of groundwater transport using the Lagrangian-Stochastic Advection-Reaction (LaSAR) methodology. The 'attenuation index' is used as a relevant and convenient measure of the coupled transport mechanisms. The coefficients of variation (CV) for seven uncertain parameters are assumed to be between 0.25 and 3.5, the highest value being for the lower bound of the mass transfer coefficient k0 . In almost all cases, the uncertainties in the macro-dispersion (CV = 0.35) and in the mass transfer rate k0 (CV = 3.5) are most significant. The global sensitivity analysis using Sobol and derivative-based indices yield consistent rankings on the significance of different models and/or parameter ranges. The results presented here are generic however the proposed methodology can be easily adapted to specific conditions where uncertainty ranges in models and/or parameters can be estimated from field and/or laboratory measurements.
Trends in MODIS Geolocation Error Analysis
NASA Technical Reports Server (NTRS)
Wolfe, R. E.; Nishihama, Masahiro
2009-01-01
Data from the two MODIS instruments have been accurately geolocated (Earth located) to enable retrieval of global geophysical parameters. The authors describe the approach used to geolocate with sub-pixel accuracy over nine years of data from M0DIS on NASA's E0S Terra spacecraft and seven years of data from MODIS on the Aqua spacecraft. The approach uses a geometric model of the MODIS instruments, accurate navigation (orbit and attitude) data and an accurate Earth terrain model to compute the location of each MODIS pixel. The error analysis approach automatically matches MODIS imagery with a global set of over 1,000 ground control points from the finer-resolution Landsat satellite to measure static biases and trends in the MO0lS geometric model parameters. Both within orbit and yearly thermally induced cyclic variations in the pointing have been found as well as a general long-term trend.
Women's Work Conditions and Marital Adjustment in Two-Earner Couples: A Structural Model.
ERIC Educational Resources Information Center
Sears, Heather A.; Galambos, Nancy L.
1992-01-01
Evaluated structural model of women's work conditions, women's stress, and marital adjustment using path analysis. Findings from 86 2-earner couples with adolescents indicated support for spillover model in which women's work stress and global stress mediated link between their work conditions and their perceptions of marital adjustment.…
NASA Technical Reports Server (NTRS)
Gregg, Watson W.
1999-01-01
A coupled general ocean circulation, biogeochemical, and radiative model was constructed to evaluate and understand the nature of seasonal variability of chlorophyll and nutrients in the global oceans. The model is driven by climatological meteorological conditions, cloud cover, and sea surface temperature. Biogeochemical processes in the model are determined from the influences of circulation and turbulence dynamics, irradiance availability, and the interactions among three functional phytoplankton groups (diatoms, chorophytes, and picoplankton) and three nutrient groups (nitrate, ammonium, and silicate). Phytoplankton groups are initialized as homogeneous fields horizontally and vertically, and allowed to distribute themselves according to the prevailing conditions. Basin-scale model chlorophyll results are in very good agreement with CZCS pigments in virtually every global region. Seasonal variability observed in the CZCS is also well represented in the model. Synoptic scale (100-1000 km) comparisons of imagery are also in good conformance, although occasional departures are apparent. Agreement of nitrate distributions with in situ data is even better, including seasonal dynamics, except for the equatorial Atlantic. The good agreement of the model with satellite and in situ data sources indicates that the model dynamics realistically simulate phytoplankton and nutrient dynamics on synoptic scales. This is especially true given that initial conditions are homogenous chlorophyll fields. The success of the model in producing a reasonable representation of chlorophyll and nutrient distributions and seasonal variability in the global oceans is attributed to the application of a generalized, processes-driven approach as opposed to regional parameterization, and the existence of multiple phytoplankton groups with different physiological and physical properties. These factors enable the model to simultaneously represent the great diversity of physical, biological, chemical, and radiative environments encountered in the global oceans.
A New Global Regression Analysis Method for the Prediction of Wind Tunnel Model Weight Corrections
NASA Technical Reports Server (NTRS)
Ulbrich, Norbert Manfred; Bridge, Thomas M.; Amaya, Max A.
2014-01-01
A new global regression analysis method is discussed that predicts wind tunnel model weight corrections for strain-gage balance loads during a wind tunnel test. The method determines corrections by combining "wind-on" model attitude measurements with least squares estimates of the model weight and center of gravity coordinates that are obtained from "wind-off" data points. The method treats the least squares fit of the model weight separate from the fit of the center of gravity coordinates. Therefore, it performs two fits of "wind- off" data points and uses the least squares estimator of the model weight as an input for the fit of the center of gravity coordinates. Explicit equations for the least squares estimators of the weight and center of gravity coordinates are derived that simplify the implementation of the method in the data system software of a wind tunnel. In addition, recommendations for sets of "wind-off" data points are made that take typical model support system constraints into account. Explicit equations of the confidence intervals on the model weight and center of gravity coordinates and two different error analyses of the model weight prediction are also discussed in the appendices of the paper.
NASA Astrophysics Data System (ADS)
Koneshov, V. N.; Nepoklonov, V. B.
2018-05-01
The development of studies on estimating the accuracy of the Earth's modern global gravity models in terms of the spherical harmonics of the geopotential in the problematic regions of the world is discussed. The comparative analysis of the results of reconstructing quasi-geoid heights and gravity anomalies from the different models is carried out for two polar regions selected within a radius of 1000 km from the North and South poles. The analysis covers nine recently developed models, including six high-resolution models and three lower order models, including the Russian GAOP2012 model. It is shown that the modern models determine the quasi-geoid heights and gravity anomalies in the polar regions with errors of 5 to 10 to a few dozen cm and from 3 to 5 to a few dozen mGal, respectively, depending on the resolution. The accuracy of the models in the Arctic is several times higher than in the Antarctic. This is associated with the peculiarities of gravity anomalies in every particular region and with the fact that the polar part of the Antarctic has been comparatively less explored by the gravity methods than the polar Arctic.
The global contribution of energy consumption by product exports from China.
Tang, Erzi; Peng, Chong
2017-06-01
This paper presents a model to analyze the mechanism of the global contribution of energy usage by product exports. The theoretical analysis is based on the perspective that contribution estimates should be in relatively smaller sectors in which the production characteristics could be considered, such as the productivity distribution for each sector. Then, we constructed a method to measure the global contribution of energy usage. The simple method to estimate the global contribution is the percentage of goods export volume compared to the GDP as a multiple of total energy consumption, but this method underestimates the global contribution because it ignores the structure of energy consumption and product export in China. According to our measurement method and based on the theoretical analysis, we calculated the global contribution of energy consumption only by industrial manufactured product exports in a smaller sector per industry or manufacturing sector. The results indicated that approximately 42% of the total energy usage in the whole economy for China in 2013 was contributed to foreign regions. Along with the primary products and service export in China, the global contribution of energy consumption for China in 2013 by export was larger than 42% of the total energy usage.
Multi-stage Vector-Borne Zoonoses Models: A Global Analysis.
Bichara, Derdei; Iggidr, Abderrahman; Smith, Laura
2018-04-25
A class of models that describes the interactions between multiple host species and an arthropod vector is formulated and its dynamics investigated. A host-vector disease model where the host's infection is structured into n stages is formulated and a complete global dynamics analysis is provided. The basic reproduction number acts as a sharp threshold, that is, the disease-free equilibrium is globally asymptotically stable (GAS) whenever [Formula: see text] and that a unique interior endemic equilibrium exists and is GAS if [Formula: see text]. We proceed to extend this model with m host species, capturing a class of zoonoses where the cross-species bridge is an arthropod vector. The basic reproduction number of the multi-host-vector, [Formula: see text], is derived and shown to be the sum of basic reproduction numbers of the model when each host is isolated with an arthropod vector. It is shown that the disease will persist in all hosts as long as it persists in one host. Moreover, the overall basic reproduction number increases with respect to the host and that bringing the basic reproduction number of each isolated host below unity in each host is not sufficient to eradicate the disease in all hosts. This is a type of "amplification effect," that is, for the considered vector-borne zoonoses, the increase in host diversity increases the basic reproduction number and therefore the disease burden.
A sensitivity analysis for a thermomechanical model of the Antarctic ice sheet and ice shelves
NASA Astrophysics Data System (ADS)
Baratelli, F.; Castellani, G.; Vassena, C.; Giudici, M.
2012-04-01
The outcomes of an ice sheet model depend on a number of parameters and physical quantities which are often estimated with large uncertainty, because of lack of sufficient experimental measurements in such remote environments. Therefore, the efforts to improve the accuracy of the predictions of ice sheet models by including more physical processes and interactions with atmosphere, hydrosphere and lithosphere can be affected by the inaccuracy of the fundamental input data. A sensitivity analysis can help to understand which are the input data that most affect the different predictions of the model. In this context, a finite difference thermomechanical ice sheet model based on the Shallow-Ice Approximation (SIA) and on the Shallow-Shelf Approximation (SSA) has been developed and applied for the simulation of the evolution of the Antarctic ice sheet and ice shelves for the last 200 000 years. The sensitivity analysis of the model outcomes (e.g., the volume of the ice sheet and of the ice shelves, the basal melt rate of the ice sheet, the mean velocity of the Ross and Ronne-Filchner ice shelves, the wet area at the base of the ice sheet) with respect to the model parameters (e.g., the basal sliding coefficient, the geothermal heat flux, the present-day surface accumulation and temperature, the mean ice shelves viscosity, the melt rate at the base of the ice shelves) has been performed by computing three synthetic numerical indices: two local sensitivity indices and a global sensitivity index. Local sensitivity indices imply a linearization of the model and neglect both non-linear and joint effects of the parameters. The global variance-based sensitivity index, instead, takes into account the complete variability of the input parameters but is usually conducted with a Monte Carlo approach which is computationally very demanding for non-linear complex models. Therefore, the global sensitivity index has been computed using a development of the model outputs in a neighborhood of the reference parameter values with a second-order approximation. The comparison of the three sensitivity indices proved that the approximation of the non-linear model with a second-order expansion is sufficient to show some differences between the local and the global indices. As a general result, the sensitivity analysis showed that most of the model outcomes are mainly sensitive to the present-day surface temperature and accumulation, which, in principle, can be measured more easily (e.g., with remote sensing techniques) than the other input parameters considered. On the other hand, the parameters to which the model resulted less sensitive are the basal sliding coefficient and the mean ice shelves viscosity.
Applications of the U.S. Geological Survey's global land cover product
Reed, B.
1997-01-01
The U.S. Geological Survey (USGS), in partnership with several international agencies and universities, has produced a global land cover characteristics database. The land cover data were created using multitemporal analysis of advanced very high resolution radiometer satellite images in conjunction with other existing geographic data. A translation table permits the conversion of the land cover classes into several conventional land cover schemes that are used by ecosystem modelers, climate modelers, land management agencies, and other user groups. The alternative classification schemes include Global Ecosystems, the Biosphere Atmosphere Transfer Scheme, the Simple Biosphere, the USGS Anderson Level 2, and the International Geosphere Biosphere Programme. The distribution system for these data is through the World Wide Web (the web site address is: http://edcwww.cr.usgs.gov/landdaac/glcc/glcc.html) or by magnetic media upon special request The availability of the data over the World Wide Web, in conjunction with the flexible database structure, allows easy data access to a wide range of users. The web site contains a user registration form that allows analysis of the diverse applications of large-area land cover data. Currently, applications are divided among mapping (20 percent), conservation (30 percent), and modeling (35 percent).
NASA Technical Reports Server (NTRS)
Goossens, Sander Johannes; Ishihara, Yoshiaki; Matsumoto, Koji; Sasaki, Sho
2012-01-01
We present a method with which we determined the local lunar gravity field model over the South Pole-Aitken (SPA) basin on the farside of the Moon by estimating adjustments to a global lunar gravity field model using SELENE tracking data. Our adjustments are expressed in localized functions concentrated over the SPA region in a spherical cap with a radius of 45deg centered at (191.1 deg E, 53.2 deg S), and the resolution is equivalent to a 150th degree and order spherical harmonics expansion. The new solution over SPA was used in several applications of geophysical analysis. It shows an increased correlation with high-resolution lunar topography in the frequency band l = 40-70, and admittance values are slightly different and more leveled when compared to other, global gravity field models using the same data. The adjustments expressed in free-air anomalies and differences in Bouguer anomalies between the local solution and the a priori global solution correlate with topographic surface features. The Moho structure beneath the SPA basin is slightly modified in our solution, most notably at the southern rim of the Apollo basin and around the Zeeman crater
A Program for Solving the Brain Ischemia Problem
DeGracia, Donald J.
2013-01-01
Our recently described nonlinear dynamical model of cell injury is here applied to the problems of brain ischemia and neuroprotection. We discuss measurement of global brain ischemia injury dynamics by time course analysis. Solutions to proposed experiments are simulated using hypothetical values for the model parameters. The solutions solve the global brain ischemia problem in terms of “master bifurcation diagrams” that show all possible outcomes for arbitrary durations of all lethal cerebral blood flow (CBF) decrements. The global ischemia master bifurcation diagrams: (1) can map to a single focal ischemia insult, and (2) reveal all CBF decrements susceptible to neuroprotection. We simulate measuring a neuroprotectant by time course analysis, which revealed emergent nonlinear effects that set dynamical limits on neuroprotection. Using over-simplified stroke geometry, we calculate a theoretical maximum protection of approximately 50% recovery. We also calculate what is likely to be obtained in practice and obtain 38% recovery; a number close to that often reported in the literature. The hypothetical examples studied here illustrate the use of the nonlinear cell injury model as a fresh avenue of approach that has the potential, not only to solve the brain ischemia problem, but also to advance the technology of neuroprotection. PMID:24961411
Yoon, Sunmoo
2017-01-01
Background Twitter can address the mental health challenges of dementia care. The aims of this study is to explore the contents and user interactions of tweets mentioning dementia to gain insights for dementia care. Methods We collected 35,260 tweets mentioning Alzheimer’s or dementia on World Alzheimer’s Day, September 21st in 2015. Topic modeling and social network analysis were applied to uncover content and structure of user communication. Results Global users generated keywords related to mental health and care including #psychology and #mental health. There were similarities and differences between the UK and the US in tweet content. The macro-level analysis uncovered substantial public interest on dementia. The meso-level network analysis revealed that top leaders of communities were spiritual organizations and traditional media. Conclusions The application of topic modeling and multi-level network analysis while incorporating visualization techniques can promote a global level understanding regarding public attention, interests, and insights regarding dementia care and mental health. PMID:27803262
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.
1998-01-01
The use of response surface models and kriging models are compared for approximating non-random, deterministic computer analyses. After discussing the traditional response surface approach for constructing polynomial models for approximation, kriging is presented as an alternative statistical-based approximation method for the design and analysis of computer experiments. Both approximation methods are applied to the multidisciplinary design and analysis of an aerospike nozzle which consists of a computational fluid dynamics model and a finite element analysis model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations. Four optimization problems are formulated and solved using both approximation models. While neither approximation technique consistently outperforms the other in this example, the kriging models using only a constant for the underlying global model and a Gaussian correlation function perform as well as the second order polynomial response surface models.
Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng
2016-01-05
Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng
Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less
Economic impacts of climate change on agriculture: the AgMIP approach
NASA Astrophysics Data System (ADS)
Delincé, Jacques; Ciaian, Pavel; Witzke, Heinz-Peter
2015-01-01
The current paper investigates the long-term global impacts on crop productivity under different climate scenarios using the AgMIP approach (Agricultural Model Intercomparison and Improvement Project). The paper provides horizontal model intercomparison from 11 economic models as well as a more detailed analysis of the simulated effects from the Common Agricultural Policy Regionalized Impact (CAPRI) model to systematically compare its performance with other AgMIP models and specifically for the Chinese agriculture. CAPRI is a comparative static partial equilibrium model extensively used for medium and long-term economic and environmental policy impact applications. The results indicate that, at the global level, the climate change will cause an agricultural productivity decrease (between -2% and -15% by 2050), a food price increase (between 1.3% and 56%) and an expansion of cultivated area (between 1% and 4%) by 2050. The results for China indicate that the climate change effects tend to be smaller than the global impacts. The CAPRI-simulated effects are, in general, close to the median across all AgMIP models. Model intercomparison analyses reveal consistency in terms of direction of change to climate change but relatively strong heterogeneity in the magnitude of the effects between models.
Simple global carbon model: The atmosphere-terrestrial biosphere-ocean interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwon, O.Y.; Schnoor, J.L.
A simple global carbon model has been developed for scenario analysis, and research needs prioritization. CO{sub 2} fertilization and temperature effects are included in the terrestrial biosphere compartment, and the ocean compartment includes inorganic chemistry which, with ocean water circulation, enables the calculation of time-variable oceanic carbon uptake. Model-derived Q{sub 10} values (the increasing rate for every 10{degrees}C increase of temperature) are 1.37 for land biota photosynthesis, 1.89 for land biota respiration, and 1.95 for soil respiration, and feedback temperature is set at 0.01{degrees}C/ppm of CO{sub 2}. These could be the important parameters controlling the carbon cycle in potential globalmore » warming scenarios. Scenario analysis, together with sensitivity analysis of temperature feedback, suggests that if CO{sub 2} emissions from fossil fuel combustion continue at the present increasing rate of {approximately}1.5% per year, a CO{sub 2} doubling (to 560 ppm) will appear in year 2060. Global warming would be responsible for 40 Gt as carbon (Gt C) accumulation in the land biota, 88 Gt C depletion from the soil carbon, a 7 Gt C accumulation in the oceans, and a 19 ppm increase in atmospheric CO{sub 2}. The ocean buffering capacity to take up the excess CO{sub 2} will decrease with the increasing atmospheric CO{sub 2} concentration. 51 refs., 8 figs., 3 tabs.« less
Strategically Planning to Change
ERIC Educational Resources Information Center
Atkins, Kemal
2010-01-01
Higher education, like the private sector, is searching for innovative ways to respond to demographic shifts, globalization, greater accountability, and new technologies. New organizational models are needed to meet these challenges. In a rapidly changing world, the development of such models can occur through effective strategic analysis and…
Anantha M. Prasad; Louis R. Iverson; Andy Liaw; Andy Liaw
2006-01-01
We evaluated four statistical models - Regression Tree Analysis (RTA), Bagging Trees (BT), Random Forests (RF), and Multivariate Adaptive Regression Splines (MARS) - for predictive vegetation mapping under current and future climate scenarios according to the Canadian Climate Centre global circulation model.
Human Land-Use Practices Lead to Global Long-Term Increases in Photosynthetic Capacity
NASA Technical Reports Server (NTRS)
Mueller, Thomas; Tucker, Compton J.; Dressler, Gunnar; Pinzon, Jorge E.; Leimgruber, Peter; Dubayah, Ralph O.; Hurtt, George C.; Boehning-Gaese, Katrin; Fagan, William F.
2014-01-01
Long-term trends in photosynthetic capacity measured with the satellite-derived Normalized Difference Vegetation Index (NDVI) are usually associated with climate change. Human impacts on the global land surface are typically not accounted for. Here, we provide the first global analysis quantifying the effect of the earth's human footprint on NDVI trends. Globally, more than 20% of the variability in NDVI trends was explained by anthropogenic factors such as land use, nitrogen fertilization, and irrigation. Intensely used land classes, such as villages, showed the greatest rates of increase in NDVI, more than twice than those of forests. These findings reveal that factors beyond climate influence global long-term trends in NDVI and suggest that global climate change models and analyses of primary productivity should incorporate land use effects.
NASA Astrophysics Data System (ADS)
Chen, Zheng; Gan, Bolan; Wu, Lixin
2017-09-01
Based on 22 of the climate models from phase 3 of the Coupled Model Intercomparison Project, we investigate the ability of the models to reproduce the spatiotemporal features of the wintertime North Pacific Oscillation (NPO), which is the second most important factor determining the wintertime sea level pressure field in simulations of the pre-industrial control climate, and evaluate the NPO response to the future most reasonable global warming scenario (the A1B scenario). We reveal that while most models simulate the geographic distribution and amplitude of the NPO pattern satisfactorily, only 13 models capture both features well. However, the temporal variability of the simulated NPO could not be significantly correlated with the observations. Further analysis indicates the weakened NPO intensity for a scenario of strong global warming is attributable to the reduced lower-tropospheric baroclinicity at mid-latitudes, which is anticipated to disrupt large-scale and low-frequency atmospheric variability, resulting in the diminished transfer of energy to the NPO, together with its northward shift.
NASA Astrophysics Data System (ADS)
Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.
2017-12-01
Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.
Global ozone and air quality: a multi-model assessment of risks to human health and crops
NASA Astrophysics Data System (ADS)
Ellingsen, K.; Gauss, M.; van Dingenen, R.; Dentener, F. J.; Emberson, L.; Fiore, A. M.; Schultz, M. G.; Stevenson, D. S.; Ashmore, M. R.; Atherton, C. S.; Bergmann, D. J.; Bey, I.; Butler, T.; Drevet, J.; Eskes, H.; Hauglustaine, D. A.; Isaksen, I. S. A.; Horowitz, L. W.; Krol, M.; Lamarque, J. F.; Lawrence, M. G.; van Noije, T.; Pyle, J.; Rast, S.; Rodriguez, J.; Savage, N.; Strahan, S.; Sudo, K.; Szopa, S.; Wild, O.
2008-02-01
Within ACCENT, a European Network of Excellence, eighteen atmospheric models from the U.S., Europe, and Japan calculated present (2000) and future (2030) concentrations of ozone at the Earth's surface with hourly temporal resolution. Comparison of model results with surface ozone measurements in 14 world regions indicates that levels and seasonality of surface ozone in North America and Europe are characterized well by global models, with annual average biases typically within 5-10 nmol/mol. However, comparison with rather sparse observations over some regions suggest that most models overestimate annual ozone by 15-20 nmol/mol in some locations. Two scenarios from the International Institute for Applied Systems Analysis (IIASA) and one from the Intergovernmental Panel on Climate Change Special Report on Emissions Scenarios (IPCC SRES) have been implemented in the models. This study focuses on changes in near-surface ozone and their effects on human health and vegetation. Different indices and air quality standards are used to characterise air quality. We show that often the calculated changes in the different indices are closely inter-related. Indices using lower thresholds are more consistent between the models, and are recommended for global model analysis. Our analysis indicates that currently about two-thirds of the regions considered do not meet health air quality standards, whereas only 2-4 regions remain below the threshold. Calculated air quality exceedances show moderate deterioration by 2030 if current emissions legislation is followed and slight improvements if current emissions reduction technology is used optimally. For the "business as usual" scenario severe air quality problems are predicted. We show that model simulations of air quality indices are particularly sensitive to how well ozone is represented, and improved accuracy is needed for future projections. Additional measurements are needed to allow a more quantitative assessment of the risks to human health and vegetation from changing levels of surface ozone.
Attribution of trends in global vegetation greenness from 1982 to 2011
NASA Astrophysics Data System (ADS)
Zhu, Z.; Xu, L.; Bi, J.; Myneni, R.; Knyazikhin, Y.
2012-12-01
Time series of remotely sensed vegetation indices data provide evidence of changes in terrestrial vegetation activity over the past decades in the world. However, it is difficult to attribute cause-and-effect to vegetation trends because variations in vegetation productivity are driven by various factors. This study investigated changes in global vegetation productivity first, and then attributed the global natural vegetation with greening trend. Growing season integrated normalized difference vegetation index (GSI NDVI) derived from the new GIMMS NDVI3g dataset (1982-2011was analyzed. A combined time series analysis model, which was developed from simper linear trend model (SLT), autoregressive integrated moving average model (ARIMA) and Vogelsang's t-PST model shows that productivity of all vegetation types except deciduous broadleaf forest predominantly showed increasing trends through the 30-year period. The evolution of changes in productivity in the last decade was also investigated. Area of greening vegetation monotonically increased through the last decade, and both the browning and no change area monotonically decreased. To attribute the predominant increase trend of productivity of global natural vegetation, trends of eight climate time series datasets (three temperature, three precipitation and two radiation datasets) were analyzed. The attribution of trends in global vegetation greenness was summarized as relaxation of climatic constraints, fertilization and other unknown reasons. Result shows that nearly all the productivity increase of global natural vegetation was driven by relaxation of climatic constraints and fertilization, which play equally important role in driving global vegetation greenness.; Area fraction and productivity change fraction of IGBP vegetation land cover classes showing statistically significant (10% level) trend in GSI NDVIt;
New limits on intrinsic charm in the nucleon from global analysis of parton distributions
Jimenez-Delgado, P.; Hobbs, T. J.; Londergan, J. T.; ...
2015-02-27
We present a new global QCD analysis of parton distribution functions, allowing for possible intrinsic charm (IC) contributions in the nucleon inspired by light-front models. The analysis makes use of the full range of available high-energy scattering data for Q 2 ≥ 1 GeV 2 and W 2 ≥ 3.5 GeV 2, including fixed-target proton and deuteron deep cross sections at lower energies that were excluded in previously global analyses. The expanded data set places more stringent constraints on the momentum carried by IC, with (x) IC at most 0.5% (corresponding to an IC normalization of ~1%) at the 4σmore » level for Δ X2 = 1. We also assess the impact of older EMC measurements of F c 2c at large x, which favor a nonzero IC, but with very large X 2 values.« less
Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset
NASA Astrophysics Data System (ADS)
Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.
2017-12-01
Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.
A multi-model assessment of terrestrial biosphere model data needs
NASA Astrophysics Data System (ADS)
Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.
2017-12-01
Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.
NASA Astrophysics Data System (ADS)
Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen
2016-07-01
Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.
2013-09-30
accuracy of the analysis . Root mean square difference ( RMSD ) is much smaller for RIP than for either Simple Ocean Data Assimilation or Incremental... Analysis Update globally for temperature as well as salinity. Regionally the same results were found, with only one exception in which the salinity RMSD ...short-term forecast using a numerical model with the observations taken within the forecast time window. The resulting state is the so-called “ analysis
A rumor transmission model with incubation in social networks
NASA Astrophysics Data System (ADS)
Jia, Jianwen; Wu, Wenjiang
2018-02-01
In this paper, we propose a rumor transmission model with incubation period and constant recruitment in social networks. By carrying out an analysis of the model, we study the stability of rumor-free equilibrium and come to the local stable condition of the rumor equilibrium. We use the geometric approach for ordinary differential equations for showing the global stability of the rumor equilibrium. And when ℜ0 = 1, the new model occurs a transcritical bifurcation. Furthermore, numerical simulations are used to support the analysis. At last, some conclusions are presented.
Sustainability, collapse and oscillations in a simple World-Earth model
NASA Astrophysics Data System (ADS)
Nitzbon, Jan; Heitzig, Jobst; Parlitz, Ulrich
2017-07-01
The Anthropocene is characterized by close interdependencies between the natural Earth system and the global human society, posing novel challenges to model development. Here we present a conceptual model describing the long-term co-evolution of natural and socio-economic subsystems of Earth. While the climate is represented via a global carbon cycle, we use economic concepts to model socio-metabolic flows of biomass and fossil fuels between nature and society. A well-being-dependent parametrization of fertility and mortality governs human population dynamics. Our analysis focuses on assessing possible asymptotic states of the Earth system for a qualitative understanding of its complex dynamics rather than quantitative predictions. Low dimension and simple equations enable a parameter-space analysis allowing us to identify preconditions of several asymptotic states and hence fates of humanity and planet. These include a sustainable co-evolution of nature and society, a global collapse and everlasting oscillations. We consider different scenarios corresponding to different socio-cultural stages of human history. The necessity of accounting for the ‘human factor’ in Earth system models is highlighted by the finding that carbon stocks during the past centuries evolved opposing to what would ‘naturally’ be expected on a planet without humans. The intensity of biomass use and the contribution of ecosystem services to human well-being are found to be crucial determinants of the asymptotic state in a (pre-industrial) biomass-only scenario without capital accumulation. The capitalistic, fossil-based scenario reveals that trajectories with fundamentally different asymptotic states might still be almost indistinguishable during even a centuries-long transient phase. Given current human population levels, our study also supports the claim that besides reducing the global demand for energy, only the extensive use of renewable energies may pave the way into a sustainable future.
Estimating heterotrophic respiration at large scales: challenges, approaches, and next steps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bond-Lamberty, Benjamin; Epron, Daniel; Harden, Jennifer W.
2016-06-27
Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of "Decomposition Functional Types" (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelsmore » to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present and discuss an example clustering analysis to show how model-produced annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from already-existing PFTs. A similar analysis, incorporating observational data, could form a basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with high-performance computing; rigorous testing of analytical results; and planning by the global modeling community for decoupling decomposition from fixed site data. These are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR at large scales.« less
NASA Astrophysics Data System (ADS)
Proestos, Y.; Christophides, G.; Erguler, K.; Tanarhte, M.; Waldock, J.; Lelieveld, J.
2014-12-01
Climate change can influence the transmission of vector borne diseases (VBDs) through altering the habitat suitability of insect vectors. Here we present global climate model simulations and evaluate the associated uncertainties in view of the main meteorological factors that may affect the distribution of the Asian Tiger mosquito (Aedes albopictus), which can transmit pathogens that cause Chikungunya, Dengue fever, yellow fever and various encephalitides. Using a general circulation model (GCM) at 50 km horizontal resolution to simulate mosquito survival variables including temperature, precipitation and relative humidity, we present both global and regional projections of the habitat suitability up to the middle of the 21st century. The model resolution of 50 km allows evaluation against previous projections for Europe and provides a basis for comparative analyses with other regions. Model uncertainties and performance are addressed in light of the recent CMIP5 ensemble climate model simulations for the RCP8.5 concentration pathway and using meteorological re-analysis data (ERA-Interim/ECMWF) for the recent past. Uncertainty ranges associated with the thresholds of meteorological variables that may affect the distribution of Ae. albopictus are diagnosed using fuzzy-logic methodology, notably to assess the influence of selected meteorological criteria and combinations of criteria that influence mosquito habitat suitability. From the climate projections for 2050, and adopting a habitat suitability index larger than 70%, we estimate that about 2.4 billion individuals in a land area of nearly 20 million square kilometres will potentially be exposed to Ae. albopictus. The synthesis of fuzzy-logic based on mosquito biology and climate change analysis provides new insights into the regional and global spreading of VBDs to support disease control and policy making.
Estimating heterotrophic respiration at large scales: Challenges, approaches, and next steps
Bond-Lamberty, Ben; Epron, Daniel; Harden, Jennifer; ...
2016-06-27
Heterotrophic respiration (HR), the aerobic and anaerobic processes mineralizing organic matter, is a key carbon flux but one impossible to measure at scales significantly larger than small experimental plots. This impedes our ability to understand carbon and nutrient cycles, benchmark models, or reliably upscale point measurements. Given that a new generation of highly mechanistic, genomic-specific global models is not imminent, we suggest that a useful step to improve this situation would be the development of Decomposition Functional Types (DFTs). Analogous to plant functional types (PFTs), DFTs would abstract and capture important differences in HR metabolism and flux dynamics, allowing modelersmore » and experimentalists to efficiently group and vary these characteristics across space and time. We argue that DFTs should be initially informed by top-down expert opinion, but ultimately developed using bottom-up, data-driven analyses, and provide specific examples of potential dependent and independent variables that could be used. We present an example clustering analysis to show how annual HR can be broken into distinct groups associated with global variability in biotic and abiotic factors, and demonstrate that these groups are distinct from (but complementary to) already-existing PFTs. A similar analysis incorporating observational data could form the basis for future DFTs. Finally, we suggest next steps and critical priorities: collection and synthesis of existing data; more in-depth analyses combining open data with rigorous testing of analytical results; using point measurements and realistic forcing variables to constrain process-based models; and planning by the global modeling community for decoupling decomposition from fixed site data. Lastly, these are all critical steps to build a foundation for DFTs in global models, thus providing the ecological and climate change communities with robust, scalable estimates of HR.« less
2016-02-01
We live in an age that increasingly calls for national or regional management of global risks. This article discusses the contributions that expert elicitation can bring to efforts to manage global risks and identifies challenges faced in conducting expert elicitation at this scale. In doing so it draws on lessons learned from conducting an expert elicitation as part of the World Health Organizations (WHO) initiative to estimate the global burden of foodborne disease; a study commissioned by the Foodborne Disease Epidemiology Reference Group (FERG). Expert elicitation is designed to fill gaps in data and research using structured, transparent methods. Such gaps are a significant challenge for global risk modeling. Experience with the WHO FERG expert elicitation shows that it is feasible to conduct an expert elicitation at a global scale, but that challenges do arise, including: defining an informative, yet feasible geographical structure for the elicitation; defining what constitutes expertise in a global setting; structuring international, multidisciplinary expert panels; and managing demands on experts' time in the elicitation. This article was written as part of a workshop, "Methods for Research Synthesis: A Cross-Disciplinary Approach" held at the Harvard Center for Risk Analysis on October 13, 2013. © 2016 Society for Risk Analysis.
Impacts of Canadian and global black carbon shipping emissions on Arctic climate
NASA Astrophysics Data System (ADS)
Shrestha, R.; von Salzen, K.
2017-12-01
Shipping activities have increased across the Arctic and are projected to continue to increase in the future. In this study we compare the climate impacts of Canadian and global shipping black carbon (BC) emissions on the Arctic using the Canadian Center for Climate Modelling and Analysis Earth System Model (CanESM4.1). The model simulations are performed with and without shipping emissions at T63 (128 x 64) spectral resolution. Results indicate that shipping activities enhance BC concentrations across the area close to the shipping emissions, which causes increased absorption of solar radiation (direct effect). An impact of shipping on temperatures is simulated across the entire Arctic, with maximum warming in fall and winter seasons. Although global mean temperature changes are very similar between the two simulations, increase in Canadian BC shipping emissions cause warmer Arctic land surface temperature in summer due to the direct radiative effects of aerosol.
Importance of vegetation distribution for future carbon balance
NASA Astrophysics Data System (ADS)
Ahlström, A.; Xia, J.; Arneth, A.; Luo, Y.; Smith, B.
2015-12-01
Projections of future terrestrial carbon uptake vary greatly between simulations. Net primary production (NPP), wild fires, vegetation dynamics (including biome shifts) and soil decomposition constitute the main processes governing the response of the terrestrial carbon cycle in a changing climate. While primary production and soil respiration are relatively well studied and implemented in all global ecosystem models used to project the future land sink of CO2, vegetation dynamics are less studied and not always represented in global models. Here we used a detailed second generation dynamic global vegetation model with advanced representation of vegetation growth and mortality and the associated turnover and proven skill in predicting vegetation distribution and succession. We apply an emulator that describes the carbon flows and pools exactly as in simulations with the full model. The emulator simulates ecosystem dynamics in response to 13 different climate or Earth system model simulations from the CMIP5 ensemble under RCP8.5 radiative forcing at year 2085. We exchanged carbon cycle processes between these 13 simulations and investigate the changes predicted by the emulator. This method allowed us to partition the entire ensemble carbon uptake uncertainty into individual processes. We found that NPP, vegetation dynamics (including biome shifts, wild fires and mortality) and soil decomposition rates explained 49%, 17% and 33% respectively of uncertainties in modeled global C-uptake. Uncertainty due to vegetation dynamics was further partitioned into stand-clearing disturbances (16%), wild fires (0%), stand dynamics (7%), reproduction (10%) and biome shifts (67%) globally. We conclude that while NPP and soil decomposition rates jointly account for 83% of future climate induced C-uptake uncertainties, vegetation turnover and structure, dominated by shifts in vegetation distribution, represent a significant fraction globally and regionally (tropical forests: 40%), strongly motivating their representation and analysis in future C-cycle studies.
A high-resolution global-scale groundwater model
NASA Astrophysics Data System (ADS)
de Graaf, I. E. M.; Sutanudjaja, E. H.; van Beek, L. P. H.; Bierkens, M. F. P.
2015-02-01
Groundwater is the world's largest accessible source of fresh water. It plays a vital role in satisfying basic needs for drinking water, agriculture and industrial activities. During times of drought groundwater sustains baseflow to rivers and wetlands, thereby supporting ecosystems. Most global-scale hydrological models (GHMs) do not include a groundwater flow component, mainly due to lack of geohydrological data at the global scale. For the simulation of lateral flow and groundwater head dynamics, a realistic physical representation of the groundwater system is needed, especially for GHMs that run at finer resolutions. In this study we present a global-scale groundwater model (run at 6' resolution) using MODFLOW to construct an equilibrium water table at its natural state as the result of long-term climatic forcing. The used aquifer schematization and properties are based on available global data sets of lithology and transmissivities combined with the estimated thickness of an upper, unconfined aquifer. This model is forced with outputs from the land-surface PCRaster Global Water Balance (PCR-GLOBWB) model, specifically net recharge and surface water levels. A sensitivity analysis, in which the model was run with various parameter settings, showed that variation in saturated conductivity has the largest impact on the groundwater levels simulated. Validation with observed groundwater heads showed that groundwater heads are reasonably well simulated for many regions of the world, especially for sediment basins (R2 = 0.95). The simulated regional-scale groundwater patterns and flow paths demonstrate the relevance of lateral groundwater flow in GHMs. Inter-basin groundwater flows can be a significant part of a basin's water budget and help to sustain river baseflows, especially during droughts. Also, water availability of larger aquifer systems can be positively affected by additional recharge from inter-basin groundwater flows.