Sample records for ukmo amip simulations

  1. Validation of landsurface processes in the AMIP models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, T J

    The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less

  2. Predicting the severity of spurious “double ITCZ” problem in CMIP5 coupled models from AMIP simulations [Tropical versus extratropical origins of the spurious 'double ITCZ' in coupled climate models

    DOE PAGES

    Xiang, Baoqiang; Zhao, Ming; Held, Isaac M.; ...

    2017-02-13

    The severity of the double Intertropical Convergence Zone (DI) problem in climate models can be measured by a tropical precipitation asymmetry index (PAI), indicating whether tropical precipitation favors the Northern Hemisphere or the Southern Hemisphere. Examination of 19 Coupled Model Intercomparison Project phase 5 models reveals that the PAI is tightly linked to the tropical sea surface temperature (SST) bias. As one of the factors determining the SST bias, the asymmetry of tropical net surface heat flux in Atmospheric Model Intercomparison Project (AMIP) simulations is identified as a skillful predictor of the PAI change from an AMIP to a coupledmore » simulation, with an intermodel correlation of 0.90. Using tropical top-of-atmosphere (TOA) fluxes, the correlations are lower but still strong. However, the extratropical asymmetries of surface and TOA fluxes in AMIP simulations cannot serve as useful predictors of the PAI change. Furthermore, this study suggests that the largest source of the DI bias is from the tropics and from atmospheric models.« less

  3. Maritime Continent seasonal climate biases in AMIP experiments of the CMIP5 multimodel ensemble

    NASA Astrophysics Data System (ADS)

    Toh, Ying Ying; Turner, Andrew G.; Johnson, Stephanie J.; Holloway, Christopher E.

    2018-02-01

    The fidelity of 28 Coupled Model Intercomparison Project phase 5 (CMIP5) models in simulating mean climate over the Maritime Continent in the Atmospheric Model Intercomparison Project (AMIP) experiment is evaluated in this study. The performance of AMIP models varies greatly in reproducing seasonal mean climate and the seasonal cycle. The multi-model mean has better skill at reproducing the observed mean climate than the individual models. The spatial pattern of 850 hPa wind is better simulated than the precipitation in all four seasons. We found that model horizontal resolution is not a good indicator of model performance. Instead, a model's local Maritime Continent biases are somewhat related to its biases in the local Hadley circulation and global monsoon. The comparison with coupled models in CMIP5 shows that AMIP models generally performed better than coupled models in the simulation of the global monsoon and local Hadley circulation but less well at simulating the Maritime Continent annual cycle of precipitation. To characterize model systematic biases in the AMIP runs, we performed cluster analysis on Maritime Continent annual cycle precipitation. Our analysis resulted in two distinct clusters. Cluster I models are able to capture both the winter monsoon and summer monsoon shift, but they overestimate the precipitation; especially during the JJA and SON seasons. Cluster II models simulate weaker seasonal migration than observed, and the maximum rainfall position stays closer to the equator throughout the year. The tropics-wide properties of these clusters suggest a connection between the skill of simulating global properties of the monsoon circulation and the skill of simulating the regional scale of Maritime Continent precipitation.

  4. Vegetation/Ecosystem Modeling and Analysis Project:Comparing biogeography and biogeochemistry models in a continental-scale study of terrestrial ecosystem responses to climate change and CO2 doubling

    NASA Astrophysics Data System (ADS)

    1995-12-01

    We compare the simulations of three biogeography models (BIOME2, Dynamic Global Phytogeography Model (DOLY), and Mapped Atmosphere-Plant Soil System (MAPSS)) and three biogeochemistry models (BIOME-BGC (BioGeochemistry Cycles), CENTURY, and Terrestrial Ecosystem Model (TEM)) for the conterminous United States under contemporary conditions of atmospheric CO2 and climate. We also compare the simulations of these models under doubled CO2 and a range of climate scenarios. For contemporary conditions, the biogeography models successfully simulate the geographic distribution of major vegetation types and have similar estimates of area for forests (42 to 46% of the conterminous United States), grasslands (17 to 27%), savannas (15 to 25%), and shrublands (14 to 18%). The biogeochemistry models estimate similar continental-scale net primary production (NPP; 3125 to 3772 × 1012 gC yr-1) and total carbon storage (108 to 118 × 1015 gC) for contemporary conditions. Among the scenarios of doubled CO2 and associated equilibrium climates produced by the three general circulation models (Oregon State University (OSU), Geophysical Fluid Dynamics Laboratory (GFDL), and United Kingdom Meteorological Office (UKMO)), all three biogeography models show both gains and losses of total forest area depending on the scenario (between 38 and 53% of conterminous United States area). The only consistent gains in forest area with all three models (BIOME2, DOLY, and MAPSS) were under the GFDL scenario due to large increases in precipitation. MAPSS lost forest area under UKMO, DOLY under OSU, and BIOME2 under both UKMO and OSU. The variability in forest area estimates occurs because the hydrologic cycles of the biogeography models have different sensitivities to increases in temperature and CO2. However, in general, the biogeography models produced broadly similar results when incorporating both climate change and elevated CO2 concentrations. For these scenarios, the NPP estimated by the biogeochemistry models increases between 2% (BIOME-BGC with UKMO climate) and 35% (TEM with UKMO climate). Changes in total carbon storage range from losses of 33% (BIOME-BGC with UKMO climate) to gains of 16% (TEM with OSU climate). The CENTURY responses of NPP and carbon storage are positive and intermediate to the responses of BIOME-BGC and TEM. The variability in carbon cycle responses occurs because the hydrologic and nitrogen cycles of the biogeochemistry models have different sensitivities to increases in temperature and CO2. When the biogeochemistry models are run with the vegetation distributions of the biogeography models, NPP ranges from no response (BIOME-BGC with all three biogeography model vegetations for UKMO climate) to increases of 40% (TEM with MAPSS vegetation for OSU climate). The total carbon storage response ranges from a decrease of 39% (BIOME-BGC with MAPSS vegetation for UKMO climate) to an increase of 32% (TEM with MAPSS vegetation for OSU and GFDL climates). The UKMO responses of BIOME-BGC with MAPSS vegetation are primarily caused by decreases in forested area and temperature-induced water stress. The OSU and GFDL responses of TEM with MAPSS vegetations are primarily caused by forest expansion and temperature-enhanced nitrogen cycling.

  5. Vegetation/ecosystem modeling and analysis project: Comparing biogeography and biogeochemistry models in a continental-scale study of terrestrial ecosystem responses to climate change and CO2 doubling

    NASA Astrophysics Data System (ADS)

    Melillo, J. M.; Borchers, J.; Chaney, J.; Fisher, H.; Fox, S.; Haxeltine, A.; Janetos, A.; Kicklighter, D. W.; Kittel, T. G. F.; McGuire, A. D.; McKeown, R.; Neilson, R.; Nemani, R.; Ojima, D. S.; Painter, T.

    1995-12-01

    We compare the simulations of three biogeography models (BIOME2, Dynamic Global Phytogeography Model (DOLY), and Mapped Atmosphere-Plant Soil System (MAPSS)) and three biogeochemistry models (BIOME-BGC (BioGeochemistry Cycles), CENTURY, and Terrestrial Ecosystem Model (TEM)) for the conterminous United States under contemporary conditions of atmospheric CO2 and climate. We also compare the simulations of these models under doubled CO2 and a range of climate scenarios. For contemporary conditions, the biogeography models successfully simulate the geographic distribution of major vegetation types and have similar estimates of area for forests (42 to 46% of the conterminous United States), grasslands (17 to 27%), savannas (15 to 25%), and shrublands (14 to 18%). The biogeochemistry models estimate similar continental-scale net primary production (NPP; 3125 to 3772×1012 gCyr-1) and total carbon storage (108 to 118×1015 gC) for contemporary conditions. Among the scenarios of doubled CO2 and associated equilibrium climates produced by the three general circulation models (Oregon State University (OSU), Geophysical Fluid Dynamics Laboratory (GFDL), and United Kingdom Meteorological Office (UKMO)), all three biogeography models show both gains and losses of total forest area depending on the scenario (between 38 and 53% of conterminous United States area). The only consistent gains in forest area with all three models (BIOME2, DOLY, and MAPSS) were under the GFDL scenario due to large increases in precipitation. MAPSS lost forest area under UKMO, DOLY under OSU, and BIOME2 under both UKMO and OSU. The variability in forest area estimates occurs because the hydrologic cycles of the biogeography models have different sensitivities to increases in temperature and CO2. However, in general, the biogeography models produced broadly similar results when incorporating both climate change and elevated CO2 concentrations. For these scenarios, the NPP estimated by the biogeochemistry models increases between 2% (BIOME-BGC with UKMO climate) and 35% (TEM with UKMO climate). Changes in total carbon storage range from losses of 33% (BIOME-BGC with UKMO climate) to gains of 16% (TEM with OSU climate). The CENTURY responses of NPP and carbon storage are positive and intermediate to the responses of BIOME-BGC and TEM. The variability in carbon cycle responses occurs because the hydrologic and nitrogen cycles of the biogeochemistry models have different sensitivities to increases in temperature and CO2. When the biogeochemistry models are run with the vegetation distributions of the biogeography models, NPP ranges from no response (BIOME-BGC with all three biogeography model vegetations for UKMO climate) to increases of 40% (TEM with MAPSS vegetation for OSU climate). The total carbon storage response ranges from a decrease of 39% (BIOME-BGC with MAPSS vegetation for UKMO climate) to an increase of 32% (TEM with MAPSS vegetation for OSU and GFDL climates). The UKMO responses of BIOME-BGC with MAPSS vegetation are primarily caused by decreases in forested area and temperature-induced water stress. The OSU and GFDL responses of TEM with MAPSS vegetations are primarily caused by forest expansion and temperature-enhanced nitrogen cycling.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiang, Baoqiang; Zhao, Ming; Held, Isaac M.

    The severity of the double Intertropical Convergence Zone (DI) problem in climate models can be measured by a tropical precipitation asymmetry index (PAI), indicating whether tropical precipitation favors the Northern Hemisphere or the Southern Hemisphere. Examination of 19 Coupled Model Intercomparison Project phase 5 models reveals that the PAI is tightly linked to the tropical sea surface temperature (SST) bias. As one of the factors determining the SST bias, the asymmetry of tropical net surface heat flux in Atmospheric Model Intercomparison Project (AMIP) simulations is identified as a skillful predictor of the PAI change from an AMIP to a coupledmore » simulation, with an intermodel correlation of 0.90. Using tropical top-of-atmosphere (TOA) fluxes, the correlations are lower but still strong. However, the extratropical asymmetries of surface and TOA fluxes in AMIP simulations cannot serve as useful predictors of the PAI change. Furthermore, this study suggests that the largest source of the DI bias is from the tropics and from atmospheric models.« less

  7. An Evaluation of Teleconnections Over the United States in an Ensemble of AMIP Simulations with the MERRA-2 Configuration of the GEOS Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Collow, Allison B. Marquardt; Mahanama, Sarith P.; Bosilovich, Michael G.; Koster, Randal D.; Schubert, Siegfried D.

    2017-01-01

    The atmospheric general circulation model that is used in NASA's Modern Era Retrospective Analysis for Research and Applications Version 2 (MERRA-2) is evaluated with respect to the relationship between large-scale teleconnection patterns and daily temperature and precipitation over the United States (US) using a ten-member ensemble of simulations, referred to as M2AMIP. A focus is placed on four teleconnection patterns that are known to influence weather and climate in the US: El Nino Southern Oscillation, the Pacific Decadal Oscillation, the North Atlantic Oscillation, and the Pacific-North American Pattern. The monthly and seasonal indices associated with the patterns are correlated with daily temperature and precipitation statistics including: (i) monthly mean 2 m temperature and precipitation, (ii) the frequency of extreme temperature events at the 90th, 95th, and 99th percentiles, and (iii) the frequency and intensity of extreme precipitation events classified at the 90th, 95th, and 99th percentiles.Correlations obtained with M2AMIP data and thus the strength of teleconnections in the free-running model are evaluated through comparison against corresponding correlations computed from observations and from MERRA-2. Overall, the strongest teleconnections in all datasets occur during the winter and coincide with the largest agreement between the observations, MERRA-2, and M2AMIP. When M2AMIP does capture the correlation seen in observations, there is a tendency for the spatial extent to be exaggerated. The weakest agreement between the data sources, for all teleconnection patterns, is in the correlation with extreme precipitation; however there are discrepancies between the datasets in the number of days with at least 1 mm of precipitation: M2AMIP has too few days with precipitation in the Northwest and the Northern Great Plains and too many days in the Northeast. In JJA, M2AMIP has too few days with precipitation in the western two-thirds of the country and too many days with precipitation along the east coast.

  8. ENSO-Related Precipitation and Its Statistical Relationship with the Walker Circulation Trend in CMIP5 AMIP Models

    DOE PAGES

    Yim, Bo; Yeh, Sang -Wook; Sohn, Byung -Ju

    2016-01-29

    Observational evidence shows that the Walker circulation (WC) in the tropical Pacific has strengthened in recent decades. In this study, we examine the WC trend for 1979–2005 and its relationship with the precipitation associated with the El Niño Southern Oscillation (ENSO) using the sea surface temperature (SST)-constrained Atmospheric Model Intercomparison Project (AMIP) simulations of the Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models. All of the 29 models show a strengthening of the WC trend in response to an increase in the SST zonal gradient along the equator. Despite the same SST-constrained AMIP simulations, however, a large diversity ismore » found among the CMIP5 climate models in the magnitude of the WC trend. The relationship between the WC trend and precipitation anomalies (PRCPAs) associated with ENSO (ENSO-related PRCPAs) shows that the longitudinal position of the ENSO-related PRCPAs in the western tropical Pacific is closely related to the magnitude of the WC trend. Specifically, it is found that the strengthening of the WC trend is large (small) in the CMIP5 AMIP simulations in which the ENSO-related PRCPAs are located relatively westward (eastward) in the western tropical Pacific. Furthermore, the zonal shift of the ENSO-related precipitation in the western tropical Pacific, which is associated with the climatological mean precipitation in the tropical Pacific, could play an important role in modifying the WC trend in the CMIP5 climate models.« less

  9. El Nino-Induced Tropical Ocean/Land Energy Exchange in MERRA-2 and M2AMIP

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Robertson, Franklin R.

    2017-01-01

    Studies have shown the correlation and connection of surface temperatures across the globe, ocean and land, related to Tropical SSTs especially El Nino. This climate variability greatly influences regional weather and hydroclimate extremes (e.g. drought and flood). In this paper, we evaluate the relationship of temperatures across the tropical oceans and continents in MERRA-2, and also in a newly developed MERRA-2 AMIP ensemble simulation (M2AMIP). M2AMIP uses the same model and spatial resolution as MERRA-2, producing the same output diagnostics over 10 ensemble members. Composite El Nino temperature data are compared with observations to evaluate the land/sea contrast, variations and phase relationship. The temperature variations are related to surface heat fluxes and the atmospheric temperatures and transport, to identify the processes that lead to the lagged redistribution of heat in the tropics and beyond. Discernable cloud, radiation and data assimilation changes accompany the onset of El Nino affecting continental regions through the progression to and following the peak values. While the model represents these variations in general, regional strengths and weaknesses can be identified.

  10. Analysis of precipitation teleconnections in CMIP models as a measure of model fidelity in simulating precipitation

    NASA Astrophysics Data System (ADS)

    Langenbrunner, B.; Neelin, J.; Meyerson, J.

    2011-12-01

    The accurate representation of precipitation is a recurring issue in global climate models, especially in the tropics. Poor skill in modeling the variability and climate teleconnections associated with El Niño/Southern Oscillation (ENSO) also persisted in the latest Climate Model Intercomparison Project (CMIP) campaigns. Observed ENSO precipitation teleconnections provide a standard by which we can judge a given model's ability to reproduce precipitation and dynamic feedback processes originating in the tropical Pacific. Using CMIP3 Atmospheric Model Intercomparison Project (AMIP) runs as a baseline, we compare precipitation teleconnections between models and observations, and we evaluate these results against available CMIP5 historical and AMIP runs. Using AMIP simulations restricts evaluation to the atmospheric response, as sea surface temperatures (SSTs) in AMIP are prescribed by observations. We use a rank correlation between ENSO SST indices and precipitation to define teleconnections, since this method is robust to outliers and appropriate for non-Gaussian data. Spatial correlations of the modeled and observed teleconnections are then evaluated. We look at these correlations in regions of strong precipitation teleconnections, including equatorial S. America, the "horseshoe" region in the western tropical Pacific, and southern N. America. For each region and season, we create a "normalized projection" of a given model's teleconnection pattern onto that of the observations, a metric that assesses the quality of regional pattern simulations while rewarding signals of correct sign over the region. Comparing this to an area-averaged (i.e., more generous) metric suggests models do better when restrictions on exact spatial dependence are loosened and conservation constraints apply. Model fidelity in regional measures remains far from perfect, suggesting intrinsic issues with the models' regional sensitivities in moist processes.

  11. Co-variation of Temperature and Precipitation in CMIP5 Models and Satellite Observations

    NASA Technical Reports Server (NTRS)

    Liu, Chunlei; Allan, Richard P.; Huffman, George J.

    2013-01-01

    Current variability of precipitation (P) and its response to surface temperature (T) are analysed using coupled (CMIP5) and atmosphere-only (AMIP5) climate model simulations and compared with observational estimates.There is striking agreement between Global Precipitation Climatology Project (GPCP) observed and AMIP5)simulated P anomalies over land both globally and in the tropics suggesting that prescribed sea surface temperature and realistic radiative forcings are sufficient for simulating the interannual variability in continental P. Differences between the observed and simulated P variability over the ocean, originate primarily from the wet tropical regions, in particular the western Pacific, but are reduced slightly after 1995. All datasets show positive responses of P to T globally of around 2 % K for simulations and 3-4 % K in GPCP observations but model responses over the tropical oceans are around 3 times smaller than GPCP over the period 1988-2005. The observed anticorrelation between land and ocean P, linked with El Nio Southern Oscillation, is captured by the simulations. All data sets over the tropical ocean show a tendency for wet regions to become wetter and dry regions drier with warming. Over the wet region (greater than or equal to 75 precipitation percentile), the precipitation response is 13-15%K for GPCP and 5%K for models while trends in P are 2.4% decade for GPCP, 0.6% decade for CMIP5 and 0.9decade for AMIP5 suggesting that models are underestimating the precipitation responses or a deficiency exists in the satellite datasets.

  12. Summer U.S. Surface Air Temperature Variability: Controlling Factors and AMIP Simulation Biases

    NASA Astrophysics Data System (ADS)

    Merrifield, A.; Xie, S. P.

    2016-02-01

    This study documents and investigates biases in simulating summer surface air temperature (SAT) variability over the continental U.S. in the Coupled Model Intercomparison Project (CMIP5) Atmospheric Model Intercomparison Project (AMIP). Empirical orthogonal function (EOF) and multivariate regression analyses are used to assess the relative importance of circulation and the land surface feedback at setting summer SAT over a 30-year period (1979-2008). In observations, regions of high SAT variability are closely associated with midtropospheric highs and subsidence, consistent with adiabatic theory (Meehl and Tebaldi 2004, Lau and Nath 2012). Preliminary analysis shows the majority of the AMIP models feature high SAT variability over the central U.S., displaced south and/or west of observed centers of action (COAs). SAT COAs in models tend to be concomitant with regions of high sensible heat flux variability, suggesting an excessive land surface feedback in these models modulate U.S. summer SAT. Additionally, tropical sea surface temperatures (SSTs) play a role in forcing the leading EOF mode for summer SAT, in concert with internal atmospheric variability. There is evidence that models respond to different SST patterns than observed. Addressing issues with the bulk land surface feedback and the SST-forced component of atmospheric variability may be key to improving model skill in simulating summer SAT variability over the U.S.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, Bo; Yeh, Sang -Wook; Sohn, Byung -Ju

    Observational evidence shows that the Walker circulation (WC) in the tropical Pacific has strengthened in recent decades. In this study, we examine the WC trend for 1979–2005 and its relationship with the precipitation associated with the El Niño Southern Oscillation (ENSO) using the sea surface temperature (SST)-constrained Atmospheric Model Intercomparison Project (AMIP) simulations of the Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models. All of the 29 models show a strengthening of the WC trend in response to an increase in the SST zonal gradient along the equator. Despite the same SST-constrained AMIP simulations, however, a large diversity ismore » found among the CMIP5 climate models in the magnitude of the WC trend. The relationship between the WC trend and precipitation anomalies (PRCPAs) associated with ENSO (ENSO-related PRCPAs) shows that the longitudinal position of the ENSO-related PRCPAs in the western tropical Pacific is closely related to the magnitude of the WC trend. Specifically, it is found that the strengthening of the WC trend is large (small) in the CMIP5 AMIP simulations in which the ENSO-related PRCPAs are located relatively westward (eastward) in the western tropical Pacific. Furthermore, the zonal shift of the ENSO-related precipitation in the western tropical Pacific, which is associated with the climatological mean precipitation in the tropical Pacific, could play an important role in modifying the WC trend in the CMIP5 climate models.« less

  14. Parallelization and Visual Analysis of Multidimensional Fields: Application to Ozone Production, Destruction, and Transport in Three Dimensions

    NASA Technical Reports Server (NTRS)

    Schwan, Karsten; Alyea, Fred; Ribarsky, M. William; Trauner, Mary; Eisenhauer, Greg; Jean, Yves; Gu, Weiming; Wang, Ray; Waldrop, Jeffrey; Schroeder, Beth; hide

    1996-01-01

    The three-dimensional, spectral transport model used in the current project was first successfully integrated over climatological time scales by Dr. Guang Ping Lou for the simulation of atmospheric N2O using the United Kingdom Meteorological Office (UKMO) 4-dimensional, assimilated wind and temperature data set. A non-parallel, FORTRAN version of this integration using a fairly simple N2O chemistry package containing only photo-chemical reactions was used to verify our initial parallel model results. The integrations reproduced the gross features of the observed stratospheric climatological N2O distributions but also simulated the structure of the stratospheric Antarctic vortex and its evolution. Subsequently, Dr. Thomas Kindler, who produced much of the parallel version of our model, enlarged the N2O model chemistry package to include N2O reactions involving O(D-1) and also introduced assimilated wind data from NASA as well as UKMO. Initially, transport calculations without chemistry were run using Carbon-14 as a non-reactive tracer gas with the result that large differences in the transport properties of the two assimilated wind data sets were apparent from the resultant Carbon-14 distributions. Subsequent calculations for N2O, including its chemistry, with the two input winds data sets with verification from UARS satellite observations have refined the transport differences between the two such that the model's steering capabilities could be used to infer the correct climatological vertical velocity fields required to support the N2O observations. During this process, it was also discovered that both the NASA and the UKMO data contained spurious values in some of the higher frequency wave components, leading to incorrect local transport calculations and ultimately affecting the large scale properties of the model's N2O distributions, particularly at tropical latitudes. Subsequent model runs with wind data that had been filtered to remove some of the high frequency components produced much more realistic N2O distributions. During the past few months, the UKMO wind data base for a complete two-year period was processed into spectral form for model use. This new version of the input transport data base now includes complete temperature fields as well as the necessary wind data. This was done to facilitate advanced chemical calculations in the parallel model which often depend upon temperature. Additional UKMO data is being added as it becomes available.

  15. The North Pacific as a Regulator of Summertime Climate Over North America and the Asian Monsoon

    NASA Technical Reports Server (NTRS)

    Lau, William K. M.; Wang, H.

    2004-01-01

    The interannual variability of summertime rainfall over the U.S. may be linked to climate anomalies over Pacific and East Asia through teleconnection patterns that may be components of recurring global climate modes in boreal summer (Lau and Weng 2002). In this study, maintenance of the boreal summer teleconnection patterns is investigated. The particular focus is on the potential effects of North Pacific air-sea interaction on climate anomalies over the U.S. Observational data, reanalysis and outputs of a series of NASA NSIPP AGCM and AGCM coupled to NASA GSFC MLO model experiments are used. Statistical analysis of observations and NSIPP AMIP type simulations indicates that, the interannual variability of observed warm season precipitation over the U.S. is related to SST variation in both tropical and North Pacific, whereas the NSIPP AMIP simulated summertime US. precipitation variation mainly reflects impact of ENS0 in tropical Pacific. This implies the potential importance of air-sea interaction in North Pacific in contributing to the interannual variability of observed summer climate over the U.S. The anomalous atmospheric circulation associated with the dominant summertime teleconnection modes in both observations and NSIPP AMIP simulations are further diagnosed, using stationary wave modeling approach. In observations, for the two dominant modes, both anomalous diabatic heating and anomalous transients significantly contribute to the anomalous circulation. The distributions of the anomalous diabatic heating and transient forcing are quadrature configured over North Pacific and North America, so that both forcings act constructively to maintain the teleconnection patterns. The contrast between observations and NSIPP AMIP simulations from stationary wave modeling diagnosis confirms the previous conclusion based on statistical analysis. To better appreciate the role of extra-tropical air-sea interaction in maintaining the summertime teleconnection pattern, various dynamical and physical fields and their inter- linkage in the series of NSIPP AGCM and AGCM coupled to MLO model experiments are examined in-depth. Based on comparison between different model experiments, we will discuss the physical and dynamical mechanisms through which the air-sea interaction in extratropics, and transient mean flow interactions over the North Pacific, affects interannual variation of U.S. climate during boreal summer.

  16. Use of circulation types classifications to evaluate AR4 climate models over the Euro-Atlantic region

    NASA Astrophysics Data System (ADS)

    Pastor, M. A.; Casado, M. J.

    2012-10-01

    This paper presents an evaluation of the multi-model simulations for the 4th Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) in terms of their ability to simulate the ERA40 circulation types over the Euro-Atlantic region in winter season. Two classification schemes, k-means and SANDRA, have been considered to test the sensitivity of the evaluation results to the classification procedure. The assessment allows establishing different rankings attending spatial and temporal features of the circulation types. Regarding temporal characteristics, in general, all AR4 models tend to underestimate the frequency of occurrence. The best model simulating spatial characteristics is the UKMO-HadGEM1 whereas CCSM3, UKMO-HadGEM1 and CGCM3.1(T63) are the best simulating the temporal features, for both classification schemes. This result agrees with the AR4 models ranking obtained when having analysed the ability of the same AR4 models to simulate Euro-Atlantic variability modes. This study has proved the utility of applying such a synoptic climatology approach as a diagnostic tool for models' assessment. The ability of the models to properly reproduce the position of ridges and troughs and the frequency of synoptic patterns, will therefore improve our confidence in the response of models to future climate changes.

  17. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    NASA Astrophysics Data System (ADS)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (< 10 days ) and longer ( 1 year) Perturbed Parameters Ensemble (PPE) simulations at low resolution to identify model feature sensitivity to parameter changes. The CAPT tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in greater detail once an educated set of parameter choice is selected. Limitations on using short-term simulations for tuning climate model are also discussed.

  18. Predictability of the 1997 and 1998 South Asian Summer Monsoons on the Intraseasonal Time Scale Based on 10 AMIP2 Model Runs

    NASA Technical Reports Server (NTRS)

    Wu, Man Li C.; Schubert, Siegfried; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Predictability of the 1997 and 1998 South Asian summer monsoons is examined using National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalyses, and 100 two-year simulations with ten different Atmospheric General Circulation Models (AGCMs) with prescribed sea surface temperature (SST). We focus on the intraseasonal variations of the south Asian summer monsoon associated with the Madden-Julian Oscillation (MJO). The NCEP/NCAR reanalysis shows a clear coupling between SST anomalies and upper level velocity potential anomalies associated with the MJO. We analyze several MJO events that developed during the 1997 and 1998 focusing of the coupling with the SST. The same analysis is carried out for the model simulations. Remarkably, the ensemble mean of the two-year AGCM simulations show a signature of the observed MJO events. The ensemble mean simulated MJO events are approximately in phase with the observed events, although they are weaker, the period of oscillation is somewhat longer, and their onset is delayed by about ten days compared with the observations. Details of the analysis and comparisons among the ten AMIP2 (Atmospheric Model Intercomparison Project) models will be presented in the conference.

  19. An Intercomparison of Changes Associated with Earth's Lower Tropospheric Temperature Using Traditional and AMIP-Style Reanalyses

    NASA Technical Reports Server (NTRS)

    Marquardt-Collow, Allison B.; Bosilovich, Michael G.; Cullather, Richard I.

    2017-01-01

    Reanalyses have become an integral tool for evaluating regional and global climate variations, and an important component of this is modifications to the energy budget. Reductions in Arctic Sea ice extent has induced an albedo feedback, causing the Arctic to warm more rapidly than anywhere else in the world, referred to as "Arctic Amplification." This has been demonstrated by observations and numerous reanalyses, including the Modern Era Retrospective Analysis for Research and Applications, Version 2 (MERRA-2). However, the Arctic Amplification signal is non-existent in a ten member ensemble of the MERRA-2 Atmospheric Model Intercomparison Project (M2AMIP) simulations, using the same prescribed climate forcing, including Sea Surface Temperature (SST) and ice. An evaluation of the temperature tendency within the lower troposphere due to radiation, moisture, and dynamics as well as the surface energy budget in MERRA-2 and M2AMIP will demonstrate that despite identical prescribed SSTs and sea ice in both versions, enhanced warming in the Arctic in MERRA-2 is in response to the analysis increment tendency due to temperature observations. Furthermore, the role of boundary conditions, model biases and changes in observing systems on the Arctic Amplification signal will be assessed. Literature on the topic of Arctic Amplification demonstrates that the enhanced warming begins in the mid-1990s. Anomalously warm Arctic SST in the early time period of MERRA-2 can mute the trend in Arctic lower troposphere temperature without the constraint of observations in M2AMIP. Additionally, MERRA-2 uses three distinct datasets of SST and sea ice concentration, which also plays a role.

  20. The GFDL global atmosphere and land model AM4.0/LM4.0: 1. Simulation characteristics with prescribed SSTs

    USGS Publications Warehouse

    Zhao, M.; Golaz, J.-C.; Held, I. M.; Guo, H.; Balaji, V.; Benson, R.; Chen, J.-H.; Chen, X.; Donner, L. J.; Dunne, J. P.; Dunne, Krista A.; Durachta, J.; Fan, S.-M.; Freidenreich, S. M.; Garner, S. T.; Ginoux, P.; Harris, L. M.; Horowitz, L. W.; Krasting, J. P.; Langenhorst, A. R.; Liang, Z.; Lin, P.; Lin, S.-J.; Malyshev, S. L.; Mason, E.; Milly, Paul C.D.; Ming, Y.; Naik, V.; Paulot, F.; Paynter, D.; Phillipps, P.; Radhakrishnan, A.; Ramaswamy, V.; Robinson, T.; Schwarzkopf, D.; Seman, C. J.; Shevliakova, E.; Shen, Z.; Shin, H.; Silvers, L.; Wilson, J. R.; Winton, M.; Wittenberg, A. T.; Wyman, B.; Xiang, B.

    2018-01-01

    In this two‐part paper, a description is provided of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). This version, with roughly 100 km horizontal resolution and 33 levels in the vertical, contains an aerosol model that generates aerosol fields from emissions and a “light” chemistry mechanism designed to support the aerosol model but with prescribed ozone. In Part 1, the quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode—with prescribed sea surface temperatures (SSTs) and sea‐ice distribution—is described and compared with previous GFDL models and with the CMIP5 archive of AMIP simulations. The model's Cess sensitivity (response in the top‐of‐atmosphere radiative flux to uniform warming of SSTs) and effective radiative forcing are also presented. In Part 2, the model formulation is described more fully and key sensitivities to aspects of the model formulation are discussed, along with the approach to model tuning.

  1. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 1. Simulation Characteristics With Prescribed SSTs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Ming; Golaz, J. -C.; Held, I. M.

    In this two–part paper, a description is provided of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). This version, with roughly 100 km horizontal resolution and 33 levels in the vertical, contains an aerosol model that generates aerosol fields from emissions and a “light” chemistry mechanism designed to support the aerosol model but with prescribed ozone. In Part 1, the quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode—with prescribed seamore » surface temperatures (SSTs) and sea–ice distribution—is described and compared with previous GFDL models and with the CMIP5 archive of AMIP simulations. Here, the model's Cess sensitivity (response in the top–of–atmosphere radiative flux to uniform warming of SSTs) and effective radiative forcing are also presented. In Part 2, the model formulation is described more fully and key sensitivities to aspects of the model formulation are discussed, along with the approach to model tuning.« less

  2. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 1. Simulation Characteristics With Prescribed SSTs

    DOE PAGES

    Zhao, Ming; Golaz, J. -C.; Held, I. M.; ...

    2018-02-19

    In this two–part paper, a description is provided of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). This version, with roughly 100 km horizontal resolution and 33 levels in the vertical, contains an aerosol model that generates aerosol fields from emissions and a “light” chemistry mechanism designed to support the aerosol model but with prescribed ozone. In Part 1, the quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode—with prescribed seamore » surface temperatures (SSTs) and sea–ice distribution—is described and compared with previous GFDL models and with the CMIP5 archive of AMIP simulations. Here, the model's Cess sensitivity (response in the top–of–atmosphere radiative flux to uniform warming of SSTs) and effective radiative forcing are also presented. In Part 2, the model formulation is described more fully and key sensitivities to aspects of the model formulation are discussed, along with the approach to model tuning.« less

  3. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 1. Simulation Characteristics With Prescribed SSTs

    NASA Astrophysics Data System (ADS)

    Zhao, M.; Golaz, J.-C.; Held, I. M.; Guo, H.; Balaji, V.; Benson, R.; Chen, J.-H.; Chen, X.; Donner, L. J.; Dunne, J. P.; Dunne, K.; Durachta, J.; Fan, S.-M.; Freidenreich, S. M.; Garner, S. T.; Ginoux, P.; Harris, L. M.; Horowitz, L. W.; Krasting, J. P.; Langenhorst, A. R.; Liang, Z.; Lin, P.; Lin, S.-J.; Malyshev, S. L.; Mason, E.; Milly, P. C. D.; Ming, Y.; Naik, V.; Paulot, F.; Paynter, D.; Phillipps, P.; Radhakrishnan, A.; Ramaswamy, V.; Robinson, T.; Schwarzkopf, D.; Seman, C. J.; Shevliakova, E.; Shen, Z.; Shin, H.; Silvers, L. G.; Wilson, J. R.; Winton, M.; Wittenberg, A. T.; Wyman, B.; Xiang, B.

    2018-03-01

    In this two-part paper, a description is provided of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). This version, with roughly 100 km horizontal resolution and 33 levels in the vertical, contains an aerosol model that generates aerosol fields from emissions and a "light" chemistry mechanism designed to support the aerosol model but with prescribed ozone. In Part 1, the quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode—with prescribed sea surface temperatures (SSTs) and sea-ice distribution—is described and compared with previous GFDL models and with the CMIP5 archive of AMIP simulations. The model's Cess sensitivity (response in the top-of-atmosphere radiative flux to uniform warming of SSTs) and effective radiative forcing are also presented. In Part 2, the model formulation is described more fully and key sensitivities to aspects of the model formulation are discussed, along with the approach to model tuning.

  4. Understanding the Central Equatorial African long-term drought using AMIP-type simulations

    NASA Astrophysics Data System (ADS)

    Hua, Wenjian; Zhou, Liming; Chen, Haishan; Nicholson, Sharon E.; Jiang, Yan; Raghavendra, Ajay

    2018-02-01

    Previous studies show that Indo-Pacific sea surface temperature (SST) variations may help to explain the observed long-term drought during April-May-June (AMJ) since the 1990s over Central equatorial Africa (CEA). However, the underlying physical mechanisms for this drought are still not clear due to observation limitations. Here we use the AMIP-type simulations with 24 ensemble members forced by observed SSTs from the ECHAM4.5 model to explore the likely physical processes that determine the rainfall variations over CEA. We not only examine the ensemble mean (EM), but also compare the "good" and "poor" ensemble members to understand the intra-ensemble variability. In general, EM and the "good" ensemble member can simulate the drought and associated reduced vertical velocity and anomalous anti-cyclonic circulation in the lower troposphere. However, the "poor" ensemble members cannot simulate the drought and associated circulation patterns. These contrasts indicate that the drought is tightly associated with the tropical Walker circulation and atmospheric teleconnection patterns. If the observational circulation patterns cannot be reproduced, the CEA drought will not be captured. Despite the large intra-ensemble spread, the model simulations indicate an essential role of SST forcing in causing the drought. These results suggest that the long-term drought may result from tropical Indo-Pacific SST variations associated with the enhanced and westward extended tropical Walker circulation.

  5. Tropical Indian Ocean warming contributions to China winter climate trends since 1960

    NASA Astrophysics Data System (ADS)

    Wu, Qigang; Yao, Yonghong; Liu, Shizuo; Cao, DanDan; Cheng, Luyao; Hu, Haibo; Sun, Leng; Yao, Ying; Yang, Zhiqi; Gao, Xuxu; Schroeder, Steven R.

    2018-01-01

    This study investigates observed and modeled contributions of global sea surface temperature (SST) to China winter climate trends in 1960-2014, including increased precipitation, warming through about 1997, and cooling since then. Observations and Atmospheric Model Intercomparison Project (AMIP) simulations with prescribed historical SST and sea ice show that tropical Indian Ocean (TIO) warming and increasing rainfall causes diabatic heating that generates a tropospheric wave train with anticyclonic 500-hPa height anomaly centers in the TIO or equatorial western Pacific (TIWP) and northeastern Eurasia (EA) and a cyclonic anomaly over China, referred to as the TIWP-EA wave train. The cyclonic anomaly causes Indochina moisture convergence and southwesterly moist flow that enhances South China precipitation, while the northern anticyclone enhances cold surges, sometimes causing severe ice storms. AMIP simulations show a 1960-1997 China cooling trend by simulating increasing instead of decreasing Arctic 500-hPa heights that move the northern anticyclone into Siberia, but enlarge the cyclonic anomaly so it still simulates realistic China precipitation trend patterns. A separate idealized TIO SST warming simulation simulates the TIWP-EA feature more realistically with correct precipitation patterns and supports the TIWP-EA teleconnection as the primary mechanism for long-term increasing precipitation in South China since 1960. Coupled Model Intercomparison Project (CMIP) experiments simulate a reduced TIO SST warming trend and weak precipitation trends, so the TIWP-EA feature is absent and strong drying is simulated in South China for 1960-1997. These simulations highlight the need for accurately modeled SST to correctly attribute regional climate trends.

  6. Coupled ocean-atmosphere models feature systematic delay in Indian monsoon onset compared to their atmosphere-only component

    NASA Astrophysics Data System (ADS)

    Turner, Andrew

    2014-05-01

    In this study we examine monsoon onset characteristics in 20th century historical and AMIP integrations of the CMIP5 multi-model database. We use a period of 1979-2005, common to both the AMIP and historical integrations. While all available observed boundary conditions, including sea-surface temperature (SST), are prescribed in the AMIP integrations, the historical integrations feature ocean-atmosphere models that generate SSTs via air-sea coupled processes. The onset of Indian monsoon rainfall is shown to be systematically earlier in the AMIP integrations when comparing groups of models that provide both experiments, and in the multi-model ensemble means for each experiment in turn. We also test some common circulation indices of the monsoon onset including the horizontal shear in the lower troposphere and wind kinetic energy. Since AMIP integrations are forced by observed SSTs and CMIP5 models are known to have large cold SST biases in the northern Arabian Sea during winter and spring that limits their monsoon rainfall, we relate the delayed onset in the coupled historical integrations to cold Arabian Sea SST biases. This study provides further motivation for solving cold SST biases in the Arabian Sea in coupled models.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakaguchi, Koichi; Leung, Lai-Yung R.; Zhao, Chun

    This study presents a diagnosis of a multi-resolution approach using the Model for Prediction Across Scales - Atmosphere (MPAS-A) for simulating regional climate. Four AMIP experiments are conducted for 1999-2009. In the first two experiments, MPAS-A is configured using global quasi-uniform grids at 120 km and 30 km grid spacing. In the other two experiments, MPAS-A is configured using variable-resolution (VR) mesh with local refinement at 30 km over North America and South America embedded inside a quasi-uniform domain at 120 km elsewhere. Precipitation and related fields in the four simulations are examined to determine how well the VR simulationsmore » reproduce the features simulated by the globally high-resolution model in the refined domain. In previous analyses of idealized aqua-planet simulations, the characteristics of the global high-resolution simulation in moist processes only developed near the boundary of the refined region. In contrast, the AMIP simulations with VR grids are able to reproduce the high-resolution characteristics across the refined domain, particularly in South America. This indicates the importance of finely resolved lower-boundary forcing such as topography and surface heterogeneity for the regional climate, and demonstrates the ability of the MPAS-A VR to replicate the large-scale moisture transport as simulated in the quasi-uniform high-resolution model. Outside of the refined domain, some upscale effects are detected through large-scale circulation but the overall climatic signals are not significant at regional scales. Our results provide support for the multi-resolution approach as a computationally efficient and physically consistent method for modeling regional climate.« less

  8. Understanding the tropical warm temperature bias simulated by climate models

    NASA Astrophysics Data System (ADS)

    Brient, Florent; Schneider, Tapio

    2017-04-01

    The state-of-the-art coupled general circulation models have difficulties in representing the observed spatial pattern of surface tempertaure. A majority of them suffers a warm bias in the tropical subsiding regions located over the eastern parts of oceans. These regions are usually covered by low-level clouds scattered from stratus along the coasts to more vertically developed shallow cumulus farther from them. Models usually fail to represent accurately this transition. Here we investigate physical drivers of this warm bias in CMIP5 models through a near-surface energy budget perspective. We show that overestimated solar insolation due to a lack of stratocumulus mostly explains the warm bias. This bias also arises partly from inter-model differences in surface fluxes that could be traced to differences in near-surface relative humidity and air-sea temperature gradient. We investigate the role of the atmosphere in driving surface biases by comparing historical and atmopsheric (AMIP) experiments. We show that some differences in boundary-layer characteristics, mostly those related to cloud fraction and relative humidity, are already present in AMIP experiments and may be the drivers of coupled biases. This gives insights in how models can be improved for better simulations of the tropical climate.

  9. A Comparison of Five Numerical Weather Prediction Analysis Climatologies in Southern High Latitudes.

    NASA Astrophysics Data System (ADS)

    Connolley, William M.; Harangozo, Stephen A.

    2001-01-01

    In this paper, numerical weather prediction analyses from four major centers are compared-the Australian Bureau of Meteorology (ABM), the European Centre for Medium-Range Weather Forecasts (ECMWF), the U.S. National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR), and The Met. Office (UKMO). Two of the series-ECMWF reanalysis (ERA) and NCEP-NCAR reanalysis (NNR)-are `reanalyses'; that is, the data have recently been processed through a consistent, modern analysis system. The other three-ABM, ECMWF operational (EOP), and UKMO-are archived from operational analyses.The primary focus in this paper is on the period of 1979-93, the period used for the reanalyses, and on climatology. However, ABM and NNR are also compared for the period before 1979, for which the evidence tends to favor NNR. The authors are concerned with basic variables-mean sea level pressure, height of the 500-hPa surface, and near-surface temperature-that are available from the basic analysis step, rather than more derived quantities (such as precipitation), which are available only from the forecast step.Direct comparisons against station observations, intercomparisons of the spatial pattern of the analyses, and intercomparisons of the temporal variation indicate that ERA, EOP, and UKMO are best for sea level pressure;that UKMO and EOP are best for 500-hPa height; and that none of the analyses perform well for near-surface temperature.

  10. Simulations of Eurasian winter temperature trends in coupled and uncoupled CFSv2

    NASA Astrophysics Data System (ADS)

    Collow, Thomas W.; Wang, Wanqiu; Kumar, Arun

    2018-01-01

    Conflicting results have been presented regarding the link between Arctic sea-ice loss and midlatitude cooling, particularly over Eurasia. This study analyzes uncoupled (atmosphere-only) and coupled (ocean-atmosphere) simulations by the Climate Forecast System, version 2 (CFSv2), to examine this linkage during the Northern Hemisphere winter, focusing on the simulation of the observed surface cooling trend over Eurasia during the last three decades. The uncoupled simulations are Atmospheric Model Intercomparison Project (AMIP) runs forced with mean seasonal cycles of sea surface temperature (SST) and sea ice, using combinations of SST and sea ice from different time periods to assess the role that each plays individually, and to assess the role of atmospheric internal variability. Coupled runs are used to further investigate the role of internal variability via the analysis of initialized predictions and the evolution of the forecast with lead time. The AMIP simulations show a mean warming response over Eurasia due to SST changes, but little response to changes in sea ice. Individual runs simulate cooler periods over Eurasia, and this is shown to be concurrent with a stronger Siberian high and warming over Greenland. No substantial differences in the variability of Eurasian surface temperatures are found between the different model configurations. In the coupled runs, the region of significant warming over Eurasia is small at short leads, but increases at longer leads. It is concluded that, although the models have some capability in highlighting the temperature variability over Eurasia, the observed cooling may still be a consequence of internal variability.

  11. Comparison of large-scale dynamical variability in the extratropical stratosphere among the JRA-55 family data sets: impacts of assimilation of observational data in JRA-55 reanalysis data

    NASA Astrophysics Data System (ADS)

    Taguchi, Masakazu

    2017-09-01

    This study compares large-scale dynamical variability in the extratropical stratosphere, such as major stratospheric sudden warmings (MSSWs), among the Japanese 55-year Reanalysis (JRA-55) family data sets. The JRA-55 family consists of three products: a standard product (STDD) of the JRA-55 reanalysis data and two sub-products of JRA-55C (CONV) and JRA-55AMIP (AMIP). CONV assimilates only conventional surface and upper-air observations without assimilation of satellite observations, whereas AMIP runs the same numerical weather prediction model without assimilation of observational data. A comparison of the occurrence of MSSWs in Northern Hemisphere (NH) winter shows that, compared to STDD, CONV delays several MSSWs by 1 to 4 days and also misses a few MSSWs. CONV also misses the Southern Hemisphere (SH) MSSW in September 2002. AMIP shows significantly fewer MSSWs in Northern Hemisphere winter and especially lacks MSSWs of the high aspect ratio of the polar vortex in which the vortex is highly stretched or split. A further examination of daily geopotential height differences between STDD and CONV reveals occasional peaks in both hemispheres that are separated from MSSWs. The delayed and missed MSSW cases have smaller height differences in magnitude than such peaks. The height differences for those MSSWs include large contributions from the zonal component, which reflects underestimations in the weakening of the zonal mean polar night jet in CONV. We also explore strong planetary wave forcings and associated polar vortex weakenings for STDD and AMIP. We find a lower frequency of strong wave forcings and weaker vortex responses to such wave forcings in AMIP, consistent with the lower MSSW frequency.

  12. Recent intensification of the Walker Circulation and the role of natural sea surface temperature variability

    NASA Astrophysics Data System (ADS)

    Zhao, X.; Allen, R.

    2017-12-01

    In a warming world, the tropical atmospheric overturning circulation-including the Walker Circulation-is expected to weaken due to thermodynamic constraints. Tropical precipitation increases at a slower rate than water vapor-which increases according to Clausius Clapeyron scaling, assuming constant relative humidity-so the tropical overturning circulation slows down. This is supported by both observations and model simulations, which show a slowdown of the Walker Circulation over the 20th century. Model projections suggest a further weakening of the Walker Circulation in the 21st century. However, over the last several decades (1979-2014), multiple observations reveal a robust strengthening of the Walker Circulation. Although coupled CMIP5 simulations are unable to reproduce this strengthening, AMIP simulations-which feature the observed evolution of SSTs-are generally able to reproduce it. Assuming the ensemble mean sea surface temperatures (SSTs) from historical CMIP5 simulations accurately represent the externally forced SST response, the observed SSTs can be decomposed into a forced and an unforced component. CAM5 AMIP-type simulations driven by the unforced component of observed SSTs reproduce the observed strengthening of the Walker Circulation. Corresponding simulations driven by the forced component of observed SSTs yield a weaker Walker Circulation. These results are consistent with the zonal tropical SST gradient and the Bjerknes feedback. The unforced component of SSTs yield an increased SST gradient over tropical Pacific (a La Nina like pattern) and strengthening of the tropical trade winds, which constitute the lower branch of the Walker Circulation. The forced component of SSTs yields a zonally uniform tropical Pacific SST warming and a marginal weakening of the Walker Circulation. Our results suggest significant modulation of the tropical Walker Circulation by natural SST variability over the last several decades.

  13. Intermodel spread of the double-ITCZ bias in coupled GCMs tied to land surface temperature in AMIP GCMs

    NASA Astrophysics Data System (ADS)

    Zhou, Wenyu; Xie, Shang-Ping

    2017-08-01

    Global climate models (GCMs) have long suffered from biases of excessive tropical precipitation in the Southern Hemisphere (SH). The severity of the double-Intertropical Convergence Zone (ITCZ) bias, defined here as the interhemispheric difference in zonal mean tropical precipitation, varies strongly among models in the Coupled Model Intercomparison Project Phase 5 (CMIP5) ensemble. Models with a more severe double-ITCZ bias feature warmer tropical sea surface temperature (SST) in the SH, coupled with weaker southeast trades. While previous studies focus on coupled ocean-atmosphere interactions, here we show that the intermodel spread in the severity of the double-ITCZ bias is closely related to land surface temperature biases, which can be further traced back to those in the Atmosphere Model Intercomparison Project (AMIP) simulations. By perturbing land temperature in models, we demonstrate that cooler land can indeed lead to a more severe double-ITCZ bias by inducing the above coupled SST-trade wind pattern in the tropics. The response to land temperature can be consistently explained from both the dynamic and energetic perspectives. Although this intermodel spread from the land temperature variation does not account for the ensemble model mean double-ITCZ bias, identifying the land temperature effect provides insights into simulating a realistic ITCZ for the right reasons.

  14. Analysis of the Diurnal Cycle of Precipitation and its Relation to Cloud Radiative Forcing Using TRMM Products

    NASA Technical Reports Server (NTRS)

    Randall, David A.; Fowler, Laura D.; Lin, Xin

    1998-01-01

    In order to improve our understanding of the interactions between clouds, radiation, and the hydrological cycle simulated in the Colorado State University General Circulation Model (CSU GCM), we focused our research on the analysis of the diurnal cycle of precipitation, top-of-the-atmosphere and surface radiation budgets, and cloudiness using 10-year long Atmospheric Model Intercomparison Project (AMIP) simulations. Comparisons the simulated diurnal cycle were made against the diurnal cycle of Earth Radiation Budget Experiment (ERBE) radiation budget and International Satellite Cloud Climatology Project (ISCCP) cloud products. This report summarizes our major findings over the Amazon Basin.

  15. How well do CMIP5 models simulate the low-level jet in western Colombia?

    NASA Astrophysics Data System (ADS)

    Sierra, Juan P.; Arias, Paola A.; Vieira, Sara C.; Agudelo, Jhoana

    2017-11-01

    The Choco jet is an important atmospheric feature of Colombian and northern South America hydro-climatology. This work assesses the ability of 26 coupled and 11 uncoupled (AMIP) global climate models (GCMs) included in the fifth phase of the Coupled Model Intercomparison Project (CMIP5) archive to simulate the climatological basic features (annual cycle, spatial distribution and vertical structure) of this jet. Using factor and cluster analysis, we objectively classify models in Best, Worst, and Intermediate groups. Despite the coarse resolution of the GCMs, this study demonstrates that nearly all models can represent the existence of the Choco low-level jet. AMIP and Best models present a more realistic simulation of jet. Worst models exhibit biases such as an anomalous southward location of the Choco jet during the whole year and a shallower jet. The model skill to represent this jet comes from their ability to reproduce some of its main causes, such as the temperature and pressure differences between particular regions in the eastern Pacific and western Colombian lands, which are non-local features. Conversely, Worst models considerably underestimate temperature and pressure differences between these key regions. We identify a close relationship between the location of the Choco jet and the Inter-tropical Convergence Zone (ITCZ), and CMIP5 models are able to represent such relationship. Errors in Worst models are related with bias in the location of the ITCZ over the eastern tropical Pacific Ocean, as well as the representation of the topography and the horizontal resolution.

  16. Role of the Tropical Pacific in recent Antarctic Sea-Ice Trends

    NASA Astrophysics Data System (ADS)

    Codron, F.; Bardet, D.; Allouache, C.; Gastineau, G.; Friedman, A. R.; Douville, H.; Voldoire, A.

    2017-12-01

    The recent (up to 2016) trends in Antarctic sea-ice cover - a global increase masking a dipole between the Ross and Bellingshausen-Weddel seas - are still not well understood, and not reproduced by CMIP5 coupled climate models. We here explore the potential role of atmospheric circulation changes around the Amundsen Sea, themselves possibly forced by tropical SSTs, an explanation that has been recently advanced. As a first check on this hypothesis, we compare the atmospheric circulation trends simulated by atmospheric GCMs coupled with an ocean or with imposed SSTs (AMIP experiment from CMIP5); the latter being in theory able to reproduce changes caused by natural SST variability. While coupled models simulate in aggregate trends that project on the SAM structure, strongest in summer, the AMIP simulations add in the winter season a pronounced Amundsen Sea Low signature (and a PNA signature in the northern hemisphere) both consistent with a Niña-like trend in the tropical Pacific. We then use a specific coupled GCM setup, in which surface wind anomalies over the tropical Pacific are strongly nudged towards the observed ones, including their interannual variability, but the model is free to evolve elsewhere. The two GCMs used then simulate a deepening trend in the Amundsen-Sea Low in winter, and are able to reproduce a dipole in sea-ice cover. Further analysis shows that the sea-ice dipole is partially forced by surface heat flux anomalies in early winter - the extent varying with the region and GCM used. The turbulent heat fluxes then act to damp the anomalies in late winter, which may however be maintained by ice-albedo feedbacks.

  17. Update of global TC simulations using a variable resolution non-hydrostatic model

    NASA Astrophysics Data System (ADS)

    Park, S. H.

    2017-12-01

    Using in a variable resolution meshes in MPAS during 2017 summer., Tropical cyclone (TC) forecasts are simulated. Two physics suite are tested to explore performance and bias of each physics suite for TC forecasting. A WRF physics suite is selected from experience on weather forecasting and CAM (Community Atmosphere Model) physics is taken from a AMIP type climate simulation. Based on the last year results from CAM5 physical parameterization package and comparing with WRF physics, we investigated a issue with intensity bias using updated version of CAM physics (CAM6). We also compared these results with coupled version of TC simulations. During this talk, TC structure will be compared specially around of boundary layer and investigate their relationship between TC intensity and different physics package.

  18. Do gravity waves significantly impact PSC occurrence in the Antarctic?

    NASA Astrophysics Data System (ADS)

    McDonald, A. J.; George, S. E.; Woollands, R. M.

    2009-02-01

    This study uses a combination of POAM III aerosol extinction measurements and CHAMP GPS/RO temperature measurements to examine the role of atmospheric gravity waves in Polar Stratospheric Cloud (PSC) formation in the Antarctic. POAM III aerosol extinction observations are used to identify Type I Polar Stratospheric Clouds using an unsupervised clustering algorithm. The seasonal and spatial distribution of PSCs observed by POAM III is examined to determine whether there is a bias towards regions of high wave activity early in the Antarctic winter which may enhance PSC formation. Examination of the probability of temperatures below the Type Ia formation temperature threshold based on UKMO analyses displays a good correspondence to the PSC occurrence derived from POAM III extinction data in general. However, in June the POAM III observations of PSC are more abundant than expected from temperature thresholds. In addition the PSC occurrence based on temperature thresholds in September and October is often significantly higher than the PSC occurrence observed by POAM III, this observation probably being due to dehydration and denitrification. Use of high resolution temperatures from CHAMP GPS/RO observations provide a slightly improved relationship to the POAM III derived values. Analysis of the CHAMP temperature observations indicates that temperature perturbations associated with gravity waves may explain the enhanced PSC incidence observed in June compared to the UKMO analyses. Comparison of the UKMO analyses temperatures relative to corresponding CHAMP observations also suggests a small warm bias in the UKMO analyses during June. Examination of the longitudinal structure PSC occurrence in June 2005 also shows that regions of enhancement are associated with data near the Antarctic peninsula a known Mountain wave "hotspot". The impact of temperature perturbations causing enhanced temperature threshold crossings is shown to be particularly important early in the Antarctic winter while later in the season temperature perturbations associated with gravity waves could contribute to about 15% of the PSC observed, a value which corresponds well to several previous studies.

  19. The new GFDL global atmosphere and land model AM2-LM2: Evaluation with prescribed SST simulations

    USGS Publications Warehouse

    Anderson, J.L.; Balaji, V.; Broccoli, A.J.; Cooke, W.F.; Delworth, T.L.; Dixon, K.W.; Donner, L.J.; Dunne, K.A.; Freidenreich, S.M.; Garner, S.T.; Gudgel, R.G.; Gordon, C.T.; Held, I.M.; Hemler, R.S.; Horowitz, L.W.; Klein, S.A.; Knutson, T.R.; Kushner, P.J.; Langenhost, A.R.; Lau, N.-C.; Liang, Z.; Malyshev, S.L.; Milly, P.C.D.; Nath, M.J.; Ploshay, J.J.; Ramaswamy, V.; Schwarzkopf, M.D.; Shevliakova, E.; Sirutis, J.J.; Soden, B.J.; Stern, W.F.; Thompson, L.A.; Wilson, R.J.; Wittenberg, A.T.; Wyman, B.L.

    2004-01-01

    The configuration and performance of a new global atmosphere and land model for climate research developed at the Geophysical Fluid Dynamics Laboratory (GFDL) are presented. The atmosphere model, known as AM2, includes a new gridpoint dynamical core, a prognostic cloud scheme, and a multispecies aerosol climatology, as well as components from previous models used at GFDL. The land model, known as LM2, includes soil sensible and latent heat storage, groundwater storage, and stomatal resistance. The performance of the coupled model AM2-LM2 is evaluated with a series of prescribed sea surface temperature (SST) simulations. Particular focus is given to the model's climatology and the characteristics of interannual variability related to El Nin??o-Southern Oscillation (ENSO). One AM2-LM2 integration was perfor med according to the prescriptions of the second Atmospheric Model Intercomparison Project (AMIP II) and data were submitted to the Program for Climate Model Diagnosis and Intercomparison (PCMDI). Particular strengths of AM2-LM2, as judged by comparison to other models participating in AMIP II, include its circulation and distributions of precipitation. Prominent problems of AM2-LM2 include a cold bias to surface and tropospheric temperatures, weak tropical cyclone activity, and weak tropical intraseasonal activity associated with the Madden-Julian oscillation. An ensemble of 10 AM2-LM 2 integrations with observed SSTs for the second half of the twentieth century permits a statistically reliable assessment of the model's response to ENSO. In general, AM2-LM2 produces a realistic simulation of the anomalies in tropical precipitation and extratropical circulation that are associated with ENSO. ?? 2004 American Meteorological Society.

  20. Land surface energy budget during dry spells: global CMIP5 AMIP simulations vs. satellite observations

    NASA Astrophysics Data System (ADS)

    Gallego-Elvira, Belen; Taylor, Christopher M.; Harris, Phil P.; Ghent, Darren; Folwell, Sonja S.

    2015-04-01

    During extended periods without rain (dry spells), the soil can dry out due to vegetation transpiration and soil evaporation. At some point in this drying cycle, land surface conditions change from energy-limited to water-limited evapotranspiration, and this is accompanied by an increase of the ground and overlying air temperatures. Regionally, the characteristics of this transition determine the influence of soil moisture on air temperature and rainfall. Global Climate Models (GCMs) disagree on where and how strongly the surface energy budget is limited by soil moisture. Flux tower observations are improving our understanding of these dry down processes, but typical heterogeneous landscapes are too sparsely sampled to ascertain a representative regional response. Alternatively, satellite observations of land surface temperature (LST) provide indirect information about the surface energy partition at 1km resolution globally. In our study, we analyse how well the dry spell dynamics of LST are represented by GCMs across the globe. We use a spatially and temporally aggregated diagnostic to describe the composite response of LST during surface dry down in rain-free periods in distinct climatic regions. The diagnostic is derived from daytime MODIS-Terra LST observations and bias-corrected meteorological re-analyses, and compared against the outputs of historical climate simulations of seven models running the CMIP5 AMIP experiment. Dry spell events are stratified by antecedent precipitation, land cover type and geographic regions to assess the sensitivity of surface warming rates to soil moisture levels at the onset of a dry spell for different surface and climatic zones. In a number of drought-prone hot spot regions, we find important differences in simulated dry spell behaviour, both between models, and compared to observations. These model biases are likely to compromise seasonal forecasts and future climate projections.

  1. FIRE_MS_UKMO_C130

    Atmospheric Science Data Center

    2015-11-25

    ... Hot-Wire MCRW Refractometer Platinum Resistance Pressure Transducer RT-4 Pyranometer Pyrgeometer Radiometer Wind ... Parameters:  Dew/Frost Point Temperature Liquid Water Content Humidity Temperature Pressure Irradiance ...

  2. Skill of Predicting Heavy Rainfall Over India: Improvement in Recent Years Using UKMO Global Model

    NASA Astrophysics Data System (ADS)

    Sharma, Kuldeep; Ashrit, Raghavendra; Bhatla, R.; Mitra, A. K.; Iyengar, G. R.; Rajagopal, E. N.

    2017-11-01

    The quantitative precipitation forecast (QPF) performance for heavy rains is still a challenge, even for the most advanced state-of-art high-resolution Numerical Weather Prediction (NWP) modeling systems. This study aims to evaluate the performance of UK Met Office Unified Model (UKMO) over India for prediction of high rainfall amounts (>2 and >5 cm/day) during the monsoon period (JJAS) from 2007 to 2015 in short range forecast up to Day 3. Among the various modeling upgrades and improvements in the parameterizations during this period, the model horizontal resolution has seen an improvement from 40 km in 2007 to 17 km in 2015. Skill of short range rainfall forecast has improved in UKMO model in recent years mainly due to increased horizontal and vertical resolution along with improved physics schemes. Categorical verification carried out using the four verification metrics, namely, probability of detection (POD), false alarm ratio (FAR), frequency bias (Bias) and Critical Success Index, indicates that QPF has improved by >29 and >24% in case of POD and FAR. Additionally, verification scores like EDS (Extreme Dependency Score), EDI (Extremal Dependence Index) and SEDI (Symmetric EDI) are used with special emphasis on verification of extreme and rare rainfall events. These scores also show an improvement by 60% (EDS) and >34% (EDI and SEDI) during the period of study, suggesting an improved skill of predicting heavy rains.

  3. Demonstrating the Importance of `` Good" Models of Land Surface Hydrological Processes

    NASA Astrophysics Data System (ADS)

    Pitman, A.; Irannejad, P.; McGuffie, K.; Henderson-Sellers, A.

    2003-12-01

    To reduce the uncertainty in the prediction of land surface climates,, the Atmospheric Model Intercomparison Project (AMIP) Diagnostic Subproject 12 (DSP 12) and the Project for Intercomparison of Land-surface Parameterisation Schemes (PILPS) have analysed dependence of climate simulations on the land-surface schemes (LSSs). This analysis has comprised three efforts: (i) proving that LSSs matter in coupled simulations; (ii) investigating whether improvements in LSSs have occurred over time; and (iii) searching for novel means of validating LSS predictions. In the first, Irannejad et al. (2003) introduce a novel method for evaluating the dependence of 19 AMIP AGCMs' LH on the LSS by excluding the impact of the atmosphere. Pseudo LSSs (PLSSs) for LH in the form of multi-variable linear models expressing mean monthly LH as a function of atmospheric forcing are developed. Analysis over three large and climatically diverse river basins shows estimates of mean annual LH from the PLSSs agreeing well with the AGCMs' simulations. RMS errors range from 0.4 to 2.2 W m-2 depending on the region and the AGCM. When the PLSSs are driven by single atmospheric forcings, different LSSs behave differently, and the variability of mean annual LH among AGCMs increases. The second strand of our investigation uncovered a clear generational sequence of land-surface schemes: first generation 'no canopy'; second generation ` SiBlings'; and ` recent schemes'. We conclude that although continental surface modelling has improved over the last 30 years, full confidence remains elusive, in part due to tuning to available observations. Finally, we show that stable water isotopes challenge predictions of evaporation and condensation processes. These three-pronged findings prove that LSSs are important to AGCM and coupled climate predictions; demonstrate that new, or changed, land-surface components increase diversity among simulations; underline the need for validation data and also challenge current parameterisations with novel observations.

  4. FIRE_AX_UKMO_C130

    Atmospheric Science Data Center

    2015-11-25

    ... FSSP Gust Probe Hot-Wire Hygrometer Platinum Resistance PMS 2D-C Probe PRT-4 Pyranometer Pyrgeometer ... Parameters:  Barometric Altitude Cloud Top Temperature Deiced Temperature Dew/Frost Point Temperature Droplet ...

  5. Exacerbation of South Asian monsoon biases in GCMs using when using coupled ocean models

    NASA Astrophysics Data System (ADS)

    Turner, Andrew

    2015-04-01

    Cold biases during spring in the northern Arabian Sea of coupled ocean-atmosphere GCMs have previously been shown to limit monsoon rainfall over South Asia during the subsequent summer, by limiting the availability of moisture being advected. The cold biases develop following advection of cold dry air on anomalous northerly low level flow, suggestive of a too-strong winter monsoon in the coupled GCMs. As the same time, these cold biases and the anomalous advection have been related to larger scales by interaction with progression of the midlatitude westerly upper level flow. In this study we compare monsoon characteristics in 20th century historical and AMIP integrations of the CMIP5 multi-model database. We use a period of 1979-2005, common to both the AMIP and historical integrations. While all available observed boundary conditions, including sea-surface temperature (SST), are prescribed in the AMIP integrations, the historical integrations feature ocean-atmosphere models that generate SSTs via air-sea coupled processes. In AMIP experiments, the seasonal mean monsoon rainfall is shown to be systematically larger than in the coupled versions, with an earlier onset date also shown using a variety of circulation and precipitation metrics. In addition, examination of the springtime jet structure suggests that it sits too far south in the coupled models, leading to a delayed formation of the South Asia High over the Tibetan Plateau in summer. Further, we show that anomalous low entropy air is being advected near the surface from the north over the Arabian Sea in spring in the coupled models.

  6. Great Plains Drought in Simulations of Twentieth Century

    NASA Astrophysics Data System (ADS)

    McCrary, R. R.; Randall, D. A.

    2008-12-01

    The Great Plains region of the United States was influenced by a number of multi-year droughts during the twentieth century. Most notable were the "Dust Bowl" drought of the 1930s and the 1950s Great Plains drought. In this study we evaluate the ability of three of the Coupled Global Climate Models (CGCMs) used in the Fourth Assessment Report (AR4) of the IPCC to simulate Great Plains drought with the same frequency and intensity as was observed during the twentieth century. The models chosen for this study are: GFDL CM 2.0, NCAR CCSM3, and UKMO HadCM3. We find that the models accurately capture the climatology of the hydrologic cycle of the Great Plains, but that they tend to overestimate the variability in Great Plains precipitation. We also find that in each model simulation at least one long-term drought occurs over the Great Plains region during their representations 20th Century Climate. The multi-year droughts produced by the models exhibit similar magnitudes and spatial scales as was observed during the twentieth century. This study also investigates the relative roles that external forcing from the tropical Pacific and local feedbacks between the land surface and the atmosphere have in the initiation and perpetuation of Great Plains drought in each model. We find that cool, La Nina-like conditions in the tropical pacific are often associated with long-term drought conditions over the Great Plains in GFDL CM 2.0 and UKMO HadCM3, but there appears to be no systematic relationship between tropical Pacific SST variability and Great Plains drought in CCSM3. It is possible the strong coupling between the land surface and the atmosphere in the NCAR model causes precipitation anomalies to lock into phase over the Great Plains thereby perpetuating drought conditions. Results from this study are intended to help assess whether or not these climate models are credible for use in the assessment of future drought over the Great Plains region of the United States.

  7. Two-Dimensional Model Simulations of Interannual Variability in the Tropical Stratosphere

    NASA Technical Reports Server (NTRS)

    Fleming, Eric L.; Jackman, Charles H.; Considine, David B.; Rosenfeld, Joan; Bhartia, P. K. (Technical Monitor)

    2001-01-01

    Meteorological data from the United Kingdom Meteorological Office (UKMO) and constituent data from the Upper Atmospheric Research Satellite (UARS) are used to construct yearly zonal mean dynamical fields for the 1990s for use in the GSFC 2-D chemistry and transport model. This allows for interannual dynamical variability to be included in the model constituent simulations. In this study, we focus on the tropical stratosphere. We find that the phase of quasi-biennial oscillation (QBO) signals in equatorial CH4, and profile and total column 03 data is resolved quite well using this empirically- based 2-D model transport framework. However. the QBO amplitudes in the model constituents are systematically underestimated relative to the observations at most levels. This deficiency is probably due in part to the limited vertical resolutions of the 2-D model and the UKMO and UARS input data sets. We find that using different heating rate calculations in the model affects the interannual and QBO amplitudes in the constituent fields, but has little impact on the phase. Sensitivity tests reveal that the QBO in transport dominates the ozone interannual variability in the lower stratosphere. with the effect of the temperature QBO being dominant in the tipper stratosphere via the strong temperature dependence of the ozone loss reaction rates. We also find that the QBO in odd nitrogen radicals, which is caused by the QBO modulated transport of NOy, plays a significant but not dominant role in determining the ozone QBO variability in the middle stratosphere. The model mean age of air is in good overall agreement with that determined from tropical lower,middle stratospheric OMS balloon observations of SF6 and CO2. The interannual variability of tile equatorial mean age in the model increases with altitude and maximizes near 40 km, with a range, of 4-5 years over the 1993-2000 time period.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, David L.; Olson, Jerry G.; Hannay, Cécile

    An error in the energy formulation in the Community Atmosphere Model (CAM) is identified and corrected. Ten year AMIP simulations are compared using the correct and incorrect energy formulations. Statistics of selected primary variables all indicate physically insignificant differences between the simulations, comparable to differences with simulations initialized with rounding sized perturbations. The two simulations are so similar mainly because of an inconsistency in the application of the incorrect energy formulation in the original CAM. CAM used the erroneous energy form to determine the states passed between the parameterizations, but used a form related to the correct formulation for themore » state passed from the parameterizations to the dynamical core. If the incorrect form is also used to determine the state passed to the dynamical core the simulations are significantly different. In addition, CAM uses the incorrect form for the global energy fixer, but that seems to be less important. The difference of the magnitude of the fixers using the correct and incorrect energy definitions is very small.« less

  9. Energy considerations in the Community Atmosphere Model (CAM)

    DOE PAGES

    Williamson, David L.; Olson, Jerry G.; Hannay, Cécile; ...

    2015-06-30

    An error in the energy formulation in the Community Atmosphere Model (CAM) is identified and corrected. Ten year AMIP simulations are compared using the correct and incorrect energy formulations. Statistics of selected primary variables all indicate physically insignificant differences between the simulations, comparable to differences with simulations initialized with rounding sized perturbations. The two simulations are so similar mainly because of an inconsistency in the application of the incorrect energy formulation in the original CAM. CAM used the erroneous energy form to determine the states passed between the parameterizations, but used a form related to the correct formulation for themore » state passed from the parameterizations to the dynamical core. If the incorrect form is also used to determine the state passed to the dynamical core the simulations are significantly different. In addition, CAM uses the incorrect form for the global energy fixer, but that seems to be less important. The difference of the magnitude of the fixers using the correct and incorrect energy definitions is very small.« less

  10. Three-dimensional evolution of water vapor distributions in the Northern Hemisphere stratosphere as observed by the MLS

    NASA Technical Reports Server (NTRS)

    Lahoz, W. A.; O'Neill, A.; Carr, E. S.; Harwood, R. S.; Froidevaux, L.; Read, W. G.; Waters, J. W.; Kumer, J. B.; Mergenthaler, J. L.; Roche, A. E.

    1994-01-01

    The three-dimensional evolution of stratospheric water vapor distributions observed by the Microwave Limb Sounder (MLS) during the period October 1991 - July 1992 is documented. The transport features inferred from the MLS water vapor distributions are corroborated using other dynamical fields, namely, nitrous oxide from the Cryogenic Limb Array Etalon Spectrometer instrument, analyzed winds from the U.K. Meteorological Office (UKMO), UKMO-derived potential vorticity, and the diabatic heating field. By taking a vortex-centered view and an along-track view, the authors observe in great detail the vertical and horizontal structure of the northern winter stratosphere. It is demonstrated that the water vapor distributions show clear signatures of the effects of diabatic descent through isentropic surfaces and quasi-horizontal transport along isentropic surfaces, and that the large-scale winter flow is organized by the interaction between the westerly polar vortex and the Aleutian high.

  11. Using MERRA, AMIP II, CMIP5 Outputs to Assess Actual and Potential Building Climate Zone Change and Variability From the Last 30 Years Through 2100

    NASA Astrophysics Data System (ADS)

    Stackhouse, P. W.; Westberg, D. J.; Hoell, J. M., Jr.; Chandler, W.; Zhang, T.

    2014-12-01

    In the US, residential and commercial building infrastructure combined consumes about 40% of total energy usage and emits about 39% of total CO2emission (DOE/EIA "Annual Energy Outlook 2013"). Thus, increasing the energy efficiency of buildings is paramount to reducing energy costs and emissions. Building codes, as used by local and state enforcement entities are typically tied to the dominant climate within an enforcement jurisdiction classified according to various climate zones. These climates zones are based upon a 30-year average of local surface observations and are developed by DOE and ASHRAE (formerly known as the American Society of Hearting, Refrigeration and Air-Conditioning Engineers). A significant shortcoming of the methodology used in constructing such maps is the use of surface observations (located mainly near airports) that are unequally distributed and frequently have periods of missing data that need to be filled by various approximation schemes. This paper demonstrates the usefulness of using NASA's Modern Era Retrospective-analysis for Research and Applications (MERRA) atmospheric data assimilation to derive the ASHRAE climate zone maps and then using MERRA to define the last 30 years of variability in climate zones. These results show that there is a statistically significant increase in the area covered by warmer climate zones and some tendency for a reduction of area in colder climate zones that require longer time series to confirm. Using the uncertainties of the basic surface temperature and precipitation parameters from MERRA as determined by comparison to surface measurements, we first compare patterns and variability of ASHRAE climate zones from MERRA relative to present day climate model runs from AMIP simulations to establish baseline sensitivity. Based upon these results, we assess the variability of the ASHRAE climate zones according to CMIP runs through 2100 using an ensemble analysis that classifies model output changes by percentiles. Estimates of statistical significance are then compared to original model variability during the AMIP period. This work quantifies and tests for significance the changes seen in the various US regions that represent a potential contribution by NASA to the ongoing National Climate Assessment.

  12. Ship accessibility predictions for the Arctic Ocean based on IPCC CO2 emission scenarios

    NASA Astrophysics Data System (ADS)

    Oh, Jai-Ho; Woo, Sumin; Yang, Sin-Il

    2017-02-01

    Changes in the extent of Arctic sea ice, which have resulted from climate change, offer new opportunities to use the Northern Sea Route (NSR) and Northwest Passage (NWP) for shipping. However, choosing to navigate the Arctic Ocean remains challenging due to the limited accessibility of ships and the balance between economic gain and potential risk. As a result, more precise and detailed information on both weather and sea ice change in the Arctic are required. In this study, a high-resolution global AGCM was used to provide detailed information on the extent and thickness of Arctic sea ice. For this simulation, we have simulated the AMIP-type simulation for the present-day climate during 31 years from 1979 to 2009 with observed SST and Sea Ice concentration. For the future climate projection, we have performed the historical climate during 1979-2005 and subsequently the future climate projection during 2010-2099 with mean of four CMIP5 models due to the two Representative Concentration Pathway scenarios (RCP 8.5 and RCP 4.5). First, the AMIP-type simulation was evaluated by comparison with observations from the Hadley Centre sea-ice and Sea Surface Temperature (HadlSST) dataset. The model reflects the maximum (in March) and minimum (in September) sea ice extent and annual cycle. Based on this validation, the future sea ice extents show the decreasing trend for both the maximum and minimum seasons and RCP 8.5 shows more sharply decreasing patterns of sea ice than RCP 4.5. Under both scenarios, ships classified as Polar Class (PC) 3 and Open-Water (OW) were predicted to have the largest and smallest number of ship-accessible days (in any given year) for the NSR and NWP, respectively. Based on the RCP 8.5 scenario, the projections suggest that after 2070, PC3 and PC6 vessels will have year-round access across to the Arctic Ocean. In contrast, OW vessels will continue to have a seasonal handicap, inhibiting their ability to pass through the NSR and NWP.

  13. Influence of Madden-Julian Oscillation (MJO) on Rainfall Variability over West Africa at Intraseasonal Timescale

    NASA Astrophysics Data System (ADS)

    Niang, C.

    2015-12-01

    Intraseasonal variability of rainfall over West Africa plays a significant role in the economy of the region and is highly linked to agriculture and water resources. This research study aims to investigate the relationship between Madden Julian Oscillation (MJO) and rainfall over West Africa during the boreal summer in the the state-of-the-art Atmospheric Model Intercomparison Project (AMIP) type simulations performed by Atmosphere General Circulation Models (GCMs) forced with prescribed Sea Surface Temperature (SST). It aims to determine the impact of MJO on rainfall and convection over West Africa and identify the dynamical processes which are involved in the state-of-the-art climate simulations. The simulations show in general good skills in capturing its main characteristics as well as its influence on rainfall over West Africa. On the global scale, most models simulated an eastward spatio-temporal propagation of enhanced and suppressed convection similar to the observed. However, over West Africa the MJO signal is weak in few of the models although there is a good coherence in the eastward propagation. The influence on rainfall is well captured in both Sahel and Guinea regions thereby adequately producing the transition between positive and negative rainfall anomalies through the different phases as seen in the observation. Furthermore, the results show that strong active convective phase is clearly associated with the African Easterly Jet (AEJ) but the weak convective phase is associated with a much weaker AEJ particularly over coastal Ghana. In assessing the mechanisms which are involved in the above impacts the convectively equatorial coupled waves (CCEW) are analysed separately. The analysis of the longitudinal propagation of zonal wind at 850hPa and outgoing longwave radiation (OLR) shows that the CCEW are very weak and their extention are very limited beyong West African region. It was found that the westward coupled equatorial Rossby waves are needed to bring out the MJO-convection link over the region and this relationship is well reproduced by all the models. Results also confirmed that it may be possible to predict the anomalous convection over West Africa with a time lead of 15-20 day with regard to Indian Ocean and AMIP simulations performed well in this regard.

  14. What have we learned from the German consortium project STORM aiming at high-resolution climate simulations?

    NASA Astrophysics Data System (ADS)

    von Storch, Jin-Song

    2014-05-01

    The German consortium STORM was built to explore high-resolution climate simulations using the high-performance computer stored at the German Climate Computer Center (DKRZ). One of the primary goals is to quantify the effect of unresolved (and parametrized) processes on climate sensitivity. We use ECHAM6/MPIOM, the coupled atmosphere-ocean model developed at the Max-Planck Institute for Meteorology. The resolution is T255L95 for the atmosphere and 1/10 degree and 80 vertical levels for the ocean. We discuss results of stand-alone runs, i.e. the ocean-only simulation driven by the NCEP/NCAR renalaysis and the atmosphere-only AMIP-type of simulation. Increasing resolution leads to a redistribution of biases, even though some improvements, both in the atmosphere and in the ocean, can clearly be attributed to the increase in resolution. We represent also new insights on ocean meso-scale eddies, in particular their effects on the ocean's energetics. Finally, we discuss the status and problems of the coupled high-resolution runs.

  15. Projected climate and vegetation changes and potential biotic effects for Fort Benning, Georgia; Fort Hood, Texas; and Fort Irwin, California

    USGS Publications Warehouse

    Shafer, S.L.; Atkins, J.; Bancroft, B.A.; Bartlein, P.J.; Lawler, J.J.; Smith, B.; Wilsey, C.B.

    2012-01-01

    The responses of species and ecosystems to future climate changes will present challenges for conservation and natural resource managers attempting to maintain both species populations and essential habitat. This report describes projected future changes in climate and vegetation for three study areas surrounding the military installations of Fort Benning, Georgia, Fort Hood, Texas, and Fort Irwin, California. Projected climate changes are described for the time period 2070–2099 (30-year mean) as compared to 1961–1990 (30-year mean) for each study area using data simulated by the coupled atmosphere-ocean general circulation models CCSM3, CGCM3.1(T47), and UKMO-HadCM3, run under the B1, A1B, and A2 future greenhouse gas emissions scenarios. These climate data are used to simulate potential changes in important components of the vegetation for each study area using LPJ, a dynamic global vegetation model, and LPJ-GUESS, a dynamic vegetation model optimized for regional studies. The simulated vegetation results are compared with observed vegetation data for the study areas. Potential effects of the simulated future climate and vegetation changes for species and habitats of management concern are discussed in each study area, with a particular focus on federally listed threatened and endangered species.

  16. North Atlantic Tropical Cyclones: historical simulations and future changes with the new high-resolution Arpege AGCM.

    NASA Astrophysics Data System (ADS)

    Pilon, R.; Chauvin, F.; Palany, P.; Belmadani, A.

    2017-12-01

    A new version of the variable high-resolution Meteo-France Arpege atmospheric general circulation model (AGCM) has been developed for tropical cyclones (TC) studies, with a focus on the North Atlantic basin, where the model horizontal resolution is 15 km. Ensemble historical AMIP (Atmospheric Model Intercomparison Project)-type simulations (1965-2014) and future projections (2020-2080) under the IPCC (Intergovernmental Panel on Climate Change) representative concentration pathway (RCP) 8.5 scenario have been produced. TC-like vortices tracking algorithm is used to investigate TC activity and variability. TC frequency, genesis, geographical distribution and intensity are examined. Historical simulations are compared to best-track and reanalysis datasets. Model TC frequency is generally realistic but tends to be too high during the rst decade of the historical simulations. Biases appear to originate from both the tracking algorithm and model climatology. Nevertheless, the model is able to simulate extremely well intense TCs corresponding to category 5 hurricanes in the North Atlantic, where grid resolution is highest. Interaction between developing TCs and vertical wind shear is shown to be contributing factor for TC variability. Future changes in TC activity and properties are also discussed.

  17. SST Patterns, Atmospheric Variability, and Inferred Sensitivities in the CMIP5 Model Archive

    NASA Astrophysics Data System (ADS)

    Marvel, K.; Pincus, R.; Schmidt, G. A.

    2017-12-01

    An emerging consensus suggests that global mean feedbacks to increasing temperature are not constant in time. If feedbacks become more positive in the future, the equilibrium climate sensitivity (ECS) inferred from recent observed global energy budget constraints is likely to be biased low. Time-varying feedbacks are largely tied to evolving sea-surface temperature patterns. In particular, recent anomalously cool conditions in the tropical Pacific may have triggered feedbacks that are not reproduced in equilibrium simulations where the tropical Pacific and Southern Ocean have had time to warm. Here, we use AMIP and CMIP5 historical simulations to explore the ECS that may be inferred over the recent historical period. We find that in all but one CMIP5 model, the feedbacks triggered by observed SST patterns are significantly less positive than those arising from historical simulations in which SST patterns are allowed to evolve unconstrained. However, there are substantial variations in feedbacks even when the SST pattern is held fixed, suggesting that atmospheric and land variability contribute to uncertainty in the estimates of ECS obtained from recent observations of the global energy budget.

  18. Application of Seasonal CRM Integrations to Develop Statistics and Improved GCM Parameterization of Subgrid Cloud-Radiation Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiaoqing Wu; Xin-Zhong Liang; Sunwook Park

    2007-01-23

    The works supported by this ARM project lay the solid foundation for improving the parameterization of subgrid cloud-radiation interactions in the NCAR CCSM and the climate simulations. We have made a significant use of CRM simulations and concurrent ARM observations to produce long-term, consistent cloud and radiative property datasets at the cloud scale (Wu et al. 2006, 2007). With these datasets, we have investigated the mesoscale enhancement of cloud systems on surface heat fluxes (Wu and Guimond 2006), quantified the effects of cloud horizontal inhomogeneity and vertical overlap on the domain-averaged radiative fluxes (Wu and Liang 2005), and subsequently validatedmore » and improved the physically-based mosaic treatment of subgrid cloud-radiation interactions (Liang and Wu 2005). We have implemented the mosaic treatment into the CCM3. The 5-year (1979-1983) AMIP-type simulation showed significant impacts of subgrid cloud-radiation interaction on the climate simulations (Wu and Liang 2005). We have actively participated in CRM intercomparisons that foster the identification and physical understanding of common errors in cloud-scale modeling (Xie et al. 2005; Xu et al. 2005, Grabowski et al. 2005).« less

  19. Circulation and rainfall climatology of a 10-year (1979 - 1988) integration with the Goddard Laboratory for atmospheres general circulation model

    NASA Technical Reports Server (NTRS)

    Kim, J.-H.; Sud, Y. C.

    1993-01-01

    A 10-year (1979-1988) integration of Goddard Laboratory for Atmospheres (GLA) general circulation model (GCM) under Atmospheric Model Intercomparison Project (AMIP) is analyzed and compared with observation. The first momentum fields of circulation variables and also hydrological variables including precipitation, evaporation, and soil moisture are presented. Our goals are (1) to produce a benchmark documentation of the GLA GCM for future model improvements; (2) to examine systematic errors between the simulated and the observed circulation, precipitation, and hydrologic cycle; (3) to examine the interannual variability of the simulated atmosphere and compare it with observation; and (4) to examine the ability of the model to capture the major climate anomalies in response to events such as El Nino and La Nina. The 10-year mean seasonal and annual simulated circulation is quite reasonable compared to the analyzed circulation, except the polar regions and area of high orography. Precipitation over tropics are quite well simulated, and the signal of El Nino/La Nina episodes can be easily identified. The time series of evaporation and soil moisture in the 12 biomes of the biosphere also show reasonable patterns compared to the estimated evaporation and soil moisture.

  20. Land-total and Ocean-total Precipitation and Evaporation from a Community Atmosphere Model version 5 Perturbed Parameter Ensemble

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covey, Curt; Lucas, Donald D.; Trenberth, Kevin E.

    2016-03-02

    This document presents the large scale water budget statistics of a perturbed input-parameter ensemble of atmospheric model runs. The model is Version 5.1.02 of the Community Atmosphere Model (CAM). These runs are the “C-Ensemble” described by Qian et al., “Parametric Sensitivity Analysis of Precipitation at Global and Local Scales in the Community Atmosphere Model CAM5” (Journal of Advances in Modeling the Earth System, 2015). As noted by Qian et al., the simulations are “AMIP type” with temperature and sea ice boundary conditions chosen to match surface observations for the five year period 2000-2004. There are 1100 ensemble members in additionmore » to one run with default inputparameter values.« less

  1. Review of asset hierarchy criticality assessment and risk analysis practices.

    DOT National Transportation Integrated Search

    2014-01-01

    The MTA NYC Transit (NYCT) has begun an enterprise-wide Asset Management Improvement Program (AMIP). In : 2012, NYCT developed an executive-level concept of operations that defined a new asset management : framework following a systems engineering ap...

  2. Predictability of short-range forecasting: a multimodel approach

    NASA Astrophysics Data System (ADS)

    García-Moya, Jose-Antonio; Callado, Alfons; Escribà, Pau; Santos, Carlos; Santos-Muñoz, Daniel; Simarro, Juan

    2011-05-01

    Numerical weather prediction (NWP) models (including mesoscale) have limitations when it comes to dealing with severe weather events because extreme weather is highly unpredictable, even in the short range. A probabilistic forecast based on an ensemble of slightly different model runs may help to address this issue. Among other ensemble techniques, Multimodel ensemble prediction systems (EPSs) are proving to be useful for adding probabilistic value to mesoscale deterministic models. A Multimodel Short Range Ensemble Prediction System (SREPS) focused on forecasting the weather up to 72 h has been developed at the Spanish Meteorological Service (AEMET). The system uses five different limited area models (LAMs), namely HIRLAM (HIRLAM Consortium), HRM (DWD), the UM (UKMO), MM5 (PSU/NCAR) and COSMO (COSMO Consortium). These models run with initial and boundary conditions provided by five different global deterministic models, namely IFS (ECMWF), UM (UKMO), GME (DWD), GFS (NCEP) and CMC (MSC). AEMET-SREPS (AE) validation on the large-scale flow, using ECMWF analysis, shows a consistent and slightly underdispersive system. For surface parameters, the system shows high skill forecasting binary events. 24-h precipitation probabilistic forecasts are verified using an up-scaling grid of observations from European high-resolution precipitation networks, and compared with ECMWF-EPS (EC).

  3. Interannual Tropical Rainfall Variability in General Circulation Model Simulations Associated with the Atmospheric Model Intercomparison Project.

    NASA Astrophysics Data System (ADS)

    Sperber, K. R.; Palmer, T. N.

    1996-11-01

    The interannual variability of rainfall over the Indian subcontinent, the African Sahel, and the Nordeste region of Brazil have been evaluated in 32 models for the period 1979-88 as part of the Atmospheric Model Intercomparison Project (AMIP). The interannual variations of Nordeste rainfall are the most readily captured, owing to the intimate link with Pacific and Atlantic sea surface temperatures. The precipitation variations over India and the Sahel are less well simulated. Additionally, an Indian monsoon wind shear index was calculated for each model. Evaluation of the interannual variability of a wind shear index over the summer monsoon region indicates that the models exhibit greater fidelity in capturing the large-scale dynamic fluctuations than the regional-scale rainfall variations. A rainfall/SST teleconnection quality control was used to objectively stratify model performance. Skill scores improved for those models that qualitatively simulated the observed rainfall/El Niño- Southern Oscillation SST correlation pattern. This subset of models also had a rainfall climatology that was in better agreement with observations, indicating a link between systematic model error and the ability to simulate interannual variations.A suite of six European Centre for Medium-Range Weather Forecasts (ECMWF) AMIP runs (differing only in their initial conditions) have also been examined. As observed, all-India rainfall was enhanced in 1988 relative to 1987 in each of these realizations. All-India rainfall variability during other years showed little or no predictability, possibly due to internal chaotic dynamics associated with intraseasonal monsoon fluctuations and/or unpredictable land surface process interactions. The interannual variations of Nordeste rainfall were best represented. The State University of New York at Albany/National Center for Atmospheric Research Genesis model was run in five initial condition realizations. In this model, the Nordeste rainfall variability was also best reproduced. However, for all regions the skill was less than that of the ECMWF model.The relationships of the all-India and Sahel rainfall/SST teleconnections with horizontal resolution, convection scheme closure, and numerics have been evaluated. Models with resolution T42 performed more poorly than lower-resolution models. The higher resolution models were predominantly spectral. At low resolution, spectral versus gridpoint numerics performed with nearly equal verisimilitude. At low resolution, moisture convergence closure was slightly more preferable than other convective closure techniques. At high resolution, the models that used moisture convergence closure performed very poorly, suggesting that moisture convergence may be problematic for models with horizontal resolution T42.

  4. Decadal shifts of East Asian summer monsoon in a climate model free of explicit GHGs and aerosols

    NASA Astrophysics Data System (ADS)

    Lin, Renping; Zhu, Jiang; Zheng, Fei

    2016-12-01

    The East Asian summer monsoon (EASM) experienced decadal transitions over the past few decades, and the associated "wetter-South-drier-North" shifts in rainfall patterns in China significantly affected the social and economic development in China. Two viewpoints stand out to explain these decadal shifts, regarding the shifts either a result of internal variability of climate system or that of external forcings (e.g. greenhouse gases (GHGs) and anthropogenic aerosols). However, most climate models, for example, the Atmospheric Model Intercomparison Project (AMIP)-type simulations and the Coupled Model Intercomparison Project (CMIP)-type simulations, fail to simulate the variation patterns, leaving the mechanisms responsible for these shifts still open to dispute. In this study, we conducted a successful simulation of these decadal transitions in a coupled model where we applied ocean data assimilation in the model free of explicit aerosols and GHGs forcing. The associated decadal shifts of the three-dimensional spatial structure in the 1990s, including the eastward retreat, the northward shift of the western Pacific subtropical high (WPSH), and the south-cool-north-warm pattern of the upper-level tropospheric temperature, were all well captured. Our simulation supports the argument that the variations of the oceanic fields are the dominant factor responsible for the EASM decadal transitions.

  5. Decadal shifts of East Asian summer monsoon in a climate model free of explicit GHGs and aerosols

    PubMed Central

    Lin, Renping; Zhu, Jiang; Zheng, Fei

    2016-01-01

    The East Asian summer monsoon (EASM) experienced decadal transitions over the past few decades, and the associated "wetter-South-drier-North" shifts in rainfall patterns in China significantly affected the social and economic development in China. Two viewpoints stand out to explain these decadal shifts, regarding the shifts either a result of internal variability of climate system or that of external forcings (e.g. greenhouse gases (GHGs) and anthropogenic aerosols). However, most climate models, for example, the Atmospheric Model Intercomparison Project (AMIP)-type simulations and the Coupled Model Intercomparison Project (CMIP)-type simulations, fail to simulate the variation patterns, leaving the mechanisms responsible for these shifts still open to dispute. In this study, we conducted a successful simulation of these decadal transitions in a coupled model where we applied ocean data assimilation in the model free of explicit aerosols and GHGs forcing. The associated decadal shifts of the three-dimensional spatial structure in the 1990s, including the eastward retreat, the northward shift of the western Pacific subtropical high (WPSH), and the south-cool-north-warm pattern of the upper-level tropospheric temperature, were all well captured. Our simulation supports the argument that the variations of the oceanic fields are the dominant factor responsible for the EASM decadal transitions. PMID:27934933

  6. Arctic Stratospheric Temperature In The Winters 1999/2000 and 2000/2001: A Quantitative Assessment and Microphysical Implications

    NASA Astrophysics Data System (ADS)

    Buss, S.; Wernli, H.; Peter, T.; Kivi, R.; Bui, T. P.; Kleinböhl, A.; Schiller, C.

    Stratospheric winter temperatures play a key role in the chain of microphysical and chemical processes that lead to the formation of polar stratospheric clouds (PSCs), chlorine activation and eventually to stratospheric ozone depletion. Here the tempera- ture conditions during the Arctic winters 1999/2000 and 2000/2001 are quantitatively investigated using observed profiles of water vapour and nitric acid, and tempera- tures from high-resolution radiosondes and aircraft observations, global ECMWF and UKMO analyses and mesoscale model simulations over Scandinavia and Greenland. The ECMWF model resolves parts of the gravity wave activity and generally agrees well with the observations. However, for the very cold temperatures near the ice frost point the ECMWF analyses have a warm bias of 1-6 K compared to radiosondes. For the mesoscale model HRM, this bias is generally reduced due to a more accurate rep- resentation of gravity waves. Quantitative estimates of the impact of the mesoscale temperature perturbations indicates that over Scandinavia and Greenland the wave- induced stratospheric cooling (as simulated by the HRM) affects only moderately the estimated chlorine activation and homogeneous NAT particle formation, but strongly enhances the potential for ice formation.

  7. Momentum and Energy Assessments with NASA and Other Model and Data Assimilation Systems

    NASA Technical Reports Server (NTRS)

    Salstein, David; Nelson, Peter; Hu, Wen-Jie

    2001-01-01

    Support from the NASA Global Modeling and Analysis Program has been used for the following research objectives: 1) the study of aspects of dynamics of torques and angular momentum based on the Goddard GEOS and other analyses; 2) the study of how models participating in the second Atmospheric Model Intercomparison Project (AMIP-2) have success in simulating certain large-scale quantities; 3) the study of the energetics and momentum cycle from certain runs from the Goddard Laboratory for Atmospheres and other models as well; 4) the assessment of changes in diabatic heating and related energetics in the community climate model (CCM3); 5) the analysis of modes of climate of the atmosphere, especially the Arctic and North Atlantic Oscillations. Further information on these endeavors will be provided in published works and the Final Report of the project.

  8. Momentum and Energy Assessments with NASA and Other Model and Data Assimilation Systems

    NASA Technical Reports Server (NTRS)

    Salstein, David; Nelson, Peter; Hu, Wen-Jie; Sud, Yogesh (Technical Monitor)

    2001-01-01

    Aspects of the angular momentum cycle, energetics, and related diagnostics from a number of models, including some from the Goddard Laboratory for Atmospheres, and from the Atmospheric Model Intercomparison Project (AMIP) are examined. Torques that dynamically excite changes in angular momentum, including strong torques at mountains were studied. The measure of how atmospheric mass from a strong weather signal can notably change the angular momentum is studied. For AMIP, there is a spread in the angular momentum amongst models, while the GLA model does reasonably well compared to the other models in the diagnostics examined, namely angular momentum and water vapor. Trends and interannual variability in water vapor over a lengthy period was examined. The role of the diabatic heating components, especially latent heating, in the energy cycle and the terms converting available potential energy to kinetic energy, among other parts of the energy cycle, are studied. Modes of climate of the atmosphere, especially the Arctic and North Atlantic Oscillations, are analyzed as well.

  9. Omens of coupled model biases in the CMIP5 AMIP simulations

    NASA Astrophysics Data System (ADS)

    Găinuşă-Bogdan, Alina; Hourdin, Frédéric; Traore, Abdoul Khadre; Braconnot, Pascale

    2018-02-01

    Despite decades of efforts and improvements in the representation of processes as well as in model resolution, current global climate models still suffer from a set of important, systematic biases in sea surface temperature (SST), not much different from the previous generation of climate models. Many studies have looked at errors in the wind field, cloud representation or oceanic upwelling in coupled models to explain the SST errors. In this paper we highlight the relationship between latent heat flux (LH) biases in forced atmospheric simulations and the SST biases models develop in coupled mode, at the scale of the entire intertropical domain. By analyzing 22 pairs of forced atmospheric and coupled ocean-atmosphere simulations from the CMIP5 database, we show a systematic, negative correlation between the spatial patterns of these two biases. This link between forced and coupled bias patterns is also confirmed by two sets of dedicated sensitivity experiments with the IPSL-CM5A-LR model. The analysis of the sources of the atmospheric LH bias pattern reveals that the near-surface wind speed bias dominates the zonal structure of the LH bias and that the near-surface relative humidity dominates the east-west contrasts.

  10. Implementing a warm cloud microphysics parameterization for convective clouds in NCAR CESM

    NASA Astrophysics Data System (ADS)

    Shiu, C.; Chen, Y.; Chen, W.; Li, J. F.; Tsai, I.; Chen, J.; Hsu, H.

    2013-12-01

    Most of cumulus convection schemes use simple empirical approaches to convert cloud liquid mass to rain water or cloud ice to snow e.g. using a constant autoconversion rate and dividing cloud liquid mass into cloud water and ice as function of air temperature (e.g. Zhang and McFarlane scheme in NCAR CAM model). There are few studies trying to use cloud microphysical schemes to better simulate such precipitation processes in the convective schemes of global models (e.g. Lohmann [2008] and Song, Zhang, and Li [2012]). A two-moment warm cloud parameterization (i.e. Chen and Liu [2004]) is implemented into the deep convection scheme of CAM5.2 of CESM model for treatment of conversion of cloud liquid water to rain water. Short-term AMIP type global simulations are conducted to evaluate the possible impacts from the modification of this physical parameterization. Simulated results are further compared to observational results from AMWG diagnostic package and CloudSAT data sets. Several sensitivity tests regarding to changes in cloud top droplet concentration (here as a rough testing for aerosol indirect effects) and changes in detrained cloud size of convective cloud ice are also carried out to understand their possible impacts on the cloud and precipitation simulations.

  11. Potential impacts of climate change on rainfall erosivity and water availability in China in the next 100 years

    Treesearch

    Ge Sun; Steven G. McNulty; Jennifer Moore; Corey Bunch; Jian Ni

    2002-01-01

    Soil erosion and water shortages threaten China’s social and economic development in the 21st century. This paper examines how projected climate change could affect soil erosion and water availability across China. We used both historical climate data (1961-1980) and the UKMO Hadley3 climate scenario (1960-2099) to drive regional hydrology and soil erosivity models....

  12. Rayleigh lidar observations of enhanced stratopause temperature over Gadanki (13.5° N, 79.2° E) during major stratospheric warming in 2006

    NASA Astrophysics Data System (ADS)

    Sridharan, S.; Sathishkumar, S.; Raghunath, K.

    2009-01-01

    Rayleigh lidar observations of temperature structure and gravity wave activity were carried out at Gadanki (13.5° N, 79.2° E) during January-February 2006. A major stratospheric warming event occurred at high latitude during the end of January and early February. There was a sudden enhancement in the stratopause temperature over Gadanki coinciding with the date of onset of the major stratospheric warming event which occurred at high latitudes. The temperature enhancement persisted even after the end of the high latitude major warming event. During the same time, the UKMO (United Kingdom Meteorological Office) zonal mean temperature showed a similar warming episode at 10° N and cooling episode at 60° N around the region of stratopause. This could be due to ascending (descending) motions at high (low) latitudes above the critical level of planetary waves, where there was no planetary wave flux. The time variation of the gravity wave potential energy computed from the temperature perturbations over Gadanki shows variabilities at planetary wave periods, suggesting a non-linear interaction between gravity waves and planetary waves. The space-time analysis of UKMO temperature data at high and low latitudes shows the presence of similar periodicities of planetary wave of zonal wavenumber 1.

  13. The GFDL global atmosphere and land model AM4.0/LM4.0: 2. Model description, sensitivity studies, and tuning strategies

    USGS Publications Warehouse

    Zhao, M.; Golaz, J.-C.; Held, I. M.; Guo, H.; Balaji, V.; Benson, R.; Chen, J.-H.; Chen, X.; Donner, L. J.; Dunne, J. P.; Dunne, Krista A.; Durachta, J.; Fan, S.-M.; Freidenreich, S. M.; Garner, S. T.; Ginoux, P.; Harris, L. M.; Horowitz, L. W.; Krasting, J. P.; Langenhorst, A. R.; Liang, Z.; Lin, P.; Lin, S.-J.; Malyshev, S. L.; Mason, E.; Milly, Paul C.D.; Ming, Y.; Naik, V.; Paulot, F.; Paynter, D.; Phillipps, P.; Radhakrishnan, A.; Ramaswamy, V.; Robinson, T.; Schwarzkopf, D.; Seman, C. J.; Shevliakova, E.; Shen, Z.; Shin, H.; Silvers, L.; Wilson, J. R.; Winton, M.; Wittenberg, A. T.; Wyman, B.; Xiang, B.

    2018-01-01

    In Part 2 of this two‐part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken to tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.

  14. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 2. Model Description, Sensitivity Studies, and Tuning Strategies

    DOE PAGES

    Zhao, Ming; Golaz, J. -C.; Held, I. M.; ...

    2018-02-19

    Here, in Part 2 of this two–part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken tomore » tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.« less

  15. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 2. Model Description, Sensitivity Studies, and Tuning Strategies

    NASA Astrophysics Data System (ADS)

    Zhao, M.; Golaz, J.-C.; Held, I. M.; Guo, H.; Balaji, V.; Benson, R.; Chen, J.-H.; Chen, X.; Donner, L. J.; Dunne, J. P.; Dunne, K.; Durachta, J.; Fan, S.-M.; Freidenreich, S. M.; Garner, S. T.; Ginoux, P.; Harris, L. M.; Horowitz, L. W.; Krasting, J. P.; Langenhorst, A. R.; Liang, Z.; Lin, P.; Lin, S.-J.; Malyshev, S. L.; Mason, E.; Milly, P. C. D.; Ming, Y.; Naik, V.; Paulot, F.; Paynter, D.; Phillipps, P.; Radhakrishnan, A.; Ramaswamy, V.; Robinson, T.; Schwarzkopf, D.; Seman, C. J.; Shevliakova, E.; Shen, Z.; Shin, H.; Silvers, L. G.; Wilson, J. R.; Winton, M.; Wittenberg, A. T.; Wyman, B.; Xiang, B.

    2018-03-01

    In Part 2 of this two-part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken to tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.

  16. The GFDL Global Atmosphere and Land Model AM4.0/LM4.0: 2. Model Description, Sensitivity Studies, and Tuning Strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Ming; Golaz, J. -C.; Held, I. M.

    Here, in Part 2 of this two–part paper, documentation is provided of key aspects of a version of the AM4.0/LM4.0 atmosphere/land model that will serve as a base for a new set of climate and Earth system models (CM4 and ESM4) under development at NOAA's Geophysical Fluid Dynamics Laboratory (GFDL). The quality of the simulation in AMIP (Atmospheric Model Intercomparison Project) mode has been provided in Part 1. Part 2 provides documentation of key components and some sensitivities to choices of model formulation and values of parameters, highlighting the convection parameterization and orographic gravity wave drag. The approach taken tomore » tune the model's clouds to observations is a particular focal point. Care is taken to describe the extent to which aerosol effective forcing and Cess sensitivity have been tuned through the model development process, both of which are relevant to the ability of the model to simulate the evolution of temperatures over the last century when coupled to an ocean model.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuan; Ma, Po-Lun; Jiang, Jonathan H.

    The attribution of the widely observed shifted precipitation extremes to different forcing agents represents a critical issue for understanding of changes in the hydrological cycle. To compare aerosol and greenhouse-gas effects on the historical trends of precipitation intensity, we performed AMIP-style NCAR/DOE CAM5 model simulations from 1950-2005 with and without anthropogenic aerosol forcings. Precipitation rates at every time step in CAM5 are used to construct precipitation probability distribution functions. By contrasting the two sets of experiments, we found that the global warming induced by the accumulating greenhouse gases is responsible for the changes in precipitation intensity at the global scale.more » However, regionally over the Eastern China, the drastic increase in anthropogenic aerosols primarily accounts for the observed light precipitation suppression since the 1950s. Compared with aerosol radiative effects, aerosol microphysical effect has a predominant role in determining the historical trends of precipitation intensity in Eastern China.« less

  18. An Intercomparison of Lidar Ozone and Temperature Measurements From the SOLVE Mission With Predicted Model Values

    NASA Technical Reports Server (NTRS)

    Burris, John; McGee, Thomas J.; Hoegy, Walt; Lait, Leslie; Sumnicht, Grant; Twigg, Larry; Heaps, William

    2000-01-01

    Temperature profiles acquired by Goddard Space Flight Center's AROTEL lidar during the SOLVE mission onboard NASA's DC-8 are compared with predicted values from several atmospheric models (DAO, NCEP and UKMO). The variability in the differences between measured and calculated temperature fields was approximately 5 K. Retrieved temperatures within the polar vortex showed large regions that were significantly colder than predicted by the atmospheric models.

  19. DYNAMICO, an atmospheric dynamical core for high-performance climate modeling

    NASA Astrophysics Data System (ADS)

    Dubos, Thomas; Meurdesoif, Yann; Spiga, Aymeric; Millour, Ehouarn; Fita, Lluis; Hourdin, Frédéric; Kageyama, Masa; Traore, Abdoul-Khadre; Guerlet, Sandrine; Polcher, Jan

    2017-04-01

    Institut Pierre Simon Laplace has developed a very scalable atmospheric dynamical core, DYNAMICO, based on energy-conserving finite-difference/finite volume numerics on a quasi-uniform icosahedral-hexagonal mesh. Scalability is achieved by combining hybrid MPI/OpenMP parallelism to asynchronous I/O. This dynamical core has been coupled to radiative transfer physics tailored to the atmosphere of Saturn, allowing unprecedented simulations of the climate of this giant planet. For terrestrial climate studies DYNAMICO is being integrated into the IPSL Earth System Model IPSL-CM. Preliminary aquaplanet and AMIP-style simulations yield reasonable results when compared to outputs from IPSL-CM5. The observed performance suggests that an order of magnitude may be gained with respect to IPSL-CM CMIP5 simulations either on the duration of simulations or on their resolution. Longer simulations would be of interest for the study of paleoclimate, while higher resolution could improve certain aspects of the modeled climate such as extreme events, as will be explored in the HighResMIP project. Following IPSL's strategic vision of building a unified global-regional modelling system, a fully-compressible, non-hydrostatic prototype of DYNAMICO has been developed, enabling future convection-resolving simulations. Work supported by ANR project "HEAT", grant number CE23_2014_HEAT Dubos, T., Dubey, S., Tort, M., Mittal, R., Meurdesoif, Y., and Hourdin, F.: DYNAMICO-1.0, an icosahedral hydrostatic dynamical core designed for consistency and versatility, Geosci. Model Dev., 8, 3131-3150, doi:10.5194/gmd-8-3131-2015, 2015.

  20. Improving biomass burning pollution predictions in Singapore using AERONET and Lidar observations.

    NASA Astrophysics Data System (ADS)

    Hardacre, Catherine; Chew, Boon Ning; Gan, Christopher; Burgin, Laura; Hort, Matthew; Lee, Shao Yi; Shaw, Felicia; Witham, Claire

    2016-04-01

    Every year millions of people are affected by poor air quality from trans-boundary smoke haze emitted from large scale biomass burning in Asia. These fires are a particular problem in the Indonesian regions of Sumatra and Kalimantan where peat fires, lit to clear land for oil palm plantations and agriculture, typically result in high levels of particulate matter (PM) emissions. In June 2013 and from August-October 2015 the combination of widespread burning, meteorological and climatological conditions resulted in severe air pollution throughout Southeast Asia. The Met Office of the United Kingdom (UKMO) and the Hazard and Risk Impact Assessment Unit of the Meteorological Service of Singapore (MSS) have developed a quantitative haze forecast to provide a reliable, routine warning of haze events in the Singapore region. The forecast system uses the UKMO's Lagrangian particle dispersion model NAME (Numerical Atmosphere-dispersion Modelling Environment) in combination with high resolution, satellite based emission data from the Global Fire Emissions System (GFAS). The buoyancy of biomass burning smoke and it's rise through the atmosphere has a large impact on the amount of air pollution at the Earth's surface. This is important in Singapore, which is affected by pollution that has travelled long distances and that will have a vertical distribution influenced by meteorology. The vertical distribution of atmospheric aerosol can be observed by Lidar which provides information about haze plume structure. NAME output from two severe haze periods that occurred in June 2013 and from August-October 2015 was compared with observations of total column aerosol optical depth (AOD) from AERONET stations in Singapore and the surrounding region, as well as vertically resolved Lidar data from a station maintained by MSS and from MPLNET. Comparing total column and vertically resolved AOD observations with NAME output indicates that the model underestimates PM concentrations throughout the column. This discrepancy may arise from i) too low emissions of PM, ii) uncertainties in the long range transport of PM or iii) the role of the boundary layer in NWP, all of which are being explored at UKMO and MSS. This study gives a more comprehensive evaluation of the model's performance and indicates that vertically resolved AOD data may be useful as a model input for the haze forecast system.

  1. Global Ocean Evaporation Increases Since 1960 in Climate Reanalyses: How Accurate Are They?

    NASA Astrophysics Data System (ADS)

    Robertson, F. R.; Roberts, J. B.; Bosilovich, M. G.

    2016-12-01

    Evaporation from the world's oceans constitutes the largest component of the global water balance. It is important not only as the ultimate source of moisture that is tied to the radiative processes determining Earth's energy balance but also to freshwater availability over land, governing habitability of the planet. The question we address is whether by using conventional observations alone, the problematic stepwise changes to model bias correction imposed by the continually changing satellite data record can be avoided and a more accurate estimate of evaporation changes obtained over the past six decades—including the satellite era from 1979 to the present. Three climate reanalyses are used, the NOAA ESRL 20CR V2, the ECMWF ERA-20C, and the JRA-55C. In contrast to conventional reanalyses, reduced-observational reanalyses are run with fewer constraints with more temporally homogenous records- SSTs, sea-ice, and radiative forcing (i.e. AMIPs) and additional, minimal observations of surface pressure and marine observations. An ensemble of AMIP-style experiments provides an important comparison. Though limited in temporal extent, state-of-the-art satellite retrievals from the SeaFlux project and 10m neutral winds from Remote Sensing Systems analysis of passive microwave measurements represent observationally driven estimates of evaporation and near-surface wind speed. ENSO-related changes in evaporation dominate interannual timescales, though over multi-decadal periods we find increasing evaporation trends approaching the Clausius-Clapeyron rate of 6% per degree SST rise. This contrasts with the more muted changes in AMIP experiments. Near-surface relative humidity and stability changes generally act to counterbalance the effects of SST alone, but wind speed changes are a chief driver of the evaporation changes. Multi-decadal signals related to Pacific and Atlantic climate variability are prominent; however, there are notable signatures of wind data issues—particularly over the Southern Indian Ocean. Though the passive microwave record extends only from 1988, associated wind speed measurements confirm the basic structure of wind-driven evaporation trends in recent decades.

  2. Projected Future Vegetation Changes for the Northwest United States and Southwest Canada at a Fine Spatial Resolution Using a Dynamic Global Vegetation Model.

    PubMed

    Shafer, Sarah L; Bartlein, Patrick J; Gray, Elizabeth M; Pelltier, Richard T

    2015-01-01

    Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0-58.0°N latitude by 136.6-103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070-2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.

  3. Satellite Data Product and Data Dissemination Updates for the SPoRT Sea Surface Temperature Composite Product

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; LaFontaine, Frank; Berndt, Emily; Meyer, Paul; Jedlovec, Gary

    2017-01-01

    The SPoRT SST composite is a reliable and robust high-resolution product generated twice per day in near real time. It incorporates highest quality data satellite data from infrared imagers and global analysis from NESDIS and UKMO. Recent updates to the product include the inclusion of VIIRS data to extend the life of the product beyond the MODIS era. It is used by a number of users in their DSS.

  4. Patterns of tropical Pacific convection anomalies and associated extratropical wave trains in AMIP5

    NASA Astrophysics Data System (ADS)

    Ding, Shuoyi; Chen, Wen; Graf, Hans-F.; Guo, Yuanyuan

    2018-05-01

    In this paper, the performance of 18 Coupled Model Intercomparison Project Phase 5 (CMIP5) models forced by observational SSTs in simulating the tropical Pacific convective variation and the atmospheric responses in the extratropics are assessed. The multi-model ensemble mean results of 18 CMIP5 models show that five major patterns of tropical Pacific convection anomaly in winter can indeed be well reproduced, however, the simulation of the corresponding extratropical responses for each pattern exists some deficiency except for the La Niña pattern compared with observations. We defined an optimized subset of well performing models (ACCESS1.0, CanAM4, CCSM4, CMCC-CM, HadGEM2-A, MPI-ESM-MR) in tropical Pacific deep convection according to the ranking of model skill score. These models exhibit approximately identical convection anomaly patterns in both amplitude and spatial structure to the observation, which potentially might improve the representation of extratropical teleconnections with the tropical Pacific, especially for the CP El Niño (CPEN), EP El Niño (EPEN) and western CP (W-CP) patterns. Both evident atmospheric anomalies of CPEN and EPEN patterns over the NA/E sector and the northeastward propagating wave trains of W-CP pattern can be quite well simulated in the high-skilled models.

  5. Spectral cumulus parameterization based on cloud-resolving model

    NASA Astrophysics Data System (ADS)

    Baba, Yuya

    2018-02-01

    We have developed a spectral cumulus parameterization using a cloud-resolving model. This includes a new parameterization of the entrainment rate which was derived from analysis of the cloud properties obtained from the cloud-resolving model simulation and was valid for both shallow and deep convection. The new scheme was examined in a single-column model experiment and compared with the existing parameterization of Gregory (2001, Q J R Meteorol Soc 127:53-72) (GR scheme). The results showed that the GR scheme simulated more shallow and diluted convection than the new scheme. To further validate the physical performance of the parameterizations, Atmospheric Model Intercomparison Project (AMIP) experiments were performed, and the results were compared with reanalysis data. The new scheme performed better than the GR scheme in terms of mean state and variability of atmospheric circulation, i.e., the new scheme improved positive bias of precipitation in western Pacific region, and improved positive bias of outgoing shortwave radiation over the ocean. The new scheme also simulated better features of convectively coupled equatorial waves and Madden-Julian oscillation. These improvements were found to be derived from the modification of parameterization for the entrainment rate, i.e., the proposed parameterization suppressed excessive increase of entrainment, thus suppressing excessive increase of low-level clouds.

  6. Wind-Stress Simulations and Equatorial Dynamics in an AGCM. Part 1; Basic Results from a 1979-1999 Forced SST Experiment

    NASA Technical Reports Server (NTRS)

    Bacmeister, Julio T.; Suarez, Max J.; Einaudi, Franco (Technical Monitor)

    2001-01-01

    This is the first of a two part study examining the connection of the equatorial momentum budget in an AGCM (Atmospheric General Circulation Model), with simulated equatorial surface wind stresses over the Pacific. The AGCM used in this study forms part of a newly developed coupled forecasting system used at NASA's Seasonal- to-Interannual Prediction Project. Here we describe the model and present results from a 20-year (1979-1999) AMIP-type experiment forced with observed SSTs (Sea Surface Temperatures). Model results are compared them with available observational data sets. The climatological pattern of extra-tropical planetary waves as well as their ENSO-related variability is found to agree quite well with re-analysis estimates. The model's surface wind stress is examined in detail, and reveals a reasonable overall simulation of seasonal interannual variability, as well as seasonal mean distributions. However, an excessive annual oscillation in wind stress over the equatorial central Pacific is found. We examine the model's divergent circulation over the tropical Pacific and compare it with estimates based on re-analysis data. These comparisons are generally good, but reveal excessive upper-level convergence in the central Pacific. In Part II of this study a direct examination of individual terms in the AGCM's momentum budget is presented. We relate the results of this analysis to the model's simulation of surface wind stress.

  7. Scaling a Convection-Resolving RCM to Near-Global Scales

    NASA Astrophysics Data System (ADS)

    Leutwyler, D.; Fuhrer, O.; Chadha, T.; Kwasniewski, G.; Hoefler, T.; Lapillonne, X.; Lüthi, D.; Osuna, C.; Schar, C.; Schulthess, T. C.; Vogt, H.

    2017-12-01

    In the recent years, first decade-long kilometer-scale resolution RCM simulations have been performed on continental-scale computational domains. However, the size of the planet Earth is still an order of magnitude larger and thus the computational implications of performing global climate simulations at this resolution are challenging. We explore the gap between the currently established RCM simulations and global simulations by scaling the GPU accelerated version of the COSMO model to a near-global computational domain. To this end, the evolution of an idealized moist baroclinic wave has been simulated over the course of 10 days with a grid spacing of up to 930 m. The computational mesh employs 36'000 x 16'001 x 60 grid points and covers 98.4% of the planet's surface. The code shows perfect weak scaling up to 4'888 Nodes of the Piz Daint supercomputer and yields 0.043 simulated years per day (SYPD) which is approximately one seventh of the 0.2-0.3 SYPD required to conduct AMIP-type simulations. However, at half the resolution (1.9 km) we've observed 0.23 SYPD. Besides formation of frontal precipitating systems containing embedded explicitly-resolved convective motions, the simulations reveal a secondary instability that leads to cut-off warm-core cyclonic vortices in the cyclone's core, once the grid spacing is refined to the kilometer scale. The explicit representation of embedded moist convection and the representation of the previously unresolved instabilities exhibit a physically different behavior in comparison to coarser-resolution simulations. The study demonstrates that global climate simulations using kilometer-scale resolution are imminent and serves as a baseline benchmark for global climate model applications and future exascale supercomputing systems.

  8. Anticipating Installation Natural Resource Climate Change Concerns: The Data

    DTIC Science & Technology

    2013-10-15

    period of development (1 to 2 decades) include: 1. CM2.1 (GFDL model — NOAA Princeton) 2. E-H and E-R ( NASA GISS) 3. HadGEM1 (Hadley UKMO) 4. CGCM3...sixth GCM, the Australian CSIRO model, to increase the sample. Thus the adopted GCMs include: 1. GFDL model (NOAA Princeton) 6. GISS Model e ( NASA ...Sciences La- boratory ( USDA 2012) created data that would be useful to the related threshold project. This US Forest Service date were similar to those of

  9. Sea surface temperature measurements by the along-track scanning radiometer on the ERS 1 satellite: Early results

    NASA Astrophysics Data System (ADS)

    Mutlow, C. T.; ZáVody, A. M.; Barton, I. J.; Llewellyn-Jones, D. T.

    1994-11-01

    The along-track scanning radiometer (ATSR) was launched in July 1991 on the European Space Agency's first remote sensing satellite, ERS 1. An initial analysis of ATSR data demonstrates that the sea surface temperature (SST) can be measured from space with very high accuracy. Comparison of simultaneous measurements of SST made from ATSR and from a ship-borne radiometer show that they agree to within 0.3°C. To assess data consistency, a complementary analysis of SST data from ATSR was also carried out. The ATSR global SST field was compared on a daily basis with daily SST analysis of the United Kingdom Meteorological Office (UKMO). The ATSR global field is consistently within 1.0°C of the UKMO analysis. Also, to demonstrate the benefits of along-track scanning SST determination, the ATSR SST data were compared with high-quality bulk temperature observations from drifting buoys. The likely causes of the differences between ATSR and the bulk temperature data are briefly discussed. These results provide early confidence in the quantitative benefit of ATSR's two-angle view of the Earth and its high radiometric performance and show a significant advance on the data obtained from other spaceborne sensors. It should be noted that these measurements were made at a time when the atmosphere was severely contaminated with volcanic aerosol particles, which degrade infrared measurements of the Earth's surface made from space.

  10. The Diversity of Cloud Responses to Twentieth Century Sea Surface Temperatures

    NASA Astrophysics Data System (ADS)

    Silvers, Levi G.; Paynter, David; Zhao, Ming

    2018-01-01

    Low-level clouds are shown to be the conduit between the observed sea surface temperatures (SST) and large decadal fluctuations of the top of the atmosphere radiative imbalance. The influence of low-level clouds on the climate feedback is shown for global mean time series as well as particular geographic regions. The changes of clouds are found to be important for a midcentury period of high sensitivity and a late century period of low sensitivity. These conclusions are drawn from analysis of amip-piForcing simulations using three atmospheric general circulation models (AM2.1, AM3, and AM4.0). All three models confirm the importance of the relationship between the global climate sensitivity and the eastern Pacific trends of SST and low-level clouds. However, this work argues that the variability of the climate feedback parameter is not driven by stratocumulus-dominated regions in the eastern ocean basins, but rather by the cloudy response in the rest of the tropics.

  11. Reconstructing Holocene climate using a climate model: Model strategy and preliminary results

    NASA Astrophysics Data System (ADS)

    Haberkorn, K.; Blender, R.; Lunkeit, F.; Fraedrich, K.

    2009-04-01

    An Earth system model of intermediate complexity (Planet Simulator; PlaSim) is used to reconstruct Holocene climate based on proxy data. The Planet Simulator is a user friendly general circulation model (GCM) suitable for palaeoclimate research. Its easy handling and the modular structure allow for fast and problem dependent simulations. The spectral model is based on the moist primitive equations conserving momentum, mass, energy and moisture. Besides the atmospheric part, a mixed layer-ocean with sea ice and a land surface with biosphere are included. The present-day climate of PlaSim, based on an AMIP II control-run (T21/10L resolution), shows reasonable agreement with ERA-40 reanalysis data. Combining PlaSim with a socio-technological model (GLUES; DFG priority project INTERDYNAMIK) provides improved knowledge on the shift from hunting-gathering to agropastoral subsistence societies. This is achieved by a data assimilation approach, incorporating proxy time series into PlaSim to initialize palaeoclimate simulations during the Holocene. For this, the following strategy is applied: The sensitivities of the terrestrial PlaSim climate are determined with respect to sea surface temperature (SST) anomalies. Here, the focus is the impact of regionally varying SST both in the tropics and the Northern Hemisphere mid-latitudes. The inverse of these sensitivities is used to determine the SST conditions necessary for the nudging of land and coastal proxy climates. Preliminary results indicate the potential, the uncertainty and the limitations of the method.

  12. Impact of Soil Moisture Initialization on Seasonal Weather Prediction

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Suarez, Max J.; Houser, Paul (Technical Monitor)

    2002-01-01

    The potential role of soil moisture initialization in seasonal forecasting is illustrated through ensembles of simulations with the NASA Seasonal-to-Interannual Prediction Project (NSIPP) model. For each boreal summer during 1997-2001, we generated two 16-member ensembles of 3-month simulations. The first, "AMIP-style" ensemble establishes the degree to which a perfect prediction of SSTs would contribute to the seasonal prediction of precipitation and temperature over continents. The second ensemble is identical to the first, except that the land surface is also initialized with "realistic" soil moisture contents through the continuous prior application (within GCM simulations leading up to the start of the forecast period) of a daily observational precipitation data set and the associated avoidance of model drift through the scaling of all surface prognostic variables. A comparison of the two ensembles shows that soil moisture initialization has a statistically significant impact on summertime precipitation and temperature over only a handful of continental regions. These regions agree, to first order, with regions that satisfy three conditions: (1) a tendency toward large initial soil moisture anomalies, (2) a strong sensitivity of evaporation to soil moisture, and (3) a strong sensitivity of precipitation to evaporation. The degree to which the initialization improves forecasts relative to observations is mixed, reflecting a critical need for the continued development of model parameterizations and data analysis strategies.

  13. The Impacts of Amazon Deforestation on Pacific Climate

    NASA Astrophysics Data System (ADS)

    Lindsey, Leah

    Variability in eastern Pacific sea surface temperatures (SSTs) associated with the El Nino Southern Oscillation are known to affect Amazonian precipitation, but to what extent do changing Amazonian vegetation and rainfall impact eastern Pacific SST? The Amazon rainforest is threatened by many factors including climate change and clearing for agricultural reasons. Forest fires and dieback are more likely due to increased frequency and intensity of droughts in the region. It is possible that extensive Amazon deforestation can enhance El Nino conditions by weakening the Walker circulation. Correlations between annual rainfall rates over the Amazon and other atmospheric parameters (global precipitation, surface air temperature, low cloud amount, 500 hPa vertical velocity, surface winds, and 200 hPa winds) over the eastern Pacific indicate strong relationships among these fields. Maps of these correlations (teleconnection maps) reveal that when the Amazon is rainy SSTs in the central and eastern Pacific are cold, rainfall is suppressed over the central and eastern Pacific, low clouds are prominent over the eastern and southeastern Pacific, and subsidence over the central and eastern Pacific is enhanced. Precipitation in the Amazon is also consistent with a strong Walker circulation (La Nina conditions), manifest as strong correlations with the easterly surface and westerly 200 hPa zonal winds. Coupling between Amazon rainfall and these fields are seen in observations and model data. Correlations were calculated using data from observations, reanalysis data, two models under the Coupled Model Intercomparison Project/Atmospheric Model Intercomparison Project (CMIP5/AMIP), and an AMIP run with the model used in this study, the Community Earth System Model (CESM1.1.1). Although the correlations between Amazon precipitation and the aforementioned fields are strong, they do not show causality. In order to investigate the impact of tropical South American deforestation on the Pacific climate, numerical experiments were performed using the CESM. Amazon deforestation was studied in an idealized world where a single continent was covered in forest and then, in a separate simulation, covered in grassland. Four different sets of simulations were carried out: 1) the baseline idealized set-up with prescribed SST, 2) another with an Andes-like mountain range, 3) a simulation with a slab ocean model rather than prescribed SST, and 4) a simulation repeated with the standard Community Atmosphere Model (CAM4) replaced by the Superparameterized version (SP-CAM). The continent in these simulations was compared to the Amazon, and the ocean to the west of the continent was compared to the eastern Pacific. All of the simulations showed a strong warming of around 3-4°C over the continent going from forest to grassland. A notable decrease in precipitation over land of about 1-3 mm day-1 and increase to the west of the continent of about 1-2 mm day-1 was also observed in most of the simulations. The simulations with the slab ocean model showed enhanced precipitation changes with a corresponding decrease of 2-4 mm day-1 over land and increase of 3-5 mm day-1 west of the continent. Simulations that used the SP-CAM showed very small changes in precipitation, which was likely due to the decreased spin-up time allowed for these simulations. The decrease in the surface roughness and reduction in the evapotranspiration for the simulations with grassland contributed to these changes in surface temperature and precipitation. The conversion of forest to grassland in our experiments imply that deforestation can lead to weakening of the Walker circulation by weakening easterly surface winds and westerly upper tropospheric winds. These findings suggest that large-scale Amazon deforestation is capable of enhancing El Nino conditions.

  14. Simulation of Asian monsoon seasonal variations with climate model R42L9/LASG

    NASA Astrophysics Data System (ADS)

    Wang, Zaizhi; Wu, Guoxiong; Wu, Tongwen; Yu, Rucong

    2004-12-01

    The seasonal variations of the Asian monsoon were explored by applying the atmospheric general circulation model R42L9 that was developed recently at the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP/CAS). The 20-yr (1979 1998) simulation was done using the prescribed 20-yr monthly SST and sea-ice data as required by Atmospheric Model Intercomparison Project (AMIP) II in the model. The monthly precipitation and monsoon circulations were analyzed and compared with the observations to validate the model’s performance in simulating the climatological mean and seasonal variations of the Asian monsoon. The results show that the model can capture the main features of the spatial distribution and the temporal evolution of precipitation in the Indian and East Asian monsoon areas. The model also reproduced the basic patterns of monsoon circulation. However, some biases exist in this model. The simulation of the heating over the Tibetan Plateau in summer was too strong. The overestimated heating caused a stronger East Asian monsoon and a weaker Indian monsoon than the observations. In the circulation fields, the South Asia high was stronger and located over the Tibetan Plateau. The western Pacific subtropical high was extended westward, which is in accordance with the observational results when the heating over the Tibetan Plateau is stronger. Consequently, the simulated rainfall around this area and in northwest China was heavier than in observations, but in the Indian monsoon area and west Pacific the rainfall was somewhat deficient.

  15. Sensitivity studies of high-resolution RegCM3 simulations of precipitation over the European Alps: the effect of lateral boundary conditions and domain size

    NASA Astrophysics Data System (ADS)

    Nadeem, Imran; Formayer, Herbert

    2016-11-01

    A suite of high-resolution (10 km) simulations were performed with the International Centre for Theoretical Physics (ICTP) Regional Climate Model (RegCM3) to study the effect of various lateral boundary conditions (LBCs), domain size, and intermediate domains on simulated precipitation over the Great Alpine Region. The boundary conditions used were ECMWF ERA-Interim Reanalysis with grid spacing 0.75∘, the ECMWF ERA-40 Reanalysis with grid spacing 1.125 and 2.5∘, and finally the 2.5∘ NCEP/DOE AMIP-II Reanalysis. The model was run in one-way nesting mode with direct nesting of the high-resolution RCM (horizontal grid spacing Δx = 10 km) with driving reanalysis, with one intermediate resolution nest (Δx = 30 km) between high-resolution RCM and reanalysis forcings, and also with two intermediate resolution nests (Δx = 90 km and Δx = 30 km) for simulations forced with LBC of resolution 2.5∘. Additionally, the impact of domain size was investigated. The results of multiple simulations were evaluated using different analysis techniques, e.g., Taylor diagram and a newly defined useful statistical parameter, called Skill-Score, for evaluation of daily precipitation simulated by the model. It has been found that domain size has the major impact on the results, while different resolution and versions of LBCs, e.g., 1.125∘ ERA40 and 0.7∘ ERA-Interim, do not produce significantly different results. It is also noticed that direct nesting with reasonable domain size, seems to be the most adequate method for reproducing precipitation over complex terrain, while introducing intermediate resolution nests seems to deteriorate the results.

  16. Projected future vegetation changes for the northwest United States and southwest Canada at a fine spatial resolution using a dynamic global vegetation model.

    USGS Publications Warehouse

    Shafer, Sarah; Bartlein, Patrick J.; Gray, Elizabeth M.; Pelltier, Richard T.

    2015-01-01

    Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0–58.0°N latitude by 136.6–103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070–2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas.

  17. Projected Future Vegetation Changes for the Northwest United States and Southwest Canada at a Fine Spatial Resolution Using a Dynamic Global Vegetation Model

    PubMed Central

    Shafer, Sarah L.; Bartlein, Patrick J.; Gray, Elizabeth M.; Pelltier, Richard T.

    2015-01-01

    Future climate change may significantly alter the distributions of many plant taxa. The effects of climate change may be particularly large in mountainous regions where climate can vary significantly with elevation. Understanding potential future vegetation changes in these regions requires methods that can resolve vegetation responses to climate change at fine spatial resolutions. We used LPJ, a dynamic global vegetation model, to assess potential future vegetation changes for a large topographically complex area of the northwest United States and southwest Canada (38.0–58.0°N latitude by 136.6–103.0°W longitude). LPJ is a process-based vegetation model that mechanistically simulates the effect of changing climate and atmospheric CO2 concentrations on vegetation. It was developed and has been mostly applied at spatial resolutions of 10-minutes or coarser. In this study, we used LPJ at a 30-second (~1-km) spatial resolution to simulate potential vegetation changes for 2070–2099. LPJ was run using downscaled future climate simulations from five coupled atmosphere-ocean general circulation models (CCSM3, CGCM3.1(T47), GISS-ER, MIROC3.2(medres), UKMO-HadCM3) produced using the A2 greenhouse gases emissions scenario. Under projected future climate and atmospheric CO2 concentrations, the simulated vegetation changes result in the contraction of alpine, shrub-steppe, and xeric shrub vegetation across the study area and the expansion of woodland and forest vegetation. Large areas of maritime cool forest and cold forest are simulated to persist under projected future conditions. The fine spatial-scale vegetation simulations resolve patterns of vegetation change that are not visible at coarser resolutions and these fine-scale patterns are particularly important for understanding potential future vegetation changes in topographically complex areas. PMID:26488750

  18. Importance of impacts scenarios for the adaptation of agriculture to climate change

    NASA Astrophysics Data System (ADS)

    Zullo, J.; Macedo, C.; Pinto, H. S.; Assad, E. D.; Koga Vicente, A.

    2012-12-01

    The great possibility that the climate is already changing, and the most drastic way possible, increases the challenge of agricultural engineering, especially in environmentally vulnerable areas and in regions where agriculture has a high economic and social importance. Knowledge of potential impacts that may be caused by changes in water and thermal regimes in coming decades is increasingly strategic, as they allow the development of techniques to adapt agriculture to climate change and therefore minimizes the risk of undesirable impacts, for example, in food and nutritional security. Thus, the main objective of this paper is to describe a way to generate impacts scenarios caused by anomalies of precipitation and temperature in the definition of climate risk zoning of an agricultural crop very important in the tropics, such as the sugar cane, especially in central-southern Brazil, which is one of its main world producers. A key point here is the choice of the climate model to be used, considering that 23 different models were used in the fourth IPCC report published in 2007. The number and range of available models requires the definition of criteria for choosing the most suitable for the preparation of the impacts scenarios. One way proposed and used in this work is based on the definition of two groups of models according to 27 technical attributes of them. The clustering of 23 models in two groups, with a model representing each group (UKMO_HadCM3 and MIROC3.2_medres), assists the generation and comparison of impacts scenarios, making them more representative and useful. Another important aspect in the generation of impacts scenarios is the estimate of the relative importance of the anomalies of precipitation and temperature, which are the most commonly used. To assess the relative importance of the anomalies are generated scenarios considering an anomaly at a time and both together. The impacts scenarios for a high emission of greenhouse gases (A2), from 2010 to 2039, were more drastic for the sugar cane in central-southern Brazil using the UKMO_HadCM3 model than the MIROC3.2_medres model. These impacts scenarios, however, were less drastic than those generated for the arabica coffee in the same simulation conditions, reinforcing the increased vulnerability of this agricultural crop to climate change than the sugar cane. The inclusion of other restrictions on the climate risk zoning improves the quality of the generated scenarios and expands its usefulness for agricultural engineering.

  19. SAO and Kelvin Waves in the EuroGRIPS GCMS and the UK Meteorological Offices Analyses

    NASA Technical Reports Server (NTRS)

    Amodei, M.; Pawson, S.; Scaife, A. A.; Lahoz, W.; Langematz, U.; Li, Ding Min; Simon, P.

    2000-01-01

    This work is an intercomparison of four tropospheric-stratospheric climate models, the Unified Model (UM) of the U.K. Meteorological Office (UKMO), the model of the Free University in Berlin (FUB). the ARPEGE-climat model of the National Center for Meteorological Research (CNRM), and the Extended UGAMP GCM (EUGCM) of the Center for Global Atmospheric Modelling (CGAM), against the UKMO analyses. This comparison has been made in the framework of the "GSM-Reality Intercomparison Project for SPARC" (GRIPS). SPARC (Stratospheric Processes and their Role in Climate) aims are to investigate the effects of the middle atmosphere on climate and the GRIPS purpose is to organized a comprehensive assessment of current Middle Atmosphere-Climate Models (MACMs). The models integrations were made without identical contraints e.g. boundary conditions, incoming solar radiation). All models are able to represent the dominant features of the extratropical circulation. In this paper, the structure of the tropical winds and the strengths of the Kelvin waves are examined. Explanations for the differences exhibited. between the models. as well as between models and analyses, are also proposed. In the analyses a rich spectrum of waves (eastward and westward) is present and contributes to drive the SAO (SemiAnnual Oscillation) and the QBO (Quasi-Biennal Oscillation). The amplitude of the Kelvin waves is close to the one observed in UARS (Upper Atmosphere Research Satellite) data. In agreement with observations, the Kelvin waves generated in the models propagate into the middle atmosphere as wave packets which underlines convective forcing origin. In most models, slow Kelvin waves propagate too high and are hence overestimated in the upper stratosphere and in the mesosphere, except for the UM which is more diffusive. These waves are not sufficient to force realistic westerlies of the QBO or SAO westerly phases. If the SAO is represented by all models only two of them are able to generate westerlies between 10 hPa and 50 hPa. The importance of the role played by subgrided gravity waves is more and more recognized. Actually, the EUGCM which includes a parametrization of gravity waves with a non-zero phase speed is able to simulate. with however some unrealistic features, clear easterly to westerly transitions as well as westerlies downward propagations. Thermal damping is also important in the westerlies forcing in the stratosphere. The model ARPEGE-climat shows more westerlies in the stratosphere than tile other three models probably due to the use of a simplified scheme to predict the ozone distribution in the middle atmosphere.

  20. Local increase of anticyclonic wave activity over northern Eurasia under amplified Arctic warming: WAVE ACTIVITY RESPONSE TO ARCTIC MELTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xue, Daokai; Lu, Jian; Sun, Lantao

    In an attempt to resolve the controversy as to whether Arctic sea ice loss leads to more mid-latitude extremes, a metric of finite-amplitude wave activity is adopted to quantify the midlatitude wave activity and its change during the observed period of the drastic Arctic sea ice decline in both ERA Interim reanalysis data and a set of AMIP-type of atmospheric model experiments. Neither the experiment with the trend in the SST or that with the declining trend of Arctic sea ice can simulate the sizable midlatitude-wide reduction in the total wave activity (Ae) observed in the reanalysis, leaving its explanationmore » to the atmospheric internal variability. On the other hand, both the diagnostics of the flux of the local wave activity and the model experiments lend evidence to a possible linkage between the sea ice loss near the Barents and Kara seas and the increasing trend of anticyclonic local wave activity over the northern part of the central Eurasia and the associated impacts on the frequency of temperature extremes.« less

  1. Interannual to Decadal Variability of Ocean Evaporation as Viewed from Climate Reanalyses

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Bosilovich, Michael G.; Roberts, Jason B.; Wang, Hailan

    2015-01-01

    Questions we'll address: Given the uncoupled framework of "AMIP" (Atmosphere Model Inter-comparison Project) experiments, what can they tell us regarding evaporation variability? Do Reduced Observations Reanalyses (RedObs) using Surface Fluxes and Clouds (SFC) pressure (and wind) provide a more realistic picture of evaporation variability? What signals of interannual variability (e.g. El Nino/Southern Oscillation (ENSO)) and decadal variability (Interdecadal Pacific Oscillation (IPO)) are detectable with this hierarchy of evaporation estimates?

  2. Pan-European climate at convection-permitting scale: a model intercomparison study

    NASA Astrophysics Data System (ADS)

    Berthou, Ségolène; Kendon, Elizabeth J.; Chan, Steven C.; Ban, Nikolina; Leutwyler, David; Schär, Christoph; Fosser, Giorgia

    2018-03-01

    We investigate the effect of using convection-permitting models (CPMs) spanning a pan-European domain on the representation of precipitation distribution at a climatic scale. In particular we compare two 2.2 km models with two 12 km models run by ETH Zürich (ETH-12 km and ETH-2.2 km) and the Met-Office (UKMO-12 km and UKMO-2.2 km). The two CPMs yield qualitatively similar differences to the precipitation climatology compared to the 12 km models, despite using different dynamical cores and different parameterization packages. A quantitative analysis confirms that the CPMs give the largest differences compared to 12 km models in the hourly precipitation distribution in regions and seasons where convection is a key process: in summer across the whole of Europe and in autumn over the Mediterranean Sea and coasts. Mean precipitation is increased over high orography, with an increased amplitude of the diurnal cycle. We highlight that both CPMs show an increased number of moderate to intense short-lasting events and a decreased number of longer-lasting low-intensity events everywhere, correcting (and often over-correcting) biases in the 12 km models. The overall hourly distribution and the intensity of the most intense events is improved in Switzerland and to a lesser extent in the UK but deteriorates in Germany. The timing of the peak in the diurnal cycle of precipitation is improved. At the daily time-scale, differences in the precipitation distribution are less clear but the greater Alpine region stands out with the largest differences. Also, Mediterranean autumnal intense events are better represented at the daily time-scale in both 2.2 km models, due to improved representation of mesoscale processes.

  3. Prediction skill of rainstorm events over India in the TIGGE weather prediction models

    NASA Astrophysics Data System (ADS)

    Karuna Sagar, S.; Rajeevan, M.; Vijaya Bhaskara Rao, S.; Mitra, A. K.

    2017-12-01

    Extreme rainfall events pose a serious threat of leading to severe floods in many countries worldwide. Therefore, advance prediction of its occurrence and spatial distribution is very essential. In this paper, an analysis has been made to assess the skill of numerical weather prediction models in predicting rainstorms over India. Using gridded daily rainfall data set and objective criteria, 15 rainstorms were identified during the monsoon season (June to September). The analysis was made using three TIGGE (THe Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble) models. The models considered are the European Centre for Medium-Range Weather Forecasts (ECMWF), National Centre for Environmental Prediction (NCEP) and the UK Met Office (UKMO). Verification of the TIGGE models for 43 observed rainstorm days from 15 rainstorm events has been made for the period 2007-2015. The comparison reveals that rainstorm events are predictable up to 5 days in advance, however with a bias in spatial distribution and intensity. The statistical parameters like mean error (ME) or Bias, root mean square error (RMSE) and correlation coefficient (CC) have been computed over the rainstorm region using the multi-model ensemble (MME) mean. The study reveals that the spread is large in ECMWF and UKMO followed by the NCEP model. Though the ensemble spread is quite small in NCEP, the ensemble member averages are not well predicted. The rank histograms suggest that the forecasts are under prediction. The modified Contiguous Rain Area (CRA) technique was used to verify the spatial as well as the quantitative skill of the TIGGE models. Overall, the contribution from the displacement and pattern errors to the total RMSE is found to be more in magnitude. The volume error increases from 24 hr forecast to 48 hr forecast in all the three models.

  4. Improvement in Simulation of Eurasian Winter Climate Variability with a Realistic Arctic Sea Ice Condition in an Atmospheric GCM

    NASA Technical Reports Server (NTRS)

    Lim, Young-Kwon; Ham, Yoo-Geun; Jeong, Jee-Hoon; Kug, Jong-Seong

    2012-01-01

    The present study investigates how much a realistic Arctic sea ice condition can contribute to improve simulation of the winter climate variation over the Eurasia region. Model experiments are set up using different sea ice boundary conditions over the past 24 years (i.e., 1988-2011). One is an atmospheric model inter-comparison (AMIP) type of run forced with observed sea-surface temperature (SST), sea ice, and greenhouse gases (referred to as Exp RSI), and the other is the same as Exp RSI except for the sea ice forcing, which is a repeating climatological annual cycle (referred to as Exp CSI). Results show that Exp RSI produces the observed dominant pattern of Eurasian winter temperatures and their interannual variation better than Exp CSI (correlation difference up to approx. 0.3). Exp RSI captures the observed strong relationship between the sea ice concentration near the Barents and Kara seas and the temperature anomaly across Eurasia, including northeastern Asia, which is not well captured in Exp CSI. Lagged atmospheric responses to sea ice retreat are examined using observations to understand atmospheric processes for the Eurasian cooling response including the Arctic temperature increase, sea-level pressure increase, upper-level jet weakening and cold air outbreak toward the mid-latitude. The reproducibility of these lagged responses by Exp RSI is also evaluated.

  5. An effective parameter optimization with radiation balance constraints in the CAM5

    NASA Astrophysics Data System (ADS)

    Wu, L.; Zhang, T.; Qin, Y.; Lin, Y.; Xue, W.; Zhang, M.

    2017-12-01

    Uncertain parameters in physical parameterizations of General Circulation Models (GCMs) greatly impact model performance. Traditional parameter tuning methods are mostly unconstrained optimization, leading to the simulation results with optimal parameters may not meet the conditions that models have to keep. In this study, the radiation balance constraint is taken as an example, which is involved in the automatic parameter optimization procedure. The Lagrangian multiplier method is used to solve this optimization problem with constrains. In our experiment, we use CAM5 atmosphere model under 5-yr AMIP simulation with prescribed seasonal climatology of SST and sea ice. We consider the synthesized metrics using global means of radiation, precipitation, relative humidity, and temperature as the goal of optimization, and simultaneously consider the conditions that FLUT and FSNTOA should satisfy as constraints. The global average of the output variables FLUT and FSNTOA are set to be approximately equal to 240 Wm-2 in CAM5. Experiment results show that the synthesized metrics is 13.6% better than the control run. At the same time, both FLUT and FSNTOA are close to the constrained conditions. The FLUT condition is well satisfied, which is obviously better than the average annual FLUT obtained with the default parameters. The FSNTOA has a slight deviation from the observed value, but the relative error is less than 7.7‰.

  6. The Impact of Sea Ice Concentration Accuracies on Climate Model Simulations with the GISS GCM

    NASA Technical Reports Server (NTRS)

    Parkinson, Claire L.; Rind, David; Healy, Richard J.; Martinson, Douglas G.; Zukor, Dorothy J. (Technical Monitor)

    2000-01-01

    The Goddard Institute for Space Studies global climate model (GISS GCM) is used to examine the sensitivity of the simulated climate to sea ice concentration specifications in the type of simulation done in the Atmospheric Modeling Intercomparison Project (AMIP), with specified oceanic boundary conditions. Results show that sea ice concentration uncertainties of +/- 7% can affect simulated regional temperatures by more than 6 C, and biases in sea ice concentrations of +7% and -7% alter simulated annually averaged global surface air temperatures by -0.10 C and +0.17 C, respectively, over those in the control simulation. The resulting 0.27 C difference in simulated annual global surface air temperatures is reduced by a third, to 0.18 C, when considering instead biases of +4% and -4%. More broadly, least-squares fits through the temperature results of 17 simulations with ice concentration input changes ranging from increases of 50% versus the control simulation to decreases of 50% yield a yearly average global impact of 0.0107 C warming for every 1% ice concentration decrease, i.e., 1.07 C warming for the full +50% to -50% range. Regionally and on a monthly average basis, the differences can be far greater, especially in the polar regions, where wintertime contrasts between the +50% and -50% cases can exceed 30 C. However, few statistically significant effects are found outside the polar latitudes, and temperature effects over the non-polar oceans tend to be under 1 C, due in part to the specification of an unvarying annual cycle of sea surface temperatures. The +/- 7% and 14% results provide bounds on the impact (on GISS GCM simulations making use of satellite data) of satellite-derived ice concentration inaccuracies, +/- 7% being the current estimated average accuracy of satellite retrievals and +/- 4% being the anticipated improved average accuracy for upcoming satellite instruments. Results show that the impact on simulated temperatures of imposed ice concentration changes is least in summer, encouragingly the same season in which the satellite accuracies are thought to be worst. Hence the impact of satellite inaccuracies is probably less than the use of an annually averaged satellite inaccuracy would suggest.

  7. Possible climate change over Eurasia under different emission scenarios

    NASA Astrophysics Data System (ADS)

    Sokolov, A. P.; Monier, E.; Scott, J. R.; Forest, C. E.; Schlosser, C. A.

    2011-12-01

    In an attempt to evaluate possible climate change over EURASIA, we analyze results of six AMIP type simulations with CAM version 3 (CAM3) at 2x2.5 degree resolution. CAM3 is driven by time series of sea surface temperatures (SSTs) and sea ice obtained by running the MIT IGSM2.3, which consists of a 3D ocean GCM coupled to a zonally-averaged atmospheric climate-chemistry model. In addition to changes in SSTs, CAM3 is forced by changes in greenhouse gases and ozone concentrations, sulfate aerosol forcing and black carbon loading calculated by the IGSM2.3. An essential feature of the IGSM is the possibility to vary its climate sensitivity (using a cloud adjustment technique) and the strength of the aerosol forcing. For consistency, new modules were developed in CAM3 to modify its climate sensitivity and aerosol forcing to match those used in the simulations with the IGSM2.3. The simulations presented in this paper were carried out for two emission scenarios, a "Business as usual" scenario and a 660 ppm of CO2-EQ stabilization, which are similar to the RCP8.5 and RCP4.5 scenarios, respectively. Values of climate sensitivity used in the simulations within the IGSM-CAM framework are median and the bounds of the 90% probability interval of the probability distribution obtained by comparing the 20th century climate simulated by different versions of the IGSM with observations. The associated strength of the aerosol forcing was chosen to ensure a good agreement with the observed climate change over the 20th century. Because the concentration of sulfate aerosol significantly decreases over the 21st century in both emissions scenarios, climate changes obtained in these simulations provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century climate change.

  8. Possible climate change over Eurasia under different emission scenarios

    NASA Astrophysics Data System (ADS)

    Sokolov, A. P.; Monier, E.; Gao, X.

    2012-12-01

    In an attempt to evaluate possible climate change over EURASIA, we analyze results of six AMIP type simulations with CAM version 3 (CAM3) at 2x2.5 degree resolution. CAM3 is driven by time series of sea surface temperatures (SSTs) and sea ice obtained by running the MIT IGSM2.3, which consists of a 3D ocean GCM coupled to a zonally-averaged atmospheric climate-chemistry model. In addition to changes in SSTs, CAM3 is forced by changes in greenhouse gases and ozone concentrations, sulfate aerosol forcing and black carbon loading calculated by the IGSM2.3. An essential feature of the IGSM is the possibility to vary its climate sensitivity (using a cloud adjustment technique) and the strength of the aerosol forcing. For consistency, new modules were developed in CAM3 to modify its climate sensitivity and aerosol forcing to match those used in the simulations with the IGSM2.3. The simulations presented in this paper were carried out for two emission scenarios, a "Business as usual" scenario and a 660 ppm of CO2-EQ stabilization, which are similar to the RCP8.5 and RCP4.5 scenarios, respectively. Values of climate sensitivity used in the simulations within the IGSM-CAM framework are median and the bounds of the 90% probability interval of the probability distribution obtained by comparing the 20th century climate simulated by different versions of the IGSM with observations. The associated strength of the aerosol forcing was chosen to ensure a good agreement with the observed climate change over the 20th century. Because the concentration of sulfate aerosol significantly decreases over the 21st century in both emissions scenarios, climate changes obtained in these simulations provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century climate change.

  9. Tropical intraseasonal oscillation simulated in an AMIP-type experiment by NICAM

    NASA Astrophysics Data System (ADS)

    Kikuchi, Kazuyoshi; Kodama, Chihiro; Nasuno, Tomoe; Nakano, Masuo; Miura, Hiroaki; Satoh, Masaki; Noda, Akira T.; Yamada, Yohei

    2017-04-01

    It is the first time for the non-hydrostatic icosahedral atmospheric model (NICAM), at a horizontal mesh size of approximately 14-km, to conduct a continuous long-term Atmospheric Model Intercomparison Project-type simulation. This study examines the performance of NICAM in simulating the tropical intraseasonal oscillation (ISO) from a statistical point of view using 30-year data (1979-2008) in the context of the bimodal ISO representation concept proposed by Kikuchi et al., which allows us to examine the seasonally varying behavior of the ISO in great detail, in addition to the MJO working group level 2 diagnostics. It is found that many of the fundamental features of the ISO are well captured by NICAM. The evolution of the ISO convection as well as large-scale circulation over the course of its life cycle is reasonably well reproduced throughout the year. As in the observation, the Madden-Julian oscillation (MJO) mode, characterized by prominent eastward propagation of convection, is predominant during boreal winter, whereas the boreal summer ISO (BSISO) mode, by a combination of pronounced eastward and northward propagation, during summer. The overall shape of the seasonal cycle as measured by the numbers of significant MJO and BSISO days in a month is relatively well captured. Two major biases, however, are also identified. The amplitude of the simulated ISO is weaker by a factor of 2. Significant BSISO events sometimes appear even during winter (December-April), amounting to 30 % of the total significant ISO days as opposed to 2 % in the observation. The results here warrant further studies using the simulation dataset to understand not only many aspects of the dynamics and physics of the ISO but also its role in weather and climate. It is also demonstrated that the concept of the bimodal ISO representation provides a useful framework for assessing model's capability to simulate, and illuminating model's deficiencies in reproducing, the ISO. The nature and causes of the two major biases are also discussed.

  10. Impact of Radiatively Interactive Dust Aerosols in the NASA GEOS-5 Climate Model: Sensitivity to Dust Particle Shape and Refractive Index

    NASA Technical Reports Server (NTRS)

    Colarco, Peter R.; Nowottnick, Edward Paul; Randles, Cynthia A.; Yi, Bingqi; Yang, Ping; Kim, Kyu-Myong; Smith, Jamison A.; Bardeen, Charles D.

    2013-01-01

    We investigate the radiative effects of dust aerosols in the NASA GEOS-5 atmospheric general circulation model. GEOS-5 is improved with the inclusion of a sectional aerosol and cloud microphysics module, the Community Aerosol and Radiation Model for Atmospheres (CARMA). Into CARMA we introduce treatment of the dust and sea salt aerosol lifecycle, including sources, transport evolution, and sinks. The aerosols are radiatively coupled to GEOS-5, and we perform a series of multi-decade AMIP-style simulations in which dust optical properties (spectral refractive index and particle shape distribution) are varied. Optical properties assuming spherical dust particles are from Mie theory, while those for non-spherical shape distributions are drawn from a recently available database for tri-axial ellipsoids. The climatologies of the various simulations generally compare well to data from the MODIS, MISR, and CALIOP space-based sensors, the ground-based AERONET, and surface measurements of dust deposition and concentration. Focusing on the summertime Saharan dust cycle we show significant variability in our simulations resulting from different choices of dust optical properties. Atmospheric heating due to dust enhances surface winds over important Saharan dust sources, and we find a positive feedback where increased dust absorption leads to increased dust emissions. We further find that increased dust absorption leads to a strengthening of the summertime Hadley cell circulation, increasing dust lofting to higher altitudes and strengthening the African Easterly Jet. This leads to a longer atmospheric residence time, higher altitude, and generally more northward transport of dust in simulations with the most absorbing dust optical properties. We find that particle shape, although important for radiance simulations, is a minor effect compared to choices of refractive index, although total atmospheric forcing is enhanced by greater than 10 percent for simulations incorporating a spheroidal shape distribution versus ellipsoidal or spherical shapes.

  11. Global Ocean Evaporation: How Well Can We Estimate Interannual to Decadal Variability?

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Bosilovich, Michael G.; Roberts, Jason B.; Wang, Hailan

    2015-01-01

    Evaporation from the world's oceans constitutes the largest component of the global water balance. It is important not only as the ultimate source of moisture that is tied to the radiative processes determining Earth's energy balance but also to freshwater availability over land, governing habitability of the planet. Here we focus on variability of ocean evaporation on scales from interannual to decadal by appealing to three sources of data: the new MERRA-2 (Modern-Era Retrospective analysis for Research and Applications -2); climate models run with historical sea-surface temperatures, ice and atmospheric constituents (so-called AMIP experiments); and state-of-the-art satellite retrievals from the Seaflux and HOAPS (Hamburg Ocean-Atmosphere Parameters and Fluxes from Satellite) projects. Each of these sources has distinct advantages as well as drawbacks. MERRA-2, like other reanalyses, synthesizes evaporation estimates consistent with observationally constrained physical and dynamical models-but data stream discontinuities are a major problem for interpreting multi-decadal records. The climate models used in data assimilation can also be run with lesser constraints such as with SSTs and sea-ice (i.e. AMIPs) or with additional, minimal observations of surface pressure and marine observations that have longer and less fragmentary observational records. We use the new ERA-20C reanalysis produced by ECMWF embodying the latter methodology. Still, the model physics biases in climate models and the lack of a predicted surface energy balance are of concern. Satellite retrievals and comparisons to ship-based measurements offer the most observationally-based estimates, but sensor inter-calibration, algorithm retrieval assumptions, and short records are dominant issues. Our strategy depends on maximizing the advantages of these combined records. The primary diagnostic tool used here is an analysis of bulk aerodynamic computations produced by these sources and uses a first-order Taylor series analysis of wind speed, SST, near-surface stability and relative humidity variations around climatology to gauge the importance of these components. We find that the MERRA-2 evaporation record is strongly influenced by the availability of wind speed and humidity from passive microwave imagers beginning in the late 1980s as well as by the SST record. The trend over the period 1980 to present is nearly 10%. AMIP or the ERA-20C trends are much smaller. We find that ENSO-related signals involving both wind speed and thermodynamic variability remain the primary signal in the latter and are confirmed by satellite retrievals. We present uncertainty estimates based on the various data sources and discuss the implications for GEWEX water and energy budget science challenges.

  12. Simulating the impact of the large-scale circulation on the 2-m temperature and precipitation climatology

    NASA Astrophysics Data System (ADS)

    Bowden, Jared H.; Nolte, Christopher G.; Otte, Tanya L.

    2013-04-01

    The impact of the simulated large-scale atmospheric circulation on the regional climate is examined using the Weather Research and Forecasting (WRF) model as a regional climate model. The purpose is to understand the potential need for interior grid nudging for dynamical downscaling of global climate model (GCM) output for air quality applications under a changing climate. In this study we downscale the NCEP-Department of Energy Atmospheric Model Intercomparison Project (AMIP-II) Reanalysis using three continuous 20-year WRF simulations: one simulation without interior grid nudging and two using different interior grid nudging methods. The biases in 2-m temperature and precipitation for the simulation without interior grid nudging are unreasonably large with respect to the North American Regional Reanalysis (NARR) over the eastern half of the contiguous United States (CONUS) during the summer when air quality concerns are most relevant. This study examines how these differences arise from errors in predicting the large-scale atmospheric circulation. It is demonstrated that the Bermuda high, which strongly influences the regional climate for much of the eastern half of the CONUS during the summer, is poorly simulated without interior grid nudging. In particular, two summers when the Bermuda high was west (1993) and east (2003) of its climatological position are chosen to illustrate problems in the large-scale atmospheric circulation anomalies. For both summers, WRF without interior grid nudging fails to simulate the placement of the upper-level anticyclonic (1993) and cyclonic (2003) circulation anomalies. The displacement of the large-scale circulation impacts the lower atmosphere moisture transport and precipitable water, affecting the convective environment and precipitation. Using interior grid nudging improves the large-scale circulation aloft and moisture transport/precipitable water anomalies, thereby improving the simulated 2-m temperature and precipitation. The results demonstrate that constraining the RCM to the large-scale features in the driving fields improves the overall accuracy of the simulated regional climate, and suggest that in the absence of such a constraint, the RCM will likely misrepresent important large-scale shifts in the atmospheric circulation under a future climate.

  13. Impact of Subsurface Temperature Variability on Meteorological Variability: An AGCM Study

    NASA Astrophysics Data System (ADS)

    Mahanama, S. P.; Koster, R. D.; Liu, P.

    2006-05-01

    Anomalous atmospheric conditions can lead to surface temperature anomalies, which in turn can lead to temperature anomalies deep in the soil. The deep soil temperature (and the associated ground heat content) has significant memory -- the dissipation of a temperature anomaly may take weeks to months -- and thus deep soil temperature may contribute to the low frequency variability of energy and water variables elsewhere in the system. The memory may even provide some skill to subseasonal and seasonal forecasts. This study uses two long-term AGCM experiments to isolate the contribution of deep soil temperature variability to variability elsewhere in the climate system. The first experiment consists of a standard ensemble of AMIP-type simulations, simulations in which the deep soil temperature variable is allowed to interact with the rest of the system. In the second experiment, the coupling of the deep soil temperature to the rest of the climate system is disabled -- at each grid cell, the local climatological seasonal cycle of deep soil temperature (as determined from the first experiment) is prescribed. By comparing the variability of various atmospheric quantities as generated in the two experiments, we isolate the contribution of interactive deep soil temperature to that variability. The results show that interactive deep soil temperature contributes significantly to surface temperature variability. Interactive deep soil temperature, however, reduces the variability of the hydrological cycle (evaporation and precipitation), largely because it allows for a negative feedback between evaporation and temperature.

  14. An extreme Arctic cyclone in August 2016 and its predictability on medium-range timescales

    NASA Astrophysics Data System (ADS)

    Yamagami, Akio; Matsueda, Mio; Tanaka, Hiroshi

    2017-04-01

    An extremely strong Arctic cyclone (AC) developed in August 2016. The AC exhibited a minimum sea level pressure (SLP) of 967.2 hPa and covered the entire Pacific sector of the Arctic Ocean at 0000UTC on 16 August. At this time the AC was comparable to the strong AC observed in August 2012, in terms of horizontal extent, position, and intensity as measured by SLP. Two processes contributed to the explosive development of the AC: growth due to baroclinic instability, similar to extratropical cyclones, during the early part of the development stage, and later nonlinear development via the merging of upper warm cores. The AC was maintained for more than one month through multiple mergings with cyclones both generated in the Arctic and migrating northward from lower latitudes, as a result of the high cyclone activity in summer 2016. This study also investigated the predictability of the AC using operational medium-range ensemble forecasts: CMC (Canada), ECMWF (EU), JMA (Japan), NCEP (USA), and UKMO (UK), available at the The Interactive Grand Global Ensemble (TIGGE) database. The minimum SLP of the AC at 0000UTC on 16 August was well predicted by ECMWF 6-day, NCEP and UKMO 5-day, CMC 4-day, and JMA 3-day in advance. The predictability of the minimum SLP of the AC in August 2016 was much higher than that of the AC in 2012 August. Whereas most of the members well predicted the cyclogenesis of the AC, the growth due to baroclinic instability was weaker in some members. Even if the baroclinic growth was predicted well, predicted AC did not develop when the nonlinear development via the merging was not predict accurately. The accurate prediction of the processes in both early and later parts of the development stage was important for the accurate prediction of the development of the AC.

  15. Managing Livestock Species under Climate Change in Australia

    PubMed Central

    Seo, S. Niggol; McCarl, Bruce

    2011-01-01

    Simple Summary World communities are concerned about the impacts of a hotter and drier climate on future agriculture. By examining Australian regional livestock data on sheep, beef cattle, dairy cattle, and pigs, the authors find that livestock production will expand under such conditions. Livestock revenue per farm is expected to increase by more than 47% by 2060 under the UKMO, the GISS, and a high degree of warming CSIRO scenario. The existence of a threshold temperature for these species is not evident. Abstract This paper examines the vulnerabilities of major livestock species raised in Australia to climate change using the regional livestock profile of Australia of around 1,400 regions. The number of each species owned, the number of each species sold, and the aggregate livestock revenue across all species are examined. The four major species analyzed are sheep, beef cattle, dairy cattle, and pigs. The analysis also includes livestock products such as wool and milk. These livestock production statistics are regressed against climate, geophysical, market and household characteristics. In contrast to crop studies, the analysis finds that livestock species are resilient to a hotter and more arid climate. Under the CSIRO climate scenario in which temperature increases by 3.4 °C, livestock revenue per farm increases significantly while the number of each species owned increases by large percentages except for dairy cattle. The precipitation reduction by about 8% in 2060 also increases the numbers of livestock species per farm household. Under both UKMO and GISS scenarios, livestock revenue is expected to increase by around 47% while the livestock population increases by large percentage. Livestock management may play a key role in adapting to a hot and arid climate in Australia. However, critical values of the climatic variables for the species analyzed in this paper are not obvious from the regional data. PMID:26486620

  16. Ionospheric reaction on sudden stratospheric warming events in Russiás Asia region

    NASA Astrophysics Data System (ADS)

    Polyakova, Anna; Perevalova, Natalya; Chernigovskaya, Marina

    2015-12-01

    The response of the ionosphere to sudden stratospheric warmings (SSWs) in the Asian region of Russia is studied. Two SSW events observed in 2008-2009 and 2012-2013 winter periods of extreme solar minimum and moderate solar maximum are considered. To detect the ionospheric effects caused by SSWs, we carried out a joint analysis of global ionospheric maps (GIM) of the total electron content (TEC), MLS (Microwave Limb Sounder, EOS Aura) measurements of temperature vertical profiles, as well as NCEP/NCAR and UKMO Reanalysis data. For the first time, it was found that during strong SSWs, in the mid-latitude ionosphere the amplitude of diurnal TEC variation decreases nearly half compared to quiet days. At the same time, the intensity of TEC deviations from the background level increases. It was also found that at SSW peak the midday TEC maximum decreases, and night/morning TEC values increase compared to quiet days. It was shown that during SSWs, TEC dynamics was identical for different geophysical conditions.The response of the ionosphere to sudden stratospheric warmings (SSWs) in the Asian region of Russia is studied. Two SSW events observed in 2008-2009 and 2012-2013 winter periods of extreme solar minimum and moderate solar maximum are considered. To detect the ionospheric effects caused by SSWs, we carried out a joint analysis of global ionospheric maps (GIM) of the total electron content (TEC), MLS (Microwave Limb Sounder, EOS Aura) measurements of temperature vertical profiles, as well as NCEP/NCAR and UKMO Reanalysis data. For the first time, it was found that during strong SSWs, in the mid-latitude ionosphere the amplitude of diurnal TEC variation decreases nearly half compared to quiet days. At the same time, the intensity of TEC deviations from the background level increases. It was also found that at SSW peak the midday TEC maximum decreases, and night/morning TEC values increase compared to quiet days. It was shown that during SSWs, TEC dynamics was identical for different geophysical conditions.

  17. The NOGAPS Ten Year AMIP Integration

    DTIC Science & Technology

    1993-10-01

    problems with open) c C c character*54 file character*48 fitnmai character*8 status character*6 ctau c logical lex,opn,iread c Lenr = 8*(2.len) if(itype.ne...return endi f c endi f c open(unit~iun,file~file,access=’direct’,form~’unformattedI *, recl= lenr ,status=status) C if(msg.tt.2) return c print 100...status(1:3),fiLnam,iun, Lenr 100 format(lx,a3,lx,a54,1 opened as unit=’,i3,’ : Lenr =’,i9,’ bytes ’) return end subroutine nfread(fnam,msg,itype,istrt,nrec

  18. Impact of physical permafrost processes on hydrological change

    NASA Astrophysics Data System (ADS)

    Hagemann, Stefan; Blome, Tanja; Beer, Christian; Ekici, Altug

    2015-04-01

    Permafrost or perennially frozen ground is an important part of the terrestrial cryosphere; roughly one quarter of Earth's land surface is underlain by permafrost. As it is a thermal phenomenon, its characteristics are highly dependent on climatic factors. The impact of the currently observed warming, which is projected to persist during the coming decades due to anthropogenic CO2 input, certainly has effects for the vast permafrost areas of the high northern latitudes. The quantification of these effects, however, is scientifically still an open question. This is partly due to the complexity of the system, where several feedbacks are interacting between land and atmosphere, sometimes counterbalancing each other. Moreover, until recently, many global circulation models (GCMs) and Earth system models (ESMs) lacked the sufficient representation of permafrost physics in their land surface schemes. Within the European Union FP7 project PAGE21, the land surface scheme JSBACH of the Max-Planck-Institute for Meteorology ESM (MPI-ESM) has been equipped with the representation of relevant physical processes for permafrost studies. These processes include the effects of freezing and thawing of soil water for both energy and water cycles, thermal properties depending on soil water and ice contents, and soil moisture movement being influenced by the presence of soil ice. In the present study, it will be analysed how these permafrost relevant processes impact projected hydrological changes over northern hemisphere high latitude land areas. For this analysis, the atmosphere-land part of MPI-ESM, ECHAM6-JSBACH, is driven by prescribed SST and sea ice in an AMIP2-type setup with and without the newly implemented permafrost processes. Observed SST and sea ice for 1979-1999 are used to consider induced changes in the simulated hydrological cycle. In addition, simulated SST and sea ice are taken from a MPI-ESM simulation conducted for CMIP5 following the RCP8.5 scenario. The corresponding simulations with ECHAM6-JSBACH are used to assess differences in projected hydrological changes induced by the permafrost relevant processes.

  19. Evaluating the cloud radiative forcing over East Asia during summer simulated by CMIP5 models

    NASA Astrophysics Data System (ADS)

    Lin, Z.; Wang, Y.; Liu, X.

    2017-12-01

    A large degree of uncertainty in global climate models (GCMs) can be attributed to the representation of clouds and its radiative forcing (CRF). In this study, the simulated CRFs, total cloud fraction (CF) and cloud properties over East Asia from 20 CMIP5 AMIP models are evaluated and compared with multiple satellite observations, and the possible causes for the CRF bias in the CMIP5 models are then investigated. Based on the satellite observation, strong Long wave CRF (LWCRF) and Short wave CRF (SWCRF) are found to be located over Southwestern China, with minimum SWCRF less than -130Wm-2 and this is associated with the large amount of cloud in the region. By contrast, weak CRFs are located over Northwest China and Western Pacific region because of less cloud amount. In Northeastern China, the strong SWCRF and week LWCRF can be found due to the dominant low-level cloud. In Eastern China, the CRFs is moderate due to the co-existence of the multi-layer cloud. CMIP5 models can basically capture the structure of CRFs in East Asia, with the spatial correlation coefficient between 0.5 and 0.9. But most models underestimate CRFs in East Asia, which is highly associated with the underestimation of cloud amount in the region. The performance of CMIP5 models varies in different part of East Asian region, with a larger deviation in Eastern China (EC). Further investigation suggests that, underestimation of the cloud amount in EC can lead to the weak bias of CRFs in EC, however, this CRF bias can be cancelled out by the overestimation effect of CRF due to excessive cloud optical depth (COD) simulated by the models. The annual cycle of simulated CRF over Eastern China is also examined, and it is found, CMIP models are unable to reproduce the northward migration of CRF in summer monsoon season, which is closely related with northward shift of East Asian summer monsoon rain belt.

  20. The impact of parametrized convection on cloud feedback.

    PubMed

    Webb, Mark J; Lock, Adrian P; Bretherton, Christopher S; Bony, Sandrine; Cole, Jason N S; Idelkadi, Abderrahmane; Kang, Sarah M; Koshiro, Tsuyoshi; Kawai, Hideaki; Ogura, Tomoo; Roehrig, Romain; Shin, Yechul; Mauritsen, Thorsten; Sherwood, Steven C; Vial, Jessica; Watanabe, Masahiro; Woelfle, Matthew D; Zhao, Ming

    2015-11-13

    We investigate the sensitivity of cloud feedbacks to the use of convective parametrizations by repeating the CMIP5/CFMIP-2 AMIP/AMIP + 4K uniform sea surface temperature perturbation experiments with 10 climate models which have had their convective parametrizations turned off. Previous studies have suggested that differences between parametrized convection schemes are a leading source of inter-model spread in cloud feedbacks. We find however that 'ConvOff' models with convection switched off have a similar overall range of cloud feedbacks compared with the standard configurations. Furthermore, applying a simple bias correction method to allow for differences in present-day global cloud radiative effects substantially reduces the differences between the cloud feedbacks with and without parametrized convection in the individual models. We conclude that, while parametrized convection influences the strength of the cloud feedbacks substantially in some models, other processes must also contribute substantially to the overall inter-model spread. The positive shortwave cloud feedbacks seen in the models in subtropical regimes associated with shallow clouds are still present in the ConvOff experiments. Inter-model spread in shortwave cloud feedback increases slightly in regimes associated with trade cumulus in the ConvOff experiments but is quite similar in the most stable subtropical regimes associated with stratocumulus clouds. Inter-model spread in longwave cloud feedbacks in strongly precipitating regions of the tropics is substantially reduced in the ConvOff experiments however, indicating a considerable local contribution from differences in the details of convective parametrizations. In both standard and ConvOff experiments, models with less mid-level cloud and less moist static energy near the top of the boundary layer tend to have more positive tropical cloud feedbacks. The role of non-convective processes in contributing to inter-model spread in cloud feedback is discussed. © 2015 The Authors.

  1. The impact of parametrized convection on cloud feedback

    PubMed Central

    Webb, Mark J.; Lock, Adrian P.; Bretherton, Christopher S.; Bony, Sandrine; Cole, Jason N. S.; Idelkadi, Abderrahmane; Kang, Sarah M.; Koshiro, Tsuyoshi; Kawai, Hideaki; Ogura, Tomoo; Roehrig, Romain; Shin, Yechul; Mauritsen, Thorsten; Sherwood, Steven C.; Vial, Jessica; Watanabe, Masahiro; Woelfle, Matthew D.; Zhao, Ming

    2015-01-01

    We investigate the sensitivity of cloud feedbacks to the use of convective parametrizations by repeating the CMIP5/CFMIP-2 AMIP/AMIP + 4K uniform sea surface temperature perturbation experiments with 10 climate models which have had their convective parametrizations turned off. Previous studies have suggested that differences between parametrized convection schemes are a leading source of inter-model spread in cloud feedbacks. We find however that ‘ConvOff’ models with convection switched off have a similar overall range of cloud feedbacks compared with the standard configurations. Furthermore, applying a simple bias correction method to allow for differences in present-day global cloud radiative effects substantially reduces the differences between the cloud feedbacks with and without parametrized convection in the individual models. We conclude that, while parametrized convection influences the strength of the cloud feedbacks substantially in some models, other processes must also contribute substantially to the overall inter-model spread. The positive shortwave cloud feedbacks seen in the models in subtropical regimes associated with shallow clouds are still present in the ConvOff experiments. Inter-model spread in shortwave cloud feedback increases slightly in regimes associated with trade cumulus in the ConvOff experiments but is quite similar in the most stable subtropical regimes associated with stratocumulus clouds. Inter-model spread in longwave cloud feedbacks in strongly precipitating regions of the tropics is substantially reduced in the ConvOff experiments however, indicating a considerable local contribution from differences in the details of convective parametrizations. In both standard and ConvOff experiments, models with less mid-level cloud and less moist static energy near the top of the boundary layer tend to have more positive tropical cloud feedbacks. The role of non-convective processes in contributing to inter-model spread in cloud feedback is discussed. PMID:26438278

  2. Constraining the low-cloud optical depth feedback at middle and high latitudes using satellite observations

    DOE PAGES

    Terai, C. R.; Klein, S. A.; Zelinka, M. D.

    2016-08-26

    The increase in cloud optical depth with warming at middle and high latitudes is a robust cloud feedback response found across all climate models. This study builds on results that suggest the optical depth response to temperature is timescale invariant for low-level clouds. The timescale invariance allows one to use satellite observations to constrain the models' optical depth feedbacks. Three passive-sensor satellite retrievals are compared against simulations from eight models from the Atmosphere Model Intercomparison Project (AMIP) of the 5th Coupled Model Intercomparison Project (CMIP5). This study confirms that the low-cloud optical depth response is timescale invariant in the AMIPmore » simulations, generally at latitudes higher than 40°. Compared to satellite estimates, most models overestimate the increase in optical depth with warming at the monthly and interannual timescales. Many models also do not capture the increase in optical depth with estimated inversion strength that is found in all three satellite observations and in previous studies. The discrepancy between models and satellites exists in both hemispheres and in most months of the year. A simple replacement of the models' optical depth sensitivities with the satellites' sensitivities reduces the negative shortwave cloud feedback by at least 50% in the 40°–70°S latitude band and by at least 65% in the 40°–70°N latitude band. Furthermore, based on this analysis of satellite observations, we conclude that the low-cloud optical depth feedback at middle and high latitudes is likely too negative in climate models.« less

  3. Experimental prediction of severe droughts on seasonal to intra-annual time scales with GFDL High-Resolution Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Yu, Z.; Lin, S.

    2011-12-01

    Regional heat waves and drought have major economic and societal impacts on regional and even global scales. For example, during and following the 2010-2011 La Nina period, severe droughts have been reported in many places around the world including China, the southern US, and the east Africa, causing severe hardship in China and famine in east Africa. In this study, we investigate the feasibility and predictability of severe spring-summer draught events, 3 to 6 months in advance with the 25-km resolution Geophysical Fluid Dynamics Laboratory High-Resolution Atmosphere Model (HiRAM), which is built as a seamless weather-climate model, capable of long-term climate simulations as well as skillful seasonal predictions (e.g., Chen and Lin 2011, GRL). We adopted a similar methodology and the same (HiRAM) model as in Chen and Lin (2011), which is used successfully for seasonal hurricane predictions. A series of initialized 7-month forecasts starting from Dec 1 are performed each year (5 members each) during the past decade (2000-2010). We will then evaluate the predictability of the severe drought events during this period by comparing model predictions vs. available observations. To evaluate the predictive skill, in this preliminary report, we will focus on the anomalies of precipitation, sea-level-pressure, and 500-mb height. These anomalies will be computed as the individual model prediction minus the mean climatology obtained by an independent AMIP-type "simulation" using observed SSTs (rather than using predictive SSTs in the forecasts) from the same model.

  4. Constraining the low-cloud optical depth feedback at middle and high latitudes using satellite observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terai, C. R.; Klein, S. A.; Zelinka, M. D.

    The increase in cloud optical depth with warming at middle and high latitudes is a robust cloud feedback response found across all climate models. This study builds on results that suggest the optical depth response to temperature is timescale invariant for low-level clouds. The timescale invariance allows one to use satellite observations to constrain the models' optical depth feedbacks. Three passive-sensor satellite retrievals are compared against simulations from eight models from the Atmosphere Model Intercomparison Project (AMIP) of the 5th Coupled Model Intercomparison Project (CMIP5). This study confirms that the low-cloud optical depth response is timescale invariant in the AMIPmore » simulations, generally at latitudes higher than 40°. Compared to satellite estimates, most models overestimate the increase in optical depth with warming at the monthly and interannual timescales. Many models also do not capture the increase in optical depth with estimated inversion strength that is found in all three satellite observations and in previous studies. The discrepancy between models and satellites exists in both hemispheres and in most months of the year. A simple replacement of the models' optical depth sensitivities with the satellites' sensitivities reduces the negative shortwave cloud feedback by at least 50% in the 40°–70°S latitude band and by at least 65% in the 40°–70°N latitude band. Furthermore, based on this analysis of satellite observations, we conclude that the low-cloud optical depth feedback at middle and high latitudes is likely too negative in climate models.« less

  5. Intercomparison of hydrologic processes in global climate models

    NASA Technical Reports Server (NTRS)

    Lau, W. K.-M.; Sud, Y. C.; Kim, J.-H.

    1995-01-01

    In this report, we address the intercomparison of precipitation (P), evaporation (E), and surface hydrologic forcing (P-E) for 23 Atmospheric Model Intercomparison Project (AMIP) general circulation models (GCM's) including relevant observations, over a variety of spatial and temporal scales. The intercomparison includes global and hemispheric means, latitudinal profiles, selected area means for the tropics and extratropics, ocean and land, respectively. In addition, we have computed anomaly pattern correlations among models and observations for different seasons, harmonic analysis for annual and semiannual cycles, and rain-rate frequency distribution. We also compare the joint influence of temperature and precipitation on local climate using the Koeppen climate classification scheme.

  6. The North American Regional Climate Change Assessment Program (NARCCAP): Status and results

    NASA Astrophysics Data System (ADS)

    Gutowski, W. J.

    2009-12-01

    NARCCAP is a multi-institutional program that is investigating systematically the uncertainties in regional scale simulations of contemporary climate and projections of future climate. NARCCAP is supported by multiple federal agencies. NARCCAP is producing an ensemble of high-resolution climate-change scenarios by nesting multiple RCMs in reanalyses and multiple atmosphere-ocean GCM simulations of contemporary and future-scenario climates. The RCM domains cover the contiguous U.S., northern Mexico, and most of Canada. The simulation suite also includes time-slice, high resolution GCMs that use sea-surface temperatures from parent atmosphere-ocean GCMs. The baseline resolution of the RCMs and time-slice GCMs is 50 km. Simulations use three sources of boundary conditions: National Centers for Environmental Prediction (NCEP)/Department of Energy (DOE) AMIP-II Reanalysis, GCMs simulating contemporary climate and GCMs using the A2 SRES emission scenario for the twenty-first century. Simulations cover 1979-2004 and 2038-2060, with the first 3 years discarded for spin-up. The resulting RCM and time-slice simulations offer opportunity for extensive analysis of RCM simulations as well as a basis for multiple high-resolution climate scenarios for climate change impacts assessments. Geophysical statisticians are developing measures of uncertainty from the ensemble. To enable very high-resolution simulations of specific regions, both RCM and high-resolution time-slice simulations are saving output needed for further downscaling. All output is publically available to the climate analysis and the climate impacts assessment community, through an archiving and data-distribution plan. Some initial results show that the models closely reproduce ENSO-related precipitation variations in coastal California, where the correlation between the simulated and observed monthly time series exceeds 0.94 for all models. The strong El Nino events of 1982-83 and 1997-98 are well reproduced for the Pacific coastal region of the U.S. in all models. ENSO signals are less well reproduced in other regions. The models also produce well extreme monthly precipitation in coastal California and the Upper Midwest. Model performance tends to deteriorate from west to east across the domain, or roughly from the inflow boundary toward the outflow boundary. This deterioration with distance from the inflow boundary is ameliorated to some extent in models formulated such that large-scale information is included in the model solution, whether implemented by spectral nudging or by use of a perturbation form of the governing equations.

  7. Wave Activity (Planetary, Tidal) throughout the Middle Atmoshere (25-100 km) over the CUJO Network: Satellite and Medium Frequency (MF) Radar Observations

    NASA Astrophysics Data System (ADS)

    Manson, A.; Meek, C.; Chshyolkova, T.; Avery, S.; Thorsen, D.; MacDougall, J.; Hocking, W.; Murayama, Y.; Igarashi, K.

    Planetary and tidal wave activity in the mesosphere-lower thermosphere (MLT), and assessment of wave activity sources in the lower atmosphere, are studied using combinations of ground based (GB) and satellite instruments (2000-2002). CUJO (Canada U.S. Japan Opportunity) comprises MF radar (MFR) systems at London (43°N, 81°W), Platteville (40°N, 105°W), Saskatoon (52°N, 107°W), Wakkanai (45°N, 142°E) and Yamagawa (31°N, 131°E). It offers a significant mid-latitude 7,000 km longitudinal sector in the North American-Pacific region, and a useful range of latitudes (12-14°) at two longitudes. CUJO provides winds and tides 70-100km. Satellite data include the daily values of the total ozone column measured by the Earth Probe (EP) TOMS (Total Ozone Mapping Spectrometer) and provides a measure of tropopause-lower stratospheric planetary wave activity as well as ozone variability. The so-called UKMO data (an assimilation system) are used for correlative purposes with the TOMS data. Climatologies of ozone and winds/tides involving frequency versus time (wavelet) contour plots for periods from 2-d to 30-d and the interval from mid 2000 to 2002, show that the changes with altitude, longitude and latitude are very significant and distinctive. Geometric-mean wavelets for the region of the 40°N MFRs demonstrate occasions during the autumn, winter and spring months when there are similarities in the spectral features of the lower atmosphere and at mesopause (85km) heights. Both direct planetary wave (PW) propagation into the MLT, non-linear PW-tide interactions, and disturbances in MLT tides associated with fluctuations in the ozone forcing are considered to be possible coupling processes. The complex horizontal wave numbers of the longer period oscillations are provided in frequency contour plots for the TOMS and UKMO data to demonstrate the differences between lower atmospheric and MLT wave motions and their directions of propagation.

  8. West African Monsoon dynamics in idealized simulations: the competitive roles of SST warming and CO2

    NASA Astrophysics Data System (ADS)

    Gaetani, Marco; Flamant, Cyrille; Hourdin, Frederic; Bastin, Sophie; Braconnot, Pascale; Bony, Sandrine

    2015-04-01

    The West African Monsoon (WAM) is affected by large climate variability at different timescales, from interannual to multidecadal, with strong environmental and socio-economic impacts associated to climate-related rainfall variability, especially in the Sahelian belt. State-of-the-art coupled climate models still show poor ability in correctly simulating the WAM past variability and also a large spread is observed in future climate projections. In this work, the July-to-September (JAS) WAM variability in the period 1979-2008 is studied in AMIP-like simulations (SST-forced) from CMIP5. The individual roles of global SST warming and CO2 concentration increasing are investigated through idealized experiments simulating a 4K warmer SST and a 4x CO2 concentration, respectively. Results show a dry response in Sahel to SST warming, with dryer conditions over western Sahel. On the contrary, wet conditions are observed when CO2 is increased, with the strongest response over central-eastern Sahel. The precipitation changes are associated to modifications in the regional atmospheric circulation: dry (wet) conditions are associated with reduced (increased) convergence in the lower troposphere, a southward (northward) shift of the African Easterly Jet, and a weaker (stronger) Tropical Easterly Jet. The co-variability between global SST and WAM precipitation is also investigated, highlighting a reorganization of the main co-variability modes. Namely, in the 4xCO2 simulation the influence of Tropical Pacific is dominant, while it is reduced in the 4K simulation, which also shows an increased coupling with the eastern Pacific and the Indian Ocean. The above results suggest a competitive action of SST warming and CO2 increasing on the WAM climate variability, with opposite effects on precipitation. The combination of the observed positive and negative response in precipitation, with wet conditions in central-eastern Sahel and dry conditions in western Sahel, is consistent with the future precipitation trends over West Africa resulting from CMIP5 coupled simulations. It is argued that the large spread in CMIP5 future projections may be related to the weight given to SST warming and direct CO2 effect by individual models. The capability of climate models in reproducing the SST-precipitation relationship appears to be crucial in this respect.

  9. CLIVAR-GSOP/GODAE Ocean Synthesis Inter-Comparison of Global Air-Sea Fluxes From Ocean and Coupled Reanalyses

    NASA Astrophysics Data System (ADS)

    Valdivieso, Maria

    2014-05-01

    The GODAE OceanView and CLIVAR-GSOP ocean synthesis program has been assessing the degree of consistency between global air-sea flux data sets obtained from ocean or coupled reanalyses (Valdivieso et al., 2014). So far, fifteen global air-sea heat flux products obtained from ocean or coupled reanalyses have been examined: seven are from low-resolution ocean reanalyses (BOM PEODAS, ECMWF ORAS4, JMA/MRI MOVEG2, JMA/MRI MOVECORE, Hamburg Univ. GECCO2, JPL ECCOv4, and NCEP GODAS), five are from eddy-permitting ocean reanalyses developed as part of the EU GMES MyOcean program (Mercator GLORYS2v1, Reading Univ. UR025.3, UR025.4, UKMO GloSea5, and CMCC C-GLORS), and the remaining three are couple reanalyses based on coupled climate models (JMA/MRI MOVE-C, GFDL ECDA and NCEP CFSR). The global heat closure in the products over the period 1993-2009 spanned by all data sets is presented in comparison with observational and atmospheric reanalysis estimates. Then, global maps of ensemble spread in the seasonal cycle, and of the Signal to Noise Ratio of interannual flux variability over the 17-yr common period are shown to illustrate the consistency between the products. We have also studied regional variability in the products, particularly at the OceanSITES project locations (such as, for instance, the TAO/TRITON and PIRATA arrays in the Tropical Pacific and Atlantic, respectively). Comparisons are being made with other products such as OAFlux latent and sensible heat fluxes (Yu et al., 2008) combined with ISCCP satellite-based radiation (Zhang et al., 2004), the ship-based NOC2.0 product (Berry and Kent, 2009), the Large and Yeager (2009) hybrid flux dataset CORE.2, and two atmospheric reanalysis products, the ECMWF ERA-Interim reanalysis (referred to as ERAi, Dee et al., 2011) and the NCEP/DOE reanalysis R2 (referred to as NCEP-R2, Kanamitsu et al., 2002). Preliminary comparisons with the observational flux products from OceanSITES are also underway. References Berry, D.I. and E.C. Kent (2009), A New Air-Sea Interaction Gridded Dataset from ICOADS with Uncertainty Estimates. Bull. Amer. Meteor. Soc 90(5), 645-656. doi: 10.1175/2008BAMS2639.1. Dee, D. P. et al. (2011), The ERA-Interim reanalysis: configuration and performance of the data assimilation system. Q.J.R. Meteorol. Soc., 137: 553-597. doi: 10.1002/qj.828. Kanamitsu M., Ebitsuzaki W., Woolen J., Yang S.K., Hnilo J.J., Fiorino M., Potter G. (2002), NCEP-DOE AMIP-II reanalysis (R-2). Bull. Amer. Meteor. Soc., 83:1631-1643. Large, W. and Yeager, S. (2009), The global climatology of an interannually varying air-sea flux data set. Clim. Dynamics, Volume 33, pp 341-364 Valdivieso, M. and co-authors (2014): Heat fluxes from ocean and coupled reanalyses, Clivar Exchanges. Issue 64. Yu, L., X. Jin, and R. A. Weller (2008), Multidecade Global Flux Datasets from the Objectively Analyzed Air-sea Fluxes (OAFlux) Project: Latent and Sensible Heat Fluxes, Ocean Evaporation, and Related Surface Meteorological Variables. Technical Report OAFlux Project (OA2008-01), Woods Hole Oceanographic Institution. Zhang, Y., WB Rossow, AA Lacis, V Oinas, MI Mishchenk (2004), Calculation of radiative fluxes from the surface to top of atmsophere based on ISCCP and other global data sets. Journal of Geophysical Research: Atmospheres (1984-2012) 109 (D19).

  10. High-resolution regional climate model evaluation using variable-resolution CESM over California

    NASA Astrophysics Data System (ADS)

    Huang, X.; Rhoades, A.; Ullrich, P. A.; Zarzycki, C. M.

    2015-12-01

    Understanding the effect of climate change at regional scales remains a topic of intensive research. Though computational constraints remain a problem, high horizontal resolution is needed to represent topographic forcing, which is a significant driver of local climate variability. Although regional climate models (RCMs) have traditionally been used at these scales, variable-resolution global climate models (VRGCMs) have recently arisen as an alternative for studying regional weather and climate allowing two-way interaction between these domains without the need for nudging. In this study, the recently developed variable-resolution option within the Community Earth System Model (CESM) is assessed for long-term regional climate modeling over California. Our variable-resolution simulations will focus on relatively high resolutions for climate assessment, namely 28km and 14km regional resolution, which are much more typical for dynamically downscaled studies. For comparison with the more widely used RCM method, the Weather Research and Forecasting (WRF) model will be used for simulations at 27km and 9km. All simulations use the AMIP (Atmospheric Model Intercomparison Project) protocols. The time period is from 1979-01-01 to 2005-12-31 (UTC), and year 1979 was discarded as spin up time. The mean climatology across California's diverse climate zones, including temperature and precipitation, is analyzed and contrasted with the Weather Research and Forcasting (WRF) model (as a traditional RCM), regional reanalysis, gridded observational datasets and uniform high-resolution CESM at 0.25 degree with the finite volume (FV) dynamical core. The results show that variable-resolution CESM is competitive in representing regional climatology on both annual and seasonal time scales. This assessment adds value to the use of VRGCMs for projecting climate change over the coming century and improve our understanding of both past and future regional climate related to fine-scale processes. This assessment is also relevant for addressing the scale limitation of current RCMs or VRGCMs when next-generation model resolution increases to ~10km and beyond.

  11. The Role of Low-Level, Terrain-Induced Jets in Rainfall Variability in Tigris Euphrates Headwaters

    NASA Technical Reports Server (NTRS)

    Dezfuli, Amin K.; Zaitchik, Benjamin F.; Badr, Hamada S.; Evans, Jason; Peters-Lidard, Christa D.

    2017-01-01

    Rainfall variability in the Tigris Euphrates headwaters is a result of interaction between topography and meteorological features at a range of spatial scales. Here, the Weather Research and Forecasting (WRF) Model, driven by the NCEP-DOE AMIP-II reanalysis (R-2), has been implemented to better understand these interactions. Simulations were performed over a domain covering most of the Middle East. The extended simulation period (1983 - 2013) enables us to study seasonality, interannual variability, spatial variability, and extreme events of rainfall. Results showed that the annual cycle of precipitation produced by WRF agrees much more closely with observations than does R-2. This was particularly evident during the transition months of April and October, which were further examined to study the underlying physical mechanisms. In both months, WRF improves representation of interannual variability relative to R-2, with a substantially larger benefit in April. This improvement results primarily from WRFs ability to resolve two low-level, terrain-induced flows in the region that are either absent or weak in R-2: one parallel to the western edge of the Zagros Mountains, and one along the east Turkish highlands. The first shows a complete reversal in its direction during wet and dry days, when flowing southeasterly it transports moisture from the Persian Gulf to the region, and when flowing northwesterly it blocks moisture and transports it away from the region. The second is more directly related to synoptic-scale systems and carries moist, warm air from the Mediterranean and Red Seas toward the region. The combined contribution of these flows explains about 50 of interannual variability in both WRF and observations for April and October precipitation.

  12. Responses of Mean and Extreme Precipitation to Deforestation in the Maritime Continent

    NASA Astrophysics Data System (ADS)

    Chen, C. C.; Lo, M. H.; Yu, J. Y.

    2017-12-01

    Anthropogenic land use and land cover change, including tropical deforestation, could have substantial effects on local surface energy and water budgets, and thus on the atmospheric stability which may result in changes in precipitation. Maritime Continent has undergone severe deforestation in recent decades but has received less attention than Amazon or Congo rainforests. Therefore, this study is to decipher the precipitation response to deforestation in the Maritime Continent. We conduct deforestation experiments using Community Earth System Model (CESM) and through converting the tropical rainforest into grassland. The results show that deforestation in Maritime Continent leads to an increase in both mean temperature and mean precipitation. Moisture budget analysis indicates that the increase in precipitation is associated with the vertically integrated vertical moisture advection, especially the dynamic component (changes in convection). In addition, through moist static energy (MSE) budget analysis, we find the atmosphere among deforested areas become unstable owing to the combined effects of positive specific humidity anomalies at around 850 hPa and anomalous warming extended from the surface to 750 hPa. This instability will induce anomalous ascending motion, which could enhance the low-level moisture convergence, providing water vapor from the surrounding warm ocean. To further evaluate the precipitation response to deforestation, we examine the precipitation changes under La Niña events and global warming scenario using CESM Atmospheric Model Intercomparison Project (AMIP) simulations and Representative Concentration Pathway (RCP) 8.5 simulations. We find that the precipitation increase caused by deforestation in Maritime Continent is comparable in magnitude to that generated by either natural variability or global warming forcing. Besides the changes in mean precipitation, preliminary results show the extreme precipitation also increases. We will further explore how the extreme precipitation changes with the deforestation forcing.

  13. The Role of Low-Level Terrain-Induced Jets in Rainfall Variability in Tigris-Euphrates Headwaters

    NASA Technical Reports Server (NTRS)

    Dezfuli, Amin K.; Zaitchik, Benjamin F.; Badr, Hamada S.; Evans, Jason; Peters-Lidard, Christa D.

    2017-01-01

    Rainfall variability in the Tigris-Euphrates headwaters is a result of interaction between topography and meteorological features at a range of spatial scales. Here, the Weather Research and Forecasting (WRF) Model, driven by the NCEPDOE AMIP-II reanalysis (R-2), has been implemented to better understand these interactions. Simulations were performed over a domain covering most of the Middle East. The extended simulation period (19832013) enables us to study seasonality, interannual variability, spatial variability, and extreme events of rainfall. Results showed that the annual cycle of precipitation produced by WRF agrees much more closely with observations than does R-2. This was particularly evident during the transition months of April and October, which were further examined to study the underlying physical mechanisms. In both months, WRF improves representation of interannual variability relative to R-2, with a substantially larger benefit in April. This improvement results primarily from WRFs ability to resolve two low-level, terrain-induced flows in the region that are either absent or weak in R-2: one parallel to the western edge of the Zagros Mountains, and one along the east Turkish highlands. The first shows a complete reversal in its direction during wet and dry days: when flowing southeasterly it transports moisture from the Persian Gulf to the region, and when flowing northwesterly it blocks moisture and transports it away from the region. The second is more directly related to synoptic-scale systems and carries moist, warm air from the Mediterranean and Red Seas toward the region. The combined contribution of these flows explains about 50 of interannual variability in both WRF and observations for April and October precipitation.

  14. Causes of skill in seasonal predictions of the Arctic Oscillation

    NASA Astrophysics Data System (ADS)

    Kumar, Arun; Chen, Mingyue

    2017-11-01

    Based on an analysis of hindcasts from a seasonal forecast system, complemented by the analysis of a large ensemble of AMIP simulations, possible causes for skillful prediction of the winter Arctic Oscillation (AO) on a seasonal time-scale are analyzed. The possibility that the recent increase in AO skill could be due to model improvements, or due to changes in the persistence characteristics of the AO, is first discounted. The analysis then focuses on exploring the possibility that the recent increase in prediction skill in AO may be due to sampling variations or could have physical causes. Temporal variations in AO skill due entirely to sampling alone cannot be discounted as this is a fundamental constraint on verifications over a short time-series. This notion is supported from theoretical considerations, and from the analysis of the temporal variations in the perfect model skill where substantial variations in skill due to sampling alone are documented. As for the physical causes, the analysis indicates possible links in the prediction skill of AO with the SST forcing from the tropics, particularly related to the SST variations associated with the Trans-Niño Index (TNI). Interannual and low frequency variations in the TNI could have contributed to similar temporal variations in AO skill. For example, a dominance of central Pacific El Niño events after 2000 (a reflection of low-frequency variations in TNI) coincided with an increase in the prediction skill of AO. The analysis approach and results provide an avenue for further investigations; for example, model simulations forced with the SST pattern associated with the TNI, to establish or reaffirm causes for AO skill.

  15. Ranking GCM Estimates of Twentieth Century Precipitation Seasonality in the Western U.S. and its Influence on Floristic Provinces.

    NASA Astrophysics Data System (ADS)

    Cole, K. L.; Eischeid, J. K.; Garfin, G. M.; Ironside, K.; Cobb, N. S.

    2008-12-01

    Floristic provinces of the western United States (west of 100W) can be segregated into three regions defined by significant seasonal precipitation during the months of: 1) November-March (Mediterranean); 2) July- September (Monsoonal); or, 3) May-June (Rocky Mountain). This third region is best defined by the absence of the late spring-early summer drought that affects regions 1 and 2. Each of these precipitation regimes is characterized by distinct vegetation types and fire seasonality adapted to that particular cycle of seasonal moisture availability and deficit. Further, areas where these regions blend from one to another can support even more complex seasonal patterns and resulting distinctive vegetation types. As a result, modeling the effects of climates on these ecosystems requires confidence that GCMs can at least approximate these sub- continental seasonal precipitation patterns. We evaluated the late Twentieth Century (1950-1999 AD) estimates of annual precipitation seasonality produced by 22 GCMs contained within the IPCC Fourth Assessment (AR4). These modeled estimates were compared to values from the PRISM dataset, extrapolated from station data, over the same historical period for the 3 seasonal periods defined above. The correlations between GCM estimates and PRISM values were ranked using 4 measures: 1) A map pattern relationship based on the correlation coefficient, 2) A map pattern relationship based on the congruence coefficient, 3) The ratio of simulated/observed area averaged precipitation based on the seasonal precipitation amounts, and, 4) The ratio of simulated/observed area averaged precipitation based on the seasonal precipitation percentages of the annual total. For each of the four metrics, the rank order of models was very similar. The ranked order of the performance of the different models quantified aspects of the model performance visible in the mapped results. While some models represented the seasonal patterns very well, others showed little correspondence with the regional patterns, especially for the summer monsoon period. These sub-continental patterns were especially well simulated over this period by the UKMO-HadGEM1, ECHAM5/MPI-OM, and the MRI-CGCM2 model runs.

  16. Antarctic Polar Descent and Planetary Wave Activity Observed in ISAMS CO from April to July 1992

    NASA Technical Reports Server (NTRS)

    Allen, D. R.; Stanford, J. L.; Nakamura, N.; Lopez-Valverde, M. A.; Lopez-Puertas, M.; Taylor, F. W.; Remedios, J. J.

    2000-01-01

    Antarctic polar descent and planetary wave activity in the upper stratosphere and lower mesosphere are observed in ISAMS CO data from April to July 1992. CO-derived mean April-to-May upper stratosphere descent rates of 15 K/day (0.25 km/day) at 60 S and 20 K/day (0.33 km/day) at 80 S are compared with descent rates from diabatic trajectory analyses. At 60 S there is excellent agreement, while at 80 S the trajectory-derived descent is significantly larger in early April. Zonal wavenumber 1 enhancement of CO is observed on 9 and 28 May, coincident with enhanced wave 1 in UKMO geopotential height. The 9 May event extends from 40 to 70 km and shows westward phase tilt with height, while the 28 May event extends from 40 to 50 km and shows virtually no phase tilt with height.

  17. Quantitative impact of aerosols on numerical weather prediction. Part I: Direct radiative forcing

    NASA Astrophysics Data System (ADS)

    Marquis, J. W.; Zhang, J.; Reid, J. S.; Benedetti, A.; Christensen, M.

    2017-12-01

    While the effects of aerosols on climate have been extensively studied over the past two decades, the impacts of aerosols on operational weather forecasts have not been carefully quantified. Despite this lack of quantification, aerosol plumes can impact weather forecasts directly by reducing surface reaching solar radiation and indirectly through affecting remotely sensed data that are used for weather forecasts. In part I of this study, the direct impact of smoke aerosol plumes on surface temperature forecasts are quantified using a smoke aerosol event affecting the United States Upper-Midwest in 2015. NCEP, ECMWF and UKMO model forecast surface temperature uncertainties are studied with respect to aerosol loading. Smoke aerosol direct cooling efficiencies are derived and the potential of including aerosol particles in operational forecasts is discussed, with the consideration of aerosol trends, especially over regions with heavy aerosol loading.

  18. Statistical analysis of simulated global soil moisture and its memory in an ensemble of CMIP5 general circulation models

    NASA Astrophysics Data System (ADS)

    Wiß, Felix; Stacke, Tobias; Hagemann, Stefan

    2014-05-01

    Soil moisture and its memory can have a strong impact on near surface temperature and precipitation and have the potential to promote severe heat waves, dry spells and floods. To analyze how soil moisture is simulated in recent general circulation models (GCMs), soil moisture data from a 23 model ensemble of Atmospheric Model Intercomparison Project (AMIP) type simulations from the Coupled Model Intercomparison Project Phase 5 (CMIP5) are examined for the period 1979 to 2008 with regard to parameterization and statistical characteristics. With respect to soil moisture processes, the models vary in their maximum soil and root depth, the number of soil layers, the water-holding capacity, and the ability to simulate freezing which all together leads to very different soil moisture characteristics. Differences in the water-holding capacity are resulting in deviations in the global median soil moisture of more than one order of magnitude between the models. In contrast, the variance shows similar absolute values when comparing the models to each other. Thus, the input and output rates by precipitation and evapotranspiration, which are computed by the atmospheric component of the models, have to be in the same range. Most models simulate great variances in the monsoon areas of the tropics and north western U.S., intermediate variances in Europe and eastern U.S., and low variances in the Sahara, continental Asia, and central and western Australia. In general, the variance decreases with latitude over the high northern latitudes. As soil moisture trends in the models were found to be negligible, the soil moisture anomalies were calculated by subtracting the 30 year monthly climatology from the data. The length of the memory is determined from the soil moisture anomalies by calculating the first insignificant autocorrelation for ascending monthly lags (insignificant autocorrelation folding time). The models show a great spread of autocorrelation length from a few months in the tropics, north western Canada, eastern U.S. and northern Europe up to few years in the Sahara, the Arabian Peninsula, continental Eurasia and central U.S. Some models simulate very long memory all over the globe. This behavior is associated with differences between the models in the maximum root and soil depth. Models with shallow roots and deep soils exhibit longer memories than models with similar soil and root depths. Further analysis will be conducted to clearly divide models into groups based on their inter-model spatial correlation of simulated soil moisture characteristics.

  19. A study on the predictability of the transition day from the dry to the rainy season over South Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Min; Nam, Ji-Eun; Choi, Hee-Wook; Ha, Jong-Chul; Lee, Yong Hee; Kim, Yeon-Hee; Kang, Hyun-Suk; Cho, ChunHo

    2016-08-01

    This study was conducted to evaluate the prediction accuracies of THe Observing system Research and Predictability EXperiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data at six operational forecast centers using the root-mean square difference (RMSD) and Brier score (BS) from April to July 2012. And it was performed to test the precipitation predictability of ensemble prediction systems (EPS) on the onset of the summer rainy season, the day of withdrawal in spring drought over South Korea on 29 June 2012 with use of the ensemble mean precipitation, ensemble probability precipitation, 10-day lag ensemble forecasts (ensemble mean and probability precipitation), and effective drought index (EDI). The RMSD analysis of atmospheric variables (geopotential-height at 500 hPa, temperature at 850 hPa, sea-level pressure and specific humidity at 850 hPa) showed that the prediction accuracies of the EPS at the Meteorological Service of Canada (CMC) and China Meteorological Administration (CMA) were poor and those at the European Center for Medium-Range Weather Forecasts (ECMWF) and Korea Meteorological Administration (KMA) were good. Also, ECMWF and KMA showed better results than other EPSs for predicting precipitation in the BS distributions. It is also evaluated that the onset of the summer rainy season could be predicted using ensemble-mean precipitation from 4-day leading time at all forecast centers. In addition, the spatial distributions of predicted precipitation of the EPS at KMA and the Met Office of the United Kingdom (UKMO) were similar to those of observed precipitation; thus, the predictability showed good performance. The precipitation probability forecasts of EPS at CMA, the National Centers for Environmental Prediction (NCEP), and UKMO (ECMWF and KMA) at 1-day lead time produced over-forecasting (under-forecasting) in the reliability diagram. And all the ones at 2˜4-day lead time showed under-forecasting. Also, the precipitation on onset day of the summer rainy season could be predicted from a 4-day lead time to initial time by using the 10-day lag ensemble mean and probability forecasts. Additionally, the predictability for withdrawal day of spring drought to be ended due to precipitation on onset day of summer rainy season was evaluated using Effective Drought Index (EDI) to be calculated by ensemble mean precipitation forecasts and spreads at five EPSs.

  20. On the recent warming in the subcloud layer entropy and vertically integrated moist static energy over South Asian Monsoon region.

    NASA Astrophysics Data System (ADS)

    Konduru, R.; Gupta, A.; Matsumoto, J.; Takahashi, H. G.

    2017-12-01

    In order to explain monsoon circulation, surface temperature gradients described as most traditional concept. However, it cannot explain certain important aspects of monsoon circulation. Later, convective quasi-equilibrium framework and vertically integrated atmospheric energy budget has become recognized theories to explain the monsoon circulation. In this article, same theories were analyzed and observed for the duration 1979-2010 over south Asian summer monsoon region. With the help of NCEP-R2, NOAA 20th Century, and Era-Interim reanalysis an important feature was noticed pertained to subcloud layer entropy and vertical moist static energy. In the last 32 years, subcloud layer entropy and vertically integrated moist static energy has shown significant seasonal warming all over the region with peak over the poleward flank of the cross-equatorial cell. The important reason related to the warming was found to be increase in surface enthalpy fluxes. Instead, other dynamical contributions pertained to the warming was also observed. Increase in positive anomalies of vertical advection of moist static energy over northern Bay of Bengal, Central India, Peninsular India, Eastern Arabian Sea, and Equatorial Indian Ocean was found to be an important dynamic factor contributing for warming of vertically integrated moist static energy. Along with it vertical moist stability has also supported the argument. Similar interpretations were perceived in the AMIP simulation of CCSM4 model. Further modeling experiments on this warming will be helpful to know the exact mechanism behind it.

  1. A measurement/model comparison of ozone photochemical loss in the Antarctic ozone hole using Polar Ozone and Aerosol Measurement observations and the Match technique

    NASA Astrophysics Data System (ADS)

    Hoppel, Karl; Bevilacqua, Richard; Canty, Timothy; Salawitch, Ross; Santee, Michelle

    2005-10-01

    The Polar Ozone and Aerosol Measurement (POAM III) instrument has provided 6 years (1998 to present) of Antarctic ozone profile measurements, which detail the annual formation of the ozone hole. During the period of ozone hole formation the measurement latitude follows the edge of the polar night and presents a unique challenge for comparing with model simulations. The formation of the ozone hole has been simulated by using a photochemical box model with an ensemble of trajectories, and the results were sampled at the measurement latitude for comparison with the measured ozone. The agreement is generally good but very sensitive to the model dynamics and less sensitive to changes in the model chemistry. In order to better isolate the chemical ozone loss the Match technique was applied to 5 years of data to directly calculate ozone photochemical loss rates. The measured loss rates are specific to the high solar zenith angle conditions of the POAM-Match trajectories and are found to increase slowly from July to early August and then increase rapidly until mid-September. The Match results are sensitive to the choice of meteorological analysis used for the trajectory calculations. The ECMWF trajectories yield the smallest, and perhaps most accurate, peak loss rates that can be reproduced by a photochemical model using standard JPL 2002 kinetics, assuming reactive bromine (BrOx) of 14 pptv based solely on contributions from CH3Br and halons, and without requiring ClOx to exceed the upper limit for available inorganic chlorine of 3.7 ppbv. Larger Match ozone loss rates are found for the late August and early September period if trajectories based on UKMO and NCEP analyses are employed. Such loss rates require higher values for ClO and/or BrO than can be simulated using JPL 2002 chemical kinetics and complete activation of chlorine. In these cases, the agreement between modeled and measured loss rates is significantly improved if the model employs larger ClOOCl cross sections (e.g., Burkholder et al., 1990) and BrOx of 24 ppt which reflects significant contributions from very short-lived bromocarbons to the inorganic bromine budget.

  2. Understanding the Geographic Controls of Hazardous Convective Weather Environments in the United States

    NASA Astrophysics Data System (ADS)

    Reed, K. A.; Chavas, D. R.

    2017-12-01

    Hazardous Convective Weather (HCW), such as severe thunderstorms and tornadoes, poses significant risk to life and property in the United States every year. While these HCW events are small scale, they develop principally within favorable larger-scale environments (i.e., HCW environments). Why these large-scale environments are confined to specific regions, particularly the Eastern United States, is not well understood. This can, in part, be related to a limited fundamental knowledge of how the climate system creates HCW environment, which provides uncertainty in how HCW environments may be altered in a changing climate. Previous research has identified the Gulf of Mexico to the south and elevated terrain upstream as key geographic contributors to the generation of HCW environments over the Eastern United States. This work investigates the relative role of these geographic features through "component denial" experiments in the Community Atmosphere Model version 5 (CAM5). In particular, CAM5 simulations where topography is removed (globally and regionally) and/or the Gulf of Mexico is converted to land is compared to a CAM5 control simulation of current climate following the Atmospheric Model Intercomparison Project (AMIP) protocols. In addition to exploring differences in general characteristics of the large-scale environments amongst the experiments, HCW changes will be explored through a combination of high shear and high Convective Available Potential Energy (CAPE) environments. Preliminary work suggests that the removal of elevated terrain reduces the inland extent of HCW environments in the United States, but not the existence of these events altogether. This indicates that topography is crucial for inland HCW environments but perhaps not for their existence in general (e.g., near the Gulf of Mexico). This initial work is a crucial first step to building a reduced-complexity framework within CAM5 to quantify how land-ocean contrast and elevated terrain control HCW environments.

  3. The Great Plains low-level jet in 1.5C and 2C HAPPI simulations: Implications for changes in extreme climate events

    NASA Astrophysics Data System (ADS)

    Weaver, S. J.; Barcikowska, M. J.

    2017-12-01

    Global temperature targets have become the cornerstone for global climate policy discussions. Given the goal of the Paris Accord to limit the rise in global mean temperature to well below 2.0oC above pre-industrial levels, and pursue efforts toward the more ambitious 1.5oC goal, there is increasing focus in the climate science community on what the relative changes in regional climate extremes may be for these two scenarios. Despite the successes of major climate science modeling efforts, there is still a significant information gap regarding the regional and seasonal changes in some climate extremes over the U.S. as a function of these global mean temperature targets.During the spring and summer, large amounts of heat and moisture are transported northward into the central and eastern U.S. by the Great Plains Low-Level Jet (GPLLJ) - an atmospheric river which dominates the subcontinental scale climate variability during the warm half of the year. Accordingly, the GPLLJ and its vast spatiotemporal variability is highly influential over several types of extreme climate anomalies east of the Rocky Mountains, including, drought and pluvial events, tornadic activity, and the evolution of central U.S warming hole. Changes in the GPLLJ and its variability are probed from the perspective of several hundred climate realizations afforded by the availability of climate model experiments from the Half a degree additional warming, Prognosis, and Projected Impacts (HAPPI) effort - a suite of multi-model ensemble AMIP simulations forced by 1.5oC and 2oC levels of global warming. The multimodel analysis focuses on the variable magnitude of the seasonal changes in the mean GPLLJ and shifts in the extremes of the prominent modes of GPLLJ variability - both of which have implications for the future shifts in extreme climate events over the Great Plains, Midwest, and southeast regions of the U.S.

  4. Prediction of the Arctic Oscillation in Boreal Winter by Dynamical Seasonal Forecasting Systems

    NASA Technical Reports Server (NTRS)

    Kang, Daehyun; Lee, Myong-In; Im, Jungho; Kim, Daehyun; Kim, Hye-Mi; Kang, Hyun-Suk; Shubert, Siegfried D.; Arriba, Albertom; MacLachlan, Craig

    2013-01-01

    This study assesses the prediction skill of the boreal winter Arctic Oscillation (AO) in the state-of-the-art dynamical ensemble prediction systems (EPSs): the UKMO GloSea4, the NCEP CFSv2, and the NASA GEOS-5. Long-term reforecasts made with the EPSs are used to evaluate representations of the AO, and to examine skill scores for the deterministic and probabilistic forecast of the AO index. The reforecasts reproduce the observed changes in the large-scale patterns of the Northern Hemispheric surface temperature, upper-level wind, and precipitation according to the AO phase. Results demonstrate that all EPSs have better prediction skill than the persistence prediction for lead times up to 3-month, suggesting a great potential for skillful prediction of the AO and the associated climate anomalies in seasonal time scale. It is also found that the deterministic and probabilistic forecast skill of the AO in the recent period (1997-2010) is higher than that in the earlier period (1983-1996).

  5. A case study of cumulus formation beneath a stratocumulus sheet: Its structure and effect on boundary layer budgets

    NASA Technical Reports Server (NTRS)

    Barlow, Roy W.; Nicholls, S.

    1990-01-01

    On several occasions during the FIRE Marine Stratocumulus IFO off the California coast, small cumulus were observed to form during the morning beneath the main stratocumulus (Sc) deck. This occurs in the type of situation described by Turton and Nicholls (1987) in which there is insufficient generation of turbulent kinetic energy (TKE) from the cloudtop or the surface to sustain mixing throughout the layer, and a separation of the surface and cloud layers occurs. The build up of humidity in the surface layer allows cumuli to form, and the more energetic of these may penetrate back into the Sc deck, reconnecting the layers. The results presented were collected by the UKMO C-130 aircraft flying in a region where these small cumulus had grown to the extent that they had penetrated into the main Sc deck above. The structure of these penetrative cumulus are examined and their implications on the layer flux and radiation budget discussed.

  6. Stratospheric warmings during February and March 1993

    NASA Technical Reports Server (NTRS)

    Manney, G. L.; Zurek, R. W.; O'Neill, A.; Swinbank, R.; Kumer, J. B.; Mergenthaler, J. L.; Roche, A. E.

    1994-01-01

    Two stratospheric warmings during February and March 1993 are described using United Kingdom Meteorological Office (UKMO) analyses, calculated potential vorticity (PV) and diabetic heating, and N2O observed by the Cryogenic Limb Array Etalon Spectrometer (CLAES) instrument on the Upper Atmosphere Research Satellite (UARS). The first warming affected temperatures over a larger region, while the second produced a larger region of reversed zonal winds. Tilted baroclinic zones formed in the temperature field, and the polar vortex tilted westward with height. Narrow tongues of high PV and low N2O were drawn off the polar vortex, and irreversibly mixed. Tongues of material were drawn from low latitudes into the region between the polar vortex and the anticyclone; diabatic descent was also strongest in this region. Increased N2O over a broad region near the edge of the polar vortex indicates the importance of horizontal transport. N2O decreased in the vortex, consistent with enhanced diabatic descent during the warmings.

  7. Ionospheric effects of sudden stratospheric warmings in eastern Siberia region

    NASA Astrophysics Data System (ADS)

    Polyakova, A. S.; Chernigovskaya, M. A.; Perevalova, N. P.

    2014-12-01

    Ionospheric effects observed in Russia's Asia region during sudden stratospheric warmings (SSWs) in the winters 2008/2009 and 2012/2013 corresponding to both extreme solar minimum and moderate solar maximum conditions have been examined. To detect the ionospheric effects which must have been induced by the SSWs, we have carried out a joint analysis of total electron content (TEC) global ionospheric maps (GIM), MLS (Microwave Limb Sounder, EOS Aura) measurements of vertical temperature profiles, as well as NCEP/NCAR and UKMO Reanalysis data. It has been revealed for the first time that during strong SSWs the amplitude of diurnal variation of TEC decreases nearly by half in the mid-latitude ionosphere. Besides, the intensity of TEC deviations from the background level increases during SSWs. It has also revealed that during SSW peak the midday TEC maximum considerably decreases, and the night/morning TEC increases compared to quiet days. The pattern of TEC response to SSW is shown to be identical for both quiet and disturbed geophysical conditions.

  8. Intercomparison of Operational Ocean Forecasting Systems in the framework of GODAE

    NASA Astrophysics Data System (ADS)

    Hernandez, F.

    2009-04-01

    One of the main benefits of the GODAE 10-year activity is the implementation of ocean forecasting systems in several countries. In 2008, several systems are operated routinely, at global or basin scale. Among them, the BLUElink (Australia), HYCOM (USA), MOVE/MRI.COM (Japan), Mercator (France), FOAM (United Kingdom), TOPAZ (Norway) and C-NOOFS (Canada) systems offered to demonstrate their operational feasibility by performing an intercomparison exercise during a three months period (February to April 2008). The objectives were: a) to show that operational ocean forecasting systems are operated routinely in different countries, and that they can interact; b) to perform in a similar way a scientific validation aimed to assess the quality of the ocean estimates, the performance, and forecasting capabilities of each system; and c) to learn from this intercomparison exercise to increase inter-operability and collaboration in real time. The intercomparison relies on the assessment strategy developed for the EU MERSEA project, where diagnostics over the global ocean have been revisited by the GODAE contributors. This approach, based on metrics, allow for each system: a) to verify if ocean estimates are consistent with the current general knowledge of the dynamics; and b) to evaluate the accuracy of delivered products, compared to space and in-situ observations. Using the same diagnostics also allows one to intercompare the results from each system consistently. Water masses and general circulation description by the different systems are consistent with WOA05 Levitus climatology. The large scale dynamics (tropical, subtropical and subpolar gyres ) are also correctly reproduced. At short scales, benefit of high resolution systems can be evidenced on the turbulent eddy field, in particular when compared to eddy kinetic energy deduced from satellite altimetry of drifter observations. Comparisons to high resolution SST products show some discrepancies on ocean surface representation, either due to model and forcing fields errors, or assimilation scheme efficiency. Comparisons to sea-ice satellite products also evidence discrepancies linked to model, forcing and assimilation strategies of each forecasting system. Key words: Intercomparison, ocean analysis, operational oceanography, system assessment, metrics, validation GODAE Intercomparison Team: L. Bertino (NERSC/Norway), G. Brassington (BMRC/Australia), E. Chassignet (FSU/USA), J. Cummings (NRL/USA), F. Davidson (DFO/Canda), M. Drévillon (CERFACS/France), P. Hacker (IPRC/USA), M. Kamachi (MRI/Japan), J.-M. Lellouche (CERFACS/France), K. A. Lisæter (NERSC/Norway), R. Mahdon (UKMO/UK), M. Martin (UKMO/UK), A. Ratsimandresy (DFO/Canada), and C. Regnier (Mercator Ocean/France)

  9. Global Effects of SuperParameterization on Hydro-Thermal Land-Atmosphere Coupling on Multiple Timescales and an Amplification of the Bowen Ratio

    NASA Astrophysics Data System (ADS)

    Qin, H.; Pritchard, M. S.; Kooperman, G. J.; Parishani, H.

    2017-12-01

    Conventional General Circulation Models (GCMs) in the Global Land-Atmosphere Coupling Experiment (GLACE) tend to produce overly strong Land-Atmosphere coupling (L-A coupling) strength. We investigate the effects of cloud SuperParameterization (SP) on L-A coupling on timescales longer than the diurnal where it has been previously shown to have a strong effect. Using the Community Atmosphere Model v3.5 (CAM3.5) and its SuperParameterized counterpart SPCAM3.5, we conducted experiments following the GLACE and Atmospheric Model Intercomparison Project (AMIP) protocols. On synoptic-to-subseasonal timescales, SP significantly mutes hydrologic L-A coupling on a global scale, through the atmospheric segment. But on longer seasonal timescales, SP does not exhibit detectable effects on hydrologic L-A coupling. Two regional effects of SP on thermal L-A coupling are also discovered and explored. Over the Arabian Peninsula, SP strikingly reduces thermal L-A coupling due to a control by mean regional rainfall reduction. Over the Southwestern US and Northern Mexico, however, SP remarkably enhances the thermal L-A coupling independent of rainfall or soil moisture. We argue that the cause may be a previously unrecognized effect of SP to amplify the simulated Bowen ratio. Not only does this help reconcile a puzzling local enhancement of thermal L-A coupling over the Southwestern US, but it is also demonstrated to be a robust, global effect of SP over land that is independent of model version and experiment design, and that has important consequences for climate change prediction.

  10. Evaluation of the NASA GISS AR5 SCM/GCM at the ARM SGP Site using Self Organizing Maps

    NASA Astrophysics Data System (ADS)

    Kennedy, A. D.; Dong, X.; Xi, B.; Del Genio, A. D.; Wolf, A.

    2011-12-01

    Understanding and improving clouds in climate models requires moving beyond comparing annual and seasonal means. Errors can offset resulting in models getting the right long-term solution for the wrong reasons. For example, cloud parameterization errors may be balanced by the model incorrectly simulating the frequency distribution of atmospheric states. To faithfully evaluate climate models it is necessary to partition results into specific regimes. This has been completed in the past by evaluating models by their ability to produce cloud regimes as determined by observational products from satellites. An alternative approach is to first classify meteorological regimes (i.e., synoptic pattern and forcing) and then determine what types of clouds occur for each class. In this study, a competitive neural network known as the Self Organizing Map (SOM) is first used to classify synoptic patterns from a reanalysis over the Southern Great Plains (SGP) region during the period 1999-2008. These results are then used to evaluate simulated clouds from the AR5 version of the NASA GISS Model E Single Column Model (SCM). Unlike past studies that narrowed classes into several categories, this study assumes that the atmosphere is capable of producing an infinite amount of states. As a result, SOMs were generated with a large number of classes for specific months when model errors were found. With nearly ten years of forcing data, an adequate number of samples have been used to determine how cloud fraction varies across the SOM and to distinguish cloud errors. Barring major forcing errors, SCM studies can be thought of as what the GCM would simulate if the dynamics were perfect. As a result, simulated and observed CFs frequently occur for the same atmospheric states. For example, physically deep clouds during the winter months occur for a small number of classes in the SOM. Although the model produces clouds during the correct states, CFs are consistently too low. Instead, the model has a positive bias of thinner clouds during these classes that were associated with low-pressure systems and fronts. To determine if this and other SCM errors are present in the GCM, the Atmospheric Model Intercomparison Project (AMIP) run for the NASA GISS GCM will also be investigated. The SOM will be used to classify atmospheric states within the GCM to determine how well the GCM captures the PDF of observed atmospheric states. Together, these comparisons will allow for a thorough evaluation of the model at the ARM SGP site.

  11. Representation of solar tides in the stratosphere and lower mesosphere in state-of-the-art reanalyses and in satellite observations

    NASA Astrophysics Data System (ADS)

    Sakazaki, Takatoshi; Fujiwara, Masatomo; Shiotani, Masato

    2018-02-01

    Atmospheric solar tides in the stratosphere and the lower mesosphere are investigated using temperature data from five state-of-the-art reanalysis data sets (MERRA-2, MERRA, JRA-55, ERA-Interim, and CFSR) as well as TIMED SABER and Aura MLS satellite measurements. The main focus is on the period 2006-2012 during which the satellite observations are available for direct comparison with the reanalyses. Diurnal migrating tides, semidiurnal migrating tides, and nonmigrating tides are diagnosed. Overall the reanalyses agree reasonably well with each other and with the satellite observations for both migrating and nonmigrating components, including their vertical structure and the seasonality. However, the agreement among reanalyses is more pronounced in the lower stratosphere and relatively weaker in the upper stratosphere and mesosphere. A systematic difference between SABER and the reanalyses is found for diurnal migrating tides in the upper stratosphere and the lower mesosphere; specifically, the amplitude of trapped modes in reanalyses is significantly smaller than that in SABER, although such difference is less clear between MLS and the reanalyses. The interannual variability and the possibility of long-term changes in migrating tides are also examined using the reanalyses during 1980-2012. All the reanalyses agree in exhibiting a clear quasi-biennial oscillation (QBO) in the tides, but the most significant indications of long-term changes in the tides represented in the reanalyses are most plausibly explained by the evolution of the satellite observing systems during this period. The tides are also compared in the full reanalyses produced by the Japan Meteorological Agency (i.e., JRA-55) and in two parallel data sets from this agency: one (JRA-55C) that repeats the reanalysis procedure but without any satellite data assimilated and one (JRA-55AMIP) that is a free-running integration of the model constrained only by observed sea surface temperatures. Many aspects of the tides are closer in JRA-55C and JRA-55AMIP than these are to the full reanalysis JRA-55, demonstrating the importance of the assimilation of satellite data in representing the diurnal variability of the middle atmosphere. In contrast to the assimilated data sets, the free-running model has no QBO in equatorial stratospheric mean circulation and our results show that it displays no quasi-biennial variability in the tides.

  12. The Stochastic Multicloud Model as part of an operational convection parameterisation in a comprehensive GCM

    NASA Astrophysics Data System (ADS)

    Peters, Karsten; Jakob, Christian; Möbis, Benjamin

    2015-04-01

    An adequate representation of convective processes in numerical models of the atmospheric circulation (general circulation models, GCMs) remains one of the grand challenges in atmospheric science. In particular, the models struggle with correctly representing the spatial distribution and high variability of tropical convection. It is thought that this model deficiency partly results from formulating current convection parameterisation schemes in a purely deterministic manner. Here, we use observations of tropical convection to inform the design of a novel convection parameterisation with stochastic elements. The novel scheme is built around the Stochastic MultiCloud Model (SMCM, Khouider et al 2010). We present the progress made in utilising SMCM-based estimates of updraft area fractions at cloud base as part of the deep convection scheme of a GCM. The updraft area fractions are used to yield one part of the cloud base mass-flux used in the closure assumption of convective mass-flux schemes. The closure thus receives a stochastic component, potentially improving modeled convective variability and coherence. For initial investigations, we apply the above methodology to the operational convective parameterisation of the ECHAM6 GCM. We perform 5-year AMIP simulations, i.e. with prescribed observed SSTs. We find that with the SMCM, convection is weaker and more coherent and continuous from timestep to timestep compared to the standard model. Total global precipitation is reduced in the SMCM run, but this reduces i) the overall error compared to observed global precipitation (GPCP) and ii) middle tropical tropospheric temperature biases compared to ERA-Interim. Hovmoeller diagrams indicate a slightly higher degree of convective organisation compared to the base case and Wheeler-Kiladis frequency wavenumber diagrams indicate slightly more spectral power in the MJO range.

  13. Equatorial waves in some CMIP5 coupled models (with stratosphere)

    NASA Astrophysics Data System (ADS)

    Maury, Pauline; Lott, François; Guez, Lionel

    2013-04-01

    The Kelvin and the Rossby Gravity Waves (RGWs) packets that dominate the day to day variability in the low equatorial stratosphere (50hPa) are analyzed in 7 ESMs that participate to CMIP5 and that include a well resolved stratosphere. The results are compared to ERAI. Two models are also used to quantify better (i) the impact of the QBO on these waves (MPI-P and MPI-MR), and (ii) the impact of convection (IPSL-CM5A and CM5B). In the stratosphere all models present quite coherent Kelvin waves and RGWs packets, which is good think since these waves dominate the day to day variability in the low stratosphere. The errors on these freely propagating waves seem therefore less pronounced then the differences seen by others on the convectively coupled waves in the troposphere. The difference between the models nevertheless stay very large, the models with a QBO have more pronounced waves, and represent better their life-cycle (this is particularly true for the RGWs). The sensitivity to the convection of the rather slow waves analysed here is not as pronounced as was found in the past for may be faster waves, but is nevertheless confirmed when we look at the same model with two drastically different convection parameterization. In the same spirit, the sensitivity of the RGWs to the QBO is confirmed by comparing almost the same model runs, one with a QBO and one without. Having a QBO nevertheless does not guarantee that the waves are realistic in all respects, as shows for instance the Temperature signature due to the RGWs in the UKMO model. There also seems to have an issue when the resolution changes drastically, the MRI model behaving quite differently from the other models when it comes to the simulations of these equatorial waves.

  14. A new general circulation model of Jupiter's atmosphere based on the UKMO Unified Model: Three-dimensional evolution of isolated vortices and zonal jets in mid-latitudes

    NASA Astrophysics Data System (ADS)

    Yamazaki, Y. H.; Skeet, D. R.; Read, P. L.

    2004-04-01

    We have been developing a new three-dimensional general circulation model for the stratosphere and troposphere of Jupiter based on the dynamical core of a portable version of the Unified Model of the UK Meteorological Office. Being one of the leading terrestrial GCMs, employed for operational weather forecasting and climate research, the Unified Model has been thoroughly tested and performance tuned for both vector and parallel computers. It is formulated as a generalized form of the standard primitive equations to handle a thick atmosphere, using a scaled pressure as the vertical coordinate. It is able to accurately simulate the dynamics of a three-dimensional fully compressible atmosphere on the whole or a part of a spherical shell at high spatial resolution in all three directions. Using the current version of the GCM, we examine the characteristics of the Jovian winds in idealized configurations based on the observed vertical structure of temperature. Our initial focus is on the evolution of isolated eddies in the mid-latitudes. Following a brief theoretical investigation of the vertical structure of the atmosphere, limited-area cyclic channel domains are used to numerically investigate the nonlinear evolution of the mid-latitude winds. First, the evolution of deep and shallow cyclones and anticyclones are tested in the atmosphere at rest to identify a preferred horizontal and vertical structure of the vortices. Then, the dependency of the migration characteristics of the vortices are investigated against modelling parameters to find that it is most sensitive to the horizontal diffusion. We also examine the hydrodynamical stability of observed subtropical jets in both northern and southern hemispheres in the three-dimensional nonlinear model as initial value problems. In both cases, it was found that the prominent jets are unstable at various scales and that vorteces of various sizes are generated including those comparable to the White Ovals and the Great Red Spot.

  15. Isolating the Roles of Different Forcing Agents in Global Stratospheric Temperature Changes Using Model Integrations with Incrementally Added Single Forcings

    NASA Technical Reports Server (NTRS)

    Aquila, V.; Swartz, W. H.; Waugh, D. W.; Colarco, P. R.; Pawson, S.; Polvani, L. M.; Stolarski, R. S.

    2016-01-01

    Satellite instruments show a cooling of global stratospheric temperatures over the whole data record (1979-2014). This cooling is not linear and includes two descending steps in the early 1980s and mid-1990s. The 1979-1995 period is characterized by increasing concentrations of ozone depleting substances (ODS) and by the two major volcanic eruptions of El Chichon (1982) and Mount Pinatubo (1991). The 1995-present period is characterized by decreasing ODS concentrations and by the absence of major volcanic eruptions. Greenhouse gas (GHG) concentrations increase over the whole time period. In order to isolate the roles of different forcing agents in the global stratospheric temperature changes, we performed a set of AMIP-style simulations using the NASA Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM). We find that in our model simulations the cooling of the stratosphere from 1979 to present is mostly driven by changes in GHG concentrations in the middle and upper stratosphere and by GHG and ODS changes in the lower stratosphere. While the cooling trend caused by increasing GHGs is roughly constant over the satellite era, changing ODS concentrations cause a significant stratospheric cooling only up to the mid-1990s, when they start to decrease because of the implementation of the Montreal Protocol. Sporadic volcanic events and the solar cycle have a distinct signature in the time series of stratospheric temperature anomalies but do not play a statistically significant role in the long-term trends from 1979 to 2014. Several factors combine to produce the step-like behavior in the stratospheric temperatures: in the lower stratosphere, the flattening starting in the mid-1990s is due to the decrease in ozone-depleting substances; Mount Pinatubo and the solar cycle cause the abrupt steps through the aerosol-associated warming and the volcanically induced ozone depletion. In the middle and upper stratosphere, changes in solar irradiance are largely responsible for the step-like behavior of global temperature anomalies, together with volcanically induced ozone depletion and water vapor increases in the post-Pinatubo years.

  16. Isolating the roles of different forcing agents in global stratospheric temperature changes using model integrations with incrementally added single forcings.

    PubMed

    Aquila, V; Swartz, W H; Waugh, D W; Colarco, P R; Pawson, S; Polvani, L M; Stolarski, R S

    2016-07-16

    Satellite instruments show a cooling of global stratospheric temperatures over the whole data record (1979-2014). This cooling is not linear, and includes two descending steps in the early 1980s and mid-1990s. The 1979-1995 period is characterized by increasing concentrations of ozone depleting substances (ODS) and by the two major volcanic eruptions of El Chichón (1982) and Mount Pinatubo (1991). The 1995-present period is characterized by decreasing ODS concentrations and by the absence of major volcanic eruptions. Greenhouse gas (GHG) concentrations increase over the whole time period. In order to isolate the roles of different forcing agents in the global stratospheric temperature changes, we performed a set of AMIP-style simulations using the NASA Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM). We find that in our model simulations the cooling of the stratosphere from 1979 to present is mostly driven by changes in GHG concentrations in the middle and upper stratosphere and by GHG and ODS changes in the lower stratosphere. While the cooling trend caused by increasing GHGs is roughly constant over the satellite era, changing ODS concentrations cause a significant stratospheric cooling only up to the mid-1990s, when they start to decrease because of the implementation of the Montreal Protocol. Sporadic volcanic events and the solar cycle have a distinct signature in the time series of stratospheric temperature anomalies but do not play a statistically significant role in the long-term trends from 1979 to 2014. Several factors combine to produce the step-like behavior in the stratospheric temperatures: in the lower stratosphere, the flattening starting in the mid 1990's is due to the decrease in ozone depleting substances; Mount Pinatubo and the solar cycle cause the abrupt steps through the aerosol-associated warming and the volcanically induced ozone depletion. In the middle and upper stratosphere, changes in solar irradiance are largely responsible for the step-like behavior of global temperatures anomalies, together with volcanically induced ozone depletion and water vapor increases in the post-Pinatubo years.

  17. Isolating the roles of different forcing agents in global stratospheric temperature changes using model integrations with incrementally added single forcings

    PubMed Central

    Aquila, V.; Swartz, W. H.; Waugh, D. W.; Colarco, P. R.; Pawson, S.; Polvani, L. M.; Stolarski, R. S.

    2018-01-01

    Satellite instruments show a cooling of global stratospheric temperatures over the whole data record (1979–2014). This cooling is not linear, and includes two descending steps in the early 1980s and mid-1990s. The 1979–1995 period is characterized by increasing concentrations of ozone depleting substances (ODS) and by the two major volcanic eruptions of El Chichón (1982) and Mount Pinatubo (1991). The 1995-present period is characterized by decreasing ODS concentrations and by the absence of major volcanic eruptions. Greenhouse gas (GHG) concentrations increase over the whole time period. In order to isolate the roles of different forcing agents in the global stratospheric temperature changes, we performed a set of AMIP-style simulations using the NASA Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM). We find that in our model simulations the cooling of the stratosphere from 1979 to present is mostly driven by changes in GHG concentrations in the middle and upper stratosphere and by GHG and ODS changes in the lower stratosphere. While the cooling trend caused by increasing GHGs is roughly constant over the satellite era, changing ODS concentrations cause a significant stratospheric cooling only up to the mid-1990s, when they start to decrease because of the implementation of the Montreal Protocol. Sporadic volcanic events and the solar cycle have a distinct signature in the time series of stratospheric temperature anomalies but do not play a statistically significant role in the long-term trends from 1979 to 2014. Several factors combine to produce the step-like behavior in the stratospheric temperatures: in the lower stratosphere, the flattening starting in the mid 1990’s is due to the decrease in ozone depleting substances; Mount Pinatubo and the solar cycle cause the abrupt steps through the aerosol-associated warming and the volcanically induced ozone depletion. In the middle and upper stratosphere, changes in solar irradiance are largely responsible for the step-like behavior of global temperatures anomalies, together with volcanically induced ozone depletion and water vapor increases in the post-Pinatubo years. PMID:29593948

  18. Projected rainfall and temperature changes over Malaysia at the end of the 21st century based on PRECIS modelling system

    NASA Astrophysics Data System (ADS)

    Loh, Jui Le; Tangang, Fredolin; Juneng, Liew; Hein, David; Lee, Dong-In

    2016-05-01

    This study investigates projected changes in rainfall and temperature over Malaysia by the end of the 21st century based on the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emission Scenarios (SRES) A2, A1B and B2 emission scenarios using the Providing Regional Climates for Impacts Studies (PRECIS). The PRECIS regional climate model (HadRM3P) is configured in 0.22° × 0.22° horizontal grid resolution and is forced at the lateral boundaries by the UKMO-HadAM3P and UKMOHadCM3Q0 global models. The model performance in simulating the present-day climate was assessed by comparing the modelsimulated results to the Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation (APHRODITE) dataset. Generally, the HadAM3P/PRECIS and HadCM3Q0/PRECIS simulated the spatio-temporal variability structure of both temperature and rainfall reasonably well, albeit with the presence of cold biases. The cold biases appear to be associated with the systematic error in the HadRM3P. The future projection of temperature indicates widespread warming over the entire country by the end of the 21st century. The projected temperature increment ranges from 2.5 to 3.9°C, 2.7 to 4.2°C and 1.7 to 3.1°C for A2, A1B and B2 scenarios, respectively. However, the projection of rainfall at the end of the 21st century indicates substantial spatio-temporal variation with a tendency for drier condition in boreal winter and spring seasons while wetter condition in summer and fall seasons. During the months of December to May, ~20-40% decrease of rainfall is projected over Peninsular Malaysia and Borneo, particularly for the A2 and B2 emission scenarios. During the summer months, rainfall is projected to increase by ~20-40% across most regions in Malaysia, especially for A2 and A1B scenarios. The spatio-temporal variations in the projected rainfall can be related to the changes in the weakening monsoon circulations, which in turn alter the patterns of regional moisture convergences in the region.

  19. Investigation of Tropical Transport with UARS Data

    NASA Technical Reports Server (NTRS)

    Dunkerton, Timothy J.

    1999-01-01

    Measurements of trace constituents obtained by instruments aboard the Upper Atmosphere Research Satellite (UARS) have been used to study transport processes associated with the quasi-biennial oscillation, laterally propagating Rossby waves, and upward propagating Kelvin waves in the tropical and subtropical upper troposphere and stratosphere. Mean vertical motions, vertical diffusivities and in-mixing rates were inferred from observations of the 'tape recorder' signal in near-equatorial stratospheric water vapor. The effect of the quasi-biennial oscillation (QBO) on tracer distributions in the upper half of the stratosphere was seen in a spectacular 'staircase' pattern, predominantly in the winter hemisphere, revealing the latitudinally asymmetric nature of QBO transport due to induced mean meridional circulations and modulation of lateral mixing associated with planetary Rossby waves. The propagation of Rossby waves across the equator in the westerly phase of the QBO was seen in tracer fields and corroborating United Kingdom Meteorological Office (UKMO) analyses; a modeling study of the effect of these waves on typical QBO wind profiles was performed. Water vapor in the upper troposphere and lower stratosphere was found to exhibit signatures of the tropical intraseasonal oscillation (TIO) and faster Kelvin waves in the two regions, respectively.

  20. Analyses of the stratospheric dynamics simulated by a GCM with a stochastic nonorographic gravity wave parameterization

    NASA Astrophysics Data System (ADS)

    Serva, Federico; Cagnazzo, Chiara; Riccio, Angelo

    2016-04-01

    The effects of the propagation and breaking of atmospheric gravity waves have long been considered crucial for their impact on the circulation, especially in the stratosphere and mesosphere, between heights of 10 and 110 km. These waves, that in the Earth's atmosphere originate from surface orography (OGWs) or from transient (nonorographic) phenomena such as fronts and convective processes (NOGWs), have horizontal wavelengths between 10 and 1000 km, vertical wavelengths of several km, and frequencies spanning from minutes to hours. Orographic and nonorographic GWs must be accounted for in climate models to obtain a realistic simulation of the stratosphere in both hemispheres, since they can have a substantial impact on circulation and temperature, hence an important role in ozone chemistry for chemistry-climate models. Several types of parameterization are currently employed in models, differing in the formulation and for the values assigned to parameters, but the common aim is to quantify the effect of wave breaking on large-scale wind and temperature patterns. In the last decade, both global observations from satellite-borne instruments and the outputs of very high resolution climate models provided insight on the variability and properties of gravity wave field, and these results can be used to constrain some of the empirical parameters present in most parameterization scheme. A feature of the NOGW forcing that clearly emerges is the intermittency, linked with the nature of the sources: this property is absent in the majority of the models, in which NOGW parameterizations are uncoupled with other atmospheric phenomena, leading to results which display lower variability compared to observations. In this work, we analyze the climate simulated in AMIP runs of the MAECHAM5 model, which uses the Hines NOGW parameterization and with a fine vertical resolution suitable to capture the effects of wave-mean flow interaction. We compare the results obtained with two version of the model, the default and a new stochastic version, in which the value of the perturbation field at launching level is not constant and uniform, but extracted at each time-step and grid-point from a given PDF. With this approach we are trying to add further variability to the effects given by the deterministic NOGW parameterization: the impact on the simulated climate will be assessed focusing on the Quasi-Biennial Oscillation of the equatorial stratosphere (known to be driven also by gravity waves) and on the variability of the mid-to-high latitudes atmosphere. The different characteristics of the circulation will be compared with recent reanalysis products in order to determine the advantages of the stochastic approach over the traditional deterministic scheme.

  1. Impact of convective activity on precipitation δ18O in isotope-enabled models

    NASA Astrophysics Data System (ADS)

    Hu, J.; Emile-Geay, J.; Dee, S.

    2017-12-01

    The ^18O signal preserved in paleo-archives (e.g. speleothem, tree ring cellulose, ice cores) is widely used to reconstruct precipitation or temperature. In the tropics, the inverse relationship between precipitation ^18O and rainfall amount, namely "amount effect" [Dansgaard, Tellus, 1964], is often used to interpret precipitation ^18O. However, recent studies have shown that precipitation ^18O is also influenced by precipitation type [Kurita et al, JGR, 2009; Moerman et al, EPSL, 2013], and recent observations indicate that it is negatively correlated with the fraction of precipitation associated with stratiform clouds [Aggarwal et al, Nature Geosci, 2016]. It is thus important to determine to what extent isotope-enabled climate models can reproduce these relationships. Here we do so using output from LMDZ, CAM2, and isoGSM from the Stable Water Isotope Intercomparison Group, Phase 2 (SWING2) project and results of SPEEDY-IER [Dee et al, JGR, 2015] from an AMIP-style experiment. The results show that these models simulate the "amount effect" well in the tropics, and the relationship between precipitation ^18O and precipitation is reversed in many places in mid-latitudes, in accordance with observations [Bowen, JGR, 2008]. Also, these models can all reproduce the negative correlation between monthly precipitation ^18O and stratiform precipitation proportion in mid-latitude (30°N-50°N; 50°S-30°S), but in the tropics (30°S-30°N), models show a positive correlation instead. The reason for this bias will be investigated within idealized experiments with SPEEDY-IER. The correct simulations of the impact of convective activity on precipitation ^18O in isotope-enabled models will improve our interpretation of paleoclimate proxies with respect to hydroclimate variability. P. K. Aggarwal et al. (2016), Nature Geosci., 9, 624-629, doi:10.1038/ngeo2739. G. J. Bowen. (2008), J. Geophys. Res., 113, D05113, doi:10.1029/2007JD009295. W. Dansgaard (1964), Tellus, 16(4), 436-468. S. Dee et al. (2015), J. Geophys. Res. Atmos., 120, 73-91, doi:10.1002/2014JD022194. N. Kurita. (2013), J. Geophys. Res. Atmos. 118, 10,376-10,390, doi:10.1002/jgrd.50754. J. W. Moerman et al. (2013), Earth Planet. Sci. Lett., 369, 108-119.

  2. A Multi-Model Analysis of the Cloud Phase Transition in 16 GCMs Using Satellite Observations (CALIPSO/GPCP) and Reanalysis Data (ECMWF/MERRA).

    NASA Astrophysics Data System (ADS)

    Cesana, G.; Waliser, D. E.; Jiang, X.; Li, J. L. F.

    2014-12-01

    The ubiquitous presence of clouds within the troposphere contributes to modulate the radiative balance of the earth-atmosphere system. Depending on their cloud phase, clouds may have different microphysical and macrophysical properties, and hence, different radiative effects. In this study, we took advantage of climate runs from the GASS-YoTC and AMIP multi-model experiments to document the differences associated to the cloud phase parameterizations of 16 GCMs. A particular emphasize has been put on the vertical structure of the transition between liquid and ice in clouds. A way to intercompare the models regardless of their cloud fraction is to study the ratio of the ice mass to the total mass of the condensed water. To address the challenge of evaluating the modeled cloud phase, we profited from the cloud phase climatology so called CALIPSO-GOCCP, which separates liquid clouds from ice clouds at global scale, with a high vertical resolution (480m), above all surfaces. We also used reanalysis data and GPCP satellite observations to investigate the influence of the temperature, the relative humidity, the vertical wind speed and the precipitations on the cloud phase transition. In 12 (of 16) models, there are too few super cooled liquid in clouds compared to observations, mostly in the high troposphere. We exhibited evidences of the link between the cloud phase transition and the humidity, the vertical wind speed as well as the precipitations. Some cloud phase schemes are more affected by the humidity and the vertical velocity and some other by the precipitations. Although a few models can reproduce the observe relation between cloud phase and temperature, humidity, vertical velocity or precipitations, none of them perform well for all the parameters. An important result of this study is that the T-dependent phase parameterizations do not allow simulating the complexity of the observed cloud phase transition. Unfortunately, more complex microphysics schemes do not succeed to reproduce all the processes neither. Finally, thanks to the combined use of CALIPSO-GOCCP and ECMWF water vapor pressure, we showed an updated version of the Clausius-Clapeyron water vapor phase diagram. This diagram represents a new tool to improve the simulation of the cloud phase transition in climate models.

  3. Predicting and attributing recent East African Spring droughts with dynamical-statistical climate model ensembles

    NASA Astrophysics Data System (ADS)

    Funk, C. C.; Shukla, S.; Hoerling, M. P.; Robertson, F. R.; Hoell, A.; Liebmann, B.

    2013-12-01

    During boreal spring, eastern portions of Kenya and Somalia have experienced more frequent droughts since 1999. Given the region's high levels of food insecurity, better predictions of these droughts could provide substantial humanitarian benefits. We show that dynamical-statistical seasonal climate forecasts, based on the latest generation of coupled atmosphere-ocean and uncoupled atmospheric models, effectively predict boreal spring rainfall in this area. Skill sources are assessed by comparing ensembles driven with full-ocean forcing with ensembles driven with ENSO-only sea surface temperatures (SSTs). Our analysis suggests that both ENSO and non-ENSO Indo-Pacific SST forcing have played an important role in the increase in drought frequencies. Over the past 30 years, La Niña drought teleconnections have strengthened, while non-ENSO Indo-Pacific convection patterns have also supported increased (decreased) Western Pacific (East African) rainfall. To further examine the relative contribution of ENSO, low frequency warming and the Pacific Decadal Oscillation, we present decompositions of ECHAM5, GFS, CAM4 and GMAO AMIP simulations. These decompositions suggest that rapid warming in the western Pacific and steeper western-to-central Pacific SST gradients have likely played an important role in the recent intensification of the Walker circulation, and the associated increase in East African aridity. A linear combination of time series describing the Pacific Decadal Oscillation and the strength of Indo-Pacific warming are shown to track East African rainfall reasonably well. The talk concludes with a few thoughts linking the potentially important interplay of attribution and prediction. At least for recent East African droughts, it appears that a characteristic Indo-Pacific SST and precipitation anomaly pattern can be linked statistically to support forecasts and attribution analyses. The combination of traditional AGCM attribution analyses with simple yet physically plausible statistical estimation procedures may help us better untangle some climate mysteries.

  4. Multimodel Evidence for an Atmospheric Circulation Response to Arctic Sea Ice Loss in the CMIP5 Future Projections

    NASA Astrophysics Data System (ADS)

    Zappa, G.; Pithan, F.; Shepherd, T. G.

    2018-01-01

    Previous single-model experiments have found that Arctic sea ice loss can influence the atmospheric circulation. To evaluate this process in a multimodel ensemble, a novel methodology is here presented and applied to infer the influence of Arctic sea ice loss in the CMIP5 future projections. Sea ice influence is estimated by comparing the circulation response in the RCP8.5 scenario against the circulation response to sea surface warming and CO2 increase inferred from the AMIPFuture and AMIP4xCO2 experiments, where sea ice is unperturbed. Multimodel evidence of the impact of sea ice loss on midlatitude atmospheric circulation is identified in late winter (January-March), when the sea ice-related surface heat flux perturbation is largest. Sea ice loss acts to suppress the projected poleward shift of the North Atlantic jet, to increase surface pressure in northern Siberia, and to lower it in North America. These features are consistent with previous single-model studies, and the present results indicate that they are robust to model formulation.

  5. Forecasting the magnitude and onset of El Niño based on climate network

    NASA Astrophysics Data System (ADS)

    Meng, Jun; Fan, Jingfang; Ashkenazy, Yosef; Bunde, Armin; Havlin, Shlomo

    2018-04-01

    El Niño is probably the most influential climate phenomenon on inter-annual time scales. It affects the global climate system and is associated with natural disasters; it has serious consequences in many aspects of human life. However, the forecasting of the onset and in particular the magnitude of El Niño are still not accurate enough, at least more than half a year ahead. Here, we introduce a new forecasting index based on climate network links representing the similarity of low frequency temporal temperature anomaly variations between different sites in the Niño 3.4 region. We find that significant upward trends in our index forecast the onset of El Niño approximately 1 year ahead, and the highest peak since the end of last El Niño in our index forecasts the magnitude of the following event. We study the forecasting capability of the proposed index on several datasets, including, ERA-Interim, NCEP Reanalysis I, PCMDI-AMIP 1.1.3 and ERSST.v5.

  6. Multimodel Evidence for an Atmospheric Circulation Response to Arctic Sea Ice Loss in the CMIP5 Future Projections.

    PubMed

    Zappa, G; Pithan, F; Shepherd, T G

    2018-01-28

    Previous single-model experiments have found that Arctic sea ice loss can influence the atmospheric circulation. To evaluate this process in a multimodel ensemble, a novel methodology is here presented and applied to infer the influence of Arctic sea ice loss in the CMIP5 future projections. Sea ice influence is estimated by comparing the circulation response in the RCP8.5 scenario against the circulation response to sea surface warming and CO 2 increase inferred from the AMIPFuture and AMIP4xCO2 experiments, where sea ice is unperturbed. Multimodel evidence of the impact of sea ice loss on midlatitude atmospheric circulation is identified in late winter (January-March), when the sea ice-related surface heat flux perturbation is largest. Sea ice loss acts to suppress the projected poleward shift of the North Atlantic jet, to increase surface pressure in northern Siberia, and to lower it in North America. These features are consistent with previous single-model studies, and the present results indicate that they are robust to model formulation.

  7. Near-real-time TOMS, telecommunications and meteorological support for the 1987 Airborne Antarctic Ozone Experiment

    NASA Technical Reports Server (NTRS)

    Ardanuy, P.; Victorine, J.; Sechrist, F.; Feiner, A.; Penn, L.

    1988-01-01

    The goal of the 1987 Airborne Antarctic Ozone Experiment was to improve the understanding of the mechanisms involved in the formation of the Antarctic ozone hole. Total ozone data taken by the Nimbus-7 Total Ozone Mapping Spectrometer (TOMS) played a central role in the successful outcome of the experiment. During the experiment, the near-real-time TOMS total ozone observations were supplied within hours of real time to the operations center in Punta Arenas, Chile. The final report summarizes the role which Research and Data Systems (RDS) Corporation played in the support of the experiment. The RDS provided telecommunications to support the science and operations efforts for the Airborne Antarctic Ozone Experiment, and supplied near real-time weather information to ensure flight and crew safety; designed and installed the telecommunications network to link NASA-GSFC, the United Kingdom Meteorological Office (UKMO), Palmer Station, the European Center for Medium-Range Weather Forecasts (ECMWF) to the operation at Punta Arenas; engineered and installed stations and other stand-alone systems to collect data from designated low-orbiting polar satellites and beacons; provided analyses of Nimbus-7 TOMS data and backup data products to Punta Arenas; and provided synoptic meteorological data analysis and reduction.

  8. The Yearly Variation in Fall-Winter Arctic Winter Vortex Descent

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark R.; Newman, Paul A.

    1999-01-01

    Using the change in HALOE methane profiles from early September to late March, we have estimated the minimum amount of diabatic descent within the polar which takes place during Arctic winter. The year to year variations are a result in the year to year variations in stratospheric wave activity which (1) modify the temperature of the vortex and thus the cooling rate; (2) reduce the apparent descent by mixing high amounts of methane into the vortex. The peak descent amounts from HALOE methane vary from l0km -14km near the arrival altitude of 25 km. Using a diabatic trajectory calculation, we compare forward and backward trajectories over the course of the winter using UKMO assimilated stratospheric data. The forward calculation agrees fairly well with the observed descent. The backward calculation appears to be unable to produce the observed amount of descent, but this is only an apparent effect due to the density decrease in parcels with altitude. Finally we show the results for unmixed descent experiments - where the parcels are fixed in latitude and longitude and allowed to descend based on the local cooling rate. Unmixed descent is found to always exceed mixed descent, because when normal parcel motion is included, the path average cooling is always less than the cooling at a fixed polar point.

  9. Efficient in-situ visualization of unsteady flows in climate simulation

    NASA Astrophysics Data System (ADS)

    Vetter, Michael; Olbrich, Stephan

    2017-04-01

    The simulation of climate data tends to produce very large data sets, which hardly can be processed in classical post-processing visualization applications. Typically, the visualization pipeline consisting of the processes data generation, visualization mapping and rendering is distributed into two parts over the network or separated via file transfer. Within most traditional post-processing scenarios the simulation is done on a supercomputer whereas the data analysis and visualization is done on a graphics workstation. That way temporary data sets with huge volume have to be transferred over the network, which leads to bandwidth bottlenecks and volume limitations. The solution to this issue is the avoidance of temporary storage, or at least significant reduction of data complexity. Within the Climate Visualization Lab - as part of the Cluster of Excellence "Integrated Climate System Analysis and Prediction" (CliSAP) at the University of Hamburg, in cooperation with the German Climate Computing Center (DKRZ) - we develop and integrate an in-situ approach. Our software framework DSVR is based on the separation of the process chain between the mapping and the rendering processes. It couples the mapping process directly to the simulation by calling methods of a parallelized data extraction library, which create a time-based sequence of geometric 3D scenes. This sequence is stored on a special streaming server with an interactive post-filtering option and then played-out asynchronously in a separate 3D viewer application. Since the rendering is part of this viewer application, the scenes can be navigated interactively. In contrast to other in-situ approaches where 2D images are created as part of the simulation or synchronous co-visualization takes place, our method supports interaction in 3D space and in time, as well as fixed frame rates. To integrate in-situ processing based on our DSVR framework and methods in the ICON climate model, we are continuously evolving the data structures and mapping algorithms of the framework to support the ICON model's native grid structures, since DSVR originally was designed for rectilinear grids only. We now have implemented a new output module to ICON to take advantage of the DSVR visualization. The visualization can be configured as most output modules by using a specific namelist and is exemplarily integrated within the non-hydrostatic atmospheric model time loop. With the integration of a DSVR based in-situ pathline extraction within ICON, a further milestone is reached. The pathline algorithm as well as the grid data structures have been optimized for the domain decomposition used for the parallelization of ICON based on MPI and OpenMP. The software implementation and evaluation is done on the supercomputers at DKRZ. In principle, the data complexity is reduced from O(n3) to O(m), where n is the grid resolution and m the number of supporting point of all pathlines. The stability and scalability evaluation is done using Atmospheric Model Intercomparison Project (AMIP) runs. We will give a short introduction in our software framework, as well as a short overview on the implementation and usage of DSVR within ICON. Furthermore, we will present visualization and evaluation results of sample applications.

  10. Long-term change of the atmospheric energy cycles and weather disturbances

    NASA Astrophysics Data System (ADS)

    Kim, WonMoo; Choi, Yong-Sang

    2017-11-01

    Weather disturbances are the manifestation of mean atmospheric energy cascading into eddies, thus identifying atmospheric energy structure is of fundamental importance to understand the weather variability in a changing climate. The question is whether our observational data can lead to a consistent diagnosis on the energy conversion characteristics. Here we investigate the atmospheric energy cascades by a simple framework of Lorenz energy cycle, and analyze the energy distribution in mean and eddy fields as forms of potential and kinetic energy. It is found that even the widely utilized independent reanalysis datasets, NCEP-DOE AMIP-II Reanalysis (NCEP2) and ERA-Interim (ERA-INT), draw different conclusions on the change of weather variability measured by eddy-related kinetic energy. NCEP2 shows an increased mean-to-eddy energy conversion and enhanced eddy activity due to efficient baroclinic energy cascade, but ERA-INT shows relatively constant energy cascading structure between the 1980s and the 2000s. The source of discrepancy mainly originates from the uncertainties in hydrological variables in the mid-troposphere. Therefore, much efforts should be made to improve mid-tropospheric observations for more reliable diagnosis of the weather disturbances as a consequence of man-made greenhouse effect.

  11. Synoptic Storms in the North Atlantic in the Atmospheric Reanalysis and Scatterometer-Based Wind Products

    NASA Astrophysics Data System (ADS)

    Dukhovskoy, D. S.; Bourassa, M. A.

    2016-12-01

    The study compares and analyses the characteristics of synoptic storms in the Subpolar North Atlantic over the time period from 2000 through 2009 derived from reanalysis data sets and scatterometer-based gridded wind products. The analysis is performed for ocean 10-m winds derived from the following wind data sets: NCEP/DOE AMIP-II reanalysis (NCEPR2), NCAR/CFSR, Arctic System Reanalysis (ASR) version 1, Cross-Calibrated Multi-Platform (CCMP) wind product versions 1.1 and recently released version 2.0 prepared by the Remote Sensing Systems, and QuikSCAT. A cyclone tracking algorithm employed in this study for storm identification is based on average vorticity fields derived from the wind data. The study discusses storm characteristics such as storm counts, trajectories, intensity, integrated kinetic energy, spatial scale. Interannal variability of these characteristics in the data sets is compared. The analyses demonstrates general agreement among the wind data products on the characteristics of the storms, their spatial distribution and trajectories. On average, the NCEPR2 storms are more energetic mostly due to large spatial scales and stronger winds. There is noticeable interannual variability in the storm characteristics, yet no obvious trend in storms is observed in the data sets.

  12. Long History of IAM Comparisons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Steven J.; Clarke, Leon E.; Edmonds, James A.

    2015-04-23

    Correspondence to editor: We agree with the editors that the assumptions behind models of all types, including integrated assessment models (IAMs), should be as transparent as possible. The editors were in error, however, when they implied that the IAM community is just “now emulating the efforts of climate researchers by instigating their own model inter-comparison projects (MIPs).” In fact, model comparisons for integrated assessment and climate models followed a remarkably similar trajectory. Early General Circulation Model (GCM) comparison efforts, evolved to the first Atmospheric Model Inter-comparison Project (AMIP), which was initiated in the early 1990s. Atmospheric models evolved to coupledmore » atmosphere-ocean models (AOGCMs) and results from the first Coupled Model Inter-Comparison Project (CMIP1) become available about a decade later. Results of first energy model comparison exercise, conducted under the auspices of the Stanford Energy Modeling Forum, were published in 1977. A summary of the first comparison focused on climate change was published in 1993. As energy models were coupled to simple economic and climate models to form IAMs, the first comparison exercise for IAMs (EMF-14) was initiated in 1994, and IAM comparison exercises have been on-going since this time.« less

  13. Use of System Thinking Software for Determining Climate Change Impacts in Water Balance for the Rio Yaqui Basin, Sonora, Mexico

    NASA Astrophysics Data System (ADS)

    Tapia, E. M.; Minjarez, J. I.; Espinoza, I. G.; Sosa, C. M.

    2013-05-01

    Climate change in Northwestern Mexico and its hydrological impact on water balance, water scarcity and flooding events, has become a matter of increasing concern over the past several decades due to the region's semiarid conditions. Changes in temperature, precipitation, and sea level will affect agriculture, farming, and aquaculture, in addition to compromising the quality of water resources for human consumption. According to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC, 2007), Global Circulation Models (GCMs) can provide reliable estimations of future climate conditions in addition to atmospheric processes that cause them, based on different input scenarios such as A2 (higher emission of greenhouse gases) and B1 (lower emission of GHG), among others. However, GCM`s resolution results to coarse in regions which have high space and time climate variability. To remediate this, several methods based on dynamical, statistical and empirical analysis have been proposed for downcaling. In this study, we evaluate possible changes in precipitation and temperature for the "Rio Yaqui Basin" in Sonora, Mexico and assess the impact of such changes on runoff, evapotranspiration and aquifer recharge for the 2010-2099 period of time. For this purpose, we analyzed the results of a Bias Corrected and Downscaled Climate Projection from the World Climate Research Programme's (WCRP's) Coupled Model Intercomparison Project phase 3 (CMIP3) multi-model dataset: UKMO-HADCM3 from the Hadley Centre for Climate Prediction. Northwest Mexico is under the influence of the North American Monsoon (NAM), a system affecting the states of Sinaloa and Sonora where the precipitation regimes change drastically during the summer months of June, July and August. It is associated to the sharp variations of topography, precipitation and temperature regimes in the region, so the importance of analyzing the downscaled climate projections. The Rio Yaqui Basin is one of the most important basins in Sonora. It is located in northwestern Mexico, and covers an approximate area of 74,054 km2, providing water for one of the most prominent agricultural zones in the state. We used the System Thinking Software "Stella 9.0.2" to dynamically visualize the effects of climate change on the Rio Yaqui Basin. In this software, the main components of the water balance are simulated over the designated period of time with tools that include stocks and flow diagrams, causal loops, model equations and built in functions. Climate change projections for the Rio Yaqui Basin showed highly variable runoff behaviors, indicating the possibility of frequent droughts alternating with years of extraordinary runoff. Simulations generated through the System Thinking Software provides a reasonable basis for establishing policies for optimizing storage of water during extraordinary runoff periods that can serve as water supplies during frequent droughts.

  14. Equatorial Wave Activity during NOAA's 2016 El Niño Rapid Response Field Campaign

    NASA Astrophysics Data System (ADS)

    Kiladis, G. N.; Dias, J.; Gehne, M.; Mayer, K.

    2016-12-01

    The El Niño Rapid Response (ENRR) field campaign targeted equatorial Pacific atmospheric convective activity during January-March 2016 through enhanced observations using dropsondes from the NOAA G-IV aircraft and radiosonde observations from Kiritimati (Christmas) Island and the NOAA research ship the Ronald H. Brown. This presentation examines the equatorial wave activity observed during ENRR and its relationship to tropical convection, and compares this activity to observations of past large El Niño events. The 2015-16 El Niño had much in common with the events during 1982-83 and 1997-98, with similar amplitude sea surface temperature (SST) anomalies, but also differed in several key aspects. All of these episodes featured enhanced convectively coupled Kelvin wave activity crossing the entire Pacific basin, which is generally absent during the northern winter seasons of near normal or La Niña SSTs. Prior to the ENRR period during December 2015 a large amplitude Madden-Julian Oscillation (MJO) was observed, with a convective signal that propagated unusually far to the east ( 150W). This was associated with an eastward displacement of the North Pacific storm track and heavy precipitation along the west coast of North America, broadly matching the large scale behavior of MJO evolution in statistical composites during El Niño. A second MJO-like event occurred during the latter part of February, 2016, but despite a similar convective heating field, the basic state flow was much different than during December, with a well-developed "westerly duct" which favored the intrusion of extratropical Rossby wave energy into the equatorial eastern Pacific region, as can be seen in E Vector fields. This latter event was accompanied by a distinct lack of an extended storm track and associated precipitation along the west coast of North America. Based on the preliminary results of AMIP simulations using observed SSTs, these differences are difficult to reproduce, and are hypothesized to be due to a certain level of "internal variability" within the storm track itself that may have been overriding the large scale forcing by the tropical diabatic heating field.

  15. Can we trust climate models to realistically represent severe European windstorms?

    NASA Astrophysics Data System (ADS)

    Trzeciak, Tomasz M.; Knippertz, Peter; Owen, Jennifer S. R.

    2014-05-01

    Despite the enormous advances made in climate change research, robust projections of the position and the strength of the North Atlantic stormtrack are not yet possible. In particular with respect to damaging windstorms, this incertitude bears enormous risks to European societies and the (re)insurance industry. Previous studies have addressed the problem of climate model uncertainty through statistical comparisons of simulations of the current climate with (re-)analysis data and found that there is large disagreement between different climate models, different ensemble members of the same model and observed climatologies of intense cyclones. One weakness of such statistical evaluations lies in the difficulty to separate influences of the climate model's basic state from the influence of fast processes on the development of the most intense storms. Compensating effects between the two might conceal errors and suggest higher reliability than there really is. A possible way to separate influences of fast and slow processes in climate projections is through a "seamless" approach of hindcasting historical, severe storms with climate models started from predefined initial conditions and run in a numerical weather prediction mode on the time scale of several days. Such a cost-effective case-study approach, which draws from and expands on the concepts from the Transpose-AMIP initiative, has recently been undertaken in the SEAMSEW project at the University of Leeds funded by the AXA Research Fund. Key results from this work focusing on 20 historical storms and using different lead times and horizontal and vertical resolutions include: (a) Tracks are represented reasonably well by most hindcasts. (b) Sensitivity to vertical resolution is low. (c) There is a systematic underprediction of cyclone depth for a coarse resolution of T63, but surprisingly no systematic bias is found for higher-resolution runs using T127, showing that climate models are in fact able to represent the storm dynamics well, if given the correct initial conditions. Combined with a too low number of deep cyclones in many climate models, this points too an insufficient number of storm-prone initial conditions in free-running climate runs. This question will be addressed in future work.

  16. Is Polar Amplification Deeper and Stronger than Dynamicists Assume?

    NASA Astrophysics Data System (ADS)

    Scheff, J.; Maroon, E.

    2017-12-01

    In the CMIP multi-model mean under strong future warming, Arctic amplification is confined to the lower troposphere, so that the meridional gradient of warming reverses around 500 mb and the upper troposphere is characterized by strong "tropical amplification" in which warming weakens with increasing latitude. This model-derived pattern of warming maxima in the upper-level tropics and lower-level Arctic has become a canonical assumption driving theories of the large-scale circulation response to climate change. Yet, several lines of evidence and reasoning suggest that Arctic amplification may in fact extend through the entire depth of the troposphere, and/or may be stronger than commonly modeled. These include satellite Microwave Sounding Unit (MSU) temperature trends as a function of latitude and vertical level, the recent discovery that the extratropical negative cloud phase feedback in models is largely spurious, and the very strong polar amplification observed in past warm and lukewarm climates. Such a warming pattern, with deep, dominant Arctic amplification, would have very different implications for the circulation than a canonical CMIP-like warming: instead of slightly shifting poleward and strengthening, eddies, jets and cells might shift equatorward and considerably weaken. Indeed, surface winds have been mysteriously weakening ("stilling") at almost all stations over the last half-century or so, there has been no poleward shift in northern hemisphere circulation metrics, and past warm climates' subtropics were apparently quite wet (and their global ocean circulations were weak.) To explore these possibilities more deeply, we examine the y-z structure of warming and circulation changes across a much broader range of models, scenarios and time periods than the CMIP future mean, and use an MSU simulator to compare them to the satellite warming record. Specifically, we examine whether the use of historical (rather than future) forcing, AMIP (rather than CMIP) configuration, individual GCMs, and/or individual ensemble members can better reproduce the structure of the MSU and surface-wind observations. Figure 1 already shows that tropical amplification is absent in the CESM1 historical ensemble (1979-2012). The results of these analyses will guide our future modeling work on these topics.

  17. Accuracy of Modelled Stratospheric Temperatures in the Winter Arctic Vortex from Infra Red Montgolfier Long Duration Balloon Measurements

    NASA Technical Reports Server (NTRS)

    Pommereau, J.-P.; Garnier, A.; Knudson, B. M.; Letrenne, G.; Durand, M.; Cseresnjes, M.; Nunes-Pinharanda, M.; Denis, L.; Newman, P. A.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The temperature of the stratosphere has been measured in the Arctic vortex every 9-10 minutes along the trajectory of four Infra Red Montgolfier long duration balloons flown for 7 to 22 days during the winters of 1997 and 1999. From a number of comparisons to independent sensors, the accuracy of the measurements is demonstrated to be plus or minus 0.5 K during nighttime and at altitude below 28 km (10 hPa). The performances of the analyses of global meteorological models, European Center for Medium Range Weather Forecasts (ECMWF) 31 and 50 levels, United Kingdom Meteorological Office (UKMO), Data Assimilation Office (DAO), National Climatic Prediction Center (NCEP) and NCEP/NCAR reanalysis, used in photochemical simulations of ozone destruction and interpretation of satellite data, are evaluated by comparison to this large (3500 data points) and homogeneous experimental data set. Most of models, except ECMWF31 in 1999, do show a smal1 average warm bias of between 0 and 1.6 K, with deviations particularly large, up to 20 K at high altitude (5hPa) in stratospheric warming conditions in 1999. Particularly wrong was ECMWF 31 levels near its top level at 10 hPa in 1999 where temperature 25 K colder than the real atmosphere were reported. The average dispersion between models and measurements varies from plus or minus 1.0 to plus or minus 3.0 K depending on the model and the year. It is shown to be the result of three contributions. The largest is a long wave modulation likely caused by the displacement of the temperature field in the analyses compared to real atmosphere. The second is the overestimation of the vertical gradient of temperature particularly in warming conditions, which explains the increase of dispersion from 1997 to 1999. Unexpectedly, the third and smallest (plus or minus 0.6-0.7 K) is the contribution of meso and subgrid scale vertical and horizontal features associated to the vertical propagation of orographic or gravity waves. Compared to other models, the newly available ECMWF 50 levels version assimilating the high vertical resolution radiances of the space borne Advanced Microwave Sounding Unit, performs significantly better (0.03 plus or minus 1.12 K on average between 10 and 140 hPa in 1999) than other models.

  18. Evaluation of TIGGE Ensemble Forecasts of Precipitation in Distinct Climate Regions in Iran

    NASA Astrophysics Data System (ADS)

    Aminyavari, Saleh; Saghafian, Bahram; Delavar, Majid

    2018-04-01

    The application of numerical weather prediction (NWP) products is increasing dramatically. Existing reports indicate that ensemble predictions have better skill than deterministic forecasts. In this study, numerical ensemble precipitation forecasts in the TIGGE database were evaluated using deterministic, dichotomous (yes/no), and probabilistic techniques over Iran for the period 2008-16. Thirteen rain gauges spread over eight homogeneous precipitation regimes were selected for evaluation. The Inverse Distance Weighting and Kriging methods were adopted for interpolation of the prediction values, downscaled to the stations at lead times of one to three days. To enhance the forecast quality, NWP values were post-processed via Bayesian Model Averaging. The results showed that ECMWF had better scores than other products. However, products of all centers underestimated precipitation in high precipitation regions while overestimating precipitation in other regions. This points to a systematic bias in forecasts and demands application of bias correction techniques. Based on dichotomous evaluation, NCEP did better at most stations, although all centers overpredicted the number of precipitation events. Compared to those of ECMWF and NCEP, UKMO yielded higher scores in mountainous regions, but performed poorly at other selected stations. Furthermore, the evaluations showed that all centers had better skill in wet than in dry seasons. The quality of post-processed predictions was better than those of the raw predictions. In conclusion, the accuracy of the NWP predictions made by the selected centers could be classified as medium over Iran, while post-processing of predictions is recommended to improve the quality.

  19. Intraseasonal Oscillations over South America: A Study with a Regional Climate Model

    NASA Technical Reports Server (NTRS)

    Chen, Baode; Chao, Winston

    2003-01-01

    The National Center for Atmospheric Research (NCAR) regional climate model version 2 (RegCM2) is used to investigate the observed characteristics of intraseasonal oscillations over South America. Our study is mainly concentrated on an intraseaonal mode, which is observed to account for a large portion of the intraseasonal variation, to have a standing feature and to be independent of the MJO. The NCEPDOE AMIP-II reanalysis is utilized to provide initial and lateral boundary conditions for the RegCM2 based upon the OOZ, 062, 122 and 182 data.Our results indicate that the intraseasonal oscillation still exists with time- averaged lateral boundary condition, which prevents the MJO and other outside disturbances from entering the model's domain, suggesting a locally forced oscillation responsible for ths intraseasonal mode independent of the MJO. Further experiments show that the annual and daily variabilities and a radiative-convective interaction are not essential to the locally forced intraseasonal oscillation. The intraseasonal oscillations over Amazon in our model essentially result from interactions among atmospheric continental- scale circulation, surface radiation, surface sensible and latent heat fluxes, and cumulus convection. The wavelet analyses of various surface energy fluxes and surface energy budget also verify that the primary cause of intraseasonal oscillation is the interaction of land surface processes with the atmosphere.

  20. Global Ocean Evaporation Increases Since 1960 in Climate Reanalyses: How Accurate Are They?

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, Jason B.; Bosilovich, Michael G.

    2016-01-01

    AGCMs w/ Specified SSTs (AMIPs) GEOS-5, ERA-20CM Ensembles Incorporate best historical estimates of SST, sea ice, radiative forcing Atmospheric "weather noise" is inconsistent with specified SST. Instantaneous Sfc fluxes can be wrong sign (e.g. Indian Ocean Monsoon, high latitude oceans). Averaging over ensemble members helps isolate SST-forced signal. Reduced Observational Reanalyses: NOAA 20CR V2C, ERA-20C, JRA-55C Incorporate observed Sfc Press (20CR), Marine Winds (ERA-20C) and rawinsondes (JRA-55C) to recover much of true synoptic or weather w/o shock of new sat obs. Comprehensive Reanalyses (MERRA-2) Full suite of observational constraints- both conventional and remote sensing. But... substantial uncertainties owing to evolving satellite observing system. Multi-source Statistically Blended OAFlux, LargeYeager Blend reanalysis, satellite, and ocean buoy information. While climatological biases are removed, non-physical trends or variations in components remain. Satellite Retrievals GSSTF3, SeaFlux, HOAPS3... Global coverage. Retrieved near sfc wind speed, & humidity used with SST to drive accurate bulk aerodynamic flux estimates. Satellite inter-calibration, spacecraft pointing variations crucial. Short record ( late 1987-present). In situ Measurements ICOADS, IVAD, Res Cruises VOS and buoys offer direct measurements. Sparse data coverage (esp south of 30S. Changes in measurement techniques (e.g. shipboard anemometer height).

  1. The influence of the atmospheric boundary layer on nocturnal layers of noctuids and other moths migrating over southern Britain.

    PubMed

    Wood, Curtis R; Chapman, Jason W; Reynolds, Donald R; Barlow, Janet F; Smith, Alan D; Woiwod, Ian P

    2006-03-01

    Insects migrating at high altitude over southern Britain have been continuously monitored by automatically operating, vertical-looking radars over a period of several years. During some occasions in the summer months, the migrants were observed to form well-defined layer concentrations, typically at heights of 200-400 m, in the stable night-time atmosphere. Under these conditions, insects are likely to have control over their vertical movements and are selecting flight heights that are favourable for long-range migration. We therefore investigated the factors influencing the formation of these insect layers by comparing radar measurements of the vertical distribution of insect density with meteorological profiles generated by the UK Meteorological Office's (UKMO) Unified Model (UM). Radar-derived measurements of mass and displacement speed, along with data from Rothamsted Insect Survey light traps, provided information on the identity of the migrants. We present here three case studies where noctuid and pyralid moths contributed substantially to the observed layers. The major meteorological factors influencing the layer concentrations appeared to be: (a) the altitude of the warmest air, (b) heights corresponding to temperature preferences or thresholds for sustained migration and (c) on nights when air temperatures are relatively high, wind-speed maxima associated with the nocturnal jet. Back-trajectories indicated that layer duration may have been determined by the distance to the coast. Overall, the unique combination of meteorological data from the UM and insect data from entomological radar described here show considerable promise for systematic studies of high-altitude insect layering.

  2. Rainfall and its seasonality over the Amazon in the 21st century as assessed by the coupled models for the IPCC AR4

    NASA Astrophysics Data System (ADS)

    Li, Wenhong; Fu, Rong; Dickinson, Robert E.

    2006-01-01

    The global climate models for the Intergovernmental Panel on Climate Change Fourth Assessment Report (IPCC AR4) predict very different changes of rainfall over the Amazon under the SRES A1B scenario for global climate change. Five of the eleven models predict an increase of annual rainfall, three models predict a decrease of rainfall, and the other three models predict no significant changes in the Amazon rainfall. We have further examined two models. The UKMO-HadCM3 model predicts an El Niño-like sea surface temperature (SST) change and warming in the northern tropical Atlantic which appear to enhance atmospheric subsidence and consequently reduce clouds over the Amazon. The resultant increase of surface solar absorption causes a stronger surface sensible heat flux and thus reduces relative humidity of the surface air. These changes decrease the rate and length of wet season rainfall and surface latent heat flux. This decreased wet season rainfall leads to drier soil during the subsequent dry season, which in turn can delay the transition from the dry to wet season. GISS-ER predicts a weaker SST warming in the western Pacific and the southern tropical Atlantic which increases moisture transport and hence rainfall in the Amazon. In the southern Amazon and Nordeste where the strongest rainfall increase occurs, the resultant higher soil moisture supports a higher surface latent heat flux during the dry and transition season and leads to an earlier wet season onset.

  3. Similarity-based multi-model ensemble approach for 1-15-day advance prediction of monsoon rainfall over India

    NASA Astrophysics Data System (ADS)

    Jaiswal, Neeru; Kishtawal, C. M.; Bhomia, Swati

    2018-04-01

    The southwest (SW) monsoon season (June, July, August and September) is the major period of rainfall over the Indian region. The present study focuses on the development of a new multi-model ensemble approach based on the similarity criterion (SMME) for the prediction of SW monsoon rainfall in the extended range. This approach is based on the assumption that training with the similar type of conditions may provide the better forecasts in spite of the sequential training which is being used in the conventional MME approaches. In this approach, the training dataset has been selected by matching the present day condition to the archived dataset and days with the most similar conditions were identified and used for training the model. The coefficients thus generated were used for the rainfall prediction. The precipitation forecasts from four general circulation models (GCMs), viz. European Centre for Medium-Range Weather Forecasts (ECMWF), United Kingdom Meteorological Office (UKMO), National Centre for Environment Prediction (NCEP) and China Meteorological Administration (CMA) have been used for developing the SMME forecasts. The forecasts of 1-5, 6-10 and 11-15 days were generated using the newly developed approach for each pentad of June-September during the years 2008-2013 and the skill of the model was analysed using verification scores, viz. equitable skill score (ETS), mean absolute error (MAE), Pearson's correlation coefficient and Nash-Sutcliffe model efficiency index. Statistical analysis of SMME forecasts shows superior forecast skill compared to the conventional MME and the individual models for all the pentads, viz. 1-5, 6-10 and 11-15 days.

  4. Potential Predictability of the Monsoon Subclimate Systems

    NASA Technical Reports Server (NTRS)

    Yang, Song; Lau, K.-M.; Chang, Y.; Schubert, S.

    1999-01-01

    While El Nino/Southern Oscillation (ENSO) phenomenon can be predicted with some success using coupled oceanic-atmospheric models, the skill of predicting the tropical monsoons is low regardless of the methods applied. The low skill of monsoon prediction may be either because the monsoons are not defined appropriately or because they are not influenced significantly by boundary forcing. The latter characterizes the importance of internal dynamics in monsoon variability and leads to many eminent chaotic features of the monsoons. In this study, we analyze results from nine AMIP-type ensemble experiments with the NASA/GEOS-2 general circulation model to assess the potential predictability of the tropical climate system. We will focus on the variability and predictability of tropical monsoon rainfall on seasonal-to-interannual time scales. It is known that the tropical climate is more predictable than its extratropical counterpart. However, predictability is different from one climate subsystem to another within the tropics. It is important to understand the differences among these subsystems in order to increase our skill of seasonal-to-interannual prediction. We assess potential predictability by comparing the magnitude of internal and forced variances as defined by Harzallah and Sadourny (1995). The internal variance measures the spread among the various ensemble members. The forced part of rainfall variance is determined by the magnitude of the ensemble mean rainfall anomaly and by the degree of consistency of the results from the various experiments.

  5. Potential Seasonal Predictability for Winter Storms over Europe

    NASA Astrophysics Data System (ADS)

    Wild, Simon; Befort, Daniel J.; Leckebusch, Gregor C.

    2017-04-01

    Reliable seasonal forecasts of strong extra-tropical cyclones and windstorms would have great social and economical benefits, as these events are the most costly natural hazards over Europe. In a previous study we have shown good agreement of spatial climatological distributions of extra-tropical cyclones and wind storms in state-of-the-art multi-member seasonal prediction systems with reanalysis. We also found significant seasonal prediction skill of extra-tropical cyclones and windstorms affecting numerous European countries. We continue this research by investigating the mechanisms and precursor conditions (primarily over the North Atlantic) on a seasonal time scale leading to enhanced extra-tropical cyclone activity and winter storm frequency over Europe. Our results regarding mechanisms show that an increased surface temperature gradient at the western edge of the North Atlantic can be related to enhanced winter storm frequency further downstream causing for example a greater number of storms over the British Isles, as observed in winter 2013-14.The so-called "Horseshoe Index", a SST tripole anomaly pattern over the North Atlantic in the summer months can also cause a higher number of winter storms over Europe in the subsequent winter. We will show results of AMIP-type sensitivity experiments using an AGCM (ECHAM5), supporting this hypothesis. Finally we will analyse whether existing seasonal forecast systems are able to capture these identified mechanisms and precursor conditions affecting the models' seasonal prediction skill.

  6. Thermodynamic ocean-atmosphere Coupling and the Predictability of Nordeste rainfall

    NASA Astrophysics Data System (ADS)

    Chang, P.; Saravanan, R.; Giannini, A.

    2003-04-01

    The interannual variability of rainfall in the northeastern region of Brazil, or Nordeste, is known to be very strongly correlated with sea surface temperature (SST) variability, of Atlantic and Pacific origin. For this reason the potential predictability of Nordeste rainfall is high. The current generation of state-of-the-art atmospheric models can replicate the observed rainfall variability with high skill when forced with the observed record of SST variability. The correlation between observed and modeled indices of Nordeste rainfall, in the AMIP-style integrations with two such models (NSIPP and CCM3) analyzed here, is of the order of 0.8, i.e. the models explain about 2/3 of the observed variability. Assuming that thermodynamic, ocean-atmosphere heat exchange plays the dominant role in tropical Atlantic SST variability on the seasonal to interannual time scale, we analyze its role in Nordeste rainfall predictability using an atmospheric general circulation model coupled to a slab ocean model. Predictability experiments initialized with observed December SST show that thermodynamic coupling plays a significant role in enhancing the persistence of SST anomalies, both in the tropical Pacific and in the tropical Atlantic. We show that thermodynamic coupling is sufficient to provide fairly accurate forecasts of tropical Atlantic SST in the boreal spring that are significantly better than the persistence forecasts. The consequences for the prediction of Nordeste rainfall are analyzed.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Swati, F. N. U.; Stein, Michael L.

    Regional climate models (RCMs) are a standard tool for downscaling climate forecasts to finer spatial scales. The evaluation of RCMs against observational data is an important step in building confidence in the use of RCMs for future prediction. In addition to model performance in climatological means and marginal distributions, a model’s ability to capture spatio-temporal relationships is important. This study develops two approaches: (1) spatial correlation/variogram for a range of spatial lags, with total monthly precipitation and non-seasonal precipitation components used to assess the spatial variations of precipitation; and (2) spatio-temporal correlation for a wide range of distances, directions, andmore » time lags, with daily precipitation occurrence used to detect the dynamic features of precipitation. These measures of spatial and spatio-temporal dependence are applied to a high-resolution RCM run and to the National Center for Environmental Prediction (NCEP)-U.S. Department of Energy (DOE) AMIP II reanalysis data (NCEP-R2), which provides initial and lateral boundary conditions for the RCM. The RCM performs better than NCEP-R2 in capturing both the spatial variations of total and non-seasonal precipitation components and the spatio-temporal correlations of daily precipitation occurrences, which are related to dynamic behaviors of precipitating systems. The improvements are apparent not just at resolutions finer than that of NCEP-R2, but also when the RCM and observational data are aggregated to the resolution of NCEP-R2.« less

  8. Analyzing Multidecadal Trends in Cloudiness Over the Subtropical Andes Mountains of South America Using a Regional Climate Model.

    NASA Astrophysics Data System (ADS)

    Zaitchik, B. F.; Russell, A.; Gnanadesikan, A.

    2016-12-01

    Satellite-based products indicate that many parts of South America have been experiencing increases in outgoing longwave radiation (OLR) and corresponding decreases in cloudiness over the last few decades, with the strongest trends occurring in the subtropical Andes Mountains - an area that is highly vulnerable to climate change due to its reliance on glacial melt for dry-season runoff. Changes in cloudiness may be contributing to increases in atmospheric temperature, thereby raising the freezing level height (FLH) - a critical geophysical parameter. Yet these trends are only partially captured in reanalysis products, while AMIP climate models generally show no significant trend in OLR over this timeframe, making it difficult to determine the underlying drivers. Therefore, controlled numerical experiments with a regional climate model are performed in order to investigate drivers of the observed OLR and cloudiness trends. The Weather Research and Forecasting model (WRF) is used here because it offers several advantages over global models, including higher resolution - a critical asset in areas of complex topography - as well as flexible physics, parameterization, and data assimilation capabilities. It is likely that changes in the mean states and meridional gradients of SSTs in the Pacific and Atlantic oceans are driving regional trends in clouds. A series of lower boundary manipulations are performed with WRF to determine to what extent changes in SSTs influence regional OLR.

  9. Soil frost-induced soil moisture precipitation feedback and effects on atmospheric states

    NASA Astrophysics Data System (ADS)

    Hagemann, Stefan; Blome, Tanja; Ekici, Altug; Beer, Christian

    2016-04-01

    Permafrost or perennially frozen ground is an important part of the terrestrial cryosphere; roughly one quarter of Earth's land surface is underlain by permafrost. As it is a thermal phenomenon, its characteristics are highly dependent on climatic factors. The impact of the currently observed warming, which is projected to persist during the coming decades due to anthropogenic CO2 input, certainly has effects for the vast permafrost areas of the high northern latitudes. The quantification of these effects, however, is scientifically still an open question. This is partly due to the complexity of the system, where several feedbacks are interacting between land and atmosphere, sometimes counterbalancing each other. Moreover, until recently, many global circulation models (GCMs) and Earth system models (ESMs) lacked the sufficient representation of permafrost physics in their land surface schemes. Within the European Union FP7 project PAGE21, the land surface scheme JSBACH of the Max-Planck-Institute for Meteorology ESM (MPI-ESM) has been equipped with the representation of relevant physical processes for permafrost studies. These processes include the effects of freezing and thawing of soil water for both energy and water cycles, thermal properties depending on soil water and ice contents, and soil moisture movement being influenced by the presence of soil ice. In the present study, it will be analysed how these permafrost relevant processes impact large-scale hydrology and climate over northern hemisphere high latitude land areas. For this analysis, the atmosphere-land part of MPI-ESM, ECHAM6-JSBACH, is driven by prescribed observed SST and sea ice in an AMIP2-type setup with and without the newly implemented permafrost processes. Results show a large improvement in the simulated discharge. On one hand this is related to an improved snowmelt peak of runoff due to frozen soil in spring. On the other hand a subsequent reduction of soil moisture leads to a positive land atmosphere feedback to precipitation over the high latitudes, which reduces the model's wet biases in precipitation and evapotranspiration during the summer. This is noteworthy as soil moisture - atmosphere feedbacks have previously not been in the research focus over the high latitudes. These results point out the importance of high latitude physical processes at the land surface for the regional climate.

  10. Mass and Ozone Fluxes from the Lowermost Stratosphere

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark R.; Olsen, Mark A.

    2004-01-01

    Net mass flux from the stratosphere to the troposphere can be computed from the heating rate along the 380K isentropic surface and the time rate of change of the mass of the lowermost stratosphere (the region between the tropopause and the 380K isentrope). Given this net mass flux and the cross tropopause diabatic mass flux, the residual adiabatic mass flux across the tropopause can also be estimated. These fluxes have been computed using meteorological fields from a free-running general circulation model (FVGCM) and two assimilation data sets, FVDAS, and UKMO. The data sets tend to agree that the annual average net mass flux for the Northern Hemisphere is about 1P10 kg/s. There is less agreement on the southern Hemisphere flux that might be half as large. For all three data sets, the adiabatic mass flux is computed to be from the upper troposphere into the lowermost stratosphere. This flux will dilute air entering from higher stratospheric altitudes. The mass fluxes are convolved with ozone mixing ratios from the Goddard 3D CTM (which uses the FVGCM) to estimate the cross-tropopause transport of ozone. A relatively large adiabatic flux of tropospheric ozone from the tropical upper troposphere into the extratropical lowermost stratosphere dilutes the stratospheric air in the lowermost stratosphere. Thus, a significant fraction of any measured ozone STE may not be ozone produced in the higher Stratosphere. The results also illustrate that the annual cycle of ozone concentration in the lowermost stratosphere has as much of a role as the transport in the seasonal ozone flux cycle. This implies that a simplified calculation of ozone STE mass from air mass and a mean ozone mixing ratio may have a large uncertainty.

  11. Verification of the skill of numerical weather prediction models in forecasting rainfall from U.S. landfalling tropical cyclones

    NASA Astrophysics Data System (ADS)

    Luitel, Beda; Villarini, Gabriele; Vecchi, Gabriel A.

    2018-01-01

    The goal of this study is the evaluation of the skill of five state-of-the-art numerical weather prediction (NWP) systems [European Centre for Medium-Range Weather Forecasts (ECMWF), UK Met Office (UKMO), National Centers for Environmental Prediction (NCEP), China Meteorological Administration (CMA), and Canadian Meteorological Center (CMC)] in forecasting rainfall from North Atlantic tropical cyclones (TCs). Analyses focus on 15 North Atlantic TCs that made landfall along the U.S. coast over the 2007-2012 period. As reference data we use gridded rainfall provided by the Climate Prediction Center (CPC). We consider forecast lead-times up to five days. To benchmark the skill of these models, we consider rainfall estimates from one radar-based (Stage IV) and four satellite-based [Tropical Rainfall Measuring Mission - Multi-satellite Precipitation Analysis (TMPA, both real-time and research version); Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN); the CPC MORPHing Technique (CMORPH)] rainfall products. Daily and storm total rainfall fields from each of these remote sensing products are compared to the reference data to obtain information about the range of errors we can expect from "observational data." The skill of the NWP models is quantified: (1) by visual examination of the distribution of the errors in storm total rainfall for the different lead-times, and numerical examination of the first three moments of the error distribution; (2) relative to climatology at the daily scale. Considering these skill metrics, we conclude that the NWP models can provide skillful forecasts of TC rainfall with lead-times up to 48 h, without a consistently best or worst NWP model.

  12. On the Characterization of Rainfall Associated with U.S. Landfalling North Atlantic Tropical Cyclones Based on Satellite Data and Numerical Weather Prediction Outputs

    NASA Astrophysics Data System (ADS)

    Luitel, B. N.; Villarini, G.; Vecchi, G. A.

    2014-12-01

    When we talk about tropical cyclones (TCs), the first things that come to mind are strong winds and storm surge affecting the coastal areas. However, according to the Federal Emergency Management Agency (FEMA) 59% of the deaths caused by TCs since 1970 is due to fresh water flooding. Heavy rainfall associated with TCs accounts for 13% of heavy rainfall events nationwide for the June-October months, with this percentage being much higher if the focus is on the eastern and southern United States. This study focuses on the evaluation of precipitation associated with the North Atlantic TCs that affected the continental United States over the period 2007 - 2012. We evaluate the rainfall associated with these TCs using four satellite based rainfall products: Tropical Rainfall Measuring Mission - Multi-satellite Precipitation Analysis (TMPA; both real-time and research version); Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN); Climate Prediction Center (CPC) MORPHing technique (CMORPH). As a reference data we use gridded rainfall provided by CPC (Daily US Unified Gauge-Based Analysis of Precipitation). Rainfall fields from each of these satellite products are compared to the reference data, providing valuable information about the realism of these products in reproducing the rainfall associated with TCs affecting the continental United States. In addition to the satellite products, we evaluate the forecasted rainfall produced by five state-of-the-art numerical weather prediction (NWP) models: European Centre for Medium-Range Weather Forecasts (ECMWF), UK Met Office (UKMO), National Centers for Environmental Prediction (NCEP), China Meteorological Administration (CMA), and Canadian Meteorological Center (CMC). The skill of these models in reproducing TC rainfall is quantified for different lead times, and discussed in light of the performance of the satellite products.

  13. Seasonal drought predictability in Portugal using statistical-dynamical techniques

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. F. S.; Pires, C. A. L.

    2016-08-01

    Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.

  14. Regional Climate Change Hotspots over Africa

    NASA Astrophysics Data System (ADS)

    Anber, U.

    2009-04-01

    Regional Climate Change Index (RCCI), is developed based on regional mean precipitation change, mean surface air temperature change, and change in precipitation and temperature interannual variability. The RCCI is a comparative index designed to identify the most responsive regions to climate change, or Hot- Spots. The RCCI is calculated for Seven land regions over North Africa and Arabian region from the latest set of climate change projections by 14 global climates for the A1B, A2 and B1 IPCC emission scenarios. The concept of climate change can be approaches from the viewpoint of vulnerability or from that of climate response. In the former case a Hot-Spot can be defined as a region for which potential climate change impacts on the environment or different activity sectors can be particularly pronounced. In the other case, a Hot-Spot can be defined as a region whose climate is especially responsive to global change. In particular, the characterization of climate change response-based Hot-Spot can provide key information to identify and investigate climate change Hot-Spots based on results from multi-model ensemble of climate change simulations performed by modeling groups from around the world as contributions to the Assessment Report of Intergovernmental Panel on Climate Change (IPCC). A Regional Climate Change Index (RCCI) is defined based on four variables: change in regional mean surface air temperature relative to the global average temperature change ( or Regional Warming Amplification Factor, RWAF ), change in mean regional precipitation ( , of present day value ), change in regional surface air temperature interannual variability ( ,of present day value), change in regional precipitation interannual variability ( , of present day value ). In the definition of the RCCI it is important to include quantities other than mean change because often mean changes are not the only important factors for specific impacts. We thus also include inter annual variability, which is critical for many activity sectors, such as agriculture and water management. The RCCI is calculated for the above mentioned set of global climate change simulations and is inter compared across regions to identify climate change, Hot- Spots, that is regions with the largest values of RCCI. It is important to stress that, as will be seen, the RCCI is a comparative index, that is a small RCCI value does not imply a small absolute change, but only a small climate response compared to other regions. The models used are: CCMA-3-T47 CNRM-CM3 CSIRO-MK3 GFDL-CM2-0 GISS-ER INMCM3 IPSL-CM4 MIROC3-2M MIUB-ECHO-G MPI-ECHAM5 MRI-CGCM2 NCAR-CCSM3 NCAR-PCM1 UKMO-HADCM3 Note that the 3 IPCC emission scenarios, A1B, B1 and A2 almost encompass the entire IPCC scenario range, the A2 being close to the high end of the range, the B1 close to the low end and the A1B lying toward the middle of the range. The model data are obtained from the IPCC site and are interpolated onto a common 1 degree grid to facilitate intercomparison. The RCCI is here defined as in Giorgi (2006), except that the entire yea is devided into two six months periods, D J F M A M and J J A S O N. RCCI=[n(∆P)+n(∆σP)+n(RWAF)+n(∆σT)]D...M + [n(∆P)+n(∆σP)+n(RWAF)+n(∆σT)]J…N (1)

  15. Towards a parameterization of convective wind gusts in Sahel

    NASA Astrophysics Data System (ADS)

    Largeron, Yann; Guichard, Françoise; Bouniol, Dominique; Couvreux, Fleur; Birch, Cathryn; Beucher, Florent

    2014-05-01

    West Africa is responsible for between 25 and 50 % of the global emissions of mineral dust (cf [Engelstaedter et al., 2006]) and these dust emissions have a huge impact on climate (cf [Carslaw et al., 2010]) and soil erosion. Numerous studies have focused on the quantification of the dust emission fluxes from knowledges of the soil surface characteristics, leading to the formulation of a threshold wind friction velocity (cf [Marticorena and Bergametti, 1995]) above which the dust can be uplifted. That flux varies with the cube of the surface wind speed above the threshold and is therefore particularly sensitive to the way the wind speed is modeled (cf [Menut, 2008]). Moreover, in the Sahelian belt, about half of the dust uplift happens during isolated events which generate violent cold pool outflows from moist deep convection, and associated high surface wind speeds. Therefore, the representation of convectively generated winds appears critical (cf [Marsham et al., 2011], [Knippertz and Todd, 2012]). The present study is motivated by these issues, and is carried out within the CAVIARS French Research National Agency (ANR) project. First, we examine the ERA interim reanalysis of the ECMWF, frequently used as an input wind field for off-line dust emission models (cf [Pierre et al., 2012]). The comparison with high-frequency local measurements shows that, not unexpectedly, the increase of the surface wind speed from deep convection is not represented in large-scale reanalysis. Therefore, following [Redelsperger et al., 2000], we propose a statistical approach to introduce a formulation of the surface wind gusts during deep convection, based on the analysis of convection-permitting high resolution simulations made with the UKMO atmospheric model (CASCADE project), the AROME operational model from Meteo-France, and the MesoNH Large Eddy Simulations model. High-frequency observations are also used to complement the analysis. However, unlike [Redelsperger et al., 2000] who focused on the wet tropical Pacific region, and linked wind gusts to convective precipitation rates alone, here, we also analyse the subgrid wind distribution during convective events, and quantify the statistical moments (variance, skewness and kurtosis) in terms of mean wind speed and convective indexes such as DCAPE. Next step of the work will be to formulate a parameterization of the cold pool convective gust from those probability density functions and analytical formulaes obtained from basic energy budget models. References : [Carslaw et al., 2010] A review of natural aerosol interactions and feedbacks within the earth system. Atmospheric Chemistry and Physics, 10(4):1701{1737. [Engelstaedter et al., 2006] North african dust emissions and transport. Earth-Science Reviews, 79(1):73{100. [Knippertz and Todd, 2012] Mineral dust aerosols over the sahara: Meteorological controls on emission and transport and implications for modeling. Reviews of Geophysics, 50(1). [Marsham et al., 2011] The importance of the representation of deep convection for modeled dust-generating winds over west africa during summer.Geophysical Research Letters, 38(16). [Marticorena and Bergametti, 1995] Modeling the atmospheric dust cycle: 1. design of a soil-derived dust emission scheme. Journal of Geophysical Research, 100(D8):16415{16. [Menut, 2008] Sensitivity of hourly saharan dust emissions to ncep and ecmwf modeled wind speed. Journal of Geophysical Research: Atmospheres (1984{2012), 113(D16). [Pierre et al., 2012] Impact of vegetation and soil moisture seasonal dynamics on dust emissions over the sahel. Journal of Geophysical Research: Atmospheres (1984{2012), 117(D6). [Redelsperger et al., 2000] A parameterization of mesoscale enhancement of surface fluxes for large-scale models. Journal of climate, 13(2):402{421.

  16. Global Effects of Superparameterization on Hydrothermal Land-Atmosphere Coupling on Multiple Timescales

    NASA Astrophysics Data System (ADS)

    Qin, Hongchen; Pritchard, Michael S.; Kooperman, Gabriel J.; Parishani, Hossein

    2018-02-01

    Many conventional General Circulation Models (GCMs) in the Global Land-Atmosphere Coupling Experiment (GLACE) tend to produce what is now recognized as overly strong land-atmosphere (L-A) coupling. We investigate the effects of cloud Superparameterization (SP) on L-A coupling on timescales beyond diurnal where it has been recently shown to have a favorable muting effect hydrologically. Using the Community Atmosphere Model v3.5 (CAM3.5) and its Superparameterized counterpart SPCAM3.5, we conducted soil moisture interference experiments following the GLACE and Atmospheric Model Intercomparison Project (AMIP) protocols. The results show that, on weekly-to-subseasonal timescales, SP also mutes hydrologic L-A coupling. This is detectable globally, and happens through the evapotranspiration-precipitation segment. But on seasonal timescales, SP does not exhibit detectable effects on hydrologic L-A coupling. Two robust regional effects of SP on thermal L-A coupling have also been explored. Over the Arabian Peninsula, SP reduces thermal L-A coupling through a straightforward control by mean rainfall reduction. More counterintuitively, over the Southwestern US and Northern Mexico, SP enhances the thermal L-A coupling in a way that is independent of rainfall and soil moisture. This signal is associated with a systematic and previously unrecognized effect of SP that produces an amplified Bowen ratio, and is detectable in multiple SP model versions and experiment designs. In addition to amplifying the present-day Bowen ratio, SP is found to amplify the climate sensitivity of Bowen ratio as well, which likely plays a role in influencing climate change predictions at the L-A interface.

  17. The Good, the Bad, and the Ugly: Numerical Prediction for Hurricane Juan (2003)

    NASA Astrophysics Data System (ADS)

    Gyakum, J.; McTaggart-Cowan, R.

    2004-05-01

    The range of accuracy of the numerical weather prediction (NWP) guidance for the landfall of Hurricane Juan (2003), from nearly perfect to nearly useless, motivates a study of the NWP forecast errors on 28-29 September 2003 in the eastern North Atlantic. Although the forecasts issued over the period were of very high quality, this is primarily because of the diligence of the forecasters, and not related to the reliability of the numerical predictions provided to them by the North American operational centers and the research community. A bifurcation in the forecast fields from various centers and institutes occurred beginning with the 0000 UTC run of 28 September, and continuing until landfall just after 0000 UTC on 29 September. The GFS (NCEP), Eta (NCEP), GEM (Canadian Meteorological Centre; CMC), and MC2 (McGill) forecast models all showed an extremely weak (minimum SLP above 1000 hPa) remnant vortex moving north-northwestward into the Gulf of Maine and merging with a diabatically-developed surface low offshore. The GFS uses a vortex-relocation scheme, the Eta a vortex bogus, and the GEM and MC2 are run on CMC analyses that contain no enhanced vortex. The UK Met Office operational, the GFDL, and the NOGAPS (US Navy) forecast models all ran a small-scale hurricane-like vortex directly into Nova Scotia and verified very well for this case. The UKMO model uses synthetic observations to enhance structures in poorly-forecasted areas during the analysis cycle and both the GFDL and NOGAPS model use advanced idealized vortex bogusing in their initial conditions. The quality of the McGill MC2 forecast is found to be significantly enhanced using a bogusing technique similar to that used in the initialization of the successful forecast models. A verification of the improved forecast is presented along with a discussion of the need for operational quality control of the background fields in the analysis cycle and for proper representation of strong, small-scale tropical vortices.

  18. Use of podcast technology to facilitate education, communication and dissemination in palliative care: the development of the AmiPal podcast

    PubMed Central

    Monnery, Daniel; Reid, Victoria Louise; Chapman, Laura

    2017-01-01

    Objectives Podcasts have the potential to facilitate communication about palliative care with researchers, policymakers and the public. Some podcasts about palliative care are available; however, this is not reflected in the academic literature. Further study is needed to evaluate the utility of podcasts to facilitate knowledge-transfer about subjects related to palliative care. The aims of this paper are to (1) describe the development of a palliative care podcast according to international recommendations for podcast quality and (2) conduct an analysis of podcast listenership over a 14-month period. Methods The podcast was designed according to internationally agreed quality indicators for medical education podcasts. The podcast was published on SoundCloud and was promoted via social media. Data were analysed for frequency of plays and geographical location between January 2015 and February 2016. Results 20 podcasts were developed which were listened to 3036 times (an average of 217 monthly plays). The Rich Site Summary feed was the most popular way to access the podcast (n=1937; 64%). The mean duration of each podcast was 10 min (range 3–21 min). The podcast was listened to in 68 different countries and was most popular in English-speaking areas, of which the USA (n=1372, 45.2%), UK (n=661, 21.8%) and Canada (n=221, 7.3%) were most common. Conclusions A palliative care podcast is a method to facilitate palliative care discussion with global audience. Podcasts offer the potential to develop educational content and promote research dissemination. Future work should focus on content development, quality metrics and impact analysis, as this form of digital communication is likely to increase and engage wider society. PMID:27580942

  19. Applying an economical scale-aware PDF-based turbulence closure model in NOAA NCEP GCMs

    NASA Astrophysics Data System (ADS)

    Belochitski, A.; Krueger, S. K.; Moorthi, S.; Bogenschutz, P.; Pincus, R.

    2016-12-01

    A novel unified representation of sub-grid scale (SGS) turbulence, cloudiness, and shallow convection is being implemented into the NOAA NCEP Global Forecasting System (GFS) general circulation model. The approach, known as Simplified High Order Closure (SHOC), is based on predicting a joint PDF of SGS thermodynamic variables and vertical velocity and using it to diagnose turbulent diffusion coefficients, SGS fluxes, condensation and cloudiness. Unlike other similar methods, only one new prognostic variable, turbulent kinetic energy (TKE), needs to be intoduced, making the technique computationally efficient.SHOC is now incorporated into a version of GFS, as well as into the next generation of the NCEP global model - NOAA Environmental Modeling System (NEMS). Turbulent diffusion coefficients computed by SHOC are now used in place of those produced by the boundary layer turbulence and shallow convection parameterizations. Large scale microphysics scheme is no longer used to calculate cloud fraction or the large-scale condensation/deposition. Instead, SHOC provides these variables. Radiative transfer parameterization uses cloudiness computed by SHOC.Outstanding problems include high level tropical cloud fraction being too high in SHOC runs, possibly related to the interaction of SHOC with condensate detrained from deep convection.Future work will consist of evaluating model performance and tuning the physics if necessary, by performing medium-range NWP forecasts with prescribed initial conditions, and AMIP-type climate tests with prescribed SSTs. Depending on the results, the model will be tuned or parameterizations modified. Next, SHOC will be implemented in the NCEP CFS, and tuned and evaluated for climate applications - seasonal prediction and long coupled climate runs. Impact of new physics on ENSO, MJO, ISO, monsoon variability, etc will be examined.

  20. Coupling between strong warm ENSO events and the phase of the stratospheric QBO.

    NASA Astrophysics Data System (ADS)

    Christiansen, Bo

    2017-04-01

    Although there in general are no significant long-term correlations between the QBO and the ENSO in observations we find that the QBO and the ENSO were aligned in the 3 to 4 years after the three strong warm ENSO events in 1982, 1997, and 2015. We study this possible connection between the QBO and the ENSO with a new version of the EC-Earth model which includes non-orographic gravity waves and a well modeled QBO. We analyze the modeled QBO in ensembles consisting of 10 AMIP-type experiments with climatological SSTs and 10 experiments with observed daily SSTs. The model experiments cover the period 1982-2013. For the ENSO we use the multivariate index (MEI). As expected the coherence is strong and statistically significant in the equatorial troposphere in the ensemble with observed SSTs. Here the coherence is a measure of the alignment of the ensemble members. In the ensemble with observed SSTs we find a strong and significant alignment of the ensemble members in the equatorial stratospheric winds in the 2 to 4 years after the strong ENSO event in 1997. This alignment also includes the observed QBO. No such alignment is found in the ensemble with climatological SSTs. These results indicate that strong warm ENSO events can directly influence the phase of the QBO. An open and maybe related question is what caused the anomalous QBO in 2016. This behaviour, which is unprecedented in the 50-60 years with data, has been described as a hiccup or a death-spiral. At least it is clear that in the last 18 months the QBO has been stuck in the same corner of the phase-space spanned by its two leading principal components. The possible connection to the ENSO will be investigated.

  1. Global trends in significant wave height and marine wind speed from the ERA-20CM

    NASA Astrophysics Data System (ADS)

    Aarnes, Ole Johan; Breivik, Øyvind

    2016-04-01

    The ERA-20CM is one of the latest additions to the ERA-series produced at the European Center for Medium-Range Weather Forecasts (ECMWF). This 10 member ensemble is generated with a version of the Integrated Forecast System (IFS), a coupled atmosphere-wave model. The model integration is run as a AMIP (Atmospheric Model Intercomparison Project) constrained by CMIP5 recommended radiative forcing and different realizations of sea-surface temperature (SST) and sea-ice cover (SIC) prescribed by the HadISST2 (Met Office Hadley Center). While the ERA-20CM is unable to reproduce the actual synoptic conditions, it is designed to offer a realistic statistical representation of the past climate, spanning the period 1899-2010. In this study we investigate global trends in significant wave height and marine wind speed based on ERA-20CM, using monthly mean data, upper percentiles and monthly/annual maxima. The aim of the study is to assess the quality of the trends and how these estimates are affected by different SST and SIC. Global trends are compared against corresponding estimates obtained with ERA-Interim (1979-2009), but also crosschecked against ERA-20C - an ECMWF pilot reanalysis of the 20th-century, known to most trustworthy in the Northern Hemisphere extratropics. Over the period 1900-2009, the 10 member ensemble yields trends mainly within +/- 5% per century. However, significant trends of opposite signs are found locally. Certain areas, like the eastern equatorial Pacific, highly affected by the El Niño Southern Oscillation, show stronger trends. In general, trends based on statistical quantities further into the tail of the distribution are found less reliable.

  2. Water Cycle Variability over the Global Oceans Estimated Using Homogenized Reanalysis Fluxes

    NASA Astrophysics Data System (ADS)

    Robertson, F. R.; Bosilovich, M. G.; Roberts, J. B.

    2017-12-01

    Establishing consistent records of the global water cycle fluxes and their variations is particularly difficult over oceans where the density of in situ observations varies enormously with time, satellite retrievals of flux processes are sparse, and reanalyses are uncertain. The latter have the positive attribute of assimilating diverse observations to provide boundary fluxes and transports but are hindered by at least two factors: (1) the physical parameterizations are imperfect and, (2) the forcing data availability and quality vary greatly in time and, thus, can induce time-dependent, false signals of climate variability. Here we examine the prospects for homogenization of reanalysis records, that is, identifying and greatly minimizing non-physical signals. Our analysis focuses on the satellite era, 1980 to near present. The strategy involves three atmospheric reanalysis systems: (1) the NASA MERRA-2, (2) the newest reanalysis produced by the Japanese Meteorological Agency, JRA-55, and (3) the European Centre for Medium Range Weather Forecasts 20th Century reanalysis, ERA-20C. MERRA-2 and ERA-20C are also accompanied by 10-member AMIP integrations, and JRA-55 by a reanalysis using only conventional observations, JRA-55C. Differencing these latter integrations from the more comprehensive reanalyses helps provide a clearer picture of the impact of satellite observations by removing the effects of SST forcing. This facilitates the use of principal component analysis as a tool to identify and remove non-physical signals. We then use these homogenized E, P and moisture transports to examine the consistency of diagnostics of thermodynamic and hydrologic scaling, especially the P-E pattern amplification or the "wet-get-wetter, dry-get-drier" response. Prospects for further validation by new turbulent flux retrievals by satellite are discussed.

  3. Acoustic analysis of shock production by very high-altitude meteors—I: infrasonic observations, dynamics and luminosity

    NASA Astrophysics Data System (ADS)

    Brown, P. G.; Edwards, W. N.; Revelle, D. O.; Spurny, P.

    2007-04-01

    Four very high-velocity and high-altitude meteors (a Leonid, two Perseids and a high-speed sporadic fireball) have been unambiguously detected at the ground both optically using precision all-sky cameras and acoustically via infrasound and seismic signals. Infrasound arriving from altitudes of over 100 km is not very common, but has been previously observed for re-entering spacecraft. This, however, is the first reported detection of such high-altitude infrasound unambiguously from meteors to our knowledge. These fragile meteoroids were found to generate acoustic waves at source heights ranging from 80 to 110 km, with most acoustic energy being generated near the lowest heights. Time residuals between observed acoustic onset and model predictions based on ray-tracing points along the photographically determined trajectories indicate that the upper winds given by the UK meteorological office (UKMO) model systematically produce lower residuals for first arrivals than those from the Naval Research Laboratory Horizontal Wind Model (HWM). Average source energies for three of the four events from acoustic data alone are found to be in the range of 2×108-9 J. One event, EN010803, had unusually favorable geometry for acoustic detection at the ground and therefore has the smallest photometric source energy (10-5 kt; 6×107 J) of any meteor detected infrasonically. When compared to the total optical radiation recorded by film, the results for the three events produce equivalent integral panchromatic luminous efficiencies of 3 7%, within a factor of two of the values proposed by Ceplecha and McCrosky [1976. Fireball end heights—a diagnostic for the structure of meteoric material. Journal of Geophysical Research 81, 6257 6275] for the velocity range (55 70 km s-1) appropriate to our events. Application of these findings to meteor showers in general suggest that the Geminid shower should be the most prolific producer of infrasound detectable meteors at the ground of all the major showers, with one Geminid fireball producing detectable infrasound from a given location every ˜400 h of observation.

  4. Development and application of an atmospheric-hydrologic-hydraulic flood forecasting model driven by TIGGE ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Bao, Hongjun; Zhao, Linna

    2012-02-01

    A coupled atmospheric-hydrologic-hydraulic ensemble flood forecasting model, driven by The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data, has been developed for flood forecasting over the Huaihe River. The incorporation of numerical weather prediction (NWP) information into flood forecasting systems may increase forecast lead time from a few hours to a few days. A single NWP model forecast from a single forecast center, however, is insufficient as it involves considerable non-predictable uncertainties and leads to a high number of false alarms. The availability of global ensemble NWP systems through TIGGE offers a new opportunity for flood forecast. The Xinanjiang model used for hydrological rainfall-runoff modeling and the one-dimensional unsteady flow model applied to channel flood routing are coupled with ensemble weather predictions based on the TIGGE data from the Canadian Meteorological Centre (CMC), the European Centre for Medium-Range Weather Forecasts (ECMWF), the UK Met Office (UKMO), and the US National Centers for Environmental Prediction (NCEP). The developed ensemble flood forecasting model is applied to flood forecasting of the 2007 flood season as a test case. The test case is chosen over the upper reaches of the Huaihe River above Lutaizi station with flood diversion and retarding areas. The input flood discharge hydrograph from the main channel to the flood diversion area is estimated with the fixed split ratio of the main channel discharge. The flood flow inside the flood retarding area is calculated as a reservoir with the water balance method. The Muskingum method is used for flood routing in the flood diversion area. A probabilistic discharge and flood inundation forecast is provided as the end product to study the potential benefits of using the TIGGE ensemble forecasts. The results demonstrate satisfactory flood forecasting with clear signals of probability of floods up to a few days in advance, and show that TIGGE ensemble forecast data are a promising tool for forecasting of flood inundation, comparable with that driven by raingauge observations.

  5. Using ARM Observations to Evaluate Climate Model Representation of Land-Atmosphere Coupling on the U.S. Southern Great Plains

    NASA Astrophysics Data System (ADS)

    Phillips, T. J.; Klein, S. A.; Ma, H. Y.; Tang, Q.

    2016-12-01

    Statistically significant coupling between summertime soil moisture and various atmospheric variables has been observed at the U.S. Southern Great Plains (SGP) facilities maintained by the U.S. DOE Atmospheric Radiation Measurement (ARM) program (Phillips and Klein, 2014 JGR). In the current study, we employ several independent measurements of shallow-depth soil moisture (SM) and of the surface evaporative fraction (EF) over multiple summers in order to estimate the range of SM-EF coupling strength at seven sites, and to approximate the SGP regional-scale coupling strength (and its uncertainty). We will use this estimate of regional-scale SM-EF coupling strength to evaluate its representation in version 5.1 of the global Community Atmosphere Model (CAM5.1) coupled to the CLM4 Land Model. Two experimental cases are considered for the 2003-2011 study period: 1) an Atmospheric Model Intercomparison Project (AMIP) run with historically observed sea surface temperatures specified, and 2) a more constrained hindcast run in which the CAM5.1 atmospheric state is initialized each day from the ERA Interim reanalysis, while the CLM4 initial conditions are obtained from an offline run of the land model using observed surface net radiation, precipitation, and wind as forcings. These twin experimental cases allow a distinction to be drawn between the land-atmosphere coupling in the free-running CAM5.1/CLM4 model and that in which the land and atmospheric states are constrained to remain closer to "reality". The constrained hindcast case, for example, should allow model errors in coupling strength to be related more closely to potential deficiencies in land-surface or atmospheric boundary-layer parameterizations. AcknowledgmentsThis work was funded by the U.S. Department of Energy Office of Science and was performed at the Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  6. Use of podcast technology to facilitate education, communication and dissemination in palliative care: the development of the AmiPal podcast.

    PubMed

    Nwosu, Amara Callistus; Monnery, Daniel; Reid, Victoria Louise; Chapman, Laura

    2017-06-01

    Podcasts have the potential to facilitate communication about palliative care with researchers, policymakers and the public. Some podcasts about palliative care are available; however, this is not reflected in the academic literature. Further study is needed to evaluate the utility of podcasts to facilitate knowledge-transfer about subjects related to palliative care. The aims of this paper are to (1) describe the development of a palliative care podcast according to international recommendations for podcast quality and (2) conduct an analysis of podcast listenership over a 14-month period. The podcast was designed according to internationally agreed quality indicators for medical education podcasts. The podcast was published on SoundCloud and was promoted via social media. Data were analysed for frequency of plays and geographical location between January 2015 and February 2016. 20 podcasts were developed which were listened to 3036 times (an average of 217 monthly plays). The Rich Site Summary feed was the most popular way to access the podcast (n=1937; 64%). The mean duration of each podcast was 10 min (range 3-21 min). The podcast was listened to in 68 different countries and was most popular in English-speaking areas, of which the USA (n=1372, 45.2%), UK (n=661, 21.8%) and Canada (n=221, 7.3%) were most common. A palliative care podcast is a method to facilitate palliative care discussion with global audience. Podcasts offer the potential to develop educational content and promote research dissemination. Future work should focus on content development, quality metrics and impact analysis, as this form of digital communication is likely to increase and engage wider society. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Performance of the multi-model SREPS precipitation probabilistic forecast over Mediterranean area

    NASA Astrophysics Data System (ADS)

    Callado, A.; Escribà, P.; Santos, C.; Santos-Muñoz, D.; Simarro, J.; García-Moya, J. A.

    2009-09-01

    The performance of the Short-Range Ensemble Prediction system (SREPS) probabilistic precipitation forecast over the Mediterranean area has been evaluated comparing with both, an Atlantic-European area excluding the first one, and a more general area including the two previous ones. The main aim is to assess whether the performance of the system due to its meso-alpha horizontal resolution of 25 kilometres is affected over the Mediterranean area, where the meteorological mesoscale events play a more important role than in an Atlantic-European area, more related to synoptic scale with an Atlantic influence. Furthermore, two different verification methods have been applied and compared for the three areas in order to assess its performance. The SREPS is a daily experimental LAM EPS focused on the short range (up to 72 hours) which has been developed at the Spanish Meteorological Agency (AEMET). To take into account implicitly the model errors, five purely independent different limited area models are used (COSMO (COSMO), HIRLAM (HIRLAM Consortium), HRM (DWD), MM5 (NOAA) and UM-NAE (UKMO)), and in order to sample the initial and boundary condition uncertainties each model is integrated using data from four different global deterministic models (GFS (NCEP), GME (DWD), IFS (ECMWF) and UM (UKMO)). As a result, crossing models and initial conditions the EPS is composed by 20 members. The underlying idea is that the ensemble performance has to improve as far as each member has itself the better possible performance, i.e. the better operational configuration limited area models are combined with the better global deterministic model configurations initialized with the best analysis. Because of this neither global EPS as initial conditions nor different model settings as multi-parameterizations or multi-parameters are used to generate SREPS. The performance over the three areas has been assessed focusing on 24 hour accumulation precipitation with four different usual forecasting thresholds: 1, 5 , 10 and 20 mm. A standard probabilistic verification exercise (following ECMWF recommendations) has been carried out, assessing quality with well known properties like reliability, resolution and discrimination, using usual performance measures: Reliability (Attributes) Diagram, Brier and Brier Skill Score Decomposition, Relative Operating Characteristic (ROC) and ROC area. The value of the forecasts w.r.t. sample climatology is shown with Relative value envelopes. This exercise has been carried out for a one year period (May 2007 to May 2008). Observed precipitation data from High Resolution (HR) networks over Europe have been used as reference. To avoid the potential lack of statistical significance due to spatial dependence between close observations, up-scaling processed observations have been used, provided by ECMWF, who collects the raw data from different member and cooperating states over Europe. This advanced up-scaling methodology has the feature to be more independent of the density of precipitation observations than the more classical simple methodology of interpolate the model outputs to the observation station points. In particular, the observations have been up-scaled to a 0.25ºx0.25º box taking each box as representative only when more than five observations are available in it. In the first one verifying method the box-average is taken, and for the second one a set of quantiles is considered, specifically 10, 25, 50 , 75 and 90 quantiles. The difference between both methods is that the first one takes over each box a single value as representative of precipitation. Whereas the second one takes a probability density function as representation of precipitation over the box, thus introducing uncertainty (related with spatial distribution) in the observations. The results are consistent, and show that in general SREPS is a reliable probabilistic forecasting system for the three selected areas. Concerning performance over different regions, the SREPS probabilistic precipitation forecasts over the selected Mediterranean area have a little less reliability and resolution than over the North Europe area, specially with the higher thresholds 10 and 20 mm. The latter results suggests that in SREPS the representation of the mesoscale meteorological events around the Mediterranean basin has to be improved, and probably also the orographic-related processes as the orographic enhancement of the precipitation. So it is suggested that the predictability skill of SREPS system around the Mediterranean could be expected to improve if the horizontal and vertical resolution of each limited area model of the system is increased in order to take into account the meso-beta scale. When comparing the two verification methods, one using up-scaled box average and the other using an up-scaled set of quantiles (i.e. a box PDF), it is shown that the validation of the probabilistic forecast is quite more consistent in the latter method when uncertainties in the observations are introduced and probably gives a more realistic idea of performance.

  8. Potential Carbon Stock Changes in Arizona's Ecosystems Due to Projected Climate Change

    NASA Astrophysics Data System (ADS)

    Finley, B. K.; Ironside, K.; Hungate, B. A.; Hurteau, M.; Koch, G. W.

    2011-12-01

    Climate change can alter the role of plants and soils as sources or sinks of atmospheric carbon dioxide and result in changes in long-term carbon storage. To understand the sensitivity of Arizona's ecosystems to climate change, we quantified the present carbon stocks in Arizona's major ecosystem types using the NASA-CASA (Carnegie Ames Stanford Approach) model. Carbon stocks for each vegetation type included surface mineral soil, dead wood litter, standing wood and live leaf biomass. The total Arizona ecosystem carbon stock is presently 1775 MMtC, 545 MMtC of which is in Pinus ponderosa and Pinus edulis forests and woodlands. Evergreen forest vegetation, predominately Pinus ponderosa, has the largest current C density at 11.3 kgC/m2, while Pinus edulis woodlands have a C density of 6.0 kgC/m2. A change in climate will impact the suitable range for each tree species, and consequentially the amount of C stored. Present habitat ranges for these tree species are projected to have widespread mortality and likely will be replaced by herbaceous species, resulting in a loss of C stored. We evaluated the C storage implications over the 2010 to 2099 period of climate change based on output from GCMs with contrasting projections for the southwestern US: MPI-ECHAM5, which projects warming and reduced precipitation, and UKMO-HadGEM, which projects warming and increased precipitation. These projected changes are end points of a spectrum of possible future climate scenarios. The vegetation distribution models used describe potential suitable habitat, and we assumed that the growth rate for each vegetation type would be one-third of the way to full C density for each 30 year period up to 2099. With increasing temperature and decreasing precipitation predictions under the MPI-ECHAM5 model, P. ponderosa and P. edulis vegetation show a decrease in carbon stored from 545 MMtC presently to 116 MMtC. With the combined increase in temperature and precipitation, C storage in these vegetation types is projected to increase to 808 MMtC. Our results indicate that future C storage in Arizona is highly dependent on precipitation. Given that most climate models for the Southwest predict a more arid future, it is likely that C storage will decrease in Arizona ecosystems, as it has in response to recent droughts, reducing mitigation of rising human emissions.

  9. Studies of the polar MLT region using SATI airglow measurements

    NASA Astrophysics Data System (ADS)

    Cho, Youngmin

    To investigate atmospheric dynamics of the MLT (Mesosphere and Lower Thermosphere) region, a ground-based instrument called SATI (Spectral Airglow Temperature Imager) was developed at York University. The rotational temperatures and emission rates of the OH (6-2) Meinel band and the O2 (0-1) Atmospheric band have been measured in the MLT region by the SATI instrument at Resolute Bay (74.68°N, 94.90°W) since November, 2001, and at the King Sejong station (62.22°S, 58.75°W) since February, 2002. The MLT measurements are examined for periodic oscillations in the ambient temperature and airglow emission rate. A dominant and coherent 4-hr oscillation is seen in both the OH and O2 temperature and emission rate at Resolute Bay in November, 2001. Tidal variation with a 12 hour period is shown in hourly averaged temperatures of the season 2001--2002 and the season 2003--2004. In addition, planetary waves with periods of 3 and 4.5 days are also seen in a longer interval. The observations at high latitudes have revealed that temperatures and emission rates are higher around the winter solstice. MLT cooling events were found at Resolute Bay in December, 2001 and February, 2002. They are compared with the UKMO (UK Meteorological Office) stratospheric assimilated data, and the MLT coolings coincide in time with the stratospheric warmings. A consistent inverse relationship of the OH temperatures and temperatures at 0.316 hPa is presented in the comparison. In previous studies of wave perturbations, the background (mean) values were normally subtracted from the instantaneous signal, but in the present investigation this was not done, allowing the long-term relationship to be examined. A positive relationship of the temperature and emission rate is seen from the SATI measurements for both short and long-term variations, suggesting that similar dynamical processes are responsible for both. This relationship is supported by satellite data from the SABER (Sounding of the Atmosphere using Broadband Emission Radiometry) instrument. The correlation is compared with the result of a simple atmospheric model based on the dynamical and chemical processes involved in the diurnal tide, and the model results are in good agreement with the observations.

  10. Using ensembles in water management: forecasting dry and wet episodes

    NASA Astrophysics Data System (ADS)

    van het Schip-Haverkamp, Tessa; van den Berg, Wim; van de Beek, Remco

    2015-04-01

    Extreme weather situations as droughts and extensive precipitation are becoming more frequent, which makes it more important to obtain accurate weather forecasts for the short and long term. Ensembles can provide a solution in terms of scenario forecasts. MeteoGroup uses ensembles in a new forecasting technique which presents a number of weather scenarios for a dynamical water management project, called Water-Rijk, in which water storage and water retention plays a large role. The Water-Rijk is part of Park Lingezegen, which is located between Arnhem and Nijmegen in the Netherlands. In collaboration with the University of Wageningen, Alterra and Eijkelkamp a forecasting system is developed for this area which can provide water boards with a number of weather and hydrology scenarios in order to assist in the decision whether or not water retention or water storage is necessary in the near future. In order to make a forecast for drought and extensive precipitation, the difference 'precipitation- evaporation' is used as a measurement of drought in the weather forecasts. In case of an upcoming drought this difference will take larger negative values. In case of a wet episode, this difference will be positive. The Makkink potential evaporation is used which gives the most accurate potential evaporation values during the summer, when evaporation plays an important role in the availability of surface water. Scenarios are determined by reducing the large number of forecasts in the ensemble to a number of averaged members with each its own likelihood of occurrence. For the Water-Rijk project 5 scenario forecasts are calculated: extreme dry, dry, normal, wet and extreme wet. These scenarios are constructed for two forecasting periods, each using its own ensemble technique: up to 48 hours ahead and up to 15 days ahead. The 48-hour forecast uses an ensemble constructed from forecasts of multiple high-resolution regional models: UKMO's Euro4 model,the ECMWF model, WRF and Hirlam. Using multiple model runs and additional post processing, an ensemble can be created from non-ensemble models. The 15-day forecast uses the ECMWF Ensemble Prediction System forecast from which scenarios can be deduced directly. A combination of the ensembles from the two forecasting periods is used in order to have the highest possible resolution of the forecast for the first 48 hours followed by the lower resolution long term forecast.

  11. Effect of the tropical Pacific and Indian Ocean warming since the late 1970s on wintertime Northern Hemispheric atmospheric circulation and East Asian climate interdecadal changes

    NASA Astrophysics Data System (ADS)

    Chu, Cuijiao; Yang, Xiu-Qun; Sun, Xuguang; Yang, Dejian; Jiang, Yiquan; Feng, Tao; Liang, Jin

    2018-04-01

    Observation reveals that the tropical Pacific-Indian Ocean (TPIO) has experienced a pronounced interdecadal warming since the end of the 1970s. Meanwhile, the wintertime midlatitude Northern Hemispheric atmospheric circulation and East Asian climate have also undergone substantial interdecadal changes. The effect of the TPIO warming on these interdecadal changes are identified by a suite of AMIP-type atmospheric general circulation model experiments in which the model is integrated from September 1948 to December 1999 with prescribed historical, observed realistic sea surface temperature (SST) in a specific region and climatological SST elsewhere. Results show that the TPIO warming reproduces quite well the observed Northern Hemispheric wintertime interdecadal changes, suggesting that these interdecadal changes primarily originate from the TPIO warming. However, each sub-region of TPIO has its own distinct contribution. Comparatively, the tropical central-eastern Pacific (TCEP) and tropical western Pacific (TWP) warming makes dominant contributions to the observed positive-phase PNA-like interdecadal anomaly over the North Pacific sector, while the tropical Indian Ocean (TIO) warming tends to cancel these contributions. Meanwhile, the TIO and TWP warming makes dominant contributions to the observed positive NAO-like interdecadal anomaly over the North Atlantic sector as well as the interdecadal anomalies over the Eurasian sector, although the TWP warming's contribution is relatively small. These remote responses are directly attributed to the TPIO warming-induced tropical convection, rainfall and diabatic heating increases, in which the TIO warming has the most significant effect. Moreover, the TPIO warming excites a Gill-type pattern anomaly over the tropical western Pacific, with a low-level anticyclonic circulation anomaly over the Philippine Sea. Of three sub-regions, the TIO warming dominates such a pattern, although the TWP warming tends to cancel this effect. The anticyclonic circulation anomaly intensifies the southwesterly flow that transfers more moisture from the Bay of Bengal to East Asia and considerably increases the winter precipitation over the southern East Asia. This is strongly supported by the observational fact that there has been a significant interdecadal increase of winter precipitation over the southern China since the end of the 1970s.

  12. Shifts of regional hydro-climatic regimes in the warmer future

    NASA Astrophysics Data System (ADS)

    Kim, H.; Morishita, S.

    2016-12-01

    It is well known that the global climate is projected to be significantly warmer than pre-industrial period, and, in 2015, it was indicated as 1-degreen increase of global mean temperature that was unprecedented previously. Human-induced additional radiative forcing causes global and regional mean temperature increase and alters energy and water partitioning in the heterogeneous pathway. Budyko proposed a conceptual equation to estimate a climate-induced dryness relating available energy and precipitation, and it has been used broadly in hydrology communities to determine regional hydro-climatic characteristics. In this study, a diagnosis framework is proposed to traced how the regional hydro-climatic regimes are shifted under the warming condition with 4 °C increase of global mean temperature. A database for Policy Decision making for Future climate change (d4PDF) based on a super-ensemble AMIP-style experiment (11,400 model years, totally) with sea surface temperature patterns extracted from six CMIP5 models is used to estimate the probability distribution of the regime shifts maximizing signal-to-noise. It was found that the global future hydro-climate condition shifts slightly to more humid condition comparing to the historical condition, since the increase of precipitation is greater and the increate of net radiation, globally. Very humid regions including tropics and semi-arid regions tend to expand, and Semi-humid and arid-regions tend to shrink. Although the change of global mean state between historical and future climate is not considerable, temporal variability under the warming climate is amplified significantly, and it induces more frequent occurrence of once-in-a-century level drought over large terrestrial regions including Africa, South America, East and Central Asia, Australia, and United States. This analysis will be extended up to the availability (expected as October 2016) of a similar database being produced under the Half a degree Additional warming, Projections, Prognosis and Impacts (HAPPI) project following the Paris Agreement, 2015, to aim to limit the increase in global average temperature to 1.5°C above pre-industrial levels.

  13. Comparative analysis of atmosphere temperature variability for Northern Eurasia based on the Reanalysis and in-situ observed data

    NASA Astrophysics Data System (ADS)

    Shulgina, T.; Genina, E.; Gordov, E.; Nikitchuk, K.

    2009-04-01

    At present numerous data archives which include meteorological observations as well as climate processes modeling data are available for Earth Science specialists. Methods of mathematical statistics are widely used for their processing and analysis. In many cases they represent the only way of quantitative assessment of the meteorological and climatic information. Unified set of analysis methods allows us to compare climatic characteristics calculated on the basis of different datasets with the purpose of performing more detailed analysis of climate dynamics for both regional and global levels. The report presents the results of comparative analysis of atmosphere temperature behavior for the Northern Eurasia territory for the period from 1979 to 2004 based on the NCEP/NCAR Reanalysis, NCEP/DOE Reanalysis AMIP II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis data and observation data obtained from meteorological stations of the former Soviet Union. Statistical processing of atmosphere temperature data included analysis of time series homogeneity of climate indices approved by WMO, such as "Number of frost days", "Number of summer days", "Number of icing days", "Number of tropical nights", etc. by means of parametric methods of mathematical statistics (Fisher and Student tests). That allowed conducting comprehensive research of spatio-temporal features of the atmosphere temperature. Analysis of the atmosphere temperature dynamics revealed inhomogeneity of the data obtained for large observation intervals. Particularly, analysis performed for the period 1979 - 2004 showed the significant increase of the number of frost and icing days approximately by 1 day for every 2 years and decrease roughly by 1 day for 2 years for the number of summer days. Also it should be mentioned that the growth period mean temperature have increased by 1.5 - 2° C for the time period being considered. The usage of different Reanalysis datasets in conjunction with in-situ observed data allowed comparing of climate indices values calculated on the basis of different datasets that improves the reliability of the results obtained. Partial support of SB RAS Basic Research Program 4.5.2 (Project 2) is acknowledged.

  14. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same through web-browser interfaces. The summer school will serve as a valuable testbed for the tool development, preparing CMDA to serve its target community: Earth-science modeling and model-analysis community.

  15. Applying an economical scale-aware PDF-based turbulence closure model in NOAA NCEP GCMs.

    NASA Astrophysics Data System (ADS)

    Krueger, S. K.; Belochitski, A.; Moorthi, S.; Bogenschutz, P.; Pincus, R.

    2015-12-01

    A novel unified representation of sub-grid scale (SGS) turbulence, cloudiness, and shallow convection is being implemented into the NOAA NCEP Global Forecasting System (GFS) general circulation model. The approach, known as Simplified High Order Closure (SHOC), is based on predicting a joint PDF of SGS thermodynamic variables and vertical velocity and using it to diagnose turbulent diffusion coefficients, SGS fluxes, condensation and cloudiness. Unlike other similar methods, only one new prognostic variable, turbulent kinetic energy (TKE), needs to be intoduced, making the technique computationally efficient.SHOC code was adopted for a global model environment from its origins in a cloud resolving model, and incorporated into NCEP GFS. SHOC was first tested in a non-interactive mode, a configuration where SHOC receives inputs from the host model, but its outputs are not returned to the GFS. In this configuration: a) SGS TKE values produced by GFS SHOC are consistent with those produced by SHOC in a CRM, b) SGS TKE in GFS SHOC exhibits a well defined diurnal cycle, c) there's enhanced boundary layer turbulence in the subtropical stratocumulus and tropical transition-to-cumulus areas d) buoyancy flux diagnosed from the assumed PDF is consistent with independently calculated Brunt-Vaisala frequency in identifying stable and unstable regions.Next, SHOC was coupled to GFS, namely turbulent diffusion coefficients computed by SHOC are now used in place of those currently produced by the GFS boundary layer and shallow convection schemes (Han and Pan, 2011), as well as condensation and cloud fraction diagnosed from the SGS PDF replace those calculated in the current large-scale cloudines scheme (Zhao and Carr, 1997). Ongoing activities consist of debugging the fully coupled GFS/SHOC.Future work will consist of evaluating model performance and tuning the physics if necessary, by performing medium-range NWP forecasts with prescribed initial conditions, and AMIP-type climate tests with prescribed SSTs. Depending on the results, the model will be tuned or parameterizations modified. Next, SHOC will be implemented in the NCEP CFS, and tuned and evaluated for climate applications - seasonal prediction and long coupled climate runs. Impact of new physics on ENSO, MJO, ISO, monsoon variability, etc will be examined.

  16. The global mean energy balance under cloud-free conditions

    NASA Astrophysics Data System (ADS)

    Wild, Martin; Hakuba, Maria; Folini, Dois; Ott, Patricia; Long, Charles

    2017-04-01

    A long standing problem of climate models is their overestimation of surface solar radiation not only under all-sky, but also under clear-sky conditions (Wild et al. 1995, Wild et al. 2006). This overestimation reduced over time in consecutive model generations due to the simulation of stronger atmospheric absorption. Here we analyze the clear sky fluxes of the latest climate model generation from the Coupled Model Intercomparison Project Phase 5 (CMIP5) against an expanded and updated set of direct observations from the Baseline Surface Radiation Network (BSRN). Clear sky climatologies from these sites have been composed based on the Long and Ackermann (2000) clear sky detection algorithm (Hakuba et al. 2017), and sampling issues when comparing with model simulated clear sky fluxes have been analyzed in Ott (2017). Overall, the overestimation of clear sky insolation in the CMIP5 models is now merely 1-2 Wm-2 in the multimodel mean, compared to 4 Wm-2 in CMIP3 and 6 Wm-2 in AMIPII (Wild et al. 2006). Still a considerable spread in the individual model biases is apparent, ranging from -2 Wm-2 to 10 Wm-2 when averaged over 53 globally distributed BSRN sites. This bias structure is used to infer best estimates for present day global mean clear sky insolation, following an approach developped in Wild et al. (2013, 2015, Clim. Dyn.) for all sky fluxes. Thereby the flux biases in the various models are linearly related to their respective global means. A best estimate can then be inferred from the linear regression at the intersect where the bias against the surface observations becomes zero. This way we obtain a best estimate of 247 Wm-2 for the global mean insolation at the Earth surface under cloud free conditions, and a global mean absorbed solar radiation of 214 Wm-2 in the cloud-free atmosphere, assuming a global mean surface albedo of 13.5%. Combined with a best estimate for the net influx of solar radiation at the Top of Atmosphere under cloud free conditions from CERES EBAF of 286 Wm-2, this leaves an amount of 72 Wm-2 absorbed solar radiation in the cloud free atmosphere. The 72 Wm-2 closely match our best estimate for the global mean cloud-free atmospheric absorption in Wild et al. JGR (2006) based on older models and their biases against much fewer direct observation. This indicates that the estimate of global mean solar absorption in the cloud free atmosphere slightly above 70 Wm-2 is fairly robust. In comparison, the global mean solar absorption under all sky conditions was estimated in Wild et al. (2015) at 80 Wm-2 based on the same approach. The difference between the all- and clear-sky absorption represents the cloud radiative effect on the atmospheric absorption, and is thus estimated here to be around 8 Wm-2. This is similar in magnitude to the 11 Wm-2 derived by Hakuba et al. (2017) when averaged over the atmospheric cloud effect determined at 36 BSRN station. We applied the same methodology also for the longwave fluxes. Thereby we obtained a best estimate for the global mean clear sky downward longwave flux at the Earth surface of 214 Wm-2. Together with a surface and TOA upward longwave flux of 398 Wm-2 and 266 Wm-2, respectively, this leaves an atmospheric longwave divergence under clear sky conditions of 182 Wm-2. Selected related references: Hakuba, M. Z., Folini, D., Wild, M., Long, C. N., Schaepman-Strub, G., and Stephens, G.L., 2017: Cloud Effects on Atmospheric Solar Absorption in Light of Most Recent Surface and Satellite Measurements. AIP Conf. Proc. (in press). Ott, P., 2017: Master Thesis at ETH Zurich (in prep.). Wild, M., Ohmura, A., Gilgen, H., and Roeckner, E., 1995: Validation of GCM simulated radiative fluxes using surface observations. J. Climate, 8, 1309-1324. Wild, M., Long, C.N., and Ohmura, A., 2006: Evaluation of clear-sky solar fluxes in GCMs participating in AMIP and IPCC-AR4 from a surface perspective. J. Geophys. Res., 111, D01104, doi:10.1029/2005JD006118. Wild, M., Folini, D., Schär, C., Loeb, N., Dutton, E.G., and König-Langlo, G., 2013: The global energy balance from a surface perspective. Climate Dynamics, 40, 3107-3134. Wild, M., Folini, D., Hakuba, M., Schär, C., Seneviratne, S.I., Kato, S., Rutan, D., Ammann, C., Wood, E.F., and König-Langlo, G., 2015: The energy balance over land and oceans: An assessment based on direct observations and CMIP5 climate models, Climate Dynamics, 3393-3429, 44, DOI 10.1007/s00382-014-2430-z.

  17. Inter-comparison of the EUMETSAT H-SAF and NASA PPS precipitation products over Western Europe.

    NASA Astrophysics Data System (ADS)

    Kidd, Chris; Panegrossi, Giulia; Ringerud, Sarah; Stocker, Erich

    2017-04-01

    The development of precipitation retrieval techniques utilising passive microwave satellite observations has achieved a good degree of maturity through the use of physically-based schemes. The DMSP Special Sensor Microwave Imager/Sounder (SSMIS) has been the mainstay of passive microwave observations over the last 13 years forming the basis of many satellite precipitation products, including NASA's Precipitation Processing System (PPS) and EUMETSAT's Hydrological Satellite Application Facility (H-SAF). The NASA PPS product utilises the Goddard Profiling (GPROF; currently 2014v2-0) retrieval scheme that provides a physically consistent retrieval scheme through the use of coincident active/passive microwave retrievals from the Global Precipitation Measurement (GPM) mission core satellite. The GPM combined algorithm retrieves hydrometeor profiles optimized for consistency with both Dual-frequency Precipitation Radar (DPR) and GPM Microwave Imager (GMI); these profiles form the basis of the GPROF database which can be utilized for any constellation radiometer within the framework a Bayesian retrieval scheme. The H-SAF product (PR-OBS-1 v1.7) is based on a physically-based Bayesian technique where the a priori information is provided by a Cloud Dynamic Radiation Database (CDRD). Meteorological parameter constraints, derived from synthetic dynamical-thermodynamical-hydrological meteorological profile variables, are used in conjunction with multi-hydrometeor microphysical profiles and multispectral PMW brightness temperature vectors into a specialized a priori knowledge database underpinning and guiding the algorithm's Bayesian retrieval solver. This paper will present the results of an inter-comparison of the NASA PPS GPROF and EUMETSAT H-SAF PR-OBS-1 products over Western Europe for the period from 1 January 2015 through 31 December 2016. Surface radar is derived from the UKMO-derived Nimrod European radar product, available at 15 minute/5 km resolution. Initial results show that overall the correlations between the two satellite precipitation products and surface radar precipitation estimates are similar, particularly for cases where there is extensive precipitation; however, the H-SAF tends to have poorer correlations in situations where rain is light or limited in extent. Similarly, RMSEs for the GPROF scheme tend to a smaller than those of the H-SAF retrievals. The difference in the performance can be traced to the identification of precipitation; the GPROF2014v2-0 scheme overestimates the occurrence and extent of the precipitation, generating a significant amount of light precipitation. The H-SAF scheme has a lower precipitation threshold of about 0.25 mmh-1 while overestimating moderate and higher precipitation intensities.

  18. Sensing Water Vapon via Spacecraft Radio Occultation Observations

    NASA Technical Reports Server (NTRS)

    Kursinski, E. Robert; Hajj, George A.

    2000-01-01

    The radio occultation technique has been used to characterize planetary atmospheres since the 1960's spanning atmospheric pressures from 16 microbars to several bars. In 1988, the use of GPS signals to make occultation observations of Earth's atmosphere was realized by Tom Yunck and Gunnar Lindal at JPL. In the GPS to low-Earth-orbiter limb- viewing occultation geometry, Fresnel diffraction yield a unique combination of high vertical resolution of 100 m to 1 km at long wavelengths (approx. 20 cm) insensitive to particulate scattering which allows routine limb sounding from the lower mesosphere through the troposphere. A single orbiting GPS/GLONASS receiver can observe - 1000 to 1400 daily occultations providing as many daily, high vertical resolution soundings as the present global radiosonde network, but with far more evenly distributed, global coverage. The occultations yield profiles of refractivity as a function of height. In the cold, dry conditions of the upper troposphere and above (T less than 240 K), profiles of density, pressure (geopotential), and temperature can be derived. Given additional temperature information, water vapor can be derived in the midddle and lower troposphere with a unique combination of vertical resolution, global distribution and insensitivity to clouds and precipitation to an accuracy of approx. 0.2 g/kg. At low latitudes, moisture profiles will be accurate to 1-5% within the convective boundary layer and better than 20% below 6 to 7 km. Accuracies of climatological averages should be approx. 0. 1 g/kg limited by the biases in the temperature estimates. To use refractivity to constrain water vapor, knowledge of temperature is required. The simplest approach is to use the temperature field from an analysis such as the 6 hour ECMWF global analysis interpolated to the locations of each occultation. A better approach is to combine the temperature and moisture fields from such an analysis with the occultation refractivity in a weighting scheme based on the errors in each data field. A ID variational combinational approach has been developed at the UKMO. We win present results from both approaches from GPS/MET data taken in June and July 1995 and compare them with the ECMWF global 6 hour moisture analyses which are derived largely from TOVS and radiosonde data. Overall, the atmosphere below the 500 mb level appears somewhat drier in general than the ECNIWF humidity field. A 2-D (latitude vs. height) climatological snapshot derived from a 2-week span of GPS/MET data will be compared to the humidity climatology of Peixoto and Oort derived from radiosonde data from 1963-1973. Differences between the GPS results and Peixoto and Oort may be the signature of a climate trend over the past 30 years.

  19. Benefits of Sharing Information: Supermodel Ensemble and Applications in South America

    NASA Astrophysics Data System (ADS)

    Dias, P. L.

    2006-05-01

    A model intercomparison program involving a large number of academic and operational institutions has been implemented in South America since 2003, motivated by the SALLJEX Intercomparison Program in 2003 (a research program focused on the identification of the role of the Andes low level jet moisture transport from the Amazon to the Plata basin) and the WMO/THORPEX (www.wmo.int/thorpex) goals to improve predictability through the proper combination of numerical weather forecasts. This program also explores the potential predictability associated with the combination of a large number of possible scenarios in the time scale of a few days to up to 15 days. Five academic institutions and five operational forecasting centers in several countries in South America, 1 academic institution in the USA, and the main global forecasting centers (NCEP, UKMO, ECMWF) agreed to provide numerical products based on operational and experimental models. The metric for model validation is concentrated on the fit of the forecast to surface observations. Meteorological data from airports, synoptic stations operated by national weather services, automatic data platforms maintained by different institutions, the PIRATA buoys etc are all collected through LDM/NCAR or direct transmission. Approximately 40 models outputs are available on a daily basis, twice a day. A simple procedure based on data assimilation principles was quite successful in combining the available forecasts in order to produce temperature, dew point, wind, pressure and precipitation forecasts at station points in S. America. The procedure is based on removing each model bias at the observational point and a weighted average based on the mean square error of the forecasts. The base period for estimating the bias and mean square error is of the order of 15 to 30 days. Products of the intercomparison model program and the optimal statistical combination of the available forecasts are public and available in real time (www.master.iag.usp.br/). Monitoring of the use of the products reveal a growing trend in the last year (reaching about 10.000 accesses per day in recent months). The intercomparison program provides a rich data set for educational products (real time use in Synoptic Meteorology and Numerical Weather Forecasting lectures), operational weather forecasts in national or regional weather centers and for research purposes. During the first phase of the program it was difficult to convince potential participants to share the information in the public homepage. However, as the system evolved, more and more institutions became associated with the program. The general opinion of the participants is that the system provides an unified metric for evaluation, a forum for discussion of the physical origin of the model forecast differences and therefore improvement of the quality of the numerical guidance.

  20. The Program for climate Model diagnosis and Intercomparison: 20-th anniversary Symposium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Potter, Gerald L; Bader, David C; Riches, Michael

    Twenty years ago, W. Lawrence (Larry) Gates approached the U.S. Department of Energy (DOE) Office of Energy Research (now the Office of Science) with a plan to coordinate the comparison and documentation of climate model differences. This effort would help improve our understanding of climate change through a systematic approach to model intercomparison. Early attempts at comparing results showed a surprisingly large range in control climate from such parameters as cloud cover, precipitation, and even atmospheric temperature. The DOE agreed to fund the effort at the Lawrence Livermore National Laboratory (LLNL), in part because of the existing computing environment andmore » because of a preexisting atmospheric science group that contained a wide variety of expertise. The project was named the Program for Climate Model Diagnosis and Intercomparison (PCMDI), and it has changed the international landscape of climate modeling over the past 20 years. In spring 2009 the DOE hosted a 1-day symposium to celebrate the twentieth anniversary of PCMDI and to honor its founder, Larry Gates. Through their personal experiences, the morning presenters painted an image of climate science in the 1970s and 1980s, that generated early support from the international community for model intercomparison, thereby bringing PCMDI into existence. Four talks covered Gates's early contributions to climate research at the University of California, Los Angeles (UCLA), the RAND Corporation, and Oregon State University through the founding of PCMDI to coordinate the Atmospheric Model Intercomparison Project (AMIP). The speakers were, in order of presentation, Warren Washington [National Center for Atmospheric Research (NCAR)], Kelly Redmond (Western Regional Climate Center), George Boer (Canadian Centre for Climate Modelling and Analysis), and Lennart Bengtsson [University of Reading, former director of the European Centre for Medium-Range Weather Forecasts (ECMWF)]. The afternoon session emphasized the scientific ideas that are the basis of PCMDI's success, summarizing their evolution and impact. Four speakers followed the various PCMDI-supported climate model intercomparison projects, beginning with early work on cloud representations in models, presented by Robert D. Cess (Distinguished Professor Emeritus, Stony Brook University), and then the latest Cloud Feedback Model Intercomparison Projects (CFMIPs) led by Sandrine Bony (Laboratoire de M'©t'©orologie Dynamique). Benjamin Santer (LLNL) presented a review of the climate change detection and attribution (D & A) work pioneered at PCMDI, and Gerald A. Meehl (NCAR) ended the day with a look toward the future of climate change research.« less

  1. Climate Model Diagnostic Analyzer Web Service System

    NASA Astrophysics Data System (ADS)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2013-12-01

    The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).

  2. Nonhuman TRIM5 Variants Enhance Recognition of HIV-1-Infected Cells by CD8+ T Cells

    PubMed Central

    Jimenez-Moyano, Esther; Ruiz, Alba; Kløverpris, Henrik N.; Rodriguez-Plata, Maria T.; Peña, Ruth; Blondeau, Caroline; Selwood, David L.; Izquierdo-Useros, Nuria; Moris, Arnaud; Clotet, Bonaventura; Goulder, Philip; Towers, Greg J.

    2016-01-01

    ABSTRACT Tripartite motif-containing protein 5 (TRIM5) restricts human immunodeficiency virus type 1 (HIV-1) in a species-specific manner by uncoating viral particles while activating early innate responses. Although the contribution of TRIM5 proteins to cellular immunity has not yet been studied, their interactions with the incoming viral capsid and the cellular proteasome led us to hypothesize a role for them. Here, we investigate whether the expression of two nonhuman TRIM5 orthologs, rhesus TRIM5α (RhT5) and TRIM-cyclophilin A (TCyp), both of which are potent restrictors of HIV-1, could enhance immune recognition of infected cells by CD8+ T cells. We illustrate how TRIM5 restriction improves CD8+ T-cell-mediated HIV-1 inhibition. Moreover, when TRIM5 activity was blocked by the nonimmunosuppressive analog of cyclosporine (CsA), sarcosine-3(4-methylbenzoate)–CsA (SmBz-CsA), we found a significant reduction in CD107a/MIP-1β expression in HIV-1-specific CD8+ T cells. This finding underscores the direct link between TRIM5 restriction and activation of CD8+ T-cell responses. Interestingly, cells expressing RhT5 induced stronger CD8+ T-cell responses through the specific recognition of the HIV-1 capsid by the immune system. The underlying mechanism of this process may involve TRIM5-specific capsid recruitment to cellular proteasomes and increase peptide availability for loading and presentation of HLA class I antigens. In summary, we identified a novel function for nonhuman TRIM5 variants in cellular immunity. We hypothesize that TRIM5 can couple innate viral sensing and CD8+ T-cell activation to increase species barriers against retrovirus infection. IMPORTANCE New therapeutics to tackle HIV-1 infection should aim to combine rapid innate viral sensing and cellular immune recognition. Such strategies could prevent seeding of the viral reservoir and the immune damage that occurs during acute infection. The nonhuman TRIM5 variants, rhesus TRIM5α (RhT5) and TRIM-cyclophilin A (TCyp), are attractive candidates owing to their potency in sensing HIV-1 and blocking its activity. Here, we show that expression of RhT5 and TCyp in HIV-1-infected cells improves CD8+ T-cell-mediated inhibition through the direct activation of HIV-1-specific CD8+ T-cell responses. We found that the potency in CD8+ activation was stronger for RhT5 variants and capsid-specific CD8+ T cells in a mechanism that relies on TRIM5-dependent particle recruitment to cellular proteasomes. This novel mechanism couples innate viral sensing with cellular immunity in a single protein and could be exploited to develop innovative therapeutics for control of HIV-1 infection. PMID:27440884

  3. Added value of non-calibrated and BMA calibrated AEMET-SREPS probabilistic forecasts: the 24 January 2009 extreme wind event over Catalonia

    NASA Astrophysics Data System (ADS)

    Escriba, P. A.; Callado, A.; Santos, D.; Santos, C.; Simarro, J.; García-Moya, J. A.

    2009-09-01

    At 00 UTC 24 January 2009 an explosive ciclogenesis originated over the Atlantic Ocean reached its maximum intensity with observed surface pressures lower than 970 hPa on its center and placed at Gulf of Vizcaya. During its path through southern France this low caused strong westerly and north-westerly winds over the Iberian Peninsula higher than 150 km/h at some places. These extreme winds leaved 10 casualties in Spain, 8 of them in Catalonia. The aim of this work is to show whether exists an added value in the short range prediction of the 24 January 2009 strong winds when using the Short Range Ensemble Prediction System (SREPS) of the Spanish Meteorological Agency (AEMET), with respect to the operational forecasting tools. This study emphasizes two aspects of probabilistic forecasting: the ability of a 3-day forecast of warn an extreme windy event and the ability of quantifying the predictability of the event so that giving value to deterministic forecast. Two type of probabilistic forecasts of wind are carried out, a non-calibrated and a calibrated one using Bayesian Model Averaging (BMA). AEMET runs daily experimentally SREPS twice a day (00 and 12 UTC). This system consists of 20 members that are constructed by integrating 5 local area models, COSMO (COSMO), HIRLAM (HIRLAM Consortium), HRM (DWD), MM5 (NOAA) and UM (UKMO), at 25 km of horizontal resolution. Each model uses 4 different initial and boundary conditions, the global models GFS (NCEP), GME (DWD), IFS (ECMWF) and UM. By this way it is obtained a probabilistic forecast that takes into account the initial, the contour and the model errors. BMA is a statistical tool for combining predictive probability functions from different sources. The BMA predictive probability density function (PDF) is a weighted average of PDFs centered on the individual bias-corrected forecasts. The weights are equal to posterior probabilities of the models generating the forecasts and reflect the skill of the ensemble members. Here BMA is applied to provide probabilistic forecasts of wind speed. In this work several forecasts for different time ranges (H+72, H+48 and H+24) of 10 meters wind speed over Catalonia are verified subjectively at one of the instants of maximum intensity, 12 UTC 24 January 2009. On one hand, three probabilistic forecasts are compared, ECMWF EPS, non-calibrated SREPS and calibrated SREPS. On the other hand, the relationship between predictability and skill of deterministic forecast is studied by looking at HIRLAM 0.16 deterministic forecasts of the event. Verification is focused on location and intensity of 10 meters wind speed and 10-minutal measures from AEMET automatic ground stations are used as observations. The results indicate that SREPS is able to forecast three days ahead mean winds higher than 36 km/h and that correctly localizes them with a significant probability of ocurrence in the affected area. The probability is higher after BMA calibration of the ensemble. The fact that probability of strong winds is high allows us to state that the predictability of the event is also high and, as a consequence, deterministic forecasts are more reliable. This is confirmed when verifying HIRLAM deterministic forecasts against observed values.

  4. Information-computational system for storage, search and analytical processing of environmental datasets based on the Semantic Web technologies

    NASA Astrophysics Data System (ADS)

    Titov, A.; Gordov, E.; Okladnikov, I.

    2009-04-01

    In this report the results of the work devoted to the development of working model of the software system for storage, semantically-enabled search and retrieval along with processing and visualization of environmental datasets containing results of meteorological and air pollution observations and mathematical climate modeling are presented. Specially designed metadata standard for machine-readable description of datasets related to meteorology, climate and atmospheric pollution transport domains is introduced as one of the key system components. To provide semantic interoperability the Resource Description Framework (RDF, http://www.w3.org/RDF/) technology means have been chosen for metadata description model realization in the form of RDF Schema. The final version of the RDF Schema is implemented on the base of widely used standards, such as Dublin Core Metadata Element Set (http://dublincore.org/), Directory Interchange Format (DIF, http://gcmd.gsfc.nasa.gov/User/difguide/difman.html), ISO 19139, etc. At present the system is available as a Web server (http://climate.risks.scert.ru/metadatabase/) based on the web-portal ATMOS engine [1] and is implementing dataset management functionality including SeRQL-based semantic search as well as statistical analysis and visualization of selected data archives [2,3]. The core of the system is Apache web server in conjunction with Tomcat Java Servlet Container (http://jakarta.apache.org/tomcat/) and Sesame Server (http://www.openrdf.org/) used as a database for RDF and RDF Schema. At present statistical analysis of meteorological and climatic data with subsequent visualization of results is implemented for such datasets as NCEP/NCAR Reanalysis, Reanalysis NCEP/DOE AMIP II, JMA/CRIEPI JRA-25, ECMWF ERA-40 and local measurements obtained from meteorological stations on the territory of Russia. This functionality is aimed primarily at finding of main characteristics of regional climate dynamics. The proposed system represents a step in the process of development of a distributed collaborative information-computational environment to support multidisciplinary investigations of Earth regional environment [4]. Partial support of this work by SB RAS Integration Project 34, SB RAS Basic Program Project 4.5.2.2, APN Project CBA2007-08NSY and FP6 Enviro-RISKS project (INCO-CT-2004-013427) is acknowledged. References 1. E.P. Gordov, V.N. Lykosov, and A.Z. Fazliev. Web portal on environmental sciences "ATMOS" // Advances in Geosciences. 2006. Vol. 8. p. 33 - 38. 2. Gordov E.P., Okladnikov I.G., Titov A.G. Development of elements of web based information-computational system supporting regional environment processes investigations // Journal of Computational Technologies, Vol. 12, Special Issue #3, 2007, pp. 20 - 28. 3. Okladnikov I.G., Titov A.G. Melnikova V.N., Shulgina T.M. Web-system for processing and visualization of meteorological and climatic data // Journal of Computational Technologies, Vol. 13, Special Issue #3, 2008, pp. 64 - 69. 4. Gordov E.P., Lykosov V.N. Development of information-computational infrastructure for integrated study of Siberia environment // Journal of Computational Technologies, Vol. 12, Special Issue #2, 2007, pp. 19 - 30.

  5. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  6. Design of simulation-based medical education and advantages and disadvantages of in situ simulation versus off-site simulation.

    PubMed

    Sørensen, Jette Led; Østergaard, Doris; LeBlanc, Vicki; Ottesen, Bent; Konge, Lars; Dieckmann, Peter; Van der Vleuten, Cees

    2017-01-21

    Simulation-based medical education (SBME) has traditionally been conducted as off-site simulation in simulation centres. Some hospital departments also provide off-site simulation using in-house training room(s) set up for simulation away from the clinical setting, and these activities are called in-house training. In-house training facilities can be part of hospital departments and resemble to some extent simulation centres but often have less technical equipment. In situ simulation, introduced over the past decade, mainly comprises of team-based activities and occurs in patient care units with healthcare professionals in their own working environment. Thus, this intentional blend of simulation and real working environments means that in situ simulation brings simulation to the real working environment and provides training where people work. In situ simulation can be either announced or unannounced, the latter also known as a drill. This article presents and discusses the design of SBME and the advantage and disadvantage of the different simulation settings, such as training in simulation-centres, in-house simulations in hospital departments, announced or unannounced in situ simulations. Non-randomised studies argue that in situ simulation is more effective for educational purposes than other types of simulation settings. Conversely, the few comparison studies that exist, either randomised or retrospective, show that choice of setting does not seem to influence individual or team learning. However, hospital department-based simulations, such as in-house simulation and in situ simulation, lead to a gain in organisational learning. To our knowledge no studies have compared announced and unannounced in situ simulation. The literature suggests some improved organisational learning from unannounced in situ simulation; however, unannounced in situ simulation was also found to be challenging to plan and conduct, and more stressful among participants. The importance of setting, context and fidelity are discussed. Based on the current limited research we suggest that choice of setting for simulations does not seem to influence individual and team learning. Department-based local simulation, such as simulation in-house and especially in situ simulation, leads to gains in organisational learning. The overall objectives of simulation-based education and factors such as feasibility can help determine choice of simulation setting.

  7. Implications of Simulation Conceptual Model Development for Simulation Management and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Pace, Dale K.

    2000-01-01

    A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.

  8. [Malfunction simulation by spaceflight training simulator].

    PubMed

    Chang, Tian-chun; Zhang, Lian-hua; Xue, Liang; Lian, Shun-guo

    2005-04-01

    To implement malfunction simulation in spaceflight training simulator. The principle of malfunction simulation was defined according to spacecraft malfunction predict and its countermeasures. The malfunction patterns were classified, and malfunction type was confirmed. A malfunction simulation model was established, and the malfunction simulation was realized by math simulation. According to the requirement of astronaut training, a spacecraft subsystem malfunction simulation model was established and realized, such as environment control and life support, GNC, push, power supply, heat control, data management, measure control and communication, structure and so on. The malfunction simulation function implemented in the spaceflight training simulator satisfied the requirements for astronaut training.

  9. Computer-based simulation training to improve learning outcomes in mannequin-based simulation exercises.

    PubMed

    Curtin, Lindsay B; Finn, Laura A; Czosnowski, Quinn A; Whitman, Craig B; Cawley, Michael J

    2011-08-10

    To assess the impact of computer-based simulation on the achievement of student learning outcomes during mannequin-based simulation. Participants were randomly assigned to rapid response teams of 5-6 students and then teams were randomly assigned to either a group that completed either computer-based or mannequin-based simulation cases first. In both simulations, students used their critical thinking skills and selected interventions independent of facilitator input. A predetermined rubric was used to record and assess students' performance in the mannequin-based simulations. Feedback and student performance scores were generated by the software in the computer-based simulations. More of the teams in the group that completed the computer-based simulation before completing the mannequin-based simulation achieved the primary outcome for the exercise, which was survival of the simulated patient (41.2% vs. 5.6%). The majority of students (>90%) recommended the continuation of simulation exercises in the course. Students in both groups felt the computer-based simulation should be completed prior to the mannequin-based simulation. The use of computer-based simulation prior to mannequin-based simulation improved the achievement of learning goals and outcomes. In addition to improving participants' skills, completing the computer-based simulation first may improve participants' confidence during the more real-life setting achieved in the mannequin-based simulation.

  10. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  11. Simulators IV; Proceedings of the SCS Conference, Orlando, FL, Apr. 6-9, 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fairchild, B.T.

    1987-01-01

    The conference presents papers on the applicability of AI techniques to simulation models, the simulation of a reentry vehicle on Simstar, simstar missile simulation, measurement issues associated with simulator sickness, and tracing the etiology of simulator sickness. Consideration is given to a simulator of a steam generator tube bundle response to a blowdown transient, the census of simulators for fossil fueled boiler and gas turbine plant operation training, and a new approach for flight simulator visual systems. Other topics include past and present simulated aircraft maintenance trainers, an AI-simulation based approach for aircraft maintenance training, simulator qualification using EPRI methodology,more » and the role of instinct in organizational dysfunction.« less

  12. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  13. Advancing renal education: hybrid simulation, using simulated patients to enhance realism in haemodialysis education.

    PubMed

    Dunbar-Reid, Kylie; Sinclair, Peter M; Hudson, Denis

    2015-06-01

    Simulation is a well-established and proven teaching method, yet its use in renal education is not widely reported. Criticisms of simulation-based teaching include limited realism and a lack of authentic patient interaction. This paper discusses the benefits and challenges of high-fidelity simulation and suggests hybrid simulation as a complementary model to existing simulation programmes. Through the use of a simulated patient, hybrid simulation can improve the authenticity of renal simulation-based education while simultaneously teaching and assessing technologically enframed caring. © 2015 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  14. Real time digital propulsion system simulation for manned flight simulators

    NASA Technical Reports Server (NTRS)

    Mihaloew, J. R.; Hart, C. E.

    1978-01-01

    A real time digital simulation of a STOL propulsion system was developed which generates significant dynamics and internal variables needed to evaluate system performance and aircraft interactions using manned flight simulators. The simulation ran at a real-to-execution time ratio of 8.8. The model was used in a piloted NASA flight simulator program to evaluate the simulation technique and the propulsion system digital control. The simulation is described and results shown. Limited results of the flight simulation program are also presented.

  15. Operationalizing Healthcare Simulation Psychological Safety: A Descriptive Analysis of an Intervention.

    PubMed

    Henricksen, Jared W; Altenburg, Catherine; Reeder, Ron W

    2017-10-01

    Despite efforts to prepare a psychologically safe environment, simulation participants are occasionally psychologically distressed. Instructing simulation educators about participant psychological risks and having a participant psychological distress action plan available to simulation educators may assist them as they seek to keep all participants psychologically safe. A Simulation Participant Psychological Safety Algorithm was designed to aid simulation educators as they debrief simulation participants perceived to have psychological distress and categorize these events as mild (level 1), moderate (level 2), or severe (level 3). A prebrief dedicated to creating a psychologically safe learning environment was held constant. The algorithm was used for 18 months in an active pediatric simulation program. Data collected included level of participant psychological distress as perceived and categorized by the simulation team using the algorithm, type of simulation that participants went through, who debriefed, and timing of when psychological distress was perceived to occur during the simulation session. The Kruskal-Wallis test was used to evaluate the relationship between events and simulation type, events and simulation educator team who debriefed, and timing of event during the simulation session. A total of 3900 participants went through 399 simulation sessions between August 1, 2014, and January 26, 2016. Thirty-four (<1%) simulation participants from 27 sessions (7%) were perceived to have an event. One participant was perceived to have a severe (level 3) psychological distress event. Events occurred more commonly in high-intensity simulations, with novice learners and with specific educator teams. Simulation type and simulation educator team were associated with occurrence of events (P < 0.001). There was no association between event timing and event level. Severe psychological distress as categorized by simulation personnel using the Simulation Participant Psychological Safety Algorithm is rare, with mild and moderate events being more common. The algorithm was used to teach simulation educators how to assist a participant who may be psychologically distressed and document perceived event severity.

  16. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  17. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  18. Accurate Behavioral Simulator of All-Digital Time-Domain Smart Temperature Sensors by Using SIMULINK

    PubMed Central

    Chen, Chun-Chi; Chen, Chao-Lieh; Lin, You-Ting

    2016-01-01

    This study proposes a new behavioral simulator that uses SIMULINK for all-digital CMOS time-domain smart temperature sensors (TDSTSs) for performing rapid and accurate simulations. Inverter-based TDSTSs offer the benefits of low cost and simple structure for temperature-to-digital conversion and have been developed. Typically, electronic design automation tools, such as HSPICE, are used to simulate TDSTSs for performance evaluations. However, such tools require extremely long simulation time and complex procedures to analyze the results and generate figures. In this paper, we organize simple but accurate equations into a temperature-dependent model (TDM) by which the TDSTSs evaluate temperature behavior. Furthermore, temperature-sensing models of a single CMOS NOT gate were devised using HSPICE simulations. Using the TDM and these temperature-sensing models, a novel simulator in SIMULINK environment was developed to substantially accelerate the simulation and simplify the evaluation procedures. Experiments demonstrated that the simulation results of the proposed simulator have favorable agreement with those obtained from HSPICE simulations, showing that the proposed simulator functions successfully. This is the first behavioral simulator addressing the rapid simulation of TDSTSs. PMID:27509507

  19. Tri-FAST Hardware-in-the-Loop Simulation. Volume I. Tri-FAST Hardware-in-the-Loop Simulation at the Advanced Simulation Center

    DTIC Science & Technology

    1979-03-28

    TECHNICAL REPORT T-79-43 TRI- FAST HARDWARE-IN-THE-LOOP SIMULATION Volume 1: Trn FAST Hardware-In-the. Loop Simulation at the Advanced Simulation...Identify by block number) Tri- FAST Hardware-in-the-Loop ACSL Advanced Simulation Center Simulation RF Target Models I a. AfIACT ( sin -oveme skit N nem...e n tdositr by block number) The purpose of this report is to document the Tri- FAST missile simulation development and the seeker hardware-in-the

  20. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  1. Genetic data simulators and their applications: an overview

    PubMed Central

    Peng, Bo; Chen, Huann-Sheng; Mechanic, Leah E.; Racine, Ben; Clarke, John; Gillanders, Elizabeth; Feuer, Eric J.

    2016-01-01

    Computer simulations have played an indispensable role in the development and application of statistical models and methods for genetic studies across multiple disciplines. The need to simulate complex evolutionary scenarios and pseudo-datasets for various studies has fueled the development of dozens of computer programs with varying reliability, performance, and application areas. To help researchers compare and choose the most appropriate simulators for their studies, we have created the Genetic Simulation Resources (GSR) website, which allows authors of simulation software to register their applications and describe them with more than 160 defined attributes. This article summarizes the properties of 93 simulators currently registered at GSR and provides an overview of the development and applications of genetic simulators. Unlike other review articles that address technical issues or compare simulators for particular application areas, we focus on software development, maintenance, and features of simulators, often from a historical perspective. Publications that cite these simulators are used to summarize both the applications of genetic simulations and the utilization of simulators. PMID:25504286

  2. A breakthrough for experiencing and understanding simulated physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1988-01-01

    The use of computer simulation in physics research is discussed, focusing on improvements to graphic workstations. Simulation capabilities and applications of enhanced visualization tools are outlined. The elements of an ideal computer simulation are presented and the potential for improving various simulation elements is examined. The interface between the human and the computer and simulation models are considered. Recommendations are made for changes in computer simulation practices and applications of simulation technology in education.

  3. Response of Flight Nurses in a Simulated Helicopter Environment.

    PubMed

    Kaniecki, David M; Hickman, Ronald L; Alfes, Celeste M; Reimer, Andrew P

    The purpose of this study was to determine if a helicopter flight simulator could provide a useful educational platform by creating experiences similar to those encountered by actual flight nurses. Flight nurse (FN) and non-FN participants completed a simulated emergency scenario in a flight simulator. Physiologic and psychological stress during the simulation was measured using heart rate and perceived stress scores. A questionnaire was then administered to assess the realism of the flight simulator. Subjects reported that the overall experience in the flight simulator was comparable with a real helicopter. Sounds, communications, vibrations, and movements in the simulator most approximated those of a real-life helicopter environment. Perceived stress levels of all participants increased significantly from 27 (on a 0-100 scale) before simulation to 51 at the peak of the simulation and declined thereafter to 28 (P < .001). Perceived stress levels of FNs increased significantly from 25 before simulation to 54 at the peak of the simulation and declined thereafter to 30 (P < .001). Perceived stress levels of non-FNs increased significantly from 31 before simulation to 49 at the peak of the simulation and declined thereafter to 25 (P < .001). There were no significant differences in perceived stress levels between FNs and non-FNs before (P = .58), during (P = .63), or after (P = .55) simulation. FNs' heart rates increased significantly from 77 before simulation to 100 at the peak of the simulation and declined thereafter to 72 (P < .001). The results of this study suggest that simulation of a critical care scenario in a high-fidelity helicopter flight simulator can provide a realistic helicopter transport experience and create physiologic and psychological stress for participants. Copyright © 2017 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.

  4. The effects of simulated fog and motion on simulator sickness in a driving simulator and the duration of after-effects.

    PubMed

    Dziuda, Lukasz; Biernacki, Marcin P; Baran, Paulina M; Truszczyński, Olaf E

    2014-05-01

    In the study, we checked: 1) how the simulator test conditions affect the severity of simulator sickness symptoms; 2) how the severity of simulator sickness symptoms changes over time; and 3) whether the conditions of the simulator test affect the severity of these symptoms in different ways, depending on the time that has elapsed since the performance of the task in the simulator. We studied 12 men aged 24-33 years (M = 28.8, SD = 3.26) using a truck simulator. The SSQ questionnaire was used to assess the severity of the symptoms of simulator sickness. Each of the subjects performed three 30-minute tasks running along the same route in a driving simulator. Each of these tasks was carried out in a different simulator configuration: A) fixed base platform with poor visibility; B) fixed base platform with good visibility; and C) motion base platform with good visibility. The measurement of the severity of the simulator sickness symptoms took place in five consecutive intervals. The results of the analysis showed that the simulator test conditions affect in different ways the severity of the simulator sickness symptoms, depending on the time which has elapsed since performing the task on the simulator. The simulator sickness symptoms persisted at the highest level for the test conditions involving the motion base platform. Also, when performing the tasks on the motion base platform, the severity of the simulator sickness symptoms varied depending on the time that had elapsed since performing the task. Specifically, the addition of motion to the simulation increased the oculomotor and disorientation symptoms reported as well as the duration of the after-effects. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Current status of endoscopic simulation in gastroenterology fellowship training programs.

    PubMed

    Jirapinyo, Pichamol; Thompson, Christopher C

    2015-07-01

    Recent guidelines have encouraged gastroenterology and surgical training programs to integrate simulation into their core endoscopic curricula. However, the role that simulation currently has within training programs is unknown. This study aims to assess the current status of simulation among gastroenterology fellowship programs. This questionnaire study consisted of 38 fields divided into two sections. The first section queried program directors' experience on simulation and assessed the current status of simulation at their institution. The second portion surveyed their opinion on the potential role of simulation on the training curriculum. The study was conducted at the 2013 American Gastroenterological Association Training Directors' Workshop in Phoenix, Arizona. The participants were program directors from Accreditation Council for Graduate Medical Education accredited gastroenterology training programs, who attended the workshop. The questionnaire was returned by 69 of 97 program directors (response rate of 71%). 42% of programs had an endoscopic simulator. Computerized simulators (61.5%) were the most common, followed by mechanical (30.8%) and animal tissue (7.7%) simulators, respectively. Eleven programs (15%) required fellows to use simulation prior to clinical cases. Only one program has a minimum number of hours fellows have to participate in simulation training. Current simulators are deemed as easy to use (76%) and good educational tools (65%). Problems are cost (72%) and accessibility (69%). The majority of program directors believe that there is a need for endoscopic simulator training, with only 8% disagreeing. Additionally, a majority believe there is a role for simulation prior to initiation of clinical cases with 15% disagreeing. Gastroenterology fellowship program directors widely recognize the importance of simulation. Nevertheless, simulation is used by only 42% of programs and only 15% of programs require that trainees use simulation prior to clinical cases. No programs currently use simulation as part of the evaluation process.

  6. Building the evidence on simulation validity: comparison of anesthesiologists' communication patterns in real and simulated cases.

    PubMed

    Weller, Jennifer; Henderson, Robert; Webster, Craig S; Shulruf, Boaz; Torrie, Jane; Davies, Elaine; Henderson, Kaylene; Frampton, Chris; Merry, Alan F

    2014-01-01

    Effective teamwork is important for patient safety, and verbal communication underpins many dimensions of teamwork. The validity of the simulated environment would be supported if it elicited similar verbal communications to the real setting. The authors hypothesized that anesthesiologists would exhibit similar verbal communication patterns in routine operating room (OR) cases and routine simulated cases. The authors further hypothesized that anesthesiologists would exhibit different communication patterns in routine cases (real or simulated) and simulated cases involving a crisis. Key communications relevant to teamwork were coded from video recordings of anesthesiologists in the OR, routine simulation and crisis simulation and percentages were compared. The authors recorded comparable videos of 20 anesthesiologists in the two simulations, and 17 of these anesthesiologists in the OR, generating 400 coded events in the OR, 683 in the routine simulation, and 1,419 in the crisis simulation. The authors found no significant differences in communication patterns in the OR and the routine simulations. The authors did find significant differences in communication patterns between the crisis simulation and both the OR and the routine simulations. Participants rated team communication as realistic and considered their communications occurred with a similar frequency in the simulations as in comparable cases in the OR. The similarity of teamwork-related communications elicited from anesthesiologists in simulated cases and the real setting lends support for the ecological validity of the simulation environment and its value in teamwork training. Different communication patterns and frequencies under the challenge of a crisis support the use of simulation to assess crisis management skills.

  7. Simulation Activity in Otolaryngology Residencies.

    PubMed

    Deutsch, Ellen S; Wiet, Gregory J; Seidman, Michael; Hussey, Heather M; Malekzadeh, Sonya; Fried, Marvin P

    2015-08-01

    Simulation has become a valuable tool in medical education, and several specialties accept or require simulation as a resource for resident training or assessment as well as for board certification or maintenance of certification. This study investigates current simulation resources and activities in US otolaryngology residency programs and examines interest in advancing simulation training and assessment within the specialty. Web-based survey. US otolaryngology residency training programs. An electronic web-based survey was disseminated to all US otolaryngology program directors to determine their respective institutional and departmental simulation resources, existing simulation activities, and interest in further simulation initiatives. Descriptive results are reported. Responses were received from 43 of 104 (43%) residency programs. Simulation capabilities and resources are available in most respondents' institutions (78.6% report onsite resources; 73.8% report availability of models, manikins, and devices). Most respondents (61%) report limited simulation activity within otolaryngology. Areas of simulation are broad, addressing technical and nontechnical skills related to clinical training (94%). Simulation is infrequently used for research, credentialing, or systems improvement. The majority of respondents (83.8%) expressed interest in participating in multicenter trials of simulation initiatives. Most respondents from otolaryngology residency programs have incorporated some simulation into their curriculum. Interest among program directors to participate in future multicenter trials appears high. Future research efforts in this area should aim to determine optimal simulators and simulation activities for training and assessment as well as how to best incorporate simulation into otolaryngology residency training programs. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  8. To simulate or not to simulate: what are the questions?

    PubMed

    Dudai, Yadin; Evers, Kathinka

    2014-10-22

    Simulation is a powerful method in science and engineering. However, simulation is an umbrella term, and its meaning and goals differ among disciplines. Rapid advances in neuroscience and computing draw increasing attention to large-scale brain simulations. What is the meaning of simulation, and what should the method expect to achieve? We discuss the concept of simulation from an integrated scientific and philosophical vantage point and pinpoint selected issues that are specific to brain simulation.

  9. Synchronization Of Parallel Discrete Event Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S.

    1992-01-01

    Adaptive, parallel, discrete-event-simulation-synchronization algorithm, Breathing Time Buckets, developed in Synchronous Parallel Environment for Emulation and Discrete Event Simulation (SPEEDES) operating system. Algorithm allows parallel simulations to process events optimistically in fluctuating time cycles that naturally adapt while simulation in progress. Combines best of optimistic and conservative synchronization strategies while avoiding major disadvantages. Algorithm processes events optimistically in time cycles adapting while simulation in progress. Well suited for modeling communication networks, for large-scale war games, for simulated flights of aircraft, for simulations of computer equipment, for mathematical modeling, for interactive engineering simulations, and for depictions of flows of information.

  10. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  11. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  12. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Astrophysics Data System (ADS)

    Chamis, C. C.; Singhal, S. N.

    1993-02-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  13. Residents' perceptions of simulation as a clinical learning approach.

    PubMed

    Walsh, Catharine M; Garg, Ankit; Ng, Stella L; Goyal, Fenny; Grover, Samir C

    2017-02-01

    Simulation is increasingly being integrated into medical education; however, there is little research into trainees' perceptions of this learning modality. We elicited trainees' perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4-6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Residents' perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents' markedly narrow perception of simulation's capacity to support non-technical skills development or its use beyond introductory learning. Trainees' learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees' a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation.

  14. A Simulation of Alternatives for Wholesale Inventory Replenishment

    DTIC Science & Technology

    2016-03-01

    algorithmic details. The last method is a mixed-integer, linear optimization model. Comparative Inventory Simulation, a discrete event simulation model, is...simulation; event graphs; reorder point; fill-rate; backorder; discrete event simulation; wholesale inventory optimization model 15. NUMBER OF PAGES...model. Comparative Inventory Simulation, a discrete event simulation model, is designed to find fill rates achieved for each National Item

  15. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  16. Future directions in flight simulation: A user perspective

    NASA Technical Reports Server (NTRS)

    Jackson, Bruce

    1993-01-01

    Langley Research Center was an early leader in simulation technology, including a special emphasis in space vehicle simulations such as the rendezvous and docking simulator for the Gemini program and the lunar landing simulator used before Apollo. In more recent times, Langley operated the first synergistic six degree of freedom motion platform (the Visual Motion Simulator, or VMS) and developed the first dual-dome air combat simulator, the Differential Maneuvering Simulator (DMS). Each Langley simulator was developed more or less independently from one another with different programming support. At present time, the various simulation cockpits, while supported by the same host computer system, run dissimilar software. The majority of recent investments in Langley's simulation facilities have been hardware procurements: host processors, visual systems, and most recently, an improved motion system. Investments in software improvements, however, have not been of the same order.

  17. DIMENSIONS OF SIMULATION.

    ERIC Educational Resources Information Center

    CRAWFORD, MEREDITH P.

    OPEN AND CLOSED LOOP SIMULATION IS DISCUSSED FROM THE VIEWPOINT OF RESEARCH AND DEVELOPMENT IN TRAINING TECHNIQUES. AREAS DISCUSSED INCLUDE--(1) OPEN-LOOP ENVIRONMENTAL SIMULATION, (2) SIMULATION NOT INVOLVING PEOPLE, (3) ANALYSIS OF OCCUPATIONS, (4) SIMULATION FOR TRAINING, (5) REAL-SIZE SYSTEM SIMULATION, (6) TECHNIQUES OF MINIATURIZATION, AND…

  18. Man-in-the-control-loop simulation of manipulators

    NASA Technical Reports Server (NTRS)

    Chang, J. L.; Lin, Tsung-Chieh; Yae, K. Harold

    1989-01-01

    A method to achieve man-in-the-control-loop simulation is presented. Emerging real-time dynamics simulation suggests a potential for creating an interactive design workstation with a human operator in the control loop. The recursive formulation for multibody dynamics simulation is studied to determine requirements for man-in-the-control-loop simulation. High speed computer graphics techniques provides realistic visual cues for the simulator. Backhoe and robot arm simulations are implemented to demonstrate the capability of man-in-the-control-loop simulation.

  19. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines.

    PubMed

    Dubrowski, Adam; Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-11-02

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation - occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation - collectively, the building blocks of optimal healthcare.

  20. Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines

    PubMed Central

    Alani, Sabrina; Bankovic, Tina; Crowe, Andrea; Pollard, Megan

    2015-01-01

    Simulation is an important training tool used in a variety of influential fields. However, development of simulation scenarios - the key component of simulation – occurs in isolation; sharing of scenarios is almost non-existent. This can make simulation use a costly task in terms of the resources and time and the possible redundancy of efforts. To alleviate these issues, the goal is to strive for an open communication of practice (CoP) surrounding simulation. To facilitate this goal, this report describes a set of guidelines for writing technical reports about simulation use for educating health professionals. Using an accepted set of guidelines will allow for homogeneity when building simulation scenarios and facilitate open sharing among simulation users. In addition to optimizing simulation efforts in institutions that are currently using simulation as an educational tool, the development of such a repository may have direct implications on developing countries, where simulation is only starting to be used systematically. Our project facilitates equivalent and global access to information, knowledge, and highest-caliber education - in this context, simulation – collectively, the building blocks of optimal healthcare.  PMID:26677421

  1. Conducting Simulation Studies in the R Programming Environment.

    PubMed

    Hallgren, Kevin A

    2013-10-12

    Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.

  2. Using a simulation assistant in modeling manufacturing systems

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.

    1988-01-01

    Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.

  3. Fluid Structural Analysis of Human Cerebral Aneurysm Using Their Own Wall Mechanical Properties

    PubMed Central

    Valencia, Alvaro; Burdiles, Patricio; Ignat, Miguel; Mura, Jorge; Rivera, Rodrigo; Sordo, Juan

    2013-01-01

    Computational Structural Dynamics (CSD) simulations, Computational Fluid Dynamics (CFD) simulation, and Fluid Structure Interaction (FSI) simulations were carried out in an anatomically realistic model of a saccular cerebral aneurysm with the objective of quantifying the effects of type of simulation on principal fluid and solid mechanics results. Eight CSD simulations, one CFD simulation, and four FSI simulations were made. The results allowed the study of the influence of the type of material elements in the solid, the aneurism's wall thickness, and the type of simulation on the modeling of a human cerebral aneurysm. The simulations use their own wall mechanical properties of the aneurysm. The more complex simulation was the FSI simulation completely coupled with hyperelastic Mooney-Rivlin material, normal internal pressure, and normal variable thickness. The FSI simulation coupled in one direction using hyperelastic Mooney-Rivlin material, normal internal pressure, and normal variable thickness is the one that presents the most similar results with respect to the more complex FSI simulation, requiring one-fourth of the calculation time. PMID:24151523

  4. NASA Lunar Regolith Simulant Program

    NASA Technical Reports Server (NTRS)

    Edmunson, J.; Betts, W.; Rickman, D.; McLemore, C.; Fikes, J.; Stoeser, D.; Wilson, S.; Schrader, C.

    2010-01-01

    Lunar regolith simulant production is absolutely critical to returning man to the Moon. Regolith simulant is used to test hardware exposed to the lunar surface environment, simulate health risks to astronauts, practice in situ resource utilization (ISRU) techniques, and evaluate dust mitigation strategies. Lunar regolith simulant design, production process, and management is a cooperative venture between members of the NASA Marshall Space Flight Center (MSFC) and the U.S. Geological Survey (USGS). The MSFC simulant team is a satellite of the Dust group based at Glenn Research Center. The goals of the cooperative group are to (1) reproduce characteristics of lunar regolith using simulants, (2) produce simulants as cheaply as possible, (3) produce simulants in the amount needed, and (4) produce simulants to meet users? schedules.

  5. Channel simulation to facilitate mobile-satellite communications research

    NASA Technical Reports Server (NTRS)

    Davarian, Faramaz

    1987-01-01

    The mobile-satellite-service channel simulator, which is a facility for an end-to-end hardware simulation of mobile satellite communications links is discussed. Propagation effects, Doppler, interference, band limiting, satellite nonlinearity, and thermal noise have been incorporated into the simulator. The propagation environment in which the simulator needs to operate and the architecture of the simulator are described. The simulator is composed of: a mobile/fixed transmitter, interference transmitters, a propagation path simulator, a spacecraft, and a fixed/mobile receiver. Data from application experiments conducted with the channel simulator are presented; the noise converison technique to evaluate interference effects, the error floor phenomenon of digital multipath fading links, and the fade margin associated with a noncoherent receiver are examined. Diagrams of the simulator are provided.

  6. Real-time simulation of the TF30-P-3 turbofan engine using a hybrid computer

    NASA Technical Reports Server (NTRS)

    Szuch, J. R.; Bruton, W. M.

    1974-01-01

    A real-time, hybrid-computer simulation of the TF30-P-3 turbofan engine was developed. The simulation was primarily analog in nature but used the digital portion of the hybrid computer to perform bivariate function generation associated with the performance of the engine's rotating components. FORTRAN listings and analog patching diagrams are provided. The hybrid simulation was controlled by a digital computer programmed to simulate the engine's standard hydromechanical control. Both steady-state and dynamic data obtained from the digitally controlled engine simulation are presented. Hybrid simulation data are compared with data obtained from a digital simulation provided by the engine manufacturer. The comparisons indicate that the real-time hybrid simulation adequately matches the baseline digital simulation.

  7. The optical design and simulation of the collimated solar simulator

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Ma, Tao

    2018-01-01

    The solar simulator is a lighting device that can simulate the solar radiation. It has been widely used in the testing of solar cells, satellite space environment simulation and ground experiment, test and calibration precision of solar sensor. The solar simulator mainly consisted of short—arc xenon lamp, ellipsoidal reflectors, a group of optical integrator, field stop, aspheric folding mirror and collimating reflector. In this paper, the solar simulator's optical system basic size are given by calculation. Then the system is optically modeled with the Lighttools software, and the simulation analysis on solar simulator using the Monte Carlo ray -tracing technique is conducted. Finally, the simulation results are given quantitatively by diagrammatic form. The rationality of the design is verified on the basis of theory.

  8. Method and system for fault accommodation of machines

    NASA Technical Reports Server (NTRS)

    Goebel, Kai Frank (Inventor); Subbu, Rajesh Venkat (Inventor); Rausch, Randal Thomas (Inventor); Frederick, Dean Kimball (Inventor)

    2011-01-01

    A method for multi-objective fault accommodation using predictive modeling is disclosed. The method includes using a simulated machine that simulates a faulted actual machine, and using a simulated controller that simulates an actual controller. A multi-objective optimization process is performed, based on specified control settings for the simulated controller and specified operational scenarios for the simulated machine controlled by the simulated controller, to generate a Pareto frontier-based solution space relating performance of the simulated machine to settings of the simulated controller, including adjustment to the operational scenarios to represent a fault condition of the simulated machine. Control settings of the actual controller are adjusted, represented by the simulated controller, for controlling the actual machine, represented by the simulated machine, in response to a fault condition of the actual machine, based on the Pareto frontier-based solution space, to maximize desirable operational conditions and minimize undesirable operational conditions while operating the actual machine in a region of the solution space defined by the Pareto frontier.

  9. The Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Mike G.; Bowman, James D.

    2007-01-01

    The paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which focuses on the investigation and development of technologies, processes and integrated simulations related to the collaborative distributed simulation of complex space systems in support of NASA's Exploration Initiative. This paper describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. In the network work area, DSES is developing a Distributed Simulation Network that will provide agency wide support for distributed simulation between all NASA centers. In the software work area, DSES is developing a collection of software models, tool and procedures that ease the burden of developing distributed simulations and provides a consistent interoperability infrastructure for agency wide participation in integrated simulation. Finally, for simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper will present current status and plans for each of these work areas with specific examples of simulations that support NASA's exploration initiatives.

  10. Warriors Edge Simulation and Gaming System: The Squad Simulation

    DTIC Science & Technology

    2005-08-01

    Warriors Edge Simulation and Gaming System: The Squad Simulation by Mark Thomas and Gary Moss ARL-TR-3564 August 2005...Edge Simulation and Gaming System: The Squad Simulation Mark Thomas and Gary Moss Computational and Information Sciences Directorate, ARL...2004–30 September 2004 5a. CONTRACT NUMBER 5b. GRANT NUMBER 4. TITLE AND SUBTITLE Warriors Edge Simulation and Gaming System: The Squad

  11. Mathematical modeling and SAR simulation multifunction SAR technology efforts

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1981-01-01

    The orbital SAR (synthetic aperture radar) simulation data was used in several simulation efforts directed toward advanced SAR development. Efforts toward simulating an operational radar, simulation of antenna polarization effects, and simulation of SAR images at serveral different wavelengths are discussed. Avenues for improvements in the orbital SAR simulation and its application to the development of advanced digital radar data processing schemes are indicated.

  12. LibKiSAO: a Java library for Querying KiSAO.

    PubMed

    Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas

    2012-09-24

    The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.

  13. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  14. Expansion of flight simulator capability for study and solution of aircraft directional control problems on runways

    NASA Technical Reports Server (NTRS)

    Kibbee, G. W.

    1978-01-01

    The development, evaluation, and evaluation results of a DC-9-10 runway directional control simulator are described. An existing wide bodied flight simulator was modified to this aircraft configuration. The simulator was structured to use either two of antiskid simulations; (1) an analog mechanization that used aircraft hardware; or (2) a digital software simulation. After the simulation was developed it was evaluated by 14 pilots who made 818 simulated flights. These evaluations involved landings, rejected takeoffs, and various ground maneuvers. Qualitatively most pilots evaluated the simulator as realistic with good potential especially for pilot training for adverse runway conditions.

  15. Physical Models and Virtual Reality Simulators in Otolaryngology.

    PubMed

    Javia, Luv; Sardesai, Maya G

    2017-10-01

    The increasing role of simulation in the medical education of future otolaryngologists has followed suit with other surgical disciplines. Simulators make it possible for the resident to explore and learn in a safe and less stressful environment. The various subspecialties in otolaryngology use physical simulators and virtual-reality simulators. Although physical simulators allow the operator to make direct contact with its components, virtual-reality simulators allow the operator to interact with an environment that is computer generated. This article gives an overview of the various types of physical simulators and virtual-reality simulators used in otolaryngology that have been reported in the literature. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Drape simulation and subjective assessment of virtual drape

    NASA Astrophysics Data System (ADS)

    Buyukaslan, E.; Kalaoglu, F.; Jevsnik, S.

    2017-10-01

    In this study, a commercial 3D virtual garment simulation software (Optitex) is used to simulate drape behaviours of five different fabrics. Mechanical properties of selected fabrics are measured by Fabric Assurance by Simple Testing (FAST) method. Measured bending, shear and extension properties of fabrics are inserted to the simulation software to achieve more realistic simulations. Simulation images of fabrics are shown to 27 people and they are asked to match real drape images of fabrics with simulated drape images. Fabric simulations of two fabrics were correctly matched by the majority of the test group. However, the other three fabrics’ simulations were mismatched by most of the people.

  17. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  18. Desktop Modeling and Simulation: Parsimonious, yet Effective Discrete-Event Simulation Analysis

    NASA Technical Reports Server (NTRS)

    Bradley, James R.

    2012-01-01

    This paper evaluates how quickly students can be trained to construct useful discrete-event simulation models using Excel The typical supply chain used by many large national retailers is described, and an Excel-based simulation model is constructed of it The set of programming and simulation skills required for development of that model are then determined we conclude that six hours of training are required to teach the skills to MBA students . The simulation presented here contains all fundamental functionallty of a simulation model, and so our result holds for any discrete-event simulation model. We argue therefore that Industry workers with the same technical skill set as students having completed one year in an MBA program can be quickly trained to construct simulation models. This result gives credence to the efficacy of Desktop Modeling and Simulation whereby simulation analyses can be quickly developed, run, and analyzed with widely available software, namely Excel.

  19. Solving search problems by strongly simulating quantum circuits

    PubMed Central

    Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.

    2013-01-01

    Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585

  20. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.

    2000-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  1. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  2. Requirements and Techniques for Developing and Measuring Simulant Materials

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Owens, Charles; Howard, Rick

    2006-01-01

    The 1989 workshop report entitled Workshop on Production and Uses of Simulated Lunar Materials and the Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage, NASA Technical Publication identify and reinforced a need for a set of standards and requirements for the production and usage of the lunar simulant materials. As NASA need prepares to return to the moon, a set of requirements have been developed for simulant materials and methods to produce and measure those simulants have been defined. Addressed in the requirements document are: 1) a method for evaluating the quality of any simulant of a regolith, 2) the minimum Characteristics for simulants of lunar regolith, and 3) a method to produce lunar regolith simulants needed for NASA's exploration mission. A method to evaluate new and current simulants has also been rigorously defined through the mathematics of Figures of Merit (FoM), a concept new to simulant development. A single FoM is conceptually an algorithm defining a single characteristic of a simulant and provides a clear comparison of that characteristic for both the simulant and a reference material. Included as an intrinsic part of the algorithm is a minimum acceptable performance for the characteristic of interest. The algorithms for the FoM for Standard Lunar Regolith Simulants are also explicitly keyed to a recommended method to make lunar simulants.

  3. The internal validity of arthroscopic simulators and their effectiveness in arthroscopic education.

    PubMed

    Slade Shantz, Jesse Alan; Leiter, Jeff R S; Gottschalk, Tania; MacDonald, Peter Benjamin

    2014-01-01

    The purpose of this systematic review was to identify standard procedures for the validation of arthroscopic simulators and determine whether simulators improve the surgical skills of users. Arthroscopic simulator validation studies and randomized trials assessing the effectiveness of arthroscopic simulators in education were identified from online databases, as well as, grey literature and reference lists. Only validation studies and randomized trials were included for review. Study heterogeneity was calculated and where appropriate, study results were combined employing a random effects model. Four hundred and thirteen studies were reviewed. Thirteen studies met the inclusion criteria assessing the construct validity of simulators. A pooled analysis of internal validation studies determined that simulators could discriminate between novice and experts, but not between novice and intermediate trainees on time of completion of a simulated task. Only one study assessed the utility of a knee simulator in training arthroscopic skills directly and demonstrated that the skill level of simulator-trained residents was greater than non-simulator-trained residents. Excessive heterogeneity exists in the literature to determine the internal and transfer validity of arthroscopic simulators currently available. Evidence suggests that simulators can discriminate between novice and expert users, but discrimination between novice and intermediate trainees in surgical education should be paramount. International standards for the assessment of arthroscopic simulator validity should be developed to increase the use and effectiveness of simulators in orthopedic surgery.

  4. Evaluating best educational practices, student satisfaction, and self-confidence in simulation: A descriptive study.

    PubMed

    Zapko, Karen A; Ferranto, Mary Lou Gemma; Blasiman, Rachael; Shelestak, Debra

    2018-01-01

    The National League for Nursing (NLN) has endorsed simulation as a necessary teaching approach to prepare students for the demanding role of professional nursing. Questions arise about the suitability of simulation experiences to educate students. Empirical support for the effect of simulation on patient outcomes is sparse. Most studies on simulation report only anecdotal results rather than data obtained using evaluative tools. The aim of this study was to examine student perception of best educational practices in simulation and to evaluate their satisfaction and self-confidence in simulation. This study was a descriptive study designed to explore students' perceptions of the simulation experience over a two-year period. Using the Jeffries framework, a Simulation Day was designed consisting of serial patient simulations using high and medium fidelity simulators and live patient actors. The setting for the study was a regional campus of a large Midwestern Research 2 university. The convenience sample consisted of 199 participants and included sophomore, junior, and senior nursing students enrolled in the baccalaureate nursing program. The Simulation Days consisted of serial patient simulations using high and medium fidelity simulators and live patient actors. Participants rotated through four scenarios that corresponded to their level in the nursing program. Data was collected in two consecutive years. Participants completed both the Educational Practices Questionnaire (Student Version) and the Student Satisfaction and Self-Confidence in Learning Scale. Results provide strong support for using serial simulation as a learning tool. Students were satisfied with the experience, felt confident in their performance, and felt the simulations were based on sound educational practices and were important for learning. Serial simulations and having students experience simulations more than once in consecutive years is a valuable method of clinical instruction. When conducted well, simulations can lead to increased student satisfaction and self-confidence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Prospects for Simulation and Gaming in Mathematics and Science Education

    ERIC Educational Resources Information Center

    Bloomer, Jacquetta

    1974-01-01

    The growth and potential of simulation and gaming techniques are examined in pure science, applied science and mathematics. The contribution of simulations, simulation games and non-simulation games are separately assessed with selective illustrations; in particular, indications for using simulated, as opposed to "live," experiments in science…

  6. A Simbol-X Event Simulator

    NASA Astrophysics Data System (ADS)

    Puccetti, S.; Fiore, F.; Giommi, P.

    2009-05-01

    The ASI Science Data Center (ASDC) has developed an X-ray event simulator to support users (and team members) in simulation of data taken with the two cameras on board the Simbol-X X-Ray Telescope. The Simbol-X simulator is very fast and flexible, compared to ray-tracing simulator. These properties make our simulator advantageous to support the user in planning proposals and comparing real data with the theoretical expectations and for a quick detection of unexpected features. We present here the simulator outline and a few examples of simulated data.

  7. Higher-level simulations of turbulent flows

    NASA Technical Reports Server (NTRS)

    Ferziger, J. H.

    1981-01-01

    The fundamentals of large eddy simulation are considered and the approaches to it are compared. Subgrid scale models and the development of models for the Reynolds-averaged equations are discussed as well as the use of full simulation in testing these models. Numerical methods used in simulating large eddies, the simulation of homogeneous flows, and results from full and large scale eddy simulations of such flows are examined. Free shear flows are considered with emphasis on the mixing layer and wake simulation. Wall-bounded flow (channel flow) and recent work on the boundary layer are also discussed. Applications of large eddy simulation and full simulation in meteorological and environmental contexts are included along with a look at the direction in which work is proceeding and what can be expected from higher-level simulation in the future.

  8. [Low Fidelity Simulation of a Zero-Y Robot

    NASA Technical Reports Server (NTRS)

    Sweet, Adam

    2001-01-01

    The item to be cleared is a low-fidelity software simulation model of a hypothetical freeflying robot designed for use in zero gravity environments. This simulation model works with the HCC simulation system that was developed by Xerox PARC and NASA Ames Research Center. HCC has been previously cleared for distribution. When used with the HCC software, the model computes the location and orientation of the simulated robot over time. Failures (such as a broken motor) can be injected into the simulation to produce simulated behavior corresponding to the failure. Release of this simulation will allow researchers to test their software diagnosis systems by attempting to diagnose the simulated failure from the simulated behavior. This model does not contain any encryption software nor can it perform any control tasks that might be export controlled.

  9. Development Issues for Lunar Regolith Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Carpenter, Paul; Sibille, Laurent; Owens, Charles; French, Raymond; McLemore, Carole

    2006-01-01

    Significant challenges and logistical issues exist for the development of standardized lunar regolith simulant (SLRS) materials for use in the development and testing of flight hardware for upcoming NASA lunar missions. A production program at Marshall Space Flight Center (MSFC) for the deployment of lunar mare basalt simulant JSC-lA is underway. Root simulants have been proposed for the development of a low-T mare basalt simulant and a high-Ca highland anorthosite simulant, as part of a framework of simulant development outlined in the 2005 Lunar Regolith Simulant Materials Workshop held at MSFC. Many of the recommendation for production and standardization of simulants have already been documented by the MSFC team. But there are a number of unanswered questions related to geology which need ta be addressed prior to the creation of the simulants.

  10. Residents’ perceptions of simulation as a clinical learning approach

    PubMed Central

    Walsh, Catharine M.; Garg, Ankit; Ng, Stella L.; Goyal, Fenny; Grover, Samir C.

    2017-01-01

    Background Simulation is increasingly being integrated into medical education; however, there is little research into trainees’ perceptions of this learning modality. We elicited trainees’ perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. Methods We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4–6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Results Residents’ perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents’ markedly narrow perception of simulation’s capacity to support non-technical skills development or its use beyond introductory learning. Conclusion Trainees’ learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees’ a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation. PMID:28344719

  11. 2007 Lunar Regolith Simulant Workshop Overview

    NASA Technical Reports Server (NTRS)

    McLemore, Carole A.; Fikes, John C.; Howell, Joe T.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.

  12. 2007 Lunar Regolith Simulant Workshop Overview

    NASA Technical Reports Server (NTRS)

    McLemore, Carole A.; Fikes, John C.; Howell, Joe T.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1 M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.

  13. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  14. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  15. A typology of educationally focused medical simulation tools.

    PubMed

    Alinier, Guillaume

    2007-10-01

    The concept of simulation as an educational tool in healthcare is not a new idea but its use has really blossomed over the last few years. This enthusiasm is partly driven by an attempt to increase patient safety and also because the technology is becoming more affordable and advanced. Simulation is becoming more commonly used for initial training purposes as well as for continuing professional development, but people often have very different perceptions of the definition of the term simulation, especially in an educational context. This highlights the need for a clear classification of the technology available but also about the method and teaching approach employed. The aims of this paper are to discuss the current range of simulation approaches and propose a clear typology of simulation teaching aids. Commonly used simulation techniques have been identified and discussed in order to create a classification that reports simulation techniques, their usual mode of delivery, the skills they can address, the facilities required, their typical use, and their pros and cons. This paper presents a clear classification scheme of educational simulation tools and techniques with six different technological levels. They are respectively: written simulations, three-dimensional models, screen-based simulators, standardized patients, intermediate fidelity patient simulators, and interactive patient simulators. This typology allows the accurate description of the simulation technology and the teaching methods applied. Thus valid comparison of educational tools can be made as to their potential effectiveness and verisimilitude at different training stages. The proposed typology of simulation methodologies available for educational purposes provides a helpful guide for educators and participants which should help them to realise the potential learning outcomes at different technological simulation levels in relation to the training approach employed. It should also be a useful resource for simulation users who are trying to improve their educational practice.

  16. Auditory perceptual simulation: Simulating speech rates or accents?

    PubMed

    Zhou, Peiyun; Christianson, Kiel

    2016-07-01

    When readers engage in Auditory Perceptual Simulation (APS) during silent reading, they mentally simulate characteristics of voices attributed to a particular speaker or a character depicted in the text. Previous research found that auditory perceptual simulation of a faster native English speaker during silent reading led to shorter reading times that auditory perceptual simulation of a slower non-native English speaker. Yet, it was uncertain whether this difference was triggered by the different speech rates of the speakers, or by the difficulty of simulating an unfamiliar accent. The current study investigates this question by comparing faster Indian-English speech and slower American-English speech in the auditory perceptual simulation paradigm. Analyses of reading times of individual words and the full sentence reveal that the auditory perceptual simulation effect again modulated reading rate, and auditory perceptual simulation of the faster Indian-English speech led to faster reading rates compared to auditory perceptual simulation of the slower American-English speech. The comparison between this experiment and the data from Zhou and Christianson (2016) demonstrate further that the "speakers'" speech rates, rather than the difficulty of simulating a non-native accent, is the primary mechanism underlying auditory perceptual simulation effects. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework

    PubMed Central

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811

  18. Collaborative simulation method with spatiotemporal synchronization process control

    NASA Astrophysics Data System (ADS)

    Zou, Yisheng; Ding, Guofu; Zhang, Weihua; Zhang, Jian; Qin, Shengfeng; Tan, John Kian

    2016-10-01

    When designing a complex mechatronics system, such as high speed trains, it is relatively difficult to effectively simulate the entire system's dynamic behaviors because it involves multi-disciplinary subsystems. Currently,a most practical approach for multi-disciplinary simulation is interface based coupling simulation method, but it faces a twofold challenge: spatial and time unsynchronizations among multi-directional coupling simulation of subsystems. A new collaborative simulation method with spatiotemporal synchronization process control is proposed for coupling simulating a given complex mechatronics system across multiple subsystems on different platforms. The method consists of 1) a coupler-based coupling mechanisms to define the interfacing and interaction mechanisms among subsystems, and 2) a simulation process control algorithm to realize the coupling simulation in a spatiotemporal synchronized manner. The test results from a case study show that the proposed method 1) can certainly be used to simulate the sub-systems interactions under different simulation conditions in an engineering system, and 2) effectively supports multi-directional coupling simulation among multi-disciplinary subsystems. This method has been successfully applied in China high speed train design and development processes, demonstrating that it can be applied in a wide range of engineering systems design and simulation with improved efficiency and effectiveness.

  19. Improving the result of forcasting using reservoir and surface network simulation

    NASA Astrophysics Data System (ADS)

    Hendri, R. S.; Winarta, J.

    2018-01-01

    This study was aimed to get more representative results in production forcasting using integrated simulation in pipeline gathering system of X field. There are 5 main scenarios which consist of the production forecast of the existing condition, work over, and infill drilling. Then, it’s determined the best development scenario. The methods of this study is Integrated Reservoir Simulator and Pipeline Simulator so-calle as Integrated Reservoir and Surface Network Simulation. After well data result from reservoir simulator was then integrated with pipeline networking simulator’s to construct a new schedule, which was input for all simulation procedure. The well design result was done by well modeling simulator then exported into pipeline simulator. Reservoir prediction depends on the minimum value of Tubing Head Pressure (THP) for each well, where the pressure drop on the Gathering Network is not necessary calculated. The same scenario was done also for the single-reservoir simulation. Integration Simulation produces results approaching the actual condition of the reservoir and was confirmed by the THP profile, which difference between those two methods. The difference between integrated simulation compared to single-modeling simulation is 6-9%. The aimed of solving back-pressure problem in pipeline gathering system of X field is achieved.

  20. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    PubMed

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  1. Workshop on data acquisition and trigger system simulations for high energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less

  2. JASMINE simulator

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Tsujimoto, Takuji; Suganuma, Masahiro; Niwa, Yoshito; Sako, Nobutada; Hatsutori, Yoichi; Tanaka, Takashi

    2006-06-01

    We explain simulation tools in JASMINE project (JASMINE simulator). The JASMINE project stands at the stage where its basic design will be determined in a few years. Then it is very important to simulate the data stream generated by astrometric fields at JASMINE in order to support investigations into error budgets, sampling strategy, data compression, data analysis, scientific performances, etc. Of course, component simulations are needed, but total simulations which include all components from observation target to satellite system are also very important. We find that new software technologies, such as Object Oriented(OO) methodologies are ideal tools for the simulation system of JASMINE(the JASMINE simulator). In this article, we explain the framework of the JASMINE simulator.

  3. Development and operation of a real-time simulation at the NASA Ames Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Sheppard, Shirin; Chetelat, Monique

    1993-01-01

    The Vertical Motion Simulator (VMS) facility at the NASA Ames Research Center combines the largest vertical motion capability in the world with a flexible real-time operating system allowing research to be conducted quickly and effectively. Due to the diverse nature of the aircraft simulated and the large number of simulations conducted annually, the challenge for the simulation engineer is to develop an accurate real-time simulation in a timely, efficient manner. The SimLab facility and the software tools necessary for an operating simulation will be discussed. Subsequent sections will describe the development process through operation of the simulation; this includes acceptance of the model, validation, integration and production phases.

  4. SimZones: An Organizational Innovation for Simulation Programs and Centers.

    PubMed

    Roussin, Christopher J; Weinstock, Peter

    2017-08-01

    The complexity and volume of simulation-based learning programs have increased dramatically over the last decade, presenting several major challenges for those who lead and manage simulation programs and centers. The authors present five major issues affecting the organization of simulation programs: (1) supporting both single- and double-loop learning experiences; (2) managing the training of simulation teaching faculty; (3) optimizing the participant mix, including individuals, professional groups, teams, and other role-players, to ensure learning; (4) balancing in situ, node-based, and center-based simulation delivery; and (5) organizing simulation research and measuring value. They then introduce the SimZones innovation, a system of organization for simulation-based learning, and explain how it can alleviate the problems associated with these five issues.Simulations are divided into four zones (Zones 0-3). Zone 0 simulations include autofeedback exercises typically practiced by solitary learners, often using virtual simulation technology. Zone 1 simulations include hands-on instruction of foundational clinical skills. Zone 2 simulations include acute situational instruction, such as clinical mock codes. Zone 3 simulations involve authentic, native teams of participants and facilitate team and system development.The authors also discuss the translation of debriefing methods from Zone 3 simulations to real patient care settings (Zone 4), and they illustrate how the SimZones approach can enable the development of longitudinal learning systems in both teaching and nonteaching hospitals. The SimZones approach was initially developed in the context of the Boston Children's Hospital Simulator Program, which the authors use to illustrate this innovation in action.

  5. The Effects of Time Advance Mechanism on Simple Agent Behaviors in Combat Simulations

    DTIC Science & Technology

    2011-12-01

    modeling packages that illustrate the differences between discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat... DES ) models , often referred to as “next-event” (Law and Kelton 2000) or discrete time simulation (DTS), commonly referred to as “time-step.” DTS...discrete-time simulation (DTS) and discrete-event simulation ( DES ) methodologies. Many combat models use DTS as their simulation time advance mechanism

  6. Application of Coalition Battle Management Language (C-BML) and C-BML Services to Live, Virtual, and Constructive (LVC) Simulation Environments

    DTIC Science & Technology

    2011-12-01

    Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization

  7. Perceptions, training experiences, and preferences of surgical residents toward laparoscopic simulation training: a resident survey.

    PubMed

    Shetty, Shohan; Zevin, Boris; Grantcharov, Teodor P; Roberts, Kurt E; Duffy, Andrew J

    2014-01-01

    Simulation training for surgical residents can shorten learning curves, improve technical skills, and expedite competency. Several studies have shown that skills learned in the simulated environment are transferable to the operating room. Residency programs are trying to incorporate simulation into the resident training curriculum to supplement the hands-on experience gained in the operating room. Despite the availability and proven utility of surgical simulators and simulation laboratories, they are still widely underutilized by surgical trainees. Studies have shown that voluntary use leads to minimal participation in a training curriculum. Although there are several simulation tools, there is no clear evidence of the superiority of one tool over the other in skill acquisition. The purpose of this study was to explore resident perceptions, training experiences, and preferences regarding laparoscopic simulation training. Our goal was to profile resident participation in surgical skills simulation, recognize potential barriers to voluntary simulator use, and identify simulation tools and tasks preferred by residents. Furthermore, this study may help to inform whether mandatory/protected training time, as part of the residents' curriculum is essential to enhance participation in the simulation laboratory. A cross-sectional study on general surgery residents (postgraduate years 1-5) at Yale University School of Medicine and the University of Toronto via an online questionnaire was conducted. Overall, 67 residents completed the survey. The institutional review board approved the methods of the study. Overall, 95.5% of the participants believed that simulation training improved their laparoscopic skills. Most respondents (92.5%) perceived that skills learned during simulation training were transferrable to the operating room. Overall, 56.7% of participants agreed that proficiency in a simulation curriculum should be mandatory before operating room experience. The simulation laboratory was most commonly used during work hours; lack of free time during work hours was most commonly cited as a reason for underutilization. Factors influencing use of the simulation laboratory in order of importance were the need for skill development, an interest in minimally invasive surgery, mandatory/protected time in a simulation environment as part of the residency program curriculum, a recommendation by an attending surgeon, and proximity of the simulation center. The most preferred simulation tool was the live animal model followed by cadaveric tissue. Virtual reality simulators were among the least-preferred (25%) simulation tools. Most residents (91.0%) felt that mandatory/protected time in a simulation environment should be introduced into resident training protocols. Mandatory and protected time in a simulation environment as part of the resident training curriculum may improve participation in simulation training. A comprehensive curriculum, which includes the use of live animals, cadaveric tissue, and virtual reality simulators, may enhance the laparoscopic training experience and interest level of surgical trainees. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  8. The State of Simulations: Soft-Skill Simulations Emerge as a Powerful New Form of E-Learning.

    ERIC Educational Resources Information Center

    Aldrich, Clark

    2001-01-01

    Presents responses of leaders from six simulation companies about challenges and opportunities of soft-skills simulations in e-learning. Discussion includes: evaluation metrics; role of subject matter experts in developing simulations; video versus computer graphics; technology needed to run simulations; technology breakthroughs; pricing;…

  9. The effectiveness of and satisfaction with high-fidelity simulation to teach cardiac surgical resuscitation skills to nurses.

    PubMed

    McRae, Marion E; Chan, Alice; Hulett, Renee; Lee, Ai Jin; Coleman, Bernice

    2017-06-01

    There are few reports of the effectiveness or satisfaction with simulation to learn cardiac surgical resuscitation skills. To test the effect of simulation on the self-confidence of nurses to perform cardiac surgical resuscitation simulation and nurses' satisfaction with the simulation experience. A convenience sample of sixty nurses rated their self-confidence to perform cardiac surgical resuscitation skills before and after two simulations. Simulation performance was assessed. Subjects completed the Satisfaction with Simulation Experience scale and demographics. Self-confidence scores to perform all cardiac surgical skills as measured by paired t-tests were significantly increased after the simulation (d=-0.50 to 1.78). Self-confidence and cardiac surgical work experience were not correlated with time to performance. Total satisfaction scores were high (mean 80.2, SD 1.06) indicating satisfaction with the simulation. There was no correlation of the satisfaction scores with cardiac surgical work experience (τ=-0.05, ns). Self-confidence scores to perform cardiac surgical resuscitation procedures were higher after the simulation. Nurses were highly satisfied with the simulation experience. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A "Skylight" Simulator for HWIL Simulation of Hyperspectral Remote Sensing.

    PubMed

    Zhao, Huijie; Cui, Bolun; Jia, Guorui; Li, Xudong; Zhang, Chao; Zhang, Xinyang

    2017-12-06

    Even though digital simulation technology has been widely used in the last two decades, hardware-in-the-loop (HWIL) simulation is still an indispensable method for spectral uncertainty research of ground targets. However, previous facilities mainly focus on the simulation of panchromatic imaging. Therefore, neither the spectral nor the spatial performance is enough for hyperspectral simulation. To improve the accuracy of illumination simulation, a new dome-like skylight simulator is designed and developed to fit the spatial distribution and spectral characteristics of a real skylight for the wavelength from 350 nm to 2500 nm. The simulator's performance was tested using a spectroradiometer with different accessories. The spatial uniformity is greater than 0.91. The spectral mismatch decreases to 1/243 of the spectral mismatch of the Imagery Simulation Facility (ISF). The spatial distribution of radiance can be adjusted, and the accuracy of the adjustment is greater than 0.895. The ability of the skylight simulator is also demonstrated by comparing radiometric quantities measured in the skylight simulator with those in a real skylight in Beijing.

  11. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  12. The Tuscan Mobile Simulation Program: a description of a program for the delivery of in situ simulation training.

    PubMed

    Ullman, Edward; Kennedy, Maura; Di Delupis, Francesco Dojmi; Pisanelli, Paolo; Burbui, Andrea Giuliattini; Cussen, Meaghan; Galli, Laura; Pini, Riccardo; Gensini, Gian Franco

    2016-09-01

    Simulation has become a critical aspect of medical education. It allows health care providers the opportunity to focus on safety and high-risk situations in a protected environment. Recently, in situ simulation, which is performed in the actual clinical setting, has been used to recreate a more realistic work environment. This form of simulation allows for better team evaluation as the workers are in their traditional roles, and can reveal latent safety errors that often are not seen in typical simulation scenarios. We discuss the creation and implementation of a mobile in situ simulation program in emergency departments of three hospitals in Tuscany, Italy, including equipment, staffing, and start-up costs for this program. We also describe latent safety threats identified in the pilot in situ simulations. This novel approach has the potential to both reduce the costs of simulation compared to traditional simulation centers, and to expand medical simulation experiences to providers and healthcare organizations that do not have access to a large simulation center.

  13. Establishing a convention for acting in healthcare simulation: merging art and science.

    PubMed

    Sanko, Jill S; Shekhter, Ilya; Kyle, Richard R; Di Benedetto, Stephen; Birnbach, David J

    2013-08-01

    Among the most powerful tools available to simulation instructors is a confederate. Although technical and logical realism is dictated by the simulation platform and setting, the quality of role playing by confederates strongly determines psychological or emotional fidelity of simulation. The highest level of realism, however, is achieved when the confederates are properly trained. Theater and acting methodology can provide simulation educators a framework from which to establish an acting convention specific to the discipline of healthcare simulation. This report attempts to examine simulation through the lens of theater arts and represents an opinion on acting in healthcare simulation for both simulation educators and confederates. It aims to refine the practice of simulation by embracing the lessons of the theater community. Although the application of these approaches in healthcare education has been described in the literature, a systematic way of organizing, publicizing, or documenting the acting within healthcare simulation has never been completed. Therefore, we attempt, for the first time, to take on this challenge and create a resource, which infuses theater arts into the practice of healthcare simulation.

  14. Simulation in bronchoscopy: current and future perspectives.

    PubMed

    Nilsson, Philip Mørkeberg; Naur, Therese Maria Henriette; Clementsen, Paul Frost; Konge, Lars

    2017-01-01

    To provide an overview of current literature that informs how to approach simulation practice of bronchoscopy and discuss how findings from other simulation research can help inform the use of simulation in bronchoscopy training. We conducted a literature search on simulation training of bronchoscopy and divided relevant studies in three categories: 1) structuring simulation training in bronchoscopy, 2) assessment of competence in bronchoscopy training, and 3) development of cheap alternatives for bronchoscopy simulation. Bronchoscopy simulation is effective, and the training should be structured as distributed practice with mastery learning criteria (ie, training until a certain level of competence is achieved). Dyad practice (training in pairs) is possible and may increase utility of available simulators. Trainee performance should be assessed with assessment tools with established validity. Three-dimensional printing is a promising new technology opening possibilities for developing cheap simulators with innovative features.

  15. Large Scale Simulation Platform for NODES Validation Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sotorrio, P.; Qin, Y.; Min, L.

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and lightmore » commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.« less

  16. Training students to detect delirium: An interprofessional pilot study.

    PubMed

    Chambers, Breah; Meyer, Mary; Peterson, Moya

    2018-06-01

    The purpose of this paper is to report nursing student knowledge acquisition and attitude after completing and interprofessional simulation with medical students. The IOM has challenged healthcare educators to teach teamwork and communication skills in interprofessional settings. Interprofessional simulation provides a higher fidelity experience than simulation in silos. Simulation may be particularly useful in helping healthcare workers gain the necessary skills to care for psychiatric clients. Specifically, healthcare providers have difficulty differentiating between dementia and delirium. Recognizing this deficit, an interprofessional simulation was created using medical students in their neurology rotation and senior nursing students. Twenty-four volunteer nursing students completed a pre-survey to assess delirium knowledge and then completed an education module about delirium. Twelve of these students participated in a simulation with medicine students. Pre and Post Kid SIM Attitude questionnaires were completed by all students participating in the simulation. After the simulations were complete, all twenty-four students were asked to complete the post-survey regarding delirium knowledge. While delirium knowledge scores improved in both groups, the simulation group scored higher, but the difference did not reach significance. The simulation group demonstrated a statistically significant improvement in attitudes toward simulation, interprofessional education, and teamwork post simulation compared to their pre-simulation scores. Nursing students who participated in an interprofessional simulation developed a heightened appreciation for learning communication, teamwork, situational awareness, and interprofessional roles and responsibilities. These results support the use of interprofessional simulation in healthcare education. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Fast Simulation of Electromagnetic Showers in the ATLAS Calorimeter: Frozen Showers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barberio, E.; /Melbourne U.; Boudreau, J.

    2011-11-29

    One of the most time consuming process simulating pp interactions in the ATLAS detector at LHC is the simulation of electromagnetic showers in the calorimeter. In order to speed up the event simulation several parametrisation methods are available in ATLAS. In this paper we present a short description of a frozen shower technique, together with some recent benchmarks and comparison with full simulation. An expected high rate of proton-proton collisions in ATLAS detector at LHC requires large samples of simulated events (Monte Carlo) to study various physics processes. A detailed simulation of particle reactions ('full simulation') in the ATLAS detectormore » is based on GEANT4 and is very accurate. However, due to complexity of the detector, high particle multiplicity and GEANT4 itself, the average CPU time spend to simulate typical QCD event in pp collision is 20 or more minutes for modern computers. During detector simulation the largest time is spend in the calorimeters (up to 70%) most of which is required for electromagnetic particles in the electromagnetic (EM) part of the calorimeters. This is the motivation for fast simulation approaches which reduce the simulation time without affecting the accuracy. Several of fast simulation methods available within the ATLAS simulation framework (standard Athena based simulation program) are discussed here with the focus on the novel frozen shower library (FS) technique. The results obtained with FS are presented here as well.« less

  18. Piloted aircraft simulation concepts and overview

    NASA Technical Reports Server (NTRS)

    Sinacori, J. B.

    1978-01-01

    An overview of piloted aircraft simulation is presented that reflects the viewpoint of an aeronautical technologist. The intent is to acquaint potential users with some of the basic concepts and issues that characterize piloted simulation. Application to the development of aircraft are highlighted, but some aspects of training simulators are covered. A historical review is given together with a description of some current simulators. Simulator usages, advantages, and limitations are discussed and human perception qualities important to simulation are related. An assessment of current simulation is presented that addresses validity, fidelity, and deficiencies. Future prospects are discussed and technology projections are made.

  19. Closed loop models for analyzing the effects of simulator characteristics. [digital simulation of human operators

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Kleinman, D. L.

    1978-01-01

    The optimal control model of the human operator is used to develop closed loop models for analyzing the effects of (digital) simulator characteristics on predicted performance and/or workload. Two approaches are considered: the first utilizes a continuous approximation to the discrete simulation in conjunction with the standard optimal control model; the second involves a more exact discrete description of the simulator in a closed loop multirate simulation in which the optimal control model simulates the pilot. Both models predict that simulator characteristics can have significant effects on performance and workload.

  20. Reevaluating simulation in nursing education: beyond the human patient simulator.

    PubMed

    Schiavenato, Martin

    2009-07-01

    The human patient simulator or high-fidelity mannequin has become synonymous with the word simulation in nursing education. Founded on a historical context and on an evaluation of the current application of simulation in nursing education, this article challenges that assumption as limited and restrictive. A definition of simulation and a broader conceptualization of its application in nursing education are presented. The need for an ideological basis for simulation in nursing education is highlighted. The call is made for theory to answer the question of why simulation is used in nursing to anchor its proper and effective application in nursing education.

  1. Simulation and evaluation of the Sh-2F helicopter in a shipboard environment using the interchangeable cab system

    NASA Technical Reports Server (NTRS)

    Paulk, C. H., Jr.; Astill, D. L.; Donley, S. T.

    1983-01-01

    The operation of the SH-2F helicopter from the decks of small ships in adverse weather was simulated using a large amplitude vertical motion simulator, a wide angle computer generated imagery visual system, and an interchangeable cab (ICAB). The simulation facility, the mathematical programs, and the validation method used to ensure simulation fidelity are described. The results show the simulator to be a useful tool in simulating the ship-landing problem. Characteristics of the ICAB system and ways in which the simulation can be improved are presented.

  2. Mental simulation of routes during navigation involves adaptive temporal compression

    PubMed Central

    Arnold, Aiden E.G.F.; Iaria, Giuseppe; Ekstrom, Arne D.

    2016-01-01

    Mental simulation is a hallmark feature of human cognition, allowing features from memories to be flexibly used during prospection. While past studies demonstrate the preservation of real-world features such as size and distance during mental simulation, their temporal dynamics remains unknown. Here, we compare mental simulations to navigation of routes in a large-scale spatial environment to test the hypothesis that such simulations are temporally compressed in an adaptive manner. Our results show that simulations occurred at 2.39x the speed it took to navigate a route, increasing in compression (3.57x) for slower movement speeds. Participant self-reports of vividness and spatial coherence of simulations also correlated strongly with simulation duration, providing an important link between subjective experiences of simulated events and how spatial representations are combined during prospection. These findings suggest that simulation of spatial events involve adaptive temporal mechanisms, mediated partly by the fidelity of memories used to generate the simulation. PMID:27568586

  3. Design of a bounded wave EMP (Electromagnetic Pulse) simulator

    NASA Astrophysics Data System (ADS)

    Sevat, P. A. A.

    1989-06-01

    Electromagnetic Pulse (EMP) simulators are used to simulate the EMP generated by a nuclear weapon and to harden equipment against the effects of EMP. At present, DREO has a 1 m EMP simulator for testing computer terminal size equipment. To develop the R and D capability for testing larger objects, such as a helicopter, a much bigger threat level facility is required. This report concerns the design of a bounded wave EMP simulator suitable for testing large size equipment. Different types of simulators are described and their pros and cons are discussed. A bounded wave parallel plate type simulator is chosen for it's efficiency and the least environmental impact. Detailed designs are given for 6 m and 10 m parallel plate type wire grid simulators. Electromagnetic fields inside and outside the simulators are computed. Preliminary specifications for a pulse generator required for the simulator are also given. Finally, the electromagnetic fields radiated from the simulator are computed and discussed.

  4. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    NASA Technical Reports Server (NTRS)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  5. Durham extremely large telescope adaptive optics simulation platform.

    PubMed

    Basden, Alastair; Butterley, Timothy; Myers, Richard; Wilson, Richard

    2007-03-01

    Adaptive optics systems are essential on all large telescopes for which image quality is important. These are complex systems with many design parameters requiring optimization before good performance can be achieved. The simulation of adaptive optics systems is therefore necessary to categorize the expected performance. We describe an adaptive optics simulation platform, developed at Durham University, which can be used to simulate adaptive optics systems on the largest proposed future extremely large telescopes as well as on current systems. This platform is modular, object oriented, and has the benefit of hardware application acceleration that can be used to improve the simulation performance, essential for ensuring that the run time of a given simulation is acceptable. The simulation platform described here can be highly parallelized using parallelization techniques suited for adaptive optics simulation, while still offering the user complete control while the simulation is running. The results from the simulation of a ground layer adaptive optics system are provided as an example to demonstrate the flexibility of this simulation platform.

  6. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  7. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  8. A survey of simulators for palpation training.

    PubMed

    Zhang, Yan; Phillips, Roger; Ward, James; Pisharody, Sandhya

    2009-01-01

    Palpation is a widely used diagnostic method in medical practice. The sensitivity of palpation is highly dependent upon the skill of clinicians, which is often difficult to master. There is a need of simulators in palpation training. This paper summarizes important work and the latest achievements in simulation for palpation training. Three types of simulators; physical models, Virtual Reality (VR) based simulations, and hybrid (computerized and physical) simulators, are surveyed. Comparisons among different kinds of simulators are presented.

  9. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  10. A data-driven dynamics simulation framework for railway vehicles

    NASA Astrophysics Data System (ADS)

    Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun

    2018-03-01

    The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.

  11. Surgical simulation: a urological perspective.

    PubMed

    Wignall, Geoffrey R; Denstedt, John D; Preminger, Glenn M; Cadeddu, Jeffrey A; Pearle, Margaret S; Sweet, Robert M; McDougall, Elspeth M

    2008-05-01

    Surgical education is changing rapidly as several factors including budget constraints and medicolegal concerns limit opportunities for urological trainees. New methods of skills training such as low fidelity bench trainers and virtual reality simulators offer new avenues for surgical education. In addition, surgical simulation has the potential to allow practicing surgeons to develop new skills and maintain those they already possess. We provide a review of the background, current status and future directions of surgical simulators as they pertain to urology. We performed a literature review and an overview of surgical simulation in urology. Surgical simulators are in various stages of development and validation. Several simulators have undergone extensive validation studies and are in use in surgical curricula. While virtual reality simulators offer the potential to more closely mimic reality and present entire operations, low fidelity simulators remain useful in skills training, particularly for novices and junior trainees. Surgical simulation remains in its infancy. However, the potential to shorten learning curves for difficult techniques and practice surgery without risk to patients continues to drive the development of increasingly more advanced and realistic models. Surgical simulation is an exciting area of surgical education. The future is bright as advancements in computing and graphical capabilities offer new innovations in simulator technology. Simulators must continue to undergo rigorous validation studies to ensure that time spent by trainees on bench trainers and virtual reality simulators will translate into improved surgical skills in the operating room.

  12. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.

  13. Tests for malingering in ophthalmology

    PubMed Central

    Incesu, Ali Ihsan

    2013-01-01

    Simulation can be defined as malingering, or sometimes functional visual loss (FVL). It manifests as either simulating an ophthalmic disease (positive simulation), or denial of ophthalmic disease (negative simulation). Conscious behavior and compensation or indemnity claims are prominent features of simulation. Since some authors suggest that this is a manifestation of underlying psychopathology, even conversion is included in this context. In today's world, every ophthalmologist can face with simulation of ophthalmic disease or disorder. In case of simulation suspect, the physician's responsibility is to prove the simulation considering the disease/disorder first, and simulation as an exclusion. In simulation examinations, the physician should be firm and smart to select appropriate test(s) to convince not only the subject, but also the judge in case of indemnity or compensation trials. Almost all ophthalmic sensory and motor functions including visual acuity, visual field, color vision and night vision can be the subject of simulation. Examiner must be skillful in selecting the most appropriate test. Apart from those in the literature, we included all kinds of simulation in ophthalmology. In addition, simulation examination techniques, such as, use of optical coherence tomography, frequency doubling perimetry (FDP), and modified polarization tests were also included. In this review, we made a thorough literature search, and added our experiences to give the readers up-to-date information on malingering or simulation in ophthalmology. PMID:24195054

  14. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  15. A review of virtual reality based training simulators for orthopaedic surgery.

    PubMed

    Vaughan, Neil; Dubey, Venketesh N; Wainwright, Thomas W; Middleton, Robert G

    2016-02-01

    This review presents current virtual reality based training simulators for hip, knee and other orthopaedic surgery, including elective and trauma surgical procedures. There have not been any reviews focussing on hip and knee orthopaedic simulators. A comparison of existing simulator features is provided to identify what is missing and what is required to improve upon current simulators. In total 11 hip replacements pre-operative planning tools were analysed, plus 9 hip trauma fracture training simulators. Additionally 9 knee arthroscopy simulators and 8 other orthopaedic simulators were included for comparison. The findings are that for orthopaedic surgery simulators in general, there is increasing use of patient-specific virtual models which reduce the learning curve. Modelling is also being used for patient-specific implant design and manufacture. Simulators are being increasingly validated for assessment as well as training. There are very few training simulators available for hip replacement, yet more advanced virtual reality is being used for other procedures such as hip trauma and drilling. Training simulators for hip replacement and orthopaedic surgery in general lag behind other surgical procedures for which virtual reality has become more common. Further developments are required to bring hip replacement training simulation up to date with other procedures. This suggests there is a gap in the market for a new high fidelity hip replacement and resurfacing training simulator. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  16. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  17. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... accordance with subpart C of this part. Line-Operational Simulation means simulation conducted using... operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation specifically includes...

  18. Facilitating researcher use of flight simulators

    NASA Technical Reports Server (NTRS)

    Russell, C. Ray

    1990-01-01

    Researchers conducting experiments with flight simulators encounter numerous obstacles in bringing their ideas to the simulator. Research into how these simulators could be used more efficiently is presented. The study involved: (1) analyzing the Advanced Concepts Simulator software architecture, (2) analyzing the interaction between the researchers and simulation programmers, and (3) proposing a documentation tool for the researchers.

  19. Parallel discrete event simulation: A shared memory approach

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Malony, Allen D.; Mccredie, Bradley D.

    1987-01-01

    With traditional event list techniques, evaluating a detailed discrete event simulation model can often require hours or even days of computation time. Parallel simulation mimics the interacting servers and queues of a real system by assigning each simulated entity to a processor. By eliminating the event list and maintaining only sufficient synchronization to insure causality, parallel simulation can potentially provide speedups that are linear in the number of processors. A set of shared memory experiments is presented using the Chandy-Misra distributed simulation algorithm to simulate networks of queues. Parameters include queueing network topology and routing probabilities, number of processors, and assignment of network nodes to processors. These experiments show that Chandy-Misra distributed simulation is a questionable alternative to sequential simulation of most queueing network models.

  20. MOSES: A Matlab-based open-source stochastic epidemic simulator.

    PubMed

    Varol, Huseyin Atakan

    2016-08-01

    This paper presents an open-source stochastic epidemic simulator. Discrete Time Markov Chain based simulator is implemented in Matlab. The simulator capable of simulating SEQIJR (susceptible, exposed, quarantined, infected, isolated and recovered) model can be reduced to simpler models by setting some of the parameters (transition probabilities) to zero. Similarly, it can be extended to more complicated models by editing the source code. It is designed to be used for testing different control algorithms to contain epidemics. The simulator is also designed to be compatible with a network based epidemic simulator and can be used in the network based scheme for the simulation of a node. Simulations show the capability of reproducing different epidemic model behaviors successfully in a computationally efficient manner.

  1. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  2. LPJ-GUESS Simulated North America Vegetation for 21-0 ka Using the TraCE-21ka Climate Simulation

    NASA Astrophysics Data System (ADS)

    Shafer, S. L.; Bartlein, P. J.

    2016-12-01

    Transient climate simulations that span multiple millennia (e.g., TraCE-21ka) have become more common as computing power has increased, allowing climate models to complete long simulations in relatively short periods of time (i.e., months). These climate simulations provide information on the potential rate, variability, and spatial expression of past climate changes. They also can be used as input data for other environmental models to simulate transient changes for different components of paleoenvironmental systems, such as vegetation. Long, transient paleovegetation simulations can provide information on a range of ecological processes, describe the spatial and temporal patterns of changes in species distributions, and identify the potential locations of past species refugia. Paleovegetation simulations also can be used to fill in spatial and temporal gaps in observed paleovegetation data (e.g., pollen records from lake sediments) and to test hypotheses of past vegetation change. We used the TraCE-21ka transient climate simulation for 21-0 ka from CCSM3, a coupled atmosphere-ocean general circulation model. The TraCE-21ka simulated temperature, precipitation, and cloud data were regridded onto a 10-minute grid of North America. These regridded climate data, along with soil data and atmospheric carbon dioxide concentrations, were used as input to LPJ-GUESS, a general ecosystem model, to simulate North America vegetation from 21-0 ka. LPJ-GUESS simulates many of the processes controlling the distribution of vegetation (e.g., competition), although some important processes (e.g., dispersal) are not simulated. We evaluate the LPJ-GUESS-simulated vegetation (in the form of plant functional types and biomes) for key time periods and compare the simulated vegetation with observed paleovegetation data, such as data archived in the Neotoma Paleoecology Database. In general, vegetation simulated by LPJ-GUESS reproduces the major North America vegetation patterns (e.g., forest, grassland) with regional areas of disagreement between simulated and observed vegetation. We describe the regions and time periods with the greatest data-model agreement and disagreement, and discuss some of the strengths and weaknesses of both the simulated climate and simulated vegetation data.

  3. Local and national laparoscopic skill competitions: residents' opinions and impact on adoption of simulation-based training.

    PubMed

    McCreery, Greig L; El-Beheiry, Mostafa; Schlachta, Christopher M

    2017-11-01

    Dedicated practice using laparoscopic simulators improves operative performance. Yet, voluntary utilization is minimal. We hypothesized that skill competition between peers, at the local and national level, positively influences residents' use of laparoscopic simulators. A web-based survey evaluated the relationship between Canadian General Surgery residents' use of laparoscopic simulation and participation in competition. Secondary outcomes assessed attitudes regarding simulation training, factors limiting use, and associations between competition level and usage. One hundred ninety (23%) of 826 potential participants responded. Eighty-three percent rated their laparoscopic abilities as novice or intermediate. More than 70% agreed that use of simulation practice improves intra-operative performance, and should be a mandatory component of training. However, 58% employed simulator practice less than once per month, and 18% never used a simulator. Sixty-five percent engaged in simulator training for 5 h or less over the preceding 6 months. Seventy-three percent had participated in laparoscopic skill competition. Of those, 51% agreed that competition was a motivation for simulation practice. No association was found between those with competition experience and simulator use. However, 83% of those who had competed nationally reported >5 h of simulator use in the previous 6 months compared to those with no competition experience (26%), local competition (40%), and local national-qualifying competition (23%) (p < 0.001). This study does not support the hypothesis that competition alone universally increases voluntary use of simulation-based training, with only the minority of individuals competing at the national level demonstrated significantly higher simulation use. However, simulation training was perceived as a valuable exercise. Lack of time and access to simulators, as opposed to lack of interest, were the most commonly reported to limited use.

  4. The Effectiveness of Remote Facilitation in Simulation-Based Pediatric Resuscitation Training for Medical Students.

    PubMed

    Ohta, Kunio; Kurosawa, Hiroshi; Shiima, Yuko; Ikeyama, Takanari; Scott, James; Hayes, Scott; Gould, Michael; Buchanan, Newton; Nadkarni, Vinay; Nishisaki, Akira

    2017-08-01

    To assess the effectiveness of pediatric simulation by remote facilitation. We hypothesized that simulation by remote facilitation is more effective compared to simulation by an on-site facilitator. We defined remote facilitation as a facilitator remotely (1) introduces simulation-based learning and simulation environment, (2) runs scenarios, and (3) performs debriefing with an on-site facilitator. A remote simulation program for medical students during pediatric rotation was implemented. Groups were allocated to either remote or on-site facilitation depending on the availability of telemedicine technology. Both groups had identical 1-hour simulation sessions with 2 scenarios and debriefing. Their team performance was assessed with behavioral assessment tool by a trained rater. Perception by students was evaluated with Likert scale (1-7). Fifteen groups with 89 students participated in a simulation by remote facilitation, and 8 groups with 47 students participated in a simulation by on-site facilitation. Participant demographics and previous simulation experience were similar. Both groups improved their performance from first to second scenario: groups by remote simulation (first [8.5 ± 4.2] vs second [13.2 ± 6.2], P = 0.003), and groups by on-site simulation (first [6.9 ± 4.1] vs second [12.4 ± 6.4], P = 0.056). The performance improvement was not significantly different between the 2 groups (P = 0.94). Faculty evaluation by students was equally high in both groups (7 vs 7; P = 0.65). A pediatric acute care simulation by remote facilitation significantly improved students' performance. In this pilot study, remote facilitation seems as effective as a traditional, locally facilitated simulation. The remote simulation can be a strong alternative method, especially where experienced facilitators are limited.

  5. Driving simulator sickness: Impact on driving performance, influence of blood alcohol concentration, and effect of repeated simulator exposures.

    PubMed

    Helland, Arne; Lydersen, Stian; Lervåg, Lone-Eirin; Jenssen, Gunnar D; Mørland, Jørg; Slørdal, Lars

    2016-09-01

    Simulator sickness is a major obstacle to the use of driving simulators for research, training and driver assessment purposes. The purpose of the present study was to investigate the possible influence of simulator sickness on driving performance measures such as standard deviation of lateral position (SDLP), and the effect of alcohol or repeated simulator exposure on the degree of simulator sickness. Twenty healthy male volunteers underwent three simulated driving trials of 1h's duration with a curvy rural road scenario, and rated their degree of simulator sickness after each trial. Subjects drove sober and with blood alcohol concentrations (BAC) of approx. 0.5g/L and 0.9g/L in a randomized order. Simulator sickness score (SSS) did not influence the primary outcome measure SDLP. Higher SSS significantly predicted lower average speed and frequency of steering wheel reversals. These effects seemed to be mitigated by alcohol. Higher BAC significantly predicted lower SSS, suggesting that alcohol inebriation alleviates simulator sickness. The negative relation between the number of previous exposures to the simulator and SSS was not statistically significant, but is consistent with habituation to the sickness-inducing effects, as shown in other studies. Overall, the results suggest no influence of simulator sickness on SDLP or several other driving performance measures. However, simulator sickness seems to cause test subjects to drive more carefully, with lower average speed and fewer steering wheel reversals, hampering the interpretation of these outcomes as measures of driving impairment and safety. BAC and repeated simulator exposures may act as confounding variables by influencing the degree of simulator sickness in experimental studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A Functional Comparison of Lunar Regoliths and Their Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, D.; Edmunson, J.; McLemore, C.

    2012-01-01

    Lunar regolith simulants are essential to the development of technology for human exploration of the Moon. Any equipment that will interact with the surface environment must be tested with simulant to mitigate risk. To reduce the greatest amount of risk, the simulant must replicate the lunar surface as well as possible. To quantify the similarities and differences between simulants, the Figures of Merit were developed. The Figures of Merit software compares the simulants and regolith by particle size, particle shape, density, and bulk chemistry and mineralogy; these four properties dictate the majority of the remaining characteristics of a geologic material. There are limitations to both the current Figures of Merit approach and simulants in general. The effect of particle textures is lacking in the Figures of Merit software, and research into this topic has only recently begun with applications to simulants. In addition, not all of the properties for lunar regolith are defined sufficiently for simulant reproduction or comparison; for example, the size distribution of particles greater than 1 centimeter and the makeup of particles less than 10 micrometers is not well known. For simulants, contamination by terrestrial weathering products or undesired trace phases in feedstock material is a major issue. Vapor deposited rims have not yet been created for simulants. Fortunately, previous limitations such as the lack of agglutinates in simulants have been addressed and commercial companies are now making agglutinate material for simulants. Despite some limitations, the Figures of Merit sufficiently quantify the comparison between simulants and regolith for useful application in lunar surface technology. Over time, the compilation and analysis of simulant user data will add an advantageous predictive capability to the Figures of Merit, accurately relating Figures of Merit characteristics to simulant user parameters.

  7. A New Approach to Modeling Jupiter's Magnetosphere

    NASA Astrophysics Data System (ADS)

    Fukazawa, K.; Katoh, Y.; Walker, R. J.; Kimura, T.; Tsuchiya, F.; Murakami, G.; Kita, H.; Tao, C.; Murata, K. T.

    2017-12-01

    The scales in planetary magnetospheres range from 10s of planetary radii to kilometers. For a number of years we have studied the magnetospheres of Jupiter and Saturn by using 3-dimensional magnetohydrodynamic (MHD) simulations. However, we have not been able to reach even the limits of the MHD approximation because of the large amount of computer resources required. Recently thanks to the progress in supercomputer systems, we have obtained the capability to simulate Jupiter's magnetosphere with 1000 times the number of grid points used in our previous simulations. This has allowed us to combine the high resolution global simulation with a micro-scale simulation of the Jovian magnetosphere. In particular we can combine a hybrid (kinetic ions and fluid electrons) simulation with the MHD simulation. In addition, the new capability enables us to run multi-parameter survey simulations of the Jupiter-solar wind system. In this study we performed a high-resolution simulation of Jovian magnetosphere to connect with the hybrid simulation, and lower resolution simulations under the various solar wind conditions to compare with Hisaki and Juno observations. In the high-resolution simulation we used a regular Cartesian gird with 0.15 RJ grid spacing and placed the inner boundary at 7 RJ. From these simulation settings, we provide the magnetic field out to around 20 RJ from Jupiter as a background field for the hybrid simulation. For the first time we have been able to resolve Kelvin Helmholtz waves on the magnetopause. We have investigated solar wind dynamic pressures between 0.01 and 0.09 nPa for a number of IMF values. These simulation data are open for the registered users to download the raw data. We have compared the results of these simulations with Hisaki auroral observations.

  8. Virtual versus face-to-face clinical simulation in relation to student knowledge, anxiety, and self-confidence in maternal-newborn nursing: A randomized controlled trial.

    PubMed

    Cobbett, Shelley; Snelgrove-Clarke, Erna

    2016-10-01

    Clinical simulations can provide students with realistic clinical learning environments to increase their knowledge, self-confidence, and decrease their anxiety prior to entering clinical practice settings. To compare the effectiveness of two maternal newborn clinical simulation scenarios; virtual clinical simulation and face-to-face high fidelity manikin simulation. Randomized pretest-posttest design. A public research university in Canada. Fifty-six third year Bachelor of Science in Nursing students. Participants were randomized to either face-to-face or virtual clinical simulation and then to dyads for completion of two clinical simulations. Measures included: (1) Nursing Anxiety and Self-Confidence with Clinical Decision Making Scale (NASC-CDM) (White, 2011), (2) knowledge pretest and post-test related to preeclampsia and group B strep, and (3) Simulation Completion Questionnaire. Before and after each simulation students completed a knowledge test and the NASC-CDM and the Simulation Completion Questionnaire at study completion. There were no statistically significant differences in student knowledge and self-confidence between face-to-face and virtual clinical simulations. Anxiety scores were higher for students in the virtual clinical simulation than for those in the face-to-face simulation. Students' self-reported preference was face-to-face citing the similarities to practicing in a 'real' situation and the immediate debrief. Students not liking the virtual clinical simulation most often cited technological issues as their rationale. Given the equivalency of knowledge and self-confidence when undergraduate nursing students participate in either maternal newborn clinical scenarios of face-to-face or virtual clinical simulation identified in this trial, it is important to take into the consideration costs and benefits/risks of simulation implementation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    PubMed

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.

  10. Striving for Better Medical Education: the Simulation Approach.

    PubMed

    Sakakushev, Boris E; Marinov, Blagoi I; Stefanova, Penka P; Kostianev, Stefan St; Georgiou, Evangelos K

    2017-06-01

    Medical simulation is a rapidly expanding area within medical education due to advances in technology, significant reduction in training hours and increased procedural complexity. Simulation training aims to enhance patient safety through improved technical competency and eliminating human factors in a risk free environment. It is particularly applicable to a practical, procedure-orientated specialties. Simulation can be useful for novice trainees, experienced clinicians (e.g. for revalidation) and team building. It has become a cornerstone in the delivery of medical education, being a paradigm shift in how doctors are educated and trained. Simulation must take a proactive position in the development of metric-based simulation curriculum, adoption of proficiency benchmarking definitions, and should not depend on the simulation platforms used. Conversely, ingraining of poor practice may occur in the absence of adequate supervision, and equipment malfunction during the simulation can break the immersion and disrupt any learning that has occurred. Despite the presence of high technology, there is a substantial learning curve for both learners and facilitators. The technology of simulation continues to advance, offering devices capable of improved fidelity in virtual reality simulation, more sophisticated procedural practice and advanced patient simulators. Simulation-based training has also brought about paradigm shifts in the medical and surgical education arenas and ensured that the scope and impact of simulation will continue to broaden.

  11. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  12. Traffic and Driving Simulator Based on Architecture of Interactive Motion.

    PubMed

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.

  13. Traffic and Driving Simulator Based on Architecture of Interactive Motion

    PubMed Central

    Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza

    2015-01-01

    This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711

  14. The Persistent Issue of Simulator Sickness in Naval Aviation Training.

    PubMed

    Geyer, Daniel J; Biggs, Adam T

    2018-04-01

    Virtual simulations offer nearly unlimited training potential for naval aviation due to the wide array of scenarios that can be simulated in a safe, reliable, and cost-effective environment. This versatility has created substantial interest in using existing and emerging virtual technology to enhance training scenarios. However, the virtual simulations themselves may hinder training initiatives by inducing simulator sickness among the trainees, which is a series of symptoms similar to motion sickness that can arise from simulator use. Simulator sickness has been a problem for military aviation since the first simulators were introduced. The problem has also persisted despite the increasing fidelity and sense of immersion offered by new generations of simulators. As such, it is essential to understand the various problems so that trainers can ensure the best possible use of the simulators. This review will examine simulator sickness as it pertains to naval aviation training. Topics include: the prevailing theories on why symptoms develop, methods of measurement, contributing factors, effects on training, effects when used shipboard, aftereffects, countermeasures, and recommendations for future research involving virtual simulations in an aviation training environment.Geyer DJ, Biggs AT. The persistent issue of simulator sickness in naval aviation training. Aerosp Med Hum Perform. 2018; 89(4):396-405.

  15. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... instruction in accordance with subpart C of this part. Line-Operational Simulation means simulation conducted..., and ground operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation...

  16. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... instruction in accordance with subpart C of this part. Line-Operational Simulation means simulation conducted..., and ground operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation...

  17. 14 CFR 142.3 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... instruction in accordance with subpart C of this part. Line-Operational Simulation means simulation conducted..., and ground operations. Line operational simulation simulations are conducted for training and evaluation purposes and include random, abnormal, and emergency occurrences. Line operational simulation...

  18. Medical simulation: Overview, and application to wound modelling and management

    PubMed Central

    Pai, Dinker R.; Singh, Simerjit

    2012-01-01

    Simulation in medical education is progressing in leaps and bounds. The need for simulation in medical education and training is increasing because of a) overall increase in the number of medical students vis-à-vis the availability of patients; b) increasing awareness among patients of their rights and consequent increase in litigations and c) tremendous improvement in simulation technology which makes simulation more and more realistic. Simulation in wound care can be divided into use of simulation in wound modelling (to test the effect of projectiles on the body) and simulation for training in wound management. Though this science is still in its infancy, more and more researchers are now devising both low-technology and high-technology (virtual reality) simulators in this field. It is believed that simulator training will eventually translate into better wound care in real patients, though this will be the subject of further research. PMID:23162218

  19. A 2.5D Computational Method to Simulate Cylindrical Fluidized Beds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Tingwen; Benyahia, Sofiane; Dietiker, Jeff

    2015-02-17

    In this paper, the limitations of axisymmetric and Cartesian two-dimensional (2D) simulations of cylindrical gas-solid fluidized beds are discussed. A new method has been proposed to carry out pseudo-two-dimensional (2.5D) simulations of a cylindrical fluidized bed by appropriately combining computational domains of Cartesian 2D and axisymmetric simulations. The proposed method was implemented in the open-source code MFIX and applied to the simulation of a lab-scale bubbling fluidized bed with necessary sensitivity study. After a careful grid study to ensure the numerical results are grid independent, detailed comparisons of the flow hydrodynamics were presented against axisymmetric and Cartesian 2D simulations. Furthermore,more » the 2.5D simulation results have been compared to the three-dimensional (3D) simulation for evaluation. This new approach yields better agreement with the 3D simulation results than with axisymmetric and Cartesian 2D simulations.« less

  20. Medium Fidelity Simulation of Oxygen Tank Venting

    NASA Technical Reports Server (NTRS)

    Sweet, Adam; Kurien, James; Lau, Sonie (Technical Monitor)

    2001-01-01

    The item to he cleared is a medium-fidelity software simulation model of a vented cryogenic tank. Such tanks are commonly used to transport cryogenic liquids such as liquid oxygen via truck, and have appeared on liquid-fueled rockets for decades. This simulation model works with the HCC simulation system that was developed by Xerox PARC and NASA Ames Research Center. HCC has been previously cleared for distribution. When used with the HCC software, the model generates simulated readings for the tank pressure and temperature as the simulated cryogenic liquid boils off and is vented. Failures (such as a broken vent valve) can be injected into the simulation to produce readings corresponding to the failure. Release of this simulation will allow researchers to test their software diagnosis systems by attempting to diagnose the simulated failure from the simulated readings. This model does not contain any encryption software nor can it perform any control tasks that might be export controlled.

  1. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  2. INACSL Standards of Best Practice for Simulation: Past, Present, and Future.

    PubMed

    Sittner, Barbara J; Aebersold, Michelle L; Paige, Jane B; Graham, Leslie L M; Schram, Andrea Parsons; Decker, Sharon I; Lioce, Lori

    2015-01-01

    To describe the historical evolution of the International Nursing Association for Clinical Simulation and Learning's (INACSL) Standards of Best Practice: Simulation. The establishment of simulation standards began as a concerted effort by the INACSL Board of Directors in 2010 to provide best practices to design, conduct, and evaluate simulation activities in order to advance the science of simulation as a teaching methodology. A comprehensive review of the evolution of INACSL Standards of Best Practice: Simulation was conducted using journal publications, the INACSL website, INACSL member survey, and reports from members of the INACSL Standards Committee. The initial seven standards, published in 2011, were reviewed and revised in 2013. Two new standards were published in 2015. The standards will continue to evolve as the science of simulation advances. As the use of simulation-based experiences increases, the INACSL Standards of Best Practice: Simulation are foundational to standardizing language, behaviors, and curricular design for facilitators and learners.

  3. SMI Compatible Simulation Scheduler Design for Reuse of Model Complying with Smp Standard

    NASA Astrophysics Data System (ADS)

    Koo, Cheol-Hea; Lee, Hoon-Hee; Cheon, Yee-Jin

    2010-12-01

    Software reusability is one of key factors which impacts cost and schedule on a software development project. It is very crucial also in satellite simulator development since there are many commercial simulator models related to satellite and dynamics. If these models can be used in another simulator platform, great deal of confidence and cost/schedule reduction would be achieved. Simulation model portability (SMP) is maintained by European Space Agency and many models compatible with SMP/simulation model interface (SMI) are available. Korea Aerospace Research Institute (KARI) is developing hardware abstraction layer (HAL) supported satellite simulator to verify on-board software of satellite. From above reasons, KARI wants to port these SMI compatible models to the HAL supported satellite simulator. To port these SMI compatible models to the HAL supported satellite simulator, simulation scheduler is preliminary designed according to the SMI standard.

  4. Medical simulation: Overview, and application to wound modelling and management.

    PubMed

    Pai, Dinker R; Singh, Simerjit

    2012-05-01

    Simulation in medical education is progressing in leaps and bounds. The need for simulation in medical education and training is increasing because of a) overall increase in the number of medical students vis-à-vis the availability of patients; b) increasing awareness among patients of their rights and consequent increase in litigations and c) tremendous improvement in simulation technology which makes simulation more and more realistic. Simulation in wound care can be divided into use of simulation in wound modelling (to test the effect of projectiles on the body) and simulation for training in wound management. Though this science is still in its infancy, more and more researchers are now devising both low-technology and high-technology (virtual reality) simulators in this field. It is believed that simulator training will eventually translate into better wound care in real patients, though this will be the subject of further research.

  5. Transfer of training and simulator qualification or myth and folklore in helicopter simulation

    NASA Technical Reports Server (NTRS)

    Dohme, Jack

    1992-01-01

    Transfer of training studies at Fort Rucker using the backward-transfer paradigm have shown that existing flight simulators are not entirely adequate for meeting training requirements. Using an ab initio training research simulator, a simulation of the UH-1, training effectiveness ratios were developed. The data demonstrate it to be a cost-effective primary trainer. A simulator qualification method was suggested in which a combination of these transfer-of-training paradigms is used to determine overall simulator fidelity and training effectiveness.

  6. Modeling of Army Research Laboratory EMP simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miletta, J.R.; Chase, R.J.; Luu, B.B.

    1993-12-01

    Models are required that permit the estimation of emitted field signatures from EMP simulators to design the simulator antenna structure, to establish the usable test volumes, and to estimate human exposure risk. This paper presents the capabilities and limitations of a variety of EMP simulator models useful to the Army's EMP survivability programs. Comparisons among frequency and time-domain models are provided for two powerful US Army Research Laboratory EMP simulators: AESOP (Army EMP Simulator Operations) and VEMPS II (Vertical EMP Simulator II).

  7. Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.

  8. Use of a Virtual Learning Platform for Distance-Based Simulation in an Acute Care Nurse Practitioner Curriculum.

    PubMed

    Carman, Margaret; Xu, Shu; Rushton, Sharron; Smallheer, Benjamin A; Williams, Denise; Amarasekara, Sathya; Oermann, Marilyn H

    Acute care nurse practitioner (ACNP) programs that use high-fidelity simulation as a teaching tool need to consider innovative strategies to provide distance-based students with learning experiences that are comparable to those in a simulation laboratory. The purpose of this article is to describe the use of virtual simulations in a distance-based ACNP program and student performance in the simulations. Virtual simulations using iSimulate were integrated into the ACNP course to promote the translation of content into a clinical context and enable students to develop their knowledge and decision-making skills. With these simulations, students worked as a team, even though they were at different sites from each other and from the faculty, to manage care of an acutely ill patient. The students were assigned to simulation groups of 4 students each. One week before the simulation, they reviewed past medical records. The virtual simulation sessions were recorded and then evaluated. The evaluation tools assessed 8 areas of performance and included key behaviors in each of these areas to be performed by students in the simulation. More than 80% of the student groups performed the key behaviors. Virtual simulations provide a learning platform that allows live interaction between students and faculty, at a distance, and application of content to clinical situations. With simulation, learners have an opportunity to practice assessment and decision-making in emergency and high-risk situations. Simulations not only are valuable for student learning but also provide a nonthreatening environment for staff to practice, receive feedback on their skills, and improve their confidence.

  9. Using flight simulators aboard ships: human side effects of an optimal scenario with smooth seas.

    PubMed

    Muth, Eric R; Lawson, Ben

    2003-05-01

    The U.S. Navy is considering placing flight simulators aboard ships. It is known that certain types of flight simulators can elicit motion adaptation syndrome (MAS), and also that certain types of ship motion can cause MAS. The goal of this study was to determine if using a flight simulator during ship motion would cause MAS, even when the simulator stimulus and the ship motion were both very mild. All participants in this study completed three conditions. Condition 1 (Sim) entailed "flying" a personal computer-based flight simulator situated on land. Condition 2 (Ship) involved riding aboard a U.S. Navy Yard Patrol boat. Condition 3 (ShipSim) entailed "flying" a personal computer-based flight simulator while riding aboard a Yard Patrol boat. Before and after each condition, participants' balance and dynamic visual acuity were assessed. After each condition, participants filled out the Nausea Profile and the Simulator Sickness Questionnaire. Following exposure to a flight simulator aboard a ship, participants reported negligible symptoms of nausea and simulator sickness. However, participants exhibited a decrease in dynamic visual acuity after exposure to the flight simulator aboard ship (T[25] = 3.61, p < 0.05). Balance results were confounded by significant learning and, therefore, not interpretable. This study suggests that flight simulators can be used aboard ship. As a minimal safety precaution, these simulators should be used according to current safety practices for land-based simulators. Optimally, these simulators should be designed to minimize MAS, located near the ship's center of rotation and used when ship motion is not provocative.

  10. The use of psychiatry-focused simulation in undergraduate nursing education: A systematic search and review.

    PubMed

    Vandyk, Amanda D; Lalonde, Michelle; Merali, Sabrina; Wright, Erica; Bajnok, Irmajean; Davies, Barbara

    2018-04-01

    Evidence on the use of simulation to teach psychiatry and mental health (including addiction) content is emerging, yet no summary of the implementation processes or associated outcomes exists. The aim of this study was to systematically search and review empirical literature on the use of psychiatry-focused simulation in undergraduate nursing education. Objectives were to (i) assess the methodological quality of existing evidence on the use of simulation to teach mental health content to undergraduate nursing students, (ii) describe the operationalization of the simulations, and (iii) summarize the associated quantitative and qualitative outcomes. We conducted online database (MEDLINE, Embase, ERIC, CINAHL, PsycINFO from January 2004 to October 2015) and grey literature searches. Thirty-two simulation studies were identified describing and evaluating six types of simulations (standardized patients, audio simulations, high-fidelity simulators, virtual world, multimodal, and tabletop). Overall, 2724 participants were included in the studies. Studies reflected a limited number of intervention designs, and outcomes were evaluated with qualitative and quantitative methods incorporating a variety of tools. Results indicated that simulation was effective in reducing student anxiety and improving their knowledge, empathy, communication, and confidence. The summarized qualitative findings all supported the benefit of simulation; however, more research is needed to assess the comparative effectiveness of the types of simulations. Recommendations from the findings include the development of guidelines for educators to deliver each simulation component (briefing, active simulation, debriefing). Finally, consensus around appropriate training of facilitators is needed, as is consistent and agreed upon simulation terminology. © 2017 Australian College of Mental Health Nurses Inc.

  11. Simulation in International Relations Education.

    ERIC Educational Resources Information Center

    Starkey, Brigid A.; Blake, Elizabeth L.

    2001-01-01

    Discusses the educational implications of simulations in international relations. Highlights include the development of international relations simulations; the role of technology; the International Communication and Negotiation Simulations (ICONS) project at the University of Maryland; evolving information technology; and simulating real-world…

  12. The role of the research simulator in the systems development of rotorcraft

    NASA Technical Reports Server (NTRS)

    Statler, I. C.; Deel, A.

    1981-01-01

    The potential application of the research simulator to future rotorcraft systems design, development, product improvement evaluations, and safety analysis is examined. Current simulation capabilities for fixed-wing aircraft are reviewed and the requirements of a rotorcraft simulator are defined. The visual system components, vertical motion simulator, cab, and computation system for a research simulator under development are described.

  13. Hardware Fault Simulator for Microprocessors

    NASA Technical Reports Server (NTRS)

    Hess, L. M.; Timoc, C. C.

    1983-01-01

    Breadboarded circuit is faster and more thorough than software simulator. Elementary fault simulator for AND gate uses three gates and shaft register to simulate stuck-at-one or stuck-at-zero conditions at inputs and output. Experimental results showed hardware fault simulator for microprocessor gave faster results than software simulator, by two orders of magnitude, with one test being applied every 4 microseconds.

  14. The new ATLAS Fast Calorimeter Simulation

    NASA Astrophysics Data System (ADS)

    Schaarschmidt, J.; ATLAS Collaboration

    2017-10-01

    Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.

  15. Accelerating a Particle-in-Cell Simulation Using a Hybrid Counting Sort

    NASA Astrophysics Data System (ADS)

    Bowers, K. J.

    2001-11-01

    In this article, performance limitations of the particle advance in a particle-in-cell (PIC) simulation are discussed. It is shown that the memory subsystem and cache-thrashing severely limit the speed of such simulations. Methods to implement a PIC simulation under such conditions are explored. An algorithm based on a counting sort is developed which effectively eliminates PIC simulation cache thrashing. Sustained performance gains of 40 to 70 percent are measured on commodity workstations for a minimal 2d2v electrostatic PIC simulation. More complete simulations are expected to have even better results as larger simulations are usually even more memory subsystem limited.

  16. Displays and simulators

    NASA Astrophysics Data System (ADS)

    Mohon, N.

    A 'simulator' is defined as a machine which imitates the behavior of a real system in a very precise manner. The major components of a simulator and their interaction are outlined in brief form, taking into account the major components of an aircraft flight simulator. Particular attention is given to the visual display portion of the simulator, the basic components of the display, their interactions, and their characteristics. Real image displays are considered along with virtual image displays, and image generators. Attention is given to an advanced simulator for pilot training, a holographic pancake window, a scan laser image generator, the construction of an infrared target simulator, and the Apollo Command Module Simulator.

  17. Research of laser echo signal simulator

    NASA Astrophysics Data System (ADS)

    Xu, Rui; Shi, Rui; Wang, Xin; Li, Zhou

    2015-11-01

    Laser echo signal simulator is one of the most significant components of hardware-in-the-loop (HWIL) simulation systems for LADAR. System model and time series model of laser echo signal simulator are established. Some influential factors which could induce fixed error and random error on the simulated return signals are analyzed, and then these system insertion errors are analyzed quantitatively. Using this theoretical model, the simulation system is investigated experimentally. The results corrected by subtracting fixed error indicate that the range error of the simulated laser return signal is less than 0.25m, and the distance range that the system can simulate is from 50m to 20km.

  18. Mars Smart Lander Simulations for Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Striepe, S. A.; Way, D. W.; Balaram, J.

    2002-01-01

    Two primary simulations have been developed and are being updated for the Mars Smart Lander Entry, Descent, and Landing (EDL). The high fidelity engineering end-to-end EDL simulation that is based on NASA Langley's Program to Optimize Simulated Trajectories (POST) and the end-to-end real-time, hardware-in-the-loop simulation testbed, which is based on NASA JPL's (Jet Propulsion Laboratory) Dynamics Simulator for Entry, Descent and Surface landing (DSENDS). This paper presents the status of these Mars Smart Lander EDL end-to-end simulations at this time. Various models, capabilities, as well as validation and verification for these simulations are discussed.

  19. Use of Carbon Arc Lamps as Solar Simulation in Environmental Testing

    NASA Technical Reports Server (NTRS)

    Goggia, R. J.; Maclay, J. E.

    1962-01-01

    This report covers work done by the authors on the solar simulator for the six-foot diameter space simulator presently in use at JPL. The space simulator was made by modifying an existent vacuum chamber and uses carbon arc lamps for solar simulation. All Ranger vehicles flown to date have been tested in this facility. The report also contains a series of appendixes covering various aspects of space-simulation design and use. Some of these appendixes contain detailed analyses of space-simulator design criteria. Others cover the techniques used in studying carbon-arc lamps and in applying them as solar simulation.

  20. An Example-Based Brain MRI Simulation Framework.

    PubMed

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L

    2015-02-21

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  1. Using Reconstructed POD Modes as Turbulent Inflow for LES Wind Turbine Simulations

    NASA Astrophysics Data System (ADS)

    Nielson, Jordan; Bhaganagar, Kiran; Juttijudata, Vejapong; Sirisup, Sirod

    2016-11-01

    Currently, in order to get realistic atmospheric effects of turbulence, wind turbine LES simulations require computationally expensive precursor simulations. At times, the precursor simulation is more computationally expensive than the wind turbine simulation. The precursor simulations are important because they capture turbulence in the atmosphere and as stated above, turbulence impacts the power production estimation. On the other hand, POD analysis has been shown to be capable of capturing turbulent structures. The current study was performed to determine the plausibility of using lower dimension models from POD analysis of LES simulations as turbulent inflow to wind turbine LES simulations. The study will aid the wind energy community by lowering the computational cost of full scale wind turbine LES simulations, while maintaining a high level of turbulent information and being able to quickly apply the turbulent inflow to multi turbine wind farms. This will be done by comparing a pure LES precursor wind turbine simulation with simulations that use reduced POD mod inflow conditions. The study shows the feasibility of using lower dimension models as turbulent inflow of LES wind turbine simulations. Overall the power production estimation and velocity field of the wind turbine wake are well captured with small errors.

  2. A systematic review of validated sinus surgery simulators.

    PubMed

    Stew, B; Kao, S S-T; Dharmawardana, N; Ooi, E H

    2018-06-01

    Simulation provides a safe and effective opportunity to develop surgical skills. A variety of endoscopic sinus surgery (ESS) simulators has been described in the literature. Validation of these simulators allows for effective utilisation in training. To conduct a systematic review of the published literature to analyse the evidence for validated ESS simulation. Pubmed, Embase, Cochrane and Cinahl were searched from inception of the databases to 11 January 2017. Twelve thousand five hundred and sixteen articles were retrieved of which 10 112 were screened following the removal of duplicates. Thirty-eight full-text articles were reviewed after meeting search criteria. Evidence of face, content, construct, discriminant and predictive validity was extracted. Twenty articles were included in the analysis describing 12 ESS simulators. Eleven of these simulators had undergone validation: 3 virtual reality, 7 physical bench models and 1 cadaveric simulator. Seven of the simulators were shown to have face validity, 7 had construct validity and 1 had predictive validity. None of the simulators demonstrated discriminate validity. This systematic review demonstrates that a number of ESS simulators have been comprehensively validated. Many of the validation processes, however, lack standardisation in outcome reporting, thus limiting a meta-analysis comparison between simulators. © 2017 John Wiley & Sons Ltd.

  3. NASA One-Dimensional Combustor Simulation--User Manual for S1D_ML

    NASA Technical Reports Server (NTRS)

    Stueber, Thomas J.; Paxson, Daniel E.

    2014-01-01

    The work presented in this paper is to promote research leading to a closed-loop control system to actively suppress thermo-acoustic instabilities. To serve as a model for such a closed-loop control system, a one-dimensional combustor simulation composed using MATLAB software tools has been written. This MATLAB based process is similar to a precursor one-dimensional combustor simulation that was formatted as FORTRAN 77 source code. The previous simulation process requires modification to the FORTRAN 77 source code, compiling, and linking when creating a new combustor simulation executable file. The MATLAB based simulation does not require making changes to the source code, recompiling, or linking. Furthermore, the MATLAB based simulation can be run from script files within the MATLAB environment or with a compiled copy of the executable file running in the Command Prompt window without requiring a licensed copy of MATLAB. This report presents a general simulation overview. Details regarding how to setup and initiate a simulation are also presented. Finally, the post-processing section describes the two types of files created while running the simulation and it also includes simulation results for a default simulation included with the source code.

  4. Challenges of NDE simulation tool validation, optimization, and utilization for composites

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Seebo, Jeffrey P.; Juarez, Peter

    2016-02-01

    Rapid, realistic nondestructive evaluation (NDE) simulation tools can aid in inspection optimization and prediction of inspectability for advanced aerospace materials and designs. NDE simulation tools may someday aid in the design and certification of aerospace components; potentially shortening the time from material development to implementation by industry and government. Furthermore, ultrasound modeling and simulation are expected to play a significant future role in validating the capabilities and limitations of guided wave based structural health monitoring (SHM) systems. The current state-of-the-art in ultrasonic NDE/SHM simulation is still far from the goal of rapidly simulating damage detection techniques for large scale, complex geometry composite components/vehicles containing realistic damage types. Ongoing work at NASA Langley Research Center is focused on advanced ultrasonic simulation tool development. This paper discusses challenges of simulation tool validation, optimization, and utilization for composites. Ongoing simulation tool development work is described along with examples of simulation validation and optimization challenges that are more broadly applicable to all NDE simulation tools. The paper will also discuss examples of simulation tool utilization at NASA to develop new damage characterization methods for composites, and associated challenges in experimentally validating those methods.

  5. What is going on in augmented reality simulation in laparoscopic surgery?

    PubMed

    Botden, Sanne M B I; Jakimowicz, Jack J

    2009-08-01

    To prevent unnecessary errors and adverse results of laparoscopic surgery, proper training is of paramount importance. A safe way to train surgeons for laparoscopic skills is simulation. For this purpose traditional box trainers are often used, however they lack objective assessment of performance. Virtual reality laparoscopic simulators assess performance, but lack realistic haptic feedback. Augmented reality (AR) combines a virtual reality (VR) setting with real physical materials, instruments, and feedback. This article presents the current developments in augmented reality laparoscopic simulation. Pubmed searches were performed to identify articles regarding surgical simulation and augmented reality. Identified companies manufacturing an AR laparoscopic simulator received the same questionnaire referring to the features of the simulator. Seven simulators that fitted the definition of augmented reality were identified during the literature search. Five of the approached manufacturers returned a completed questionnaire, of which one simulator appeared to be VR and was therefore not applicable for this review. Several augmented reality simulators have been developed over the past few years and they are improving rapidly. We recommend the development of AR laparoscopic simulators for component tasks of procedural training. AR simulators should be implemented in current laparoscopic training curricula, in particular for laparoscopic suturing training.

  6. Aviation Simulators for the Desktop: Panel and Demonstrations

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Rosekind, Marl R. (Technical Monitor)

    1997-01-01

    Panel Members are: Christine M. Mitchell (Georgia Tech), Michael T. Palmer (NASA Langley), Greg Pisani (NASA Ames), and Amy R. Pritchett (MIT). The Panel members are affiliated with aviation human factors groups from NASA Ames, NASA Langley, MITCHELL Department of Aerospace and Aeronautical Engineering, and Georgia Technics Center for Human-Machine Systems Research. Panelists will describe the simulator(s) used in their respective institutions including a description of the FMS aircraft models, software, hardware, and displays. Panelists will summarize previous, on-going, and planned empirical studies conducted with the simulators. Greg Pisanich will describe two NASA Ames simulation systems: the Stone Soup Simulator (SSS), and the Airspace Operations Human Factors Simulation Laboratory. The the Stone Soup Simulator is a desktop-based, research flight simulator that includes mode control, flight management, and datalink functionality. It has been developed as a non-proprietary simulator that can be easily distributed to academic and industry researchers who are collaborating on NASA research projects. It will be used and extended by research groups represented by at least two panelists (Mitchell and Palmer). The Airspace Operations Simulator supports the study of air traffic control in conjunction with the flight deck. This simulator will be used provide an environment in which many AATT and free flight concepts can be demonstrated and evaluated. Mike Palmer will describe two NASA Langley efforts: The Langley Simulator and MD-11 extensions to the NASA Amesbury simulator. The first simulator is publicly available and combines a B-737 model with a high fidelity flight management system. The second simulator enhances the S3 simulator with MD-11 electronic flight displays together with modifications to the flight and FMS models to emulate MD-11 dynamics and operations. Chris Mitchell will describe GT-EFIRT (Georgia Tech-Electronic Flight Instrument Research Tool) and B-757 enhancements to the NASA Ames S3. GT-EFIRT is a medium fidelity simulator used to conduct preliminary studies of the CATS (crew activity tracking system). Like the Langley efforts with S3, the Georgia Tech enhancements will allow it to emulate the dynamics and operations of a widely used glass cockpit. Amy Pritchett will describe the MIT simulator(s) that have been used in a range of research investigating cockpit displays, warning devices, and flight deck-ATC interaction.

  7. Large eddy simulation in a rotary blood pump: Viscous shear stress computation and comparison with unsteady Reynolds-averaged Navier-Stokes simulation.

    PubMed

    Torner, Benjamin; Konnigk, Lucas; Hallier, Sebastian; Kumar, Jitendra; Witte, Matthias; Wurm, Frank-Hendrik

    2018-06-01

    Numerical flow analysis (computational fluid dynamics) in combination with the prediction of blood damage is an important procedure to investigate the hemocompatibility of a blood pump, since blood trauma due to shear stresses remains a problem in these devices. Today, the numerical damage prediction is conducted using unsteady Reynolds-averaged Navier-Stokes simulations. Investigations with large eddy simulations are rarely being performed for blood pumps. Hence, the aim of the study is to examine the viscous shear stresses of a large eddy simulation in a blood pump and compare the results with an unsteady Reynolds-averaged Navier-Stokes simulation. The simulations were carried out at two operation points of a blood pump. The flow was simulated on a 100M element mesh for the large eddy simulation and a 20M element mesh for the unsteady Reynolds-averaged Navier-Stokes simulation. As a first step, the large eddy simulation was verified by analyzing internal dissipative losses within the pump. Then, the pump characteristics and mean and turbulent viscous shear stresses were compared between the two simulation methods. The verification showed that the large eddy simulation is able to reproduce the significant portion of dissipative losses, which is a global indication that the equivalent viscous shear stresses are adequately resolved. The comparison with the unsteady Reynolds-averaged Navier-Stokes simulation revealed that the hydraulic parameters were in agreement, but differences for the shear stresses were found. The results show the potential of the large eddy simulation as a high-quality comparative case to check the suitability of a chosen Reynolds-averaged Navier-Stokes setup and turbulence model. Furthermore, the results lead to suggest that large eddy simulations are superior to unsteady Reynolds-averaged Navier-Stokes simulations when instantaneous stresses are applied for the blood damage prediction.

  8. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less

  9. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    PubMed

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to use high-fidelity patient simulators, which present simulations that are closer to real-life situations.

  10. Simulation-Based Training Platforms for Arthroscopy: A Randomized Comparison of Virtual Reality Learning to Benchtop Learning.

    PubMed

    Middleton, Robert M; Alvand, Abtin; Garfjeld Roberts, Patrick; Hargrove, Caroline; Kirby, Georgina; Rees, Jonathan L

    2017-05-01

    To determine whether a virtual reality (VR) arthroscopy simulator or benchtop (BT) arthroscopy simulator showed superiority as a training tool. Arthroscopic novices were randomized to a training program on a BT or a VR knee arthroscopy simulator. The VR simulator provided user performance feedback. Individuals performed a diagnostic arthroscopy on both simulators before and after the training program. Performance was assessed using wireless objective motion analysis and a global rating scale. The groups (8 in the VR group, 9 in the BT group) were well matched at baseline across all parameters (P > .05). Training on each simulator resulted in significant performance improvements across all parameters (P < .05). BT training conferred a significant improvement in all parameters when trainees were reassessed on the VR simulator (P < .05). In contrast, VR training did not confer improvement in performance when trainees were reassessed on the BT simulator (P > .05). BT-trained subjects outperformed VR-trained subjects in all parameters during final assessments on the BT simulator (P < .05). There was no difference in objective performance between VR-trained and BT-trained subjects on final VR simulator wireless objective motion analysis assessment (P > .05). Both simulators delivered improvements in arthroscopic skills. BT training led to skills that readily transferred to the VR simulator. Skills acquired after VR training did not transfer as readily to the BT simulator. Despite trainees receiving automated metric feedback from the VR simulator, the results suggest a greater gain in psychomotor skills for BT training. Further work is required to determine if this finding persists in the operating room. This study suggests that there are differences in skills acquired on different simulators and skills learnt on some simulators may be more transferable. Further work in identifying user feedback metrics that enhance learning is also required. Copyright © 2016 Arthroscopy Association of North America. All rights reserved.

  11. The effect of simulation courseware on critical thinking in undergraduate nursing students: multi-site pre-post study.

    PubMed

    Shin, Hyunsook; Ma, Hyunhee; Park, Jiyoung; Ji, Eun Sun; Kim, Dong Hee

    2015-04-01

    The use of simulations has been considered as opportunities for students to enhance their critical thinking (CT), but previous studies were limited because they did not provide in-depth information on the working dynamics of simulation or on the effects of the number of simulation exposures on CT. This study examined the effect of an integrated pediatric nursing simulation used in a nursing practicum on students' CT abilities and identified the effects of differing numbers of simulation exposures on CT in a multi-site environment. The study used a multi-site, pre-test, post-test design. A total of 237 nursing students at three universities enrolled in a pediatric practicum participated in this study from February to December 2013. All three schools used the same simulation courseware, including the same simulation scenarios, evaluation tools, and simulation equipment. The courseware incorporated high-fidelity simulators and standardized patients. Students at school A completed one simulation session, whereas students at schools B and C completed two and three simulation sessions, respectively. Yoon's Critical Thinking Disposition tool (2008) was used to measure students' CT abilities. The gains in students' CT scores varied according to their numbers of exposures to the simulation courseware. With a single exposure, there were no statistically significant gains in CT, whereas three exposures to the courseware produced significant gains in CT. In seven subcategories of critical thinking, three exposures to the simulation courseware produced CT gains in the prudence and intellectual eagerness subcategories, and the overall simulation experience produced CT gains in the prudence, systematicity, healthy skepticism, and intellectual eagerness subcategories. Simulation courseware may produce positive learning outcomes for prudence in nursing education. In addition, the findings from the multi-site comparative study may contribute to greater understanding of how patient simulation experiences impact students' CT abilities. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. MCC level C formulation requirements. Shuttle TAEM guidance and flight control, STS-1 baseline

    NASA Technical Reports Server (NTRS)

    Carman, G. L.; Montez, M. N.

    1980-01-01

    The TAEM guidance and body rotational dynamics models required for the MCC simulation of the TAEM mission phase are defined. This simulation begins at the end of the entry phase and terminates at TAEM autoland interface. The logic presented is the required configuration for the first shuttle orbital flight (STS-1). The TAEM guidance is simulated in detail. The rotational dynamics simulation is a simplified model that assumes that the commanded rotational rates can be achieved in the integration interval. Thus, the rotational dynamics simulation is essentially a simulation of the autopilot commanded rates and integration of these rates to determine orbiter attitude. The rotational dynamics simulation also includes a simulation of the speedbrake deflection. The body flap and elevon deflections are computed in the orbiter aerodynamic simulation.

  13. Simulator for heterogeneous dataflow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    1993-01-01

    A new simulator is developed to simulate the execution of an algorithm graph in accordance with the Algorithm to Architecture Mapping Model (ATAMM) rules. ATAMM is a Petri Net model which describes the periodic execution of large-grained, data-independent dataflow graphs and which provides predictable steady state time-optimized performance. This simulator extends the ATAMM simulation capability from a heterogenous set of resources, or functional units, to a more general heterogenous architecture. Simulation test cases show that the simulator accurately executes the ATAMM rules for both a heterogenous architecture and a homogenous architecture, which is the special case for only one processor type. The simulator forms one tool in an ATAMM Integrated Environment which contains other tools for graph entry, graph modification for performance optimization, and playback of simulations for analysis.

  14. An agent-based stochastic Occupancy Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  15. An agent-based stochastic Occupancy Simulator

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen; Luo, Xuan

    2017-06-01

    Occupancy has significant impacts on building performance. However, in current building performance simulation programs, occupancy inputs are static and lack diversity, contributing to discrepancies between the simulated and actual building performance. This work presents an Occupancy Simulator that simulates the stochastic behavior of occupant presence and movement in buildings, capturing the spatial and temporal occupancy diversity. Each occupant and each space in the building are explicitly simulated as an agent with their profiles of stochastic behaviors. The occupancy behaviors are represented with three types of models: (1) the status transition events (e.g., first arrival in office) simulated with probability distributionmore » model, (2) the random moving events (e.g., from one office to another) simulated with a homogeneous Markov chain model, and (3) the meeting events simulated with a new stochastic model. A hierarchical data model was developed for the Occupancy Simulator, which reduces the amount of data input by using the concepts of occupant types and space types. Finally, a case study of a small office building is presented to demonstrate the use of the Simulator to generate detailed annual sub-hourly occupant schedules for individual spaces and the whole building. The Simulator is a web application freely available to the public and capable of performing a detailed stochastic simulation of occupant presence and movement in buildings. Future work includes enhancements in the meeting event model, consideration of personal absent days, verification and validation of the simulated occupancy results, and expansion for use with residential buildings.« less

  16. Simulation of transmission electron microscope images of biological specimens.

    PubMed

    Rullgård, H; Ofverstedt, L-G; Masich, S; Daneholt, B; Oktem, O

    2011-09-01

    We present a new approach to simulate electron cryo-microscope images of biological specimens. The framework for simulation consists of two parts; the first is a phantom generator that generates a model of a specimen suitable for simulation, the second is a transmission electron microscope simulator. The phantom generator calculates the scattering potential of an atomic structure in aqueous buffer and allows the user to define the distribution of molecules in the simulated image. The simulator includes a well defined electron-specimen interaction model based on the scalar Schrödinger equation, the contrast transfer function for optics, and a noise model that includes shot noise as well as detector noise including detector blurring. To enable optimal performance, the simulation framework also includes a calibration protocol for setting simulation parameters. To test the accuracy of the new framework for simulation, we compare simulated images to experimental images recorded of the Tobacco Mosaic Virus (TMV) in vitreous ice. The simulated and experimental images show good agreement with respect to contrast variations depending on dose and defocus. Furthermore, random fluctuations present in experimental and simulated images exhibit similar statistical properties. The simulator has been designed to provide a platform for development of new instrumentation and image processing procedures in single particle electron microscopy, two-dimensional crystallography and electron tomography with well documented protocols and an open source code into which new improvements and extensions are easily incorporated. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  17. Quantitative Technique for Comparing Simulant Materials through Figures of Merit

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; Hoelzer, Hans; Fourroux, Kathy; Owens, Charles; McLemore, Carole; Fikes, John

    2007-01-01

    The 1989 workshop report entitled Workshop on Production and Uses of Simulated Lunar Materials and the Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage, NASA Technical Publication both identified and reinforced a need for a set of standards and requirements for the production and usage of the Lunar simulant materials. As NASA prepares to return to the Moon, and set out to Mars, a set of early requirements have been developed for simulant materials and the initial methods to produce and measure those simulants have been defined. Addressed in the requirements document are: 1) a method for evaluating the quality of any simulant of a regolith, 2) the minimum characteristics for simulants of Lunar regolith, and 3) a method to produce simulants needed for NASA's Exploration mission. As an extension of the requirements document a method to evaluate new and current simulants has been rigorously defined through the mathematics of Figures of Merit (FoM). Requirements and techniques have been developed that allow the simulant provider to compare their product to a standard reference material through Figures of Merit. Standard reference material may be physical material such as the Apollo core samples or material properties predicted for any landing site. The simulant provider is not restricted to providing a single "high fidelity" simulant, which may be costly to produce. The provider can now develop "lower fidelity" simulants for engineering applications such as drilling and mobility applications.

  18. iCrowd: agent-based behavior modeling and crowd simulator

    NASA Astrophysics Data System (ADS)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  19. Software for Engineering Simulations of a Spacecraft

    NASA Technical Reports Server (NTRS)

    Shireman, Kirk; McSwain, Gene; McCormick, Bernell; Fardelos, Panayiotis

    2005-01-01

    Spacecraft Engineering Simulation II (SES II) is a C-language computer program for simulating diverse aspects of operation of a spacecraft characterized by either three or six degrees of freedom. A functional model in SES can include a trajectory flight plan; a submodel of a flight computer running navigational and flight-control software; and submodels of the environment, the dynamics of the spacecraft, and sensor inputs and outputs. SES II features a modular, object-oriented programming style. SES II supports event-based simulations, which, in turn, create an easily adaptable simulation environment in which many different types of trajectories can be simulated by use of the same software. The simulation output consists largely of flight data. SES II can be used to perform optimization and Monte Carlo dispersion simulations. It can also be used to perform simulations for multiple spacecraft. In addition to its generic simulation capabilities, SES offers special capabilities for space-shuttle simulations: for this purpose, it incorporates submodels of the space-shuttle dynamics and a C-language version of the guidance, navigation, and control components of the space-shuttle flight software.

  20. Pre-simulation orientation for medical trainees: An approach to decrease anxiety and improve confidence and performance.

    PubMed

    Bommer, Cassidy; Sullivan, Sarah; Campbell, Krystle; Ahola, Zachary; Agarwal, Suresh; O'Rourke, Ann; Jung, Hee Soo; Gibson, Angela; Leverson, Glen; Liepert, Amy E

    2018-02-01

    We assessed the effect of basic orientation to the simulation environment on anxiety, confidence, and clinical decision making. Twenty-four graduating medical students participated in a two-week surgery preparatory curriculum, including three simulations. Baseline anxiety was assessed pre-course. Scenarios were completed on day 2 and day 9. Prior to the first simulation, participants were randomly divided into two groups. Only one group received a pre-simulation orientation. Before the second simulation, all students received the same orientation. Learner anxiety was reported immediately preceding and following each simulation. Confidence was assessed post-simulation. Performance was evaluated by surgical faculty. The oriented group experienced decreased anxiety following the first simulation (p = 0.003); the control group did not. Compared to the control group, the oriented group reported less anxiety and greater confidence and received higher performance scores following all three simulations (all p < 0.05). Pre-simulation orientation reduces anxiety while increasing confidence and improving performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Holistic Nursing Simulation: A Concept Analysis.

    PubMed

    Cohen, Bonni S; Boni, Rebecca

    2018-03-01

    Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.

  2. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  3. A “Skylight” Simulator for HWIL Simulation of Hyperspectral Remote Sensing

    PubMed Central

    Zhao, Huijie; Cui, Bolun; Li, Xudong; Zhang, Chao; Zhang, Xinyang

    2017-01-01

    Even though digital simulation technology has been widely used in the last two decades, hardware-in-the-loop (HWIL) simulation is still an indispensable method for spectral uncertainty research of ground targets. However, previous facilities mainly focus on the simulation of panchromatic imaging. Therefore, neither the spectral nor the spatial performance is enough for hyperspectral simulation. To improve the accuracy of illumination simulation, a new dome-like skylight simulator is designed and developed to fit the spatial distribution and spectral characteristics of a real skylight for the wavelength from 350 nm to 2500 nm. The simulator’s performance was tested using a spectroradiometer with different accessories. The spatial uniformity is greater than 0.91. The spectral mismatch decreases to 1/243 of the spectral mismatch of the Imagery Simulation Facility (ISF). The spatial distribution of radiance can be adjusted, and the accuracy of the adjustment is greater than 0.895. The ability of the skylight simulator is also demonstrated by comparing radiometric quantities measured in the skylight simulator with those in a real skylight in Beijing. PMID:29211004

  4. Development of a device to simulate tooth mobility.

    PubMed

    Erdelt, Kurt-Jürgen; Lamper, Timea

    2010-10-01

    The testing of new materials under simulation of oral conditions is essential in medicine. For simulation of fracture strength different simulation devices are used for test set-up. The results of these in vitro tests differ because there is no standardization of tooth mobility in simulation devices. The aim of this study is to develop a simulation device that depicts the tooth mobility curve as accurately as possible and creates reproducible and scalable mobility curves. With the aid of published literature and with the help of dentists, average forms of tooth classes were generated. Based on these tooth data, different abutment tooth shapes and different simulation devices were designed with a CAD system and were generated with a Rapid Prototyping system. Then, for all simulation devices the displacement curves were created with a universal testing machine and compared with the tooth mobility curve. With this new information, an improved adapted simulation device was constructed. A simulations device that is able to simulate the mobility curve of natural teeth with high accuracy and where mobility is reproducible and scalable was developed.

  5. Prototype software model for designing intruder detection systems with simulation

    NASA Astrophysics Data System (ADS)

    Smith, Jeffrey S.; Peters, Brett A.; Curry, James C.; Gupta, Dinesh

    1998-08-01

    This article explores using discrete-event simulation for the design and control of defence oriented fixed-sensor- based detection system in a facility housing items of significant interest to enemy forces. The key issues discussed include software development, simulation-based optimization within a modeling framework, and the expansion of the framework to create real-time control tools and training simulations. The software discussed in this article is a flexible simulation environment where the data for the simulation are stored in an external database and the simulation logic is being implemented using a commercial simulation package. The simulation assesses the overall security level of a building against various intruder scenarios. A series of simulation runs with different inputs can determine the change in security level with changes in the sensor configuration, building layout, and intruder/guard strategies. In addition, the simulation model developed for the design stage of the project can be modified to produce a control tool for the testing, training, and real-time control of systems with humans and sensor hardware in the loop.

  6. Simulation in surgery: a review.

    PubMed

    Tan, Shaun Shi Yan; Sarker, Sudip K

    2011-05-01

    The ability to acquire surgical skills requires consistent practice, and evidence suggests that many of these technical skills can be learnt away from the operating theatre. The aim of this review article is to discuss the importance of surgical simulation today and its various types, exploring the effectiveness of simulation in the clinical setting and its challenges for the future. Surgical simulation offers the opportunity for trainees to practise their surgical skills prior to entering the operating theatre, allowing detailed feedback and objective assessment of their performance. This enables better patient safety and standards of care. Surgical simulators can be divided into organic or inorganic simulators. Organic simulators, consisting of live animal and fresh human cadaver models, are considered to be of high-fidelity. Inorganic simulators comprise virtual reality simulators and synthetic bench models. Current evidence suggests that skills acquired through training with simulators, positively transfers to the clinical setting and improves operative outcome. The major challenge for the future revolves around understanding the value of this new technology and developing an educational curriculum that can incorporate surgical simulators.

  7. Enhancing the Simulation Speed of Sensor Network Applications by Asynchronization of Interrupt Service Routines

    PubMed Central

    Joe, Hyunwoo; Woo, Duk-Kyun; Kim, Hyungshin

    2013-01-01

    Sensor network simulations require high fidelity and timing accuracy to be used as an implementation and evaluation tool. The cycle-accurate and instruction-level simulator is the known solution for these purposes. However, this type of simulation incurs a high computation cost since it has to model not only the instruction level behavior but also the synchronization between multiple sensors for their causality. This paper presents a novel technique that exploits asynchronous simulations of interrupt service routines (ISR). We can avoid the synchronization overheads when the interrupt service routines are simulated without preemption. If the causality errors occur, we devise a rollback procedure to restore the original synchronized simulation. This concept can be extended to any instruction-level sensor network simulator. Evaluation results show our method can enhance the simulation speed up to 52% in the case of our experiments. For applications with longer interrupt service routines and smaller number of preemptions, the speedup becomes greater. In addition, our simulator is 2 to 11 times faster than the well-known sensor network simulator. PMID:23966200

  8. Synchronous Parallel Emulation and Discrete Event Simulation System with Self-Contained Simulation Objects and Active Event Objects

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in a method of performing object-oriented simulation and a system having inter-connected processor nodes operating in parallel to simulate mutual interactions of a set of discrete simulation objects distributed among the nodes as a sequence of discrete events changing state variables of respective simulation objects so as to generate new event-defining messages addressed to respective ones of the nodes. The object-oriented simulation is performed at each one of the nodes by assigning passive self-contained simulation objects to each one of the nodes, responding to messages received at one node by generating corresponding active event objects having user-defined inherent capabilities and individual time stamps and corresponding to respective events affecting one of the passive self-contained simulation objects of the one node, restricting the respective passive self-contained simulation objects to only providing and receiving information from die respective active event objects, requesting information and changing variables within a passive self-contained simulation object by the active event object, and producing corresponding messages specifying events resulting therefrom by the active event objects.

  9. Implementation of extended Lagrangian dynamics in GROMACS for polarizable simulations using the classical Drude oscillator model.

    PubMed

    Lemkul, Justin A; Roux, Benoît; van der Spoel, David; MacKerell, Alexander D

    2015-07-15

    Explicit treatment of electronic polarization in empirical force fields used for molecular dynamics simulations represents an important advancement in simulation methodology. A straightforward means of treating electronic polarization in these simulations is the inclusion of Drude oscillators, which are auxiliary, charge-carrying particles bonded to the cores of atoms in the system. The additional degrees of freedom make these simulations more computationally expensive relative to simulations using traditional fixed-charge (additive) force fields. Thus, efficient tools are needed for conducting these simulations. Here, we present the implementation of highly scalable algorithms in the GROMACS simulation package that allow for the simulation of polarizable systems using extended Lagrangian dynamics with a dual Nosé-Hoover thermostat as well as simulations using a full self-consistent field treatment of polarization. The performance of systems of varying size is evaluated, showing that the present code parallelizes efficiently and is the fastest implementation of the extended Lagrangian methods currently available for simulations using the Drude polarizable force field. © 2015 Wiley Periodicals, Inc.

  10. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  11. Simulation of networks of spiking neurons: A review of tools and strategies

    PubMed Central

    Brette, Romain; Rudolph, Michelle; Carnevale, Ted; Hines, Michael; Beeman, David; Bower, James M.; Diesmann, Markus; Morrison, Abigail; Goodman, Philip H.; Harris, Frederick C.; Zirpe, Milind; Natschläger, Thomas; Pecevski, Dejan; Ermentrout, Bard; Djurfeldt, Mikael; Lansner, Anders; Rochel, Olivier; Vieville, Thierry; Muller, Eilif; Davison, Andrew P.; El Boustani, Sami

    2009-01-01

    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin–Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks. PMID:17629781

  12. Hybrid Eulerian and Lagrangian Simulation of Steep and Breaking Waves and Surface Fluxes in High Winds

    DTIC Science & Technology

    2010-09-30

    simulating violent free - surface flows , and show the importance of wave breaking in energy transport...using Eulerian simulation . 3 IMPACT/APPLICATION This project aims at developing an advanced simulation tool for multi-fluids free - surface flows that...several Eulerian and Lagrangian methods for free - surface turbulence and wave simulation . The WIND–SNOW is used to simulate 1 Report

  13. Integrating simulation training into the nursing curriculum.

    PubMed

    Wilford, Amanda; Doyle, Thomas J

    The use of simulation is gaining momentum in nurse education across the UK. The Nursing and Midwifery Council is currently investigating the use of simulation in pre-registration nursing. This article gives a brief history of simulation, discusses competence issues and why simulation is best placed to teach nurses in today's health service. An innovative approach to implementing simulation into the nursing curriculum is introduced.

  14. Military Training: Observations on Efforts to Prepare Personnel to Survive Helicopter Crashes into Water

    DTIC Science & Technology

    2014-07-14

    Air Force Environmental conditions simulation equipment Equipment that simulates conditions such as waves, wind, rain, thunder , lightning , and...Environmental conditions simulation equipment Equipment that simulates conditions such as waves, wind, rain, thunder , lightning , and combat sounds...items such as wave generators, heavy-duty fans to simulate high winds, strobe lights to simulate lightning , water spray and injection systems to

  15. Effects of water-management alternatives on streamflow in the Ipswich River basin, Massachusetts

    USGS Publications Warehouse

    Zarriello, Philip J.

    2001-01-01

    Management alternatives that could help mitigate the effects of water withdrawals on streamflow in the Ipswich River Basin were evaluated by simulation with a calibrated Hydrologic Simulation Program--Fortran (HSPF) model. The effects of management alternatives on streamflow were simulated for a 35-year period (196195). Most alternatives examined increased low flows compared to the base simulation of average 1989-93 withdrawals. Only the simulation of no septic-effluent inflow, and the simulation of a 20-percent increase in withdrawals, further lowered flows or caused the river to stop flowing for longer periods of time than the simulation of average 198993 withdrawals. Simulations of reduced seasonal withdrawals by 20 percent, and by 50 percent, resulted in a modest increase in low flow in a critical habitat reach (model reach 8 near the Reading town well field); log-Pearson Type III analysis of simulated daily-mean flow indicated that under these reduced withdrawals, model reach 8 would stop flowing for a period of seven consecutive days about every other year, whereas under average 198993 withdrawals this reach would stop flowing for a seven consecutive day period almost every year. Simulations of no seasonal withdrawals, and simulations that stopped streamflow depletion when flow in model reach 19 was below 22 cubic feet per second, indicated flow would be maintained in model reach 8 at all times. Simulations indicated wastewater-return flows would augment low flow in proportion to the rate of return flow. Simulations of a 1.5 million gallons per day return flow rate indicated model reach 8 would stop flowing for a period of seven consecutive days about once every 5 years; simulated return flow rates of 1.1 million gallons per day indicated that model reach 8 would stop flowing for a period of seven consecutive days about every other year. Simulation of reduced seasonal withdrawals, combined with no septic effluent return flow, indicated only a slight increase in low flow compared to low flows simulated under average 198993 withdrawals. Simulation of reduced seasonal withdrawal, combined with 2.6 million gallons per day wastewater-return flows, provided more flow in model reach 8 than that simulated under no withdrawals.

  16. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  17. NOTE: Implementation of angular response function modeling in SPECT simulations with GATE

    NASA Astrophysics Data System (ADS)

    Descourt, P.; Carlier, T.; Du, Y.; Song, X.; Buvat, I.; Frey, E. C.; Bardies, M.; Tsui, B. M. W.; Visvikis, D.

    2010-05-01

    Among Monte Carlo simulation codes in medical imaging, the GATE simulation platform is widely used today given its flexibility and accuracy, despite long run times, which in SPECT simulations are mostly spent in tracking photons through the collimators. In this work, a tabulated model of the collimator/detector response was implemented within the GATE framework to significantly reduce the simulation times in SPECT. This implementation uses the angular response function (ARF) model. The performance of the implemented ARF approach has been compared to standard SPECT GATE simulations in terms of the ARF tables' accuracy, overall SPECT system performance and run times. Considering the simulation of the Siemens Symbia T SPECT system using high-energy collimators, differences of less than 1% were measured between the ARF-based and the standard GATE-based simulations, while considering the same noise level in the projections, acceleration factors of up to 180 were obtained when simulating a planar 364 keV source seen with the same SPECT system. The ARF-based and the standard GATE simulation results also agreed very well when considering a four-head SPECT simulation of a realistic Jaszczak phantom filled with iodine-131, with a resulting acceleration factor of 100. In conclusion, the implementation of an ARF-based model of collimator/detector response for SPECT simulations within GATE significantly reduces the simulation run times without compromising accuracy.

  18. Performance evaluation of an agent-based occupancy simulation model

    DOE PAGES

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing; ...

    2017-01-17

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  19. Performance evaluation of an agent-based occupancy simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Xuan; Lam, Khee Poh; Chen, Yixing

    Occupancy is an important factor driving building performance. Static and homogeneous occupant schedules, commonly used in building performance simulation, contribute to issues such as performance gaps between simulated and measured energy use in buildings. Stochastic occupancy models have been recently developed and applied to better represent spatial and temporal diversity of occupants in buildings. However, there is very limited evaluation of the usability and accuracy of these models. This study used measured occupancy data from a real office building to evaluate the performance of an agent-based occupancy simulation model: the Occupancy Simulator. The occupancy patterns of various occupant types weremore » first derived from the measured occupant schedule data using statistical analysis. Then the performance of the simulation model was evaluated and verified based on (1) whether the distribution of observed occupancy behavior patterns follows the theoretical ones included in the Occupancy Simulator, and (2) whether the simulator can reproduce a variety of occupancy patterns accurately. Results demonstrated the feasibility of applying the Occupancy Simulator to simulate a range of occupancy presence and movement behaviors for regular types of occupants in office buildings, and to generate stochastic occupant schedules at the room and individual occupant levels for building performance simulation. For future work, model validation is recommended, which includes collecting and using detailed interval occupancy data of all spaces in an office building to validate the simulated occupant schedules from the Occupancy Simulator.« less

  20. Numerical simulation and characterization of trapping noise in InGaP-GaAs heterojunctions devices at high injection

    NASA Astrophysics Data System (ADS)

    Nallatamby, Jean-Christophe; Abdelhadi, Khaled; Jacquet, Jean-Claude; Prigent, Michel; Floriot, Didier; Delage, Sylvain; Obregon, Juan

    2013-03-01

    Commercially available simulators present considerable advantages in performing accurate DC, AC and transient simulations of semiconductor devices, including many fundamental and parasitic effects which are not generally taken into account in house-made simulators. Nevertheless, while the TCAD simulators of the public domain we have tested give accurate results for the simulation of diffusion noise, none of the tested simulators perform trap-assisted GR noise accurately. In order to overcome the aforementioned problem we propose a robust solution to accurately simulate GR noise due to traps. It is based on numerical processing of the output data of one of the simulators available in the public-domain, namely SENTAURUS (from Synopsys). We have linked together, through a dedicated Data Access Component (DAC), the deterministic output data available from SENTAURUS and a powerful, customizable post-processing tool developed on the mathematical SCILAB software package. Thus, robust simulations of GR noise in semiconductor devices can be performed by using GR Langevin sources associated to the scalar Green functions responses of the device. Our method takes advantage of the accuracy of the deterministic simulations of electronic devices obtained with SENTAURUS. A Comparison between 2-D simulations and measurements of low frequency noise on InGaP-GaAs heterojunctions, at low as well as high injection levels, demonstrates the validity of the proposed simulation tool.

  1. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework.

    PubMed

    Tay, Charison; Khajuria, Ankur; Gupte, Chinmay

    2014-01-01

    Traditional orthopaedic training has followed an apprenticeship model whereby trainees enhance their skills by operating under guidance. However the introduction of limitations on training hours and shorter training programmes mean that alternative training strategies are required. To perform a literature review on simulation training in arthroscopy and devise a framework that structures different simulation techniques that could be used in arthroscopic training. A systematic search of Medline, Embase, Google Scholar and the Cochrane Databases were performed. Search terms included "virtual reality OR simulator OR simulation" and "arthroscopy OR arthroscopic". 14 studies evaluating simulators in knee, shoulder and hip arthroplasty were included. The majority of the studies demonstrated construct and transference validity but only one showed concurrent validity. More studies are required to assess its potential as a training and assessment tool, skills transference between simulators and to determine the extent of skills decay from prolonged delays in training. We also devised a "ladder of arthroscopic simulation" that provides a competency-based framework to implement different simulation strategies. The incorporation of simulation into an orthopaedic curriculum will depend on a coordinated approach between many bodies. But the successful integration of simulators in other areas of surgery supports a possible role for simulation in advancing orthopaedic education. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  2. Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner

    NASA Astrophysics Data System (ADS)

    Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.

    2015-02-01

    Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.

  3. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  4. Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Malsbury, T.; Atencio, A., Jr.

    1992-01-01

    A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.

  5. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  6. An electrical circuit model for simulation of indoor radon concentration.

    PubMed

    Musavi Nasab, S M; Negarestani, A

    2013-01-01

    In this study, a new model based on electric circuit theory was introduced to simulate the behaviour of indoor radon concentration. In this model, a voltage source simulates radon generation in walls, conductivity simulates migration through walls and voltage across a capacitor simulates radon concentration in a room. This simulation considers migration of radon through walls by diffusion mechanism in one-dimensional geometry. Data reported in a typical Greek house were employed to examine the application of this technique of simulation to the behaviour of radon.

  7. An Innovative and Successful Simulation Day.

    PubMed

    Bowling, Ann M; Eismann, Michelle

    This article discusses the development of a creative and innovative plan to incorporate independent activities, including skill reviews and scenarios, into a single eight-hour day, using small student groups to enhance the learning process for pediatric nursing students. The simulation day consists of skills activities and pediatric simulation scenarios using the human patient simulator. Using small student groups in simulation captures the students' attention and enhances motivation to learn. The simulation day is a work in progress; appropriate changes are continually being made to improve the simulation experience for students.

  8. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Limits to high-speed simulations of spiking neural networks using general-purpose computers.

    PubMed

    Zenke, Friedemann; Gerstner, Wulfram

    2014-01-01

    To understand how the central nervous system performs computations using recurrent neuronal circuitry, simulations have become an indispensable tool for theoretical neuroscience. To study neuronal circuits and their ability to self-organize, increasing attention has been directed toward synaptic plasticity. In particular spike-timing-dependent plasticity (STDP) creates specific demands for simulations of spiking neural networks. On the one hand a high temporal resolution is required to capture the millisecond timescale of typical STDP windows. On the other hand network simulations have to evolve over hours up to days, to capture the timescale of long-term plasticity. To do this efficiently, fast simulation speed is the crucial ingredient rather than large neuron numbers. Using different medium-sized network models consisting of several thousands of neurons and off-the-shelf hardware, we compare the simulation speed of the simulators: Brian, NEST and Neuron as well as our own simulator Auryn. Our results show that real-time simulations of different plastic network models are possible in parallel simulations in which numerical precision is not a primary concern. Even so, the speed-up margin of parallelism is limited and boosting simulation speeds beyond one tenth of real-time is difficult. By profiling simulation code we show that the run times of typical plastic network simulations encounter a hard boundary. This limit is partly due to latencies in the inter-process communications and thus cannot be overcome by increased parallelism. Overall, these results show that to study plasticity in medium-sized spiking neural networks, adequate simulation tools are readily available which run efficiently on small clusters. However, to run simulations substantially faster than real-time, special hardware is a prerequisite.

  10. Simulating adverse event spontaneous reporting systems as preferential attachment networks: application to the Vaccine Adverse Event Reporting System.

    PubMed

    Scott, J; Botsis, T; Ball, R

    2014-01-01

    Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.

  11. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  12. Translational simulation: not 'where?' but 'why?' A functional view of in situ simulation.

    PubMed

    Brazil, Victoria

    2017-01-01

    Healthcare simulation has been widely adopted for health professional education at all stages of training and practice and across cognitive, procedural, communication and teamwork domains. Recent enthusiasm for in situ simulation-delivered in the real clinical environment-cites improved transfer of knowledge and skills into real-world practice, as well as opportunities to identify latent safety threats and other workplace-specific issues. However, describing simulation type according to place may not be helpful. Instead, I propose the term translational simulation as a functional term for how simulation may be connected directly with health service priorities and patient outcomes, through interventional and diagnostic functions, independent of the location of the simulation activity.

  13. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  14. Automated simulation as part of a design workstation

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.

    1990-01-01

    A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.

  15. The ATLAS Simulation Infrastructure

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2010-09-25

    The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, andmore » the validation of the simulated output against known physics processes.« less

  16. Microgrid and Inverter Control and Simulator Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-13

    A collection of software that can simulate the operation of an inverter on a microgrid or control a real inverter. In addition, it can simulate the control of multiple nodes on a microgrid." Application: Simulation of inverters and microgrids; control of inverters on microgrids." The MMI submodule is designed to control custom inverter hardware, and to simulate that hardware. The INVERTER submodule is only the simulator code, and is of an earlier generation than the simulator in MMI. The MICROGRID submodule is an agent-based simulator of multiple nodes on a microgrid which presents a web interface. The WIND submodule producesmore » movies of wind data with a web interface.« less

  17. COCOA: Simulating Observations of Star Cluster Simulations

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2017-03-01

    COCOA (Cluster simulatiOn Comparison with ObservAtions) creates idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. The code can simulate optical observations from simulation snapshots in which positions and magnitudes of objects are known. The parameters for simulating the observations can be adjusted to mimic telescopes of various sizes. COCOA also has a photometry pipeline that can use standalone versions of DAOPHOT (ascl:1104.011) and ALLSTAR to produce photometric catalogs for all observed stars.

  18. An Orion/Ares I Launch and Ascent Simulation: One Segment of the Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Crues, Edwin Z.; Blum, Mike G.; Alofs, Cathy; Busto, Juan

    2007-01-01

    This paper describes the architecture and implementation of a distributed launch and ascent simulation of NASA's Orion spacecraft and Ares I launch vehicle. This simulation is one segment of the Distributed Space Exploration Simulation (DSES) Project. The DSES project is a research and development collaboration between NASA centers which investigates technologies and processes for distributed simulation of complex space systems in support of NASA's Exploration Initiative. DSES is developing an integrated end-to-end simulation capability to support NASA development and deployment of new exploration spacecraft and missions. This paper describes the first in a collection of simulation capabilities that DSES will support.

  19. Cosimulation of embedded system using RTOS software simulator

    NASA Astrophysics Data System (ADS)

    Wang, Shihao; Duan, Zhigang; Liu, Mingye

    2003-09-01

    Embedded system design often employs co-simulation to verify system's function; one efficient verification tool of software is Instruction Set Simulator (ISS). As a full functional model of target CPU, ISS interprets instruction of embedded software step by step, which usually is time-consuming since it simulates at low-level. Hence ISS often becomes the bottleneck of co-simulation in a complicated system. In this paper, a new software verification tools, the RTOS software simulator (RSS) was presented. The mechanism of its operation was described in a full details. In RSS method, RTOS API is extended and hardware simulator driver is adopted to deal with data-exchange and synchronism between the two simulators.

  20. Gravitational Reference Sensor Front-End Electronics Simulator for LISA

    NASA Astrophysics Data System (ADS)

    Meshksar, Neda; Ferraioli, Luigi; Mance, Davor; ten Pierick, Jan; Zweifel, Peter; Giardini, Domenico; ">LISA Pathfinder colaboration,

  1. Medical Simulation Practices 2010 Survey Results

    NASA Technical Reports Server (NTRS)

    McCrindle, Jeffrey J.

    2011-01-01

    Medical Simulation Centers are an essential component of our learning infrastructure to prepare doctors and nurses for their careers. Unlike the military and aerospace simulation industry, very little has been published regarding the best practices currently in use within medical simulation centers. This survey attempts to provide insight into the current simulation practices at medical schools, hospitals, university nursing programs and community college nursing programs. Students within the MBA program at Saint Joseph's University conducted a survey of medical simulation practices during the summer 2010 semester. A total of 115 institutions responded to the survey. The survey resus discuss overall effectiveness of current simulation centers as well as the tools and techniques used to conduct the simulation activity

  2. Development of IR imaging system simulator

    NASA Astrophysics Data System (ADS)

    Xiang, Xinglang; He, Guojing; Dong, Weike; Dong, Lu

    2017-02-01

    To overcome the disadvantages of the tradition semi-physical simulation and injection simulation equipment in the performance evaluation of the infrared imaging system (IRIS), a low-cost and reconfigurable IRIS simulator, which can simulate the realistic physical process of infrared imaging, is proposed to test and evaluate the performance of the IRIS. According to the theoretical simulation framework and the theoretical models of the IRIS, the architecture of the IRIS simulator is constructed. The 3D scenes are generated and the infrared atmospheric transmission effects are simulated using OGRE technology in real-time on the computer. The physical effects of the IRIS are classified as the signal response characteristic, modulation transfer characteristic and noise characteristic, and they are simulated on the single-board signal processing platform based on the core processor FPGA in real-time using high-speed parallel computation method.

  3. Acoustic Parametric Array for Identifying Standoff Targets

    NASA Astrophysics Data System (ADS)

    Hinders, M. K.; Rudd, K. E.

    2010-02-01

    An integrated simulation method for investigating nonlinear sound beams and 3D acoustic scattering from any combination of complicated objects is presented. A standard finite-difference simulation method is used to model pulsed nonlinear sound propagation from a source to a scattering target via the KZK equation. Then, a parallel 3D acoustic simulation method based on the finite integration technique is used to model the acoustic wave interaction with the target. Any combination of objects and material layers can be placed into the 3D simulation space to study the resulting interaction. Several example simulations are presented to demonstrate the simulation method and 3D visualization techniques. The combined simulation method is validated by comparing experimental and simulation data and a demonstration of how this combined simulation method assisted in the development of a nonlinear acoustic concealed weapons detector is also presented.

  4. Compound simulator IR radiation characteristics test and calibration

    NASA Astrophysics Data System (ADS)

    Li, Yanhong; Zhang, Li; Li, Fan; Tian, Yi; Yang, Yang; Li, Zhuo; Shi, Rui

    2015-10-01

    The Hardware-in-the-loop simulation can establish the target/interference physical radiation and interception of product flight process in the testing room. In particular, the simulation of environment is more difficult for high radiation energy and complicated interference model. Here the development in IR scene generation produced by a fiber array imaging transducer with circumferential lamp spot sources is introduced. The IR simulation capability includes effective simulation of aircraft signatures and point-source IR countermeasures. Two point-sources as interference can move in two-dimension random directions. For simulation the process of interference release, the radiation and motion characteristic is tested. Through the zero calibration for optical axis of simulator, the radiation can be well projected to the product detector. The test and calibration results show the new type compound simulator can be used in the hardware-in-the-loop simulation trial.

  5. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  6. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  7. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  8. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  9. 10 CFR 55.46 - Simulation facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Simulation facilities. 55.46 Section 55.46 Energy NUCLEAR... Simulation facilities. (a) General. This section addresses the use of a simulation facility for the... applicants for operator and senior operator licenses. (b) Commission-approved simulation facilities and...

  10. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  11. Simulating neural systems with Xyce.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting

    2012-12-01

    Sandias parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  12. Some Dimensions of Simulation.

    ERIC Educational Resources Information Center

    Beck, Isabel; Monroe, Bruce

    Beginning with definitions of "simulation" (a methodology for testing alternative decisions under hypothetical conditions), this paper focuses on the use of simulation as an instructional method, pointing out the relationships and differences between role playing, games, and simulation. The term "simulation games" is explored with an analysis of…

  13. Computer Simulation in Tomorrow's Schools.

    ERIC Educational Resources Information Center

    Foster, David

    1984-01-01

    Suggests use of simulation as an educational strategy has promise for the school of the future; discusses specific advantages of simulations over alternative educational methods, role of microcomputers in educational simulation, and past obstacles and future promise of microcomputer simulations; and presents a literature review on effectiveness of…

  14. 78 FR 30956 - Cruise Vessel Security and Safety Training Provider Certification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ..., practical demonstration, or simulation program. A detailed instructor manual must be submitted. Submissions... simulation programs to be used. If a simulator or simulation program is to be used, include technical... lessons and, if appropriate, for practical demonstrations or simulation exercises and assessments...

  15. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  16. The Many Faces of Patient-Centered Simulation: Implications for Researchers.

    PubMed

    Arnold, Jennifer L; McKenzie, Frederic Rick D; Miller, Jane Lindsay; Mancini, Mary E

    2018-06-01

    Patient-centered simulation for nonhealthcare providers is an emerging and innovative application for healthcare simulation. Currently, no consensus exists on what patient-centered simulation encompasses and outcomes research in this area is limited. Conceptually, patient-centered simulation aligns with the principles of patient- and family-centered care bringing this educational tool directly to patients and caregivers with the potential to improve patient care and outcomes. This descriptive article is a summary of findings presented at the 2nd International Meeting for Simulation in Healthcare Research Summit. Experts in the field delineated a categorization for better describing patient-centered simulation and reviewed the literature to identify a research agenda. Three types of patient-centered simulation patient-directed, patient-driven, and patient-specific are presented with research priorities identified for each. Patient-centered simulation has been shown to be an effective educational tool and has the potential to directly improve patient care outcomes. Presenting a typology for patient-centered simulation provides direction for future research.

  17. VASA: Interactive Computational Steering of Large Asynchronous Simulation Pipelines for Societal Infrastructure.

    PubMed

    Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S

    2014-12-01

    We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.

  18. Real-time visual simulation of APT system based on RTW and Vega

    NASA Astrophysics Data System (ADS)

    Xiong, Shuai; Fu, Chengyu; Tang, Tao

    2012-10-01

    The Matlab/Simulink simulation model of APT (acquisition, pointing and tracking) system is analyzed and established. Then the model's C code which can be used for real-time simulation is generated by RTW (Real-Time Workshop). Practical experiments show, the simulation result of running the C code is the same as running the Simulink model directly in the Matlab environment. MultiGen-Vega is a real-time 3D scene simulation software system. With it and OpenGL, the APT scene simulation platform is developed and used to render and display the virtual scenes of the APT system. To add some necessary graphics effects to the virtual scenes real-time, GLSL (OpenGL Shading Language) shaders are used based on programmable GPU. By calling the C code, the scene simulation platform can adjust the system parameters on-line and get APT system's real-time simulation data to drive the scenes. Practical application shows that this visual simulation platform has high efficiency, low charge and good simulation effect.

  19. Role of a cumulus parameterization scheme in simulating atmospheric circulation and rainfall in the nine-layer Goddard Laboratory for Atmospheres General Circulation Model

    NASA Technical Reports Server (NTRS)

    Sud, Y. C.; Chao, Winston C.; Walker, G. K.

    1992-01-01

    The influence of a cumulus convection scheme on the simulated atmospheric circulation and hydrologic cycle is investigated by means of a coarse version of the GCM. Two sets of integrations, each containing an ensemble of three summer simulations, were produced. The ensemble sets of control and experiment simulations are compared and differentially analyzed to determine the influence of a cumulus convection scheme on the simulated circulation and hydrologic cycle. The results show that cumulus parameterization has a very significant influence on the simulation circulation and precipitation. The upper-level condensation heating over the ITCZ is much smaller for the experiment simulations as compared to the control simulations; correspondingly, the Hadley and Walker cells for the control simulations are also weaker and are accompanied by a weaker Ferrel cell in the Southern Hemisphere. Overall, the difference fields show that experiment simulations (without cumulus convection) produce a cooler and less energetic atmosphere.

  20. Understanding Emergency Care Delivery Through Computer Simulation Modeling.

    PubMed

    Laker, Lauren F; Torabi, Elham; France, Daniel J; Froehle, Craig M; Goldlust, Eric J; Hoot, Nathan R; Kasaie, Parastu; Lyons, Michael S; Barg-Walkow, Laura H; Ward, Michael J; Wears, Robert L

    2018-02-01

    In 2017, Academic Emergency Medicine convened a consensus conference entitled, "Catalyzing System Change through Health Care Simulation: Systems, Competency, and Outcomes." This article, a product of the breakout session on "understanding complex interactions through systems modeling," explores the role that computer simulation modeling can and should play in research and development of emergency care delivery systems. This article discusses areas central to the use of computer simulation modeling in emergency care research. The four central approaches to computer simulation modeling are described (Monte Carlo simulation, system dynamics modeling, discrete-event simulation, and agent-based simulation), along with problems amenable to their use and relevant examples to emergency care. Also discussed is an introduction to available software modeling platforms and how to explore their use for research, along with a research agenda for computer simulation modeling. Through this article, our goal is to enhance adoption of computer simulation, a set of methods that hold great promise in addressing emergency care organization and design challenges. © 2017 by the Society for Academic Emergency Medicine.

  1. Benefits of full scope simulators during solar thermal power plants design and construction

    NASA Astrophysics Data System (ADS)

    Gallego, José F.; Gil, Elena; Rey, Pablo

    2017-06-01

    In order to efficiently develop high-precision dynamic simulators for solar thermal power plants, Tecnatom adapted its simulation technology to consider solar thermal models. This effort and the excellent response of the simulation market have allowed Tecnatom to develop simulators with both parabolic trough and solar power tower technologies, including molten salt energy storage. These simulators may pursue different objectives, giving rise to training or engineering simulators. Solar thermal power market combines the need for the training of the operators with the potential benefits associated to the improvement of the design of the plants. This fact along with the simulation capabilities enabled by the current technology and the broad experience of Tecnatom present the development of an engineering+training simulator as a very advantageous option. This paper describes the challenge of the development and integration of a full scope simulator during the design and construction stages of a solar thermal power plant, showing the added value to the different engineering areas.

  2. Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.

    PubMed

    Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk

    2013-08-01

    Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.

  3. Evaluation and development the routing protocol of a fully functional simulation environment for VANETs

    NASA Astrophysics Data System (ADS)

    Ali, Azhar Tareq; Warip, Mohd Nazri Mohd; Yaakob, Naimah; Abduljabbar, Waleed Khalid; Atta, Abdu Mohammed Ali

    2017-11-01

    Vehicular Ad-hoc Networks (VANETs) is an area of wireless technologies that is attracting a great deal of interest. There are still several areas of VANETS, such as security and routing protocols, medium access control, that lack large amounts of research. There is also a lack of freely available simulators that can quickly and accurately simulate VANETs. The main goal of this paper is to develop a freely available VANETS simulator and to evaluate popular mobile ad-hoc network routing protocols in several VANETS scenarios. The VANETS simulator consisted of a network simulator, traffic (mobility simulator) and used a client-server application to keep the two simulators in sync. The VANETS simulator also models buildings to create a more realistic wireless network environment. Ad-Hoc Distance Vector routing (AODV), Dynamic Source Routing (DSR) and Dynamic MANET On-demand (DYMO) were initially simulated in a city, country, and highway environment to provide an overall evaluation.

  4. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  5. Innovations in surgery simulation: a review of past, current and future techniques

    PubMed Central

    Burtt, Karen; Solorzano, Carlos A.; Carey, Joseph N.

    2016-01-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon’s skill set, decrease hospital costs, and improve patient outcomes. PMID:28090509

  6. Innovations in surgery simulation: a review of past, current and future techniques.

    PubMed

    Badash, Ido; Burtt, Karen; Solorzano, Carlos A; Carey, Joseph N

    2016-12-01

    As a result of recent work-hours limitations and concerns for patient safety, innovations in extraclinical surgical simulation have become a desired part of residency education. Current simulation models, including cadaveric, animal, bench-top, virtual reality (VR) and robotic simulators are increasingly used in surgical training programs. Advances in telesurgery, three-dimensional (3D) printing, and the incorporation of patient-specific anatomy are paving the way for simulators to become integral components of medical training in the future. Evidence from the literature highlights the benefits of including simulations in surgical training; skills acquired through simulations translate into improvements in operating room performance. Moreover, simulations are rapidly incorporating new medical technologies and offer increasingly high-fidelity recreations of procedures. As a result, both novice and expert surgeons are able to benefit from their use. As dedicated, structured curricula are developed that incorporate simulations into daily resident training, simulated surgeries will strengthen the surgeon's skill set, decrease hospital costs, and improve patient outcomes.

  7. Integrating Medical Simulation Into the Physician Assistant Physiology Curriculum.

    PubMed

    Li, Lixin; Lopes, John; Zhou, Joseph Yi; Xu, Biao

    2016-12-01

    Medical simulation has recently been used in medical education, and evidence indicates that it is a valuable tool for teaching and evaluation. Very few studies have evaluated the integration of medical simulation in medical physiology education, particularly in PA programs. This study was designed to assess the value of integrating medical simulation into the PA physiology curriculum. Seventy-five students from the PA program at Central Michigan University participated in this study. Mannequin-based simulation was used to simulate a patient with hemorrhagic shock and congestive heart failure to demonstrate the Frank-Starling force and cardiac function curve. Before and after the medical simulation, students completed a questionnaire as a self-assessment. A knowledge test was also delivered after the simulation. Our study demonstrated a significant improvement in student confidence in understanding congestive heart failure, hemorrhagic shock, and the Frank-Starling curve after the simulation. Medical simulation may be an effective way to enhance basic science learning experiences for students and an ideal supplement to traditional, lecture-based teaching in PA education.

  8. Mathematical simulation of power conditioning systems. Volume 1: Simulation of elementary units. Report on simulation methodology

    NASA Technical Reports Server (NTRS)

    Prajous, R.; Mazankine, J.; Ippolito, J. C.

    1978-01-01

    Methods and algorithms used for the simulation of elementary power conditioning units buck, boost, and buck-boost, as well as shunt PWM are described. Definitions are given of similar converters and reduced parameters. The various parts of the simulation to be carried out are dealt with; local stability, corrective network, measurements of input-output impedance and global stability. A simulation example is given.

  9. Modeling, Simulation and Design of Plasmonic Interconnects for On-Chip Signal Processing

    DTIC Science & Technology

    2011-02-14

    integration and computation can be achieved by using the photonic detection devices such as the ultrafast photodectors and nanowire field transistors... infrared to optical frequencies, and their FDTD simulation results are shown in the middle diagram. In the right most diagram, the HSPICE simulation...FDTD simulation. The results tally very well to affirm that plasmonic nanowires can be simulated using circuit simulators like HSPICE to combine the

  10. Program For Parallel Discrete-Event Simulation

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.

    1991-01-01

    User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.

  11. OpenSimulator Interoperability with DRDC Simulation Tools: Compatibility Study

    DTIC Science & Technology

    2014-09-01

    into two components: (1) backend data services consisting of user accounts, login service, assets, and inventory; and (2) the simulator server which...components are combined into a single OpenSimulator process. In grid mode, the two components are separated, placing the backend services into a ROBUST... mobile devices. Potential points of compatibility between Unity and OpenSimulator include: a Unity-based desktop computer OpenSimulator viewer; a

  12. Multiscale optical simulation settings: challenging applications handled with an iterative ray-tracing FDTD interface method.

    PubMed

    Leiner, Claude; Nemitz, Wolfgang; Schweitzer, Susanne; Kuna, Ladislav; Wenzl, Franz P; Hartmann, Paul; Satzinger, Valentin; Sommer, Christian

    2016-03-20

    We show that with an appropriate combination of two optical simulation techniques-classical ray-tracing and the finite difference time domain method-an optical device containing multiple diffractive and refractive optical elements can be accurately simulated in an iterative simulation approach. We compare the simulation results with experimental measurements of the device to discuss the applicability and accuracy of our iterative simulation procedure.

  13. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  14. [Current and future use of surgical skills simulation in gynecologic resident education: a French national survey].

    PubMed

    Crochet, P; Aggarwal, R; Berdah, S; Yaribakht, S; Boubli, L; Gamerre, M; Agostini, A

    2014-05-01

    Simulation is a promising method to enhance surgical education in gynecology. The purpose of this study was to provide baseline information on the current use of simulators across French academic schools. Two questionnaires were created, one specifically for residents and one for professors. Main issues included the type of simulators used and the kind of use made for training purposes. Opinions and agreement about the use of simulators were also asked. Twenty-six percent of residents (258/998) and 24% of professors (29/122) answered the questionnaire. Sixty-five percent of residents (167/258) had experienced simulators. Laparoscopic pelvic-trainers (84%) and sessions on alive pigs (63%) were most commonly used. Residents reported access to simulators most commonly during introductory sessions (51%) and days of academic workshops (38%). Residents believed simulators very useful for training. Professors agreed that simulators should become a required part of residency training, but were less enthusiastic regarding simulation becoming a part of certification for practice. Surgical skills simulators are already experienced by a majority of French gynecologic residents. However, the use of these educational tools varies among surgical schools and remains occasional for the majority of residents. There was a strong agreement that simulation technology should be a component of training. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  15. Virtual reality simulators and training in laparoscopic surgery.

    PubMed

    Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos

    2015-01-01

    Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  16. Taming Wild Horses: The Need for Virtual Time-based Scheduling of VMs in Network Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B; Perumalla, Kalyan S; Henz, Brian J

    2012-01-01

    The next generation of scalable network simulators employ virtual machines (VMs) to act as high-fidelity models of traffic producer/consumer nodes in simulated networks. However, network simulations could be inaccurate if VMs are not scheduled according to virtual time, especially when many VMs are hosted per simulator core in a multi-core simulator environment. Since VMs are by default free-running, on the outset, it is not clear if, and to what extent, their untamed execution affects the results in simulated scenarios. Here, we provide the first quantitative basis for establishing the need for generalized virtual time scheduling of VMs in network simulators,more » based on an actual prototyped implementations. To exercise breadth, our system is tested with multiple disparate applications: (a) a set of message passing parallel programs, (b) a computer worm propagation phenomenon, and (c) a mobile ad-hoc wireless network simulation. We define and use error metrics and benchmarks in scaled tests to empirically report the poor match of traditional, fairness-based VM scheduling to VM-based network simulation, and also clearly show the better performance of our simulation-specific scheduler, with up to 64 VMs hosted on a 12-core simulator node.« less

  17. Is There Bias against Simulation in Microsurgery Training?

    PubMed

    Theman, Todd A; Labow, Brian I

    2016-09-01

    Background While other surgical specialties have embraced virtual reality simulation for training and recertification, microsurgery has lagged. This study aims to assess the opinions of microsurgeons on the role of simulation in microsurgery assessment and training. Methods We surveyed faculty members of the American Society of Reconstructive Microsurgery to ascertain opinions on their use of simulation in training and opinions about the utility of simulation for skills acquisition, teaching, and skills assessment. The 21-question survey was disseminated online to 675 members. Results Eighty-nine members completed the survey for a 13.2% response rate. Few microsurgeons have experience with high-fidelity simulation, and opinions on its utility are internally inconsistent. Although 84% of respondents could not identify a reason why simulation would not be useful, only 24% believed simulation is a useful measure of clinical performance. Nearly three-fourths of respondents were skeptical that simulation would improve their skills. Ninety-four percent had no experience with simulator-based assessment. Conclusion Simulation has been shown to improve skills acquisition in microsurgery, but our survey suggests that unfamiliarity may foster bias against the technology. Failure to incorporate simulation may adversely affect training and may put surgeons at a disadvantage should these technologies be adopted for recertification by regulatory agencies. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  18. A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.

    PubMed

    Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L

    2018-05-16

    During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.

  19. Sensitivity of polar ozone recovery predictions of the GMI 3D CTM to GCM and DAS dynamics

    NASA Astrophysics Data System (ADS)

    Considine, D.; Connell, P.; Strahan, S.; Douglass, A.; Rotman, D.

    2003-04-01

    The Global Modeling Initiative (GMI) 3-D chemistry and transport model has been used to generate 2 simulations of the 1995-2030 time period. The 36-year simulations both used the source gas and aerosol boundary conditions of the 2002 World Meteorological Organization assessment exercise MA2. The first simulation was based on a single year of meteorological data (winds, temperatures) generated by the new Goddard Space Flight Center "Finite Volume" General Circulation Model (FVGCM), repeated for each year of the simulation. The second simulation used a year of meteorological data generated by a new data assimilation system based on the FVGCM (FVDAS), using observations for July 1, 1999 - June 30, 2000. All other aspects of the two simulations were identical. The increase in vortex-averaged south polar springtime ozone concentrations in the lower stratosphere over the course of the simulations is more robust in the simulation driven by the GCM meteorological data than in the simulation driven by DAS winds. At the same time, the decrease in estimated chemical springtime ozone loss is similar. We thus attribute the differences between the two simulations to differences in the representations of polar dynamics which reduce the sensitivity of the simulation driven by DAS winds to changes in vortex chemistry. We also evaluate the representations in the two simulations of trace constituent distributions in the current polar lower stratosphere using various observations. In these comparisons the GCM-based simulation often is in better agreement with the observations than the DAS-based simulation.

  20. The effects of simulated patients and simulated gynecologic models on student anxiety in providing IUD services.

    PubMed

    Khadivzadeh, Talat; Erfanian, Fatemeh

    2012-10-01

    Midwifery students experience high levels of stress during their initial clinical practices. Addressing the learner's source of anxiety and discomfort can ease the learning experience and lead to better outcomes. The aim of this study was to find out the effect of a simulation-based course, using simulated patients and simulated gynecologic models on student anxiety and comfort while practicing to provide intrauterine device (IUD) services. Fifty-six eligible midwifery students were randomly allocated into simulation-based and traditional training groups. They participated in a 12-hour workshop in providing IUD services. The simulation group was trained through an educational program including simulated gynecologic models and simulated patients. The students in both groups then practiced IUD consultation and insertion with real patients in the clinic. The students' anxiety in IUD insertion was assessed using the "Spielberger anxiety test" and the "comfort in providing IUD services" questionnaire. There were significant differences between students in 2 aspects of anxiety including state (P < 0.001) and trait (P = 0.024) and the level of comfort (P = 0.000) in providing IUD services in simulation and traditional groups. "Fear of uterine perforation during insertion" was the most important cause of students' anxiety in providing IUD services, which was reported by 74.34% of students. Simulated patients and simulated gynecologic models are effective in optimizing students' anxiety levels when practicing to deliver IUD services. Therefore, it is recommended that simulated patients and simulated gynecologic models be used before engaging students in real clinical practice.

  1. Mobile Simulation Unit: taking simulation to the surgical trainee.

    PubMed

    Pena, Guilherme; Altree, Meryl; Babidge, Wendy; Field, John; Hewett, Peter; Maddern, Guy

    2015-05-01

    Simulation-based training has become an increasingly accepted part of surgical training. However, simulators are still not widely available to surgical trainees. Some factors that hinder the widespread implementation of simulation-based training are the lack of standardized methods and equipment, costs and time constraints. We have developed a Mobile Simulation Unit (MSU) that enables trainees to access modern simulation equipment tailored to the needs of the learner at the trainee's workplace. From July 2012 to December 2012, the MSU visited six hospitals in South Australia, four in metropolitan and two in rural areas. Resident Medical Officers, surgical trainees, Fellows and International Medical Graduates were invited to voluntarily utilize a variety of surgical simulators on offer. Participants were asked to complete a survey about the accessibility of simulation equipment at their workplace, environment of the MSU, equipment available and instruction received. Utilization data were collected. The MSU was available for a total of 303 h over 52 days. Fifty-five participants were enrolled in the project and each spent on average 118 min utilizing the simulators. The utilization of the total available time was 36%. Participants reported having a poor access to simulation at their workplace and overwhelmingly gave positive feedback regarding their experience in the MSU. The use of the MSU to provide simulation-based education in surgery is feasible and practical. The MSU provides consistent simulation training at the surgical trainee's workplace, regardless of geographic location, and it has the potential to increase participation in simulation programmes. © 2014 Royal Australasian College of Surgeons.

  2. Visualization and simulation techniques for surgical simulators using actual patient's data.

    PubMed

    Radetzky, Arne; Nürnberger, Andreas

    2002-11-01

    Because of the increasing complexity of surgical interventions research in surgical simulation became more and more important over the last years. However, the simulation of tissue deformation is still a challenging problem, mainly due to the short response times that are required for real-time interaction. The demands to hard and software are even larger if not only the modeled human anatomy is used but the anatomy of actual patients. This is required if the surgical simulator should be used as training medium for expert surgeons rather than students. In this article, suitable visualization and simulation methods for surgical simulation utilizing actual patient's datasets are described. Therefore, the advantages and disadvantages of direct and indirect volume rendering for the visualization are discussed and a neuro-fuzzy system is described, which can be used for the simulation of interactive tissue deformations. The neuro-fuzzy system makes it possible to define the deformation behavior based on a linguistic description of the tissue characteristics or to learn the dynamics by using measured data of real tissue. Furthermore, a simulator for minimally-invasive neurosurgical interventions is presented that utilizes the described visualization and simulation methods. The structure of the simulator is described in detail and the results of a system evaluation by an experienced neurosurgeon--a quantitative comparison between different methods of virtual endoscopy as well as a comparison between real brain images and virtual endoscopies--are given. The evaluation proved that the simulator provides a higher realism of the visualization and simulation then other currently available simulators. Copyright 2002 Elsevier Science B.V.

  3. Bringing consistency to simulation of population models--Poisson simulation as a bridge between micro and macro simulation.

    PubMed

    Gustafsson, Leif; Sternad, Mikael

    2007-10-01

    Population models concern collections of discrete entities such as atoms, cells, humans, animals, etc., where the focus is on the number of entities in a population. Because of the complexity of such models, simulation is usually needed to reproduce their complete dynamic and stochastic behaviour. Two main types of simulation models are used for different purposes, namely micro-simulation models, where each individual is described with its particular attributes and behaviour, and macro-simulation models based on stochastic differential equations, where the population is described in aggregated terms by the number of individuals in different states. Consistency between micro- and macro-models is a crucial but often neglected aspect. This paper demonstrates how the Poisson Simulation technique can be used to produce a population macro-model consistent with the corresponding micro-model. This is accomplished by defining Poisson Simulation in strictly mathematical terms as a series of Poisson processes that generate sequences of Poisson distributions with dynamically varying parameters. The method can be applied to any population model. It provides the unique stochastic and dynamic macro-model consistent with a correct micro-model. The paper also presents a general macro form for stochastic and dynamic population models. In an appendix Poisson Simulation is compared with Markov Simulation showing a number of advantages. Especially aggregation into state variables and aggregation of many events per time-step makes Poisson Simulation orders of magnitude faster than Markov Simulation. Furthermore, you can build and execute much larger and more complicated models with Poisson Simulation than is possible with the Markov approach.

  4. Laparoscopic skills acquisition: a study of simulation and traditional training.

    PubMed

    Marlow, Nicholas; Altree, Meryl; Babidge, Wendy; Field, John; Hewett, Peter; Maddern, Guy J

    2014-12-01

    Training in basic laparoscopic skills can be undertaken using traditional methods, where trainees are educated by experienced surgeons through a process of graduated responsibility or by simulation-based training. This study aimed to assess whether simulation trained individuals reach the same level of proficiency in basic laparoscopic skills as traditional trained participants when assessed in a simulated environment. A prospective study was undertaken. Participants were allocated to one of two cohorts according to surgical experience. Participants from the inexperienced cohort were randomized to receive training in basic laparoscopic skills on either a box trainer or a virtual reality simulator. They were then assessed on the simulator on which they did not receive training. Participants from the experienced cohort, considered to have received traditional training in basic laparoscopic skills, did not receive simulation training and were randomized to either the box trainer or virtual reality simulator for skills assessment. The assessment scores from different cohorts on either simulator were then compared. A total of 138 participants completed the assessment session, 101 in the inexperienced simulation-trained cohort and 37 on the experienced traditionally trained cohort. There was no statistically significant difference between the training outcomes of simulation and traditionally trained participants, irrespective of the simulator type used. The results demonstrated that participants trained on either a box trainer or virtual reality simulator achieved a level of basic laparoscopic skills assessed in a simulated environment that was not significantly different from participants who had been traditionally trained in basic laparoscopic skills. © 2013 Royal Australasian College of Surgeons.

  5. Surgical simulators in urological training--views of UK Training Programme Directors.

    PubMed

    Forster, James A; Browning, Anthony J; Paul, Alan B; Biyani, C Shekhar

    2012-09-01

    What's known on the subject? and What does the study add? The role of surgical simulators is currently being debated in urological and other surgical specialties. Simulators are not presently implemented in the UK urology training curriculum. The availability of simulators and the opinions of Training Programme Directors' (TPD) on their role have not been described. In the present questionnaire-based survey, the trainees of most, but not all, UK TPDs had access to laparoscopic simulators, and that all responding TPDs thought that simulators improved laparoscopic training. We hope that the present study will be a positive step towards making an agreement to formally introduce simulators into the UK urology training curriculum. To discuss the current situation on the use of simulators in surgical training. To determine the views of UK Urology Training Programme Directors (TPDs) on the availability and use of simulators in Urology at present, and to discuss the role that simulators may have in future training. An online-questionnaire survey was distributed to all UK Urology TPDs. In all, 16 of 21 TPDs responded. All 16 thought that laparoscopic simulators improved the quality of laparoscopic training. The trainees of 13 TPDs had access to a laparoscopic simulator (either in their own hospital or another hospital in the deanery). Most TPDs thought that trainees should use simulators in their free time, in quiet time during work hours, or in teaching sessions (rather than incorporated into the weekly timetable). We feel that the current apprentice-style method of training in urological surgery is out-dated. We think that all TPDs and trainees should have access to a simulator, and that a formal competency based simulation training programme should be incorporated into the urology training curriculum, with trainees reaching a minimum proficiency on a simulator before undertaking surgical procedures. © 2012 THE AUTHORS. BJU INTERNATIONAL © 2012 BJU INTERNATIONAL.

  6. Design and Test of Advanced Thermal Simulators for an Alkali Metal-Cooled Reactor Simulator

    NASA Technical Reports Server (NTRS)

    Garber, Anne E.; Dickens, Ricky E.

    2011-01-01

    The Early Flight Fission Test Facility (EFF-TF) at NASA Marshall Space Flight Center (MSFC) has as one of its primary missions the development and testing of fission reactor simulators for space applications. A key component in these simulated reactors is the thermal simulator, designed to closely mimic the form and function of a nuclear fuel pin using electric heating. Continuing effort has been made to design simple, robust, inexpensive thermal simulators that closely match the steady-state and transient performance of a nuclear fuel pin. A series of these simulators have been designed, developed, fabricated and tested individually and in a number of simulated reactor systems at the EFF-TF. The purpose of the thermal simulators developed under the Fission Surface Power (FSP) task is to ensure that non-nuclear testing can be performed at sufficiently high fidelity to allow a cost-effective qualification and acceptance strategy to be used. Prototype thermal simulator design is founded on the baseline Fission Surface Power reactor design. Recent efforts have been focused on the design, fabrication and test of a prototype thermal simulator appropriate for use in the Technology Demonstration Unit (TDU). While designing the thermal simulators described in this paper, effort were made to improve the axial power profile matching of the thermal simulators. Simultaneously, a search was conducted for graphite materials with higher resistivities than had been employed in the past. The combination of these two efforts resulted in the creation of thermal simulators with power capacities of 2300-3300 W per unit. Six of these elements were installed in a simulated core and tested in the alkali metal-cooled Fission Surface Power Primary Test Circuit (FSP-PTC) at a variety of liquid metal flow rates and temperatures. This paper documents the design of the thermal simulators, test program, and test results.

  7. FERN - a Java framework for stochastic simulation and evaluation of reaction networks.

    PubMed

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-08-29

    Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.

  8. Mobile in Situ Simulation as a Tool for Evaluation and Improvement of Trauma Treatment in the Emergency Department.

    PubMed

    Amiel, Imri; Simon, Daniel; Merin, Ofer; Ziv, Amitai

    2016-01-01

    Medical simulation is an increasingly recognized tool for teaching, coaching, training, and examining practitioners in the medical field. For many years, simulation has been used to improve trauma care and teamwork. Despite technological advances in trauma simulators, including better means of mobilization and control, most reported simulation-based trauma training has been conducted inside simulation centers, and the practice of mobile simulation in hospitals' trauma rooms has not been investigated fully. The emergency department personnel from a second-level trauma center in Israel were evaluated. Divided into randomly formed trauma teams, they were reviewed twice using in situ mobile simulation training at the hospital's trauma bay. In all, 4 simulations were held before and 4 simulations were held after a structured learning intervention. The intervention included a 1-day simulation-based training conducted at the Israel Center for Medical Simulation (MSR), which included video-based debriefing facilitated by the hospital's 4 trauma team leaders who completed a 2-day simulation-based instructors' course before the start of the study. The instructors were also trained on performance rating and thus were responsible for the assessment of their respective teams in real time as well as through reviewing of the recorded videos; thus enabling a comparison of the performances in the mobile simulation exercise before and after the educational intervention. The internal reliability of the experts' evaluation calculated in the Cronbach α model was found to be 0.786. Statistically significant improvement was observed in 4 of 10 parameters, among which were teamwork (29.64%) and communication (24.48%) (p = 0.00005). The mobile in situ simulation-based training demonstrated efficacy both as an assessment tool for trauma teams' function and an educational intervention when coupled with in vitro simulation-based training, resulting in a significant improvement of the teams' function in various aspects of treatment. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  9. Fluid, solid and fluid-structure interaction simulations on patient-based abdominal aortic aneurysm models.

    PubMed

    Kelly, Sinead; O'Rourke, Malachy

    2012-04-01

    This article describes the use of fluid, solid and fluid-structure interaction simulations on three patient-based abdominal aortic aneurysm geometries. All simulations were carried out using OpenFOAM, which uses the finite volume method to solve both fluid and solid equations. Initially a fluid-only simulation was carried out on a single patient-based geometry and results from this simulation were compared with experimental results. There was good qualitative and quantitative agreement between the experimental and numerical results, suggesting that OpenFOAM is capable of predicting the main features of unsteady flow through a complex patient-based abdominal aortic aneurysm geometry. The intraluminal thrombus and arterial wall were then included, and solid stress and fluid-structure interaction simulations were performed on this, and two other patient-based abdominal aortic aneurysm geometries. It was found that the solid stress simulations resulted in an under-estimation of the maximum stress by up to 5.9% when compared with the fluid-structure interaction simulations. In the fluid-structure interaction simulations, flow induced pressure within the aneurysm was found to be up to 4.8% higher than the value of peak systolic pressure imposed in the solid stress simulations, which is likely to be the cause of the variation in the stress results. In comparing the results from the initial fluid-only simulation with results from the fluid-structure interaction simulation on the same patient, it was found that wall shear stress values varied by up to 35% between the two simulation methods. It was concluded that solid stress simulations are adequate to predict the maximum stress in an aneurysm wall, while fluid-structure interaction simulations should be performed if accurate prediction of the fluid wall shear stress is necessary. Therefore, the decision to perform fluid-structure interaction simulations should be based on the particular variables of interest in a given study.

  10. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    PubMed Central

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined. PMID:22172142

  11. A Primer on Simulation and Gaming.

    ERIC Educational Resources Information Center

    Barton, Richard F.

    In a primer intended for the administrative professions, for the behavioral sciences, and for education, simulation and its various aspects are defined, illustrated, and explained. Man-model simulation, man-computer simulation, all-computer simulation, and analysis are discussed as techniques for studying object systems (parts of the "real…

  12. A Study of Umbilical Communication Interface of Simulator Kernel to Enhance Visibility and Controllability

    NASA Astrophysics Data System (ADS)

    Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok

    2013-08-01

    In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.

  13. Simulating Issue Networks in Small Classes using the World Wide Web.

    ERIC Educational Resources Information Center

    Josefson, Jim; Casey, Kelly

    2000-01-01

    Provides background information on simulations and active learning. Discusses the use of simulations in political science courses. Describes a simulation exercise where students performed specific institutional role playing, simulating the workings of a single congressional issue network, based on the reauthorization of the Endangered Species Act.…

  14. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  15. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  16. 14 CFR Appendix H to Part 121 - Advanced Simulation

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Advanced Simulation H Appendix H to Part... Simulation This appendix provides guidelines and a means for achieving flightcrew training in advanced... simulator, as appropriate. Advanced Simulation Training Program For an operator to conduct Level C or D...

  17. Do Simulations Enhance Student Learning? An Empirical Evaluation of an IR Simulation

    ERIC Educational Resources Information Center

    Shellman, Stephen M.; Turan, Kursad

    2006-01-01

    There is a nascent literature on the question of whether active learning methods, and in particular simulation methods, enhance student learning. In this article, the authors evaluate the utility of an international relations simulation in enhancing learning objectives. Student evaluations provide evidence that the simulation process enhances…

  18. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Training devices and simulators. 121.921...

  19. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Training devices and simulators. 121.921...

  20. 14 CFR 121.921 - Training devices and simulators.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... devices and simulators. (a) Each flight training device or airplane simulator that will be used in an AQP... device or flight simulator qualification level: (1) Required evaluation of individual or crew proficiency... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Training devices and simulators. 121.921...

Top