Sample records for simulant variability study

  1. Temporal Variability of Observed and Simulated Hyperspectral Earth Reflectance

    NASA Technical Reports Server (NTRS)

    Roberts, Yolanda; Pilewskie, Peter; Kindel, Bruce; Feldman, Daniel; Collins, William D.

    2012-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a climate observation system designed to study Earth's climate variability with unprecedented absolute radiometric accuracy and SI traceability. Observation System Simulation Experiments (OSSEs) were developed using GCM output and MODTRAN to simulate CLARREO reflectance measurements during the 21st century as a design tool for the CLARREO hyperspectral shortwave imager. With OSSE simulations of hyperspectral reflectance, Feldman et al. [2011a,b] found that shortwave reflectance is able to detect changes in climate variables during the 21st century and improve time-to-detection compared to broadband measurements. The OSSE has been a powerful tool in the design of the CLARREO imager and for understanding the effect of climate change on the spectral variability of reflectance, but it is important to evaluate how well the OSSE simulates the Earth's present-day spectral variability. For this evaluation we have used hyperspectral reflectance measurements from the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY), a shortwave spectrometer that was operational between March 2002 and April 2012. To study the spectral variability of SCIAMACHY-measured and OSSE-simulated reflectance, we used principal component analysis (PCA), a spectral decomposition technique that identifies dominant modes of variability in a multivariate data set. Using quantitative comparisons of the OSSE and SCIAMACHY PCs, we have quantified how well the OSSE captures the spectral variability of Earth?s climate system at the beginning of the 21st century relative to SCIAMACHY measurements. These results showed that the OSSE and SCIAMACHY data sets share over 99% of their total variance in 2004. Using the PCs and the temporally distributed reflectance spectra projected onto the PCs (PC scores), we can study the temporal variability of the observed and simulated reflectance spectra. Multivariate time series analysis of the PC scores using techniques such as Singular Spectrum Analysis (SSA) and Multichannel SSA will provide information about the temporal variability of the dominant variables. Quantitative comparison techniques can evaluate how well the OSSE reproduces the temporal variability observed by SCIAMACHY spectral reflectance measurements during the first decade of the 21st century. PCA of OSSE-simulated reflectance can also be used to study how the dominant spectral variables change on centennial scales for forced and unforced climate change scenarios. To have confidence in OSSE predictions of the spectral variability of hyperspectral reflectance, it is first necessary for us to evaluate the degree to which the OSSE simulations are able to reproduce the Earth?s present-day spectral variability.

  2. Skill of ENSEMBLES seasonal re-forecasts for malaria prediction in West Africa

    NASA Astrophysics Data System (ADS)

    Jones, A. E.; Morse, A. P.

    2012-12-01

    This study examines the performance of malaria-relevant climate variables from the ENSEMBLES seasonal ensemble re-forecasts for sub-Saharan West Africa, using a dynamic malaria model to transform temperature and rainfall forecasts into simulated malaria incidence and verifying these forecasts against simulations obtained by driving the malaria model with General Circulation Model-derived reanalysis. Two subregions of forecast skill are identified: the highlands of Cameroon, where low temperatures limit simulated malaria during the forecast period and interannual variability in simulated malaria is closely linked to variability in temperature, and northern Nigeria/southern Niger, where simulated malaria variability is strongly associated with rainfall variability during the peak rain months.

  3. Simulating tracer transport in variably saturated soils and shallow groundwater

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to develop a realistic model to simulate the complex processes of flow and tracer transport in variably saturated soils and to compare simulation results with the detailed monitoring observations. The USDA-ARS OPE3 field site was selected for the case study due to ava...

  4. Simulating variable source problems via post processing of individual particle tallies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bleuel, D.L.; Donahue, R.J.; Ludewigt, B.A.

    2000-10-20

    Monte Carlo is an extremely powerful method of simulating complex, three dimensional environments without excessive problem simplification. However, it is often time consuming to simulate models in which the source can be highly varied. Similarly difficult are optimization studies involving sources in which many input parameters are variable, such as particle energy, angle, and spatial distribution. Such studies are often approached using brute force methods or intelligent guesswork. One field in which these problems are often encountered is accelerator-driven Boron Neutron Capture Therapy (BNCT) for the treatment of cancers. Solving the reverse problem of determining the best neutron source formore » optimal BNCT treatment can be accomplished by separating the time-consuming particle-tracking process of a full Monte Carlo simulation from the calculation of the source weighting factors which is typically performed at the beginning of a Monte Carlo simulation. By post-processing these weighting factors on a recorded file of individual particle tally information, the effect of changing source variables can be realized in a matter of seconds, instead of requiring hours or days for additional complete simulations. By intelligent source biasing, any number of different source distributions can be calculated quickly from a single Monte Carlo simulation. The source description can be treated as variable and the effect of changing multiple interdependent source variables on the problem's solution can be determined. Though the focus of this study is on BNCT applications, this procedure may be applicable to any problem that involves a variable source.« less

  5. Uncertainty analysis of the simulations of effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota

    USGS Publications Warehouse

    Wesolowski, Edwin A.

    1996-01-01

    Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.

  6. Simulating maize yield and biomass with spatial variability of soil field capacity

    USDA-ARS?s Scientific Manuscript database

    Spatial variability in field soil water and other properties is a challenge for system modelers who use only representative values for model inputs, rather than their distributions. In this study, we compared simulation results from a calibrated model with spatial variability of soil field capacity ...

  7. North Atlantic Jet Variability in PMIP3 LGM Simulations

    NASA Astrophysics Data System (ADS)

    Hezel, P.; Li, C.

    2017-12-01

    North Atlantic jet variability in glacial climates has been shown inmodelling studies to be strongly influenced by upstream ice sheettopography. We analyze the results of 8 models from the PMIP3simulations, forced with a hybrid Laurentide Ice Sheet topography, andcompare them to the PMIP2 simulations which were forced with theICE-5G topography, to develop a general understanding of the NorthAtlantic jet and jet variability. The strengthening of the jet andreduced spatial variability is a robust feature of the last glacialmaximum (LGM) simulations compared to the pre-industrial state.However, the canonical picture of the LGM North Atlantic jet as beingmore zonal and elongated compared to pre-industrial climate states isnot a robust result across models, and may have arisen in theliterature as a function of multiple studies performed with the samemodel.

  8. A sensitivity study of the coupled simulation of the Northeast Brazil rainfall variability

    NASA Astrophysics Data System (ADS)

    Misra, Vasubandhu

    2007-06-01

    Two long-term coupled ocean-land-atmosphere simulations with slightly different parameterization of the diagnostic shallow inversion clouds in the atmospheric general circulation model (AGCM) of the Center for Ocean-Land-Atmosphere Studies (COLA) coupled climate model are compared for their annual cycle and interannual variability of the northeast Brazil (NEB) rainfall variability. It is seen that the solar insolation affected by the changes to the shallow inversion clouds results in large scale changes to the gradients of the SST and the surface pressure. The latter in turn modulates the surface convergence and the associated Atlantic ITCZ precipitation and the NEB annual rainfall variability. In contrast, the differences in the NEB interannual rainfall variability between the two coupled simulations is attributed to their different remote ENSO forcing.

  9. Simulation of South-Asian Summer Monsoon in a GCM

    NASA Astrophysics Data System (ADS)

    Ajayamohan, R. S.

    2007-10-01

    Major characteristics of Indian summer monsoon climate are analyzed using simulations from the upgraded version of Florida State University Global Spectral Model (FSUGSM). The Indian monsoon has been studied in terms of mean precipitation and low-level and upper-level circulation patterns and compared with observations. In addition, the model's fidelity in simulating observed monsoon intraseasonal variability, interannual variability and teleconnection patterns is examined. The model is successful in simulating the major rainbelts over the Indian monsoon region. However, the model exhibits bias in simulating the precipitation bands over the South China Sea and the West Pacific region. Seasonal mean circulation patterns of low-level and upper-level winds are consistent with the model's precipitation pattern. Basic features like onset and peak phase of monsoon are realistically simulated. However, model simulation indicates an early withdrawal of monsoon. Northward propagation of rainbelts over the Indian continent is simulated fairly well, but the propagation is weak over the ocean. The model simulates the meridional dipole structure associated with the monsoon intraseasonal variability realistically. The model is unable to capture the observed interannual variability of monsoon and its teleconnection patterns. Estimate of potential predictability of the model reveals the dominating influence of internal variability over the Indian monsoon region.

  10. A Simulation Study of Missing Data with Multiple Missing X's

    ERIC Educational Resources Information Center

    Rubright, Jonathan D.; Nandakumar, Ratna; Glutting, Joseph J.

    2014-01-01

    When exploring missing data techniques in a realistic scenario, the current literature is limited: most studies only consider consequences with data missing on a single variable. This simulation study compares the relative bias of two commonly used missing data techniques when data are missing on more than one variable. Factors varied include type…

  11. Affected States Soft Independent Modeling by Class Analogy from the Relation Between Independent Variables, Number of Independent Variables and Sample Size

    PubMed Central

    Kanık, Emine Arzu; Temel, Gülhan Orekici; Erdoğan, Semra; Kaya, İrem Ersöz

    2013-01-01

    Objective: The aim of study is to introduce method of Soft Independent Modeling of Class Analogy (SIMCA), and to express whether the method is affected from the number of independent variables, the relationship between variables and sample size. Study Design: Simulation study. Material and Methods: SIMCA model is performed in two stages. In order to determine whether the method is influenced by the number of independent variables, the relationship between variables and sample size, simulations were done. Conditions in which sample sizes in both groups are equal, and where there are 30, 100 and 1000 samples; where the number of variables is 2, 3, 5, 10, 50 and 100; moreover where the relationship between variables are quite high, in medium level and quite low were mentioned. Results: Average classification accuracy of simulation results which were carried out 1000 times for each possible condition of trial plan were given as tables. Conclusion: It is seen that diagnostic accuracy results increase as the number of independent variables increase. SIMCA method is a method in which the relationship between variables are quite high, the number of independent variables are many in number and where there are outlier values in the data that can be used in conditions having outlier values. PMID:25207065

  12. Observations and simulations of the ionospheric lunar tide: Seasonal variability

    NASA Astrophysics Data System (ADS)

    Pedatella, N. M.

    2014-07-01

    The seasonal variability of the ionospheric lunar tide is investigated using a combination of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) observations and thermosphere-ionosphere-mesosphere electrodynamics general circulation model (TIME-GCM) simulations. The present study focuses on the seasonal variability of the lunar tide in the ionosphere and its potential connection to the occurrence of stratosphere sudden warmings (SSWs). COSMIC maximum F region electron density (NmF2) and total electron content observations reveal a primarily annual variation of the ionospheric lunar tide, with maximum amplitudes occurring at low latitudes during December-February. Simulations of the lunar tide climatology in TIME-GCM display a similar annual variability as the COSMIC observations. This leads to the conclusion that the annual variability of the lunar tide in the ionosphere is not solely due to the occurrence of SSWs. Rather, the annual variability of the lunar tide in the ionosphere is generated by the seasonal variability of the lunar tide at E region altitudes. However, compared to the observations, the ionospheric lunar tide annual variability is weaker in the climatological simulations which is attributed to the occurrence of SSWs during the majority of the years included in the observations. Introducing a SSW into the TIME-GCM simulation leads to an additional enhancement of the lunar tide during Northern Hemisphere winter, increasing the lunar tide annual variability and resulting in an annual variability that is more consistent with the observations. The occurrence of SSWs can therefore potentially bias lunar tide climatologies, and it is important to consider these effects in studies of the lunar tide in the atmosphere and ionosphere.

  13. Uncertainty Propagation of Non-Parametric-Derived Precipitation Estimates into Multi-Hydrologic Model Simulations

    NASA Astrophysics Data System (ADS)

    Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.

    2017-12-01

    Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.

  14. Natural variability of marine ecosystems inferred from a coupled climate to ecosystem simulation

    NASA Astrophysics Data System (ADS)

    Le Mézo, Priscilla; Lefort, Stelly; Séférian, Roland; Aumont, Olivier; Maury, Olivier; Murtugudde, Raghu; Bopp, Laurent

    2016-01-01

    This modeling study analyzes the simulated natural variability of pelagic ecosystems in the North Atlantic and North Pacific. Our model system includes a global Earth System Model (IPSL-CM5A-LR), the biogeochemical model PISCES and the ecosystem model APECOSM that simulates upper trophic level organisms using a size-based approach and three interactive pelagic communities (epipelagic, migratory and mesopelagic). Analyzing an idealized (e.g., no anthropogenic forcing) 300-yr long pre-industrial simulation, we find that low and high frequency variability is dominant for the large and small organisms, respectively. Our model shows that the size-range exhibiting the largest variability at a given frequency, defined as the resonant range, also depends on the community. At a given frequency, the resonant range of the epipelagic community includes larger organisms than that of the migratory community and similarly, the latter includes larger organisms than the resonant range of the mesopelagic community. This study shows that the simulated temporal variability of marine pelagic organisms' abundance is not only influenced by natural climate fluctuations but also by the structure of the pelagic community. As a consequence, the size- and community-dependent response of marine ecosystems to climate variability could impact the sustainability of fisheries in a warming world.

  15. Effects of input uncertainty on cross-scale crop modeling

    NASA Astrophysics Data System (ADS)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.

  16. Variables affecting learning in a simulation experience: a mixed methods study.

    PubMed

    Beischel, Kelly P

    2013-02-01

    The primary purpose of this study was to test a hypothesized model describing the direct effects of learning variables on anxiety and cognitive learning outcomes in a high-fidelity simulation (HFS) experience. The secondary purpose was to explain and explore student perceptions concerning the qualities and context of HFS affecting anxiety and learning. This study used a mixed methods quantitative-dominant explanatory design with concurrent qualitative data collection to examine variables affecting learning in undergraduate, beginning nursing students (N = 124). Being ready to learn, having a strong auditory-verbal learning style, and being prepared for simulation directly affected anxiety, whereas learning outcomes were directly affected by having strong auditory-verbal and hands-on learning styles. Anxiety did not quantitatively mediate cognitive learning outcomes as theorized, although students qualitatively reported debilitating levels of anxiety. This study advances nursing education science by providing evidence concerning variables affecting learning outcomes in HFS.

  17. Sea Surface Salinity Variability from Simulations and Observations: Preparing for Aquarius

    NASA Technical Reports Server (NTRS)

    Jacob, S. Daniel; LeVine, David M.

    2010-01-01

    Oceanic fresh water transport has been shown to play an important role in the global hydrological cycle. Sea surface salinity (SSS) is representative of the surface fresh water fluxes and the upcoming Aquarius mission scheduled to be launched in December 2010 will provide excellent spatial and temporal SSS coverage to better estimate the net exchange. In most ocean general circulation models, SSS is relaxed to climatology to prevent model drift. While SST remains a well observed variable, relaxing to SST reduces the range of SSS variability in the simulations (Fig.1). The main objective of the present study is to simulate surface tracers using a primitive equation ocean model for multiple forcing data sets to identify and establish a baseline SSS variability. The simulated variability scales are compared to those from near-surface argo salinity measurements.

  18. Historical range of variability in landscape structure: a simulation study in Oregon, USA.

    Treesearch

    Etsuko Nonaka; Thomas A. Spies

    2005-01-01

    We estimated the historical range of variability (HRV) of forest landscape structure under natural disturbance regimes at the scale of a physiographic province (Oregon Coast Range, 2 million ha) and evaluated the similarity to HRV of current and future landscapes under alternative management scenarios. We used a stochastic fire simulation model to simulate...

  19. Ionosphere variability at mid latitudes during sudden stratosphere warmings

    NASA Astrophysics Data System (ADS)

    Pedatella, N. M.; Maute, A. I.; Maruyama, N.

    2015-12-01

    Variability of the mid latitude ionosphere and thermosphere during the 2009 and 2013 sudden stratosphere warmings (SSWs) is investigated in the present study using a combination of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) observations and model simulations. The simulations are performed using the Thermosphere-Ionosphere-Mesosphere-Electrodynamics General Circulation Model (TIME-GCM) and Ionosphere Plasmasphere Electrodynamics (IPE) model. Both the COSMIC observations and TIME-GCM simulations reveal perturbations in the F-region peak height (hmF2) at Southern Hemisphere mid latitudes during SSW time periods. The perturbations are ~20-30 km, which corresponds to 10-20% variability in hmF2. The TIME-GCM simulations and COSMIC observations of the hmF2 variability are in overall good agreement, and the simulations can thus be used to understand the physical processes responsible for the hmF2 variability. The simulation results demonstrate that the mid lattiude hmF2 variability is primarily driven by the propagation of the migrating semidiurnal lunar tide (M2) into the thermosphere where it modulates the field aligned neutrals winds, which in-turn raise and lower the F-region peak height. The importance of the thermosphere neutral winds on generating the ionosphere variability at mid latitudes during SSWs is supported by IPE simulations performed both with and without the neutral wind variability. Though there are subtle differences, the consistency of the behavior between the 2009 and 2013 SSWs suggests that variability in the Southern Hemisphere mid latitude ionosphere and thermosphere is a consistent feature of the SSW impact on the upper atmosphere.

  20. Simulation of carbon allocation and organ growth variability in apple tree by connecting architectural and source–sink models

    PubMed Central

    Pallas, Benoît; Da Silva, David; Valsesia, Pierre; Yang, Weiwei; Guillaume, Olivier; Lauri, Pierre-Eric; Vercambre, Gilles; Génard, Michel; Costes, Evelyne

    2016-01-01

    Background and aims Plant growth depends on carbon availability and allocation among organs. QualiTree has been designed to simulate carbon allocation and partitioning in the peach tree (Prunus persica), whereas MappleT is dedicated to the simulation of apple tree (Malus × domestica) architecture. The objective of this study was to couple both models and adapt QualiTree to apple trees to simulate organ growth traits and their within-tree variability. Methods MappleT was used to generate architectures corresponding to the ‘Fuji’ cultivar, accounting for the variability within and among individuals. These architectures were input into QualiTree to simulate shoot and fruit growth during a growth cycle. We modified QualiTree to account for the observed shoot polymorphism in apple trees, i.e. different classes (long, medium and short) that were characterized by different growth function parameters. Model outputs were compared with observed 3D tree geometries, considering shoot and final fruit size and growth dynamics. Key Results The modelling approach connecting MappleT and QualiTree was appropriate to the simulation of growth and architectural characteristics at the tree scale (plant leaf area, shoot number and types, fruit weight at harvest). At the shoot scale, mean fruit weight and its variability within trees was accurately simulated, whereas the model tended to overestimate individual shoot leaf area and underestimate its variability for each shoot type. Varying the parameter related to the intensity of carbon exchange between shoots revealed that behaviour intermediate between shoot autonomy and a common assimilate pool was required to properly simulate within-tree fruit growth variability. Moreover, the model correctly dealt with the crop load effect on organ growth. Conclusions This study provides understanding of the integration of shoot ontogenetic properties, carbon supply and transport between entities for simulating organ growth in trees. Further improvements regarding the integration of retroaction loops between carbon allocation and the resulting plant architecture are expected to allow multi-year simulations. PMID:27279576

  1. North Atlantic simulations in Coordinated Ocean-ice Reference Experiments phase II (CORE-II). Part II: Inter-annual to decadal variability

    NASA Astrophysics Data System (ADS)

    Danabasoglu, Gokhan; Yeager, Steve G.; Kim, Who M.; Behrens, Erik; Bentsen, Mats; Bi, Daohua; Biastoch, Arne; Bleck, Rainer; Böning, Claus; Bozec, Alexandra; Canuto, Vittorio M.; Cassou, Christophe; Chassignet, Eric; Coward, Andrew C.; Danilov, Sergey; Diansky, Nikolay; Drange, Helge; Farneti, Riccardo; Fernandez, Elodie; Fogli, Pier Giuseppe; Forget, Gael; Fujii, Yosuke; Griffies, Stephen M.; Gusev, Anatoly; Heimbach, Patrick; Howard, Armando; Ilicak, Mehmet; Jung, Thomas; Karspeck, Alicia R.; Kelley, Maxwell; Large, William G.; Leboissetier, Anthony; Lu, Jianhua; Madec, Gurvan; Marsland, Simon J.; Masina, Simona; Navarra, Antonio; Nurser, A. J. George; Pirani, Anna; Romanou, Anastasia; Salas y Mélia, David; Samuels, Bonita L.; Scheinert, Markus; Sidorenko, Dmitry; Sun, Shan; Treguier, Anne-Marie; Tsujino, Hiroyuki; Uotila, Petteri; Valcke, Sophie; Voldoire, Aurore; Wang, Qiang; Yashayaev, Igor

    2016-01-01

    Simulated inter-annual to decadal variability and trends in the North Atlantic for the 1958-2007 period from twenty global ocean - sea-ice coupled models are presented. These simulations are performed as contributions to the second phase of the Coordinated Ocean-ice Reference Experiments (CORE-II). The study is Part II of our companion paper (Danabasoglu et al., 2014) which documented the mean states in the North Atlantic from the same models. A major focus of the present study is the representation of Atlantic meridional overturning circulation (AMOC) variability in the participating models. Relationships between AMOC variability and those of some other related variables, such as subpolar mixed layer depths, the North Atlantic Oscillation (NAO), and the Labrador Sea upper-ocean hydrographic properties, are also investigated. In general, AMOC variability shows three distinct stages. During the first stage that lasts until the mid- to late-1970s, AMOC is relatively steady, remaining lower than its long-term (1958-2007) mean. Thereafter, AMOC intensifies with maximum transports achieved in the mid- to late-1990s. This enhancement is then followed by a weakening trend until the end of our integration period. This sequence of low frequency AMOC variability is consistent with previous studies. Regarding strengthening of AMOC between about the mid-1970s and the mid-1990s, our results support a previously identified variability mechanism where AMOC intensification is connected to increased deep water formation in the subpolar North Atlantic, driven by NAO-related surface fluxes. The simulations tend to show general agreement in their temporal representations of, for example, AMOC, sea surface temperature (SST), and subpolar mixed layer depth variabilities. In particular, the observed variability of the North Atlantic SSTs is captured well by all models. These findings indicate that simulated variability and trends are primarily dictated by the atmospheric datasets which include the influence of ocean dynamics from nature superimposed onto anthropogenic effects. Despite these general agreements, there are many differences among the model solutions, particularly in the spatial structures of variability patterns. For example, the location of the maximum AMOC variability differs among the models between Northern and Southern Hemispheres.

  2. North Atlantic Simulations in Coordinated Ocean-Ice Reference Experiments Phase II (CORE-II) . Part II; Inter-Annual to Decadal Variability

    NASA Technical Reports Server (NTRS)

    Danabasoglu, Gokhan; Yeager, Steve G.; Kim, Who M.; Behrens, Erik; Bentsen, Mats; Bi, Daohua; Biastoch, Arne; Bleck, Rainer; Boening, Claus; Bozec, Alexandra; hide

    2015-01-01

    Simulated inter-annual to decadal variability and trends in the North Atlantic for the 1958-2007 period from twenty global ocean - sea-ice coupled models are presented. These simulations are performed as contributions to the second phase of the Coordinated Ocean-ice Reference Experiments (CORE-II). The study is Part II of our companion paper (Danabasoglu et al., 2014) which documented the mean states in the North Atlantic from the same models. A major focus of the present study is the representation of Atlantic meridional overturning circulation (AMOC) variability in the participating models. Relationships between AMOC variability and those of some other related variables, such as subpolar mixed layer depths, the North Atlantic Oscillation (NAO), and the Labrador Sea upper-ocean hydrographic properties, are also investigated. In general, AMOC variability shows three distinct stages. During the first stage that lasts until the mid- to late-1970s, AMOC is relatively steady, remaining lower than its long-term (1958-2007) mean. Thereafter, AMOC intensifies with maximum transports achieved in the mid- to late-1990s. This enhancement is then followed by a weakening trend until the end of our integration period. This sequence of low frequency AMOC variability is consistent with previous studies. Regarding strengthening of AMOC between about the mid-1970s and the mid-1990s, our results support a previously identified variability mechanism where AMOC intensification is connected to increased deep water formation in the subpolar North Atlantic, driven by NAO-related surface fluxes. The simulations tend to show general agreement in their representations of, for example, AMOC, sea surface temperature (SST), and subpolar mixed layer depth variabilities. In particular, the observed variability of the North Atlantic SSTs is captured well by all models. These findings indicate that simulated variability and trends are primarily dictated by the atmospheric datasets which include the influence of ocean dynamics from nature superimposed onto anthropogenic effects. Despite these general agreements, there are many differences among the model solutions, particularly in the spatial structures of variability patterns. For example, the location of the maximum AMOC variability differs among the models between Northern and Southern Hemispheres.

  3. A Method for Modeling the Intrinsic Dynamics of Intraindividual Variability: Recovering the Parameters of Simulated Oscillators in Multi-Wave Panel Data.

    ERIC Educational Resources Information Center

    Boker, Steven M.; Nesselroade, John R.

    2002-01-01

    Examined two methods for fitting models of intrinsic dynamics to intraindividual variability data by testing these techniques' behavior in equations through simulation studies. Among the main results is the demonstration that a local linear approximation of derivatives can accurately recover the parameters of a simulated linear oscillator, with…

  4. Impact of atmospheric forcing on heat content variability in the sub-surface layer in the Japan/East Sea, 1948-2009

    NASA Astrophysics Data System (ADS)

    Stepanov, Dmitry; Gusev, Anatoly; Diansky, Nikolay

    2016-04-01

    Based on numerical simulations the study investigates impact of atmospheric forcing on heat content variability of the sub-surface layer in Japan/East Sea (JES), 1948-2009. We developed a model configuration based on a INMOM model and atmospheric forcing extracted from the CORE phase II experiment dataset 1948-2009, which enables to assess impact of only atmospheric forcing on heat content variability of the sub-surface layer of the JES. An analysis of kinetic energy (KE) and total heat content (THC) in the JES obtained from our numerical simulations showed that the simulated circulation of the JES is being quasi-steady state. It was found that the year-mean KE variations obtained from our numerical simulations are similar those extracted from the SODA reanalysis. Comparison of the simulated THC and that extracted from the SODA reanalysis showed significant consistence between them. An analysis of numerical simulations showed that the simulated circulation structure is very similar that obtained from the PALACE floats in the intermediate and abyssal layers in the JES. Using empirical orthogonal function analysis we studied spatial-temporal variability of the heat content of the sub-surface layer in the JES. Based on comparison of the simulated heat content variations with those obtained from natural observations an assessment of the atmospheric forcing impact on the heat content variability was obtained. Using singular value decomposition analysis we considered relationships between the heat content variability and wind stress curl as well as sensible heat flux in winter. It was established the major role of sensible heat flux in decadal variability of the heat content of the sub-surface layer in the JES. The research was supported by the Russian Foundation for Basic Research (grant N 14-05-00255) and the Council on the Russian Federation President Grants (grant N MK-3241.2015.5)

  5. Transferability of optimally-selected climate models in the quantification of climate change impacts on hydrology

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Brissette, François P.; Lucas-Picher, Philippe

    2016-11-01

    Given the ever increasing number of climate change simulations being carried out, it has become impractical to use all of them to cover the uncertainty of climate change impacts. Various methods have been proposed to optimally select subsets of a large ensemble of climate simulations for impact studies. However, the behaviour of optimally-selected subsets of climate simulations for climate change impacts is unknown, since the transfer process from climate projections to the impact study world is usually highly non-linear. Consequently, this study investigates the transferability of optimally-selected subsets of climate simulations in the case of hydrological impacts. Two different methods were used for the optimal selection of subsets of climate scenarios, and both were found to be capable of adequately representing the spread of selected climate model variables contained in the original large ensemble. However, in both cases, the optimal subsets had limited transferability to hydrological impacts. To capture a similar variability in the impact model world, many more simulations have to be used than those that are needed to simply cover variability from the climate model variables' perspective. Overall, both optimal subset selection methods were better than random selection when small subsets were selected from a large ensemble for impact studies. However, as the number of selected simulations increased, random selection often performed better than the two optimal methods. To ensure adequate uncertainty coverage, the results of this study imply that selecting as many climate change simulations as possible is the best avenue. Where this was not possible, the two optimal methods were found to perform adequately.

  6. Real versus Simulated Mobile Phone Exposures in Experimental Studies

    PubMed Central

    Panagopoulos, Dimitris J.; Johansson, Olle; Carlo, George L.

    2015-01-01

    We examined whether exposures to mobile phone radiation in biological/clinical experiments should be performed with real-life Electromagnetic Fields (EMFs) emitted by commercially available mobile phone handsets, instead of simulated EMFs emitted by generators or test phones. Real mobile phone emissions are constantly and unpredictably varying and thus are very different from simulated emissions which employ fixed parameters and no variability. This variability is an important parameter that makes real emissions more bioactive. Living organisms seem to have decreased defense against environmental stressors of high variability. While experimental studies employing simulated EMF-emissions present a strong inconsistency among their results with less than 50% of them reporting effects, studies employing real mobile phone exposures demonstrate an almost 100% consistency in showing adverse effects. This consistency is in agreement with studies showing association with brain tumors, symptoms of unwellness, and declines in animal populations. Average dosimetry in studies with real emissions can be reliable with increased number of field measurements, and variation in experimental outcomes due to exposure variability becomes less significant with increased number of experimental replications. We conclude that, in order for experimental findings to reflect reality, it is crucially important that exposures be performed by commercially available mobile phone handsets. PMID:26346766

  7. Three-dimensional benchmark for variable-density flow and transport simulation: matching semi-analytic stability modes for steady unstable convection in an inclined porous box

    USGS Publications Warehouse

    Voss, Clifford I.; Simmons, Craig T.; Robinson, Neville I.

    2010-01-01

    This benchmark for three-dimensional (3D) numerical simulators of variable-density groundwater flow and solute or energy transport consists of matching simulation results with the semi-analytical solution for the transition from one steady-state convective mode to another in a porous box. Previous experimental and analytical studies of natural convective flow in an inclined porous layer have shown that there are a variety of convective modes possible depending on system parameters, geometry and inclination. In particular, there is a well-defined transition from the helicoidal mode consisting of downslope longitudinal rolls superimposed upon an upslope unicellular roll to a mode consisting of purely an upslope unicellular roll. Three-dimensional benchmarks for variable-density simulators are currently (2009) lacking and comparison of simulation results with this transition locus provides an unambiguous means to test the ability of such simulators to represent steady-state unstable 3D variable-density physics.

  8. Affected States soft independent modeling by class analogy from the relation between independent variables, number of independent variables and sample size.

    PubMed

    Kanık, Emine Arzu; Temel, Gülhan Orekici; Erdoğan, Semra; Kaya, Irem Ersöz

    2013-03-01

    The aim of study is to introduce method of Soft Independent Modeling of Class Analogy (SIMCA), and to express whether the method is affected from the number of independent variables, the relationship between variables and sample size. Simulation study. SIMCA model is performed in two stages. In order to determine whether the method is influenced by the number of independent variables, the relationship between variables and sample size, simulations were done. Conditions in which sample sizes in both groups are equal, and where there are 30, 100 and 1000 samples; where the number of variables is 2, 3, 5, 10, 50 and 100; moreover where the relationship between variables are quite high, in medium level and quite low were mentioned. Average classification accuracy of simulation results which were carried out 1000 times for each possible condition of trial plan were given as tables. It is seen that diagnostic accuracy results increase as the number of independent variables increase. SIMCA method is a method in which the relationship between variables are quite high, the number of independent variables are many in number and where there are outlier values in the data that can be used in conditions having outlier values.

  9. Sensitivity of the interannual variability of mineral aerosol simulations to meteorological forcing dataset

    DOE PAGES

    Smith, Molly B.; Mahowald, Natalie M.; Albani, Samuel; ...

    2017-03-07

    Interannual variability in desert dust is widely observed and simulated, yet the sensitivity of these desert dust simulations to a particular meteorological dataset, as well as a particular model construction, is not well known. Here we use version 4 of the Community Atmospheric Model (CAM4) with the Community Earth System Model (CESM) to simulate dust forced by three different reanalysis meteorological datasets for the period 1990–2005. We then contrast the results of these simulations with dust simulated using online winds dynamically generated from sea surface temperatures, as well as with simulations conducted using other modeling frameworks but the same meteorological forcings, in order tomore » determine the sensitivity of climate model output to the specific reanalysis dataset used. For the seven cases considered in our study, the different model configurations are able to simulate the annual mean of the global dust cycle, seasonality and interannual variability approximately equally well (or poorly) at the limited observational sites available. Altogether, aerosol dust-source strength has remained fairly constant during the time period from 1990 to 2005, although there is strong seasonal and some interannual variability simulated in the models and seen in the observations over this time period. Model interannual variability comparisons to observations, as well as comparisons between models, suggest that interannual variability in dust is still difficult to simulate accurately, with averaged correlation coefficients of 0.1 to 0.6. Because of the large variability, at least 1 year of observations at most sites are needed to correctly observe the mean, but in some regions, particularly the remote oceans of the Southern Hemisphere, where interannual variability may be larger than in the Northern Hemisphere, 2–3 years of data are likely to be needed.« less

  10. Multimodel ensembles of wheat growth: many models are better than one.

    PubMed

    Martre, Pierre; Wallach, Daniel; Asseng, Senthold; Ewert, Frank; Jones, James W; Rötter, Reimund P; Boote, Kenneth J; Ruane, Alex C; Thorburn, Peter J; Cammarano, Davide; Hatfield, Jerry L; Rosenzweig, Cynthia; Aggarwal, Pramod K; Angulo, Carlos; Basso, Bruno; Bertuzzi, Patrick; Biernath, Christian; Brisson, Nadine; Challinor, Andrew J; Doltra, Jordi; Gayler, Sebastian; Goldberg, Richie; Grant, Robert F; Heng, Lee; Hooker, Josh; Hunt, Leslie A; Ingwersen, Joachim; Izaurralde, Roberto C; Kersebaum, Kurt Christian; Müller, Christoph; Kumar, Soora Naresh; Nendel, Claas; O'leary, Garry; Olesen, Jørgen E; Osborne, Tom M; Palosuo, Taru; Priesack, Eckart; Ripoche, Dominique; Semenov, Mikhail A; Shcherbak, Iurii; Steduto, Pasquale; Stöckle, Claudio O; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Travasso, Maria; Waha, Katharina; White, Jeffrey W; Wolf, Joost

    2015-02-01

    Crop models of crop growth are increasingly used to quantify the impact of global changes due to climate or crop management. Therefore, accuracy of simulation results is a major concern. Studies with ensembles of crop models can give valuable information about model accuracy and uncertainty, but such studies are difficult to organize and have only recently begun. We report on the largest ensemble study to date, of 27 wheat models tested in four contrasting locations for their accuracy in simulating multiple crop growth and yield variables. The relative error averaged over models was 24-38% for the different end-of-season variables including grain yield (GY) and grain protein concentration (GPC). There was little relation between error of a model for GY or GPC and error for in-season variables. Thus, most models did not arrive at accurate simulations of GY and GPC by accurately simulating preceding growth dynamics. Ensemble simulations, taking either the mean (e-mean) or median (e-median) of simulated values, gave better estimates than any individual model when all variables were considered. Compared to individual models, e-median ranked first in simulating measured GY and third in GPC. The error of e-mean and e-median declined with an increasing number of ensemble members, with little decrease beyond 10 models. We conclude that multimodel ensembles can be used to create new estimators with improved accuracy and consistency in simulating growth dynamics. We argue that these results are applicable to other crop species, and hypothesize that they apply more generally to ecological system models. © 2014 John Wiley & Sons Ltd.

  11. Multimodel Ensembles of Wheat Growth: More Models are Better than One

    NASA Technical Reports Server (NTRS)

    Martre, Pierre; Wallach, Daniel; Asseng, Senthold; Ewert, Frank; Jones, James W.; Rotter, Reimund P.; Boote, Kenneth J.; Ruane, Alex C.; Thorburn, Peter J.; Cammarano, Davide; hide

    2015-01-01

    Crop models of crop growth are increasingly used to quantify the impact of global changes due to climate or crop management. Therefore, accuracy of simulation results is a major concern. Studies with ensembles of crop models can give valuable information about model accuracy and uncertainty, but such studies are difficult to organize and have only recently begun. We report on the largest ensemble study to date, of 27 wheat models tested in four contrasting locations for their accuracy in simulating multiple crop growth and yield variables. The relative error averaged over models was 24-38% for the different end-of-season variables including grain yield (GY) and grain protein concentration (GPC). There was little relation between error of a model for GY or GPC and error for in-season variables. Thus, most models did not arrive at accurate simulations of GY and GPC by accurately simulating preceding growth dynamics. Ensemble simulations, taking either the mean (e-mean) or median (e-median) of simulated values, gave better estimates than any individual model when all variables were considered. Compared to individual models, e-median ranked first in simulating measured GY and third in GPC. The error of e-mean and e-median declined with an increasing number of ensemble members, with little decrease beyond 10 models. We conclude that multimodel ensembles can be used to create new estimators with improved accuracy and consistency in simulating growth dynamics. We argue that these results are applicable to other crop species, and hypothesize that they apply more generally to ecological system models.

  12. Multimodel Ensembles of Wheat Growth: Many Models are Better than One

    NASA Technical Reports Server (NTRS)

    Martre, Pierre; Wallach, Daniel; Asseng, Senthold; Ewert, Frank; Jones, James W.; Rotter, Reimund P.; Boote, Kenneth J.; Ruane, Alexander C.; Thorburn, Peter J.; Cammarano, Davide; hide

    2015-01-01

    Crop models of crop growth are increasingly used to quantify the impact of global changes due to climate or crop management. Therefore, accuracy of simulation results is a major concern. Studies with ensembles of crop model scan give valuable information about model accuracy and uncertainty, but such studies are difficult to organize and have only recently begun. We report on the largest ensemble study to date, of 27 wheat models tested in four contrasting locations for their accuracy in simulating multiple crop growth and yield variables. The relative error averaged over models was 2438 for the different end-of-season variables including grain yield (GY) and grain protein concentration (GPC). There was little relation between error of a model for GY or GPC and error for in-season variables. Thus, most models did not arrive at accurate simulations of GY and GPC by accurately simulating preceding growth dynamics. Ensemble simulations, taking either the mean (e-mean) or median (e-median) of simulated values, gave better estimates than any individual model when all variables were considered. Compared to individual models, e-median ranked first in simulating measured GY and third in GPC. The error of e-mean and e-median declined with an increasing number of ensemble members, with little decrease beyond 10 models. We conclude that multimodel ensembles can be used to create new estimators with improved accuracy and consistency in simulating growth dynamics. We argue that these results are applicable to other crop species, and hypothesize that they apply more generally to ecological system models.

  13. The Relative Influence of Several Factors on Simulation Performance.

    ERIC Educational Resources Information Center

    Gosenpud, Jerry; Miesing, Paul

    1992-01-01

    Describes a study that examined the relative impact of 44 independent variables on performance in a business simulation used in an administrative policy course. Categories of variables include ability, motivation, interest, confidence, cohesion, and organizational formality; and results indicate that motivation and interest have the most effect on…

  14. NAO and its relationship with the Northern Hemisphere mean surface temperature in CMIP5 simulations

    NASA Astrophysics Data System (ADS)

    Wang, Xiaofan; Li, Jianping; Sun, Cheng; Liu, Ting

    2017-04-01

    The North Atlantic Oscillation (NAO) is one of the most prominent teleconnection patterns in the Northern Hemisphere and has recently been found to be both an internal source and useful predictor of the multidecadal variability of the Northern Hemisphere mean surface temperature (NHT). In this study, we examine how well the variability of the NAO and NHT are reproduced in historical simulations generated by the 40 models that constitute Phase 5 of the Coupled Model Intercomparison Project (CMIP5). All of the models are able to capture the basic characteristics of the interannual NAO pattern reasonably well, whereas the simulated decadal NAO patterns show less consistency with the observations. The NAO fluctuations over multidecadal time scales are underestimated by almost all models. Regarding the NHT multidecadal variability, the models generally represent the externally forced variations well but tend to underestimate the internal NHT. With respect to the performance of the models in reproducing the NAO-NHT relationship, 14 models capture the observed decadal lead of the NAO, and model discrepancies in the representation of this linkage are derived mainly from their different interpretation of the underlying physical processes associated with the Atlantic Multidecadal Oscillation (AMO) and the Atlantic meridional overturning circulation (AMOC). This study suggests that one way to improve the simulation of the multidecadal variability of the internal NHT lies in better simulation of the multidecadal variability of the NAO and its delayed effect on the NHT variability via slow ocean processes.

  15. Effect of land model ensemble versus coupled model ensemble on the simulation of precipitation climatology and variability

    NASA Astrophysics Data System (ADS)

    Wei, Jiangfeng; Dirmeyer, Paul A.; Yang, Zong-Liang; Chen, Haishan

    2017-10-01

    Through a series of model simulations with an atmospheric general circulation model coupled to three different land surface models, this study investigates the impacts of land model ensembles and coupled model ensemble on precipitation simulation. It is found that coupling an ensemble of land models to an atmospheric model has a very minor impact on the improvement of precipitation climatology and variability, but a simple ensemble average of the precipitation from three individually coupled land-atmosphere models produces better results, especially for precipitation variability. The generally weak impact of land processes on precipitation should be the main reason that the land model ensembles do not improve precipitation simulation. However, if there are big biases in the land surface model or land surface data set, correcting them could improve the simulated climate, especially for well-constrained regional climate simulations.

  16. Realistic simulated MRI and SPECT databases. Application to SPECT/MRI registration evaluation.

    PubMed

    Aubert-Broche, Berengere; Grova, Christophe; Reilhac, Anthonin; Evans, Alan C; Collins, D Louis

    2006-01-01

    This paper describes the construction of simulated SPECT and MRI databases that account for realistic anatomical and functional variability. The data is used as a gold-standard to evaluate four SPECT/MRI similarity-based registration methods. Simulation realism was accounted for using accurate physical models of data generation and acquisition. MRI and SPECT simulations were generated from three subjects to take into account inter-subject anatomical variability. Functional SPECT data were computed from six functional models of brain perfusion. Previous models of normal perfusion and ictal perfusion observed in Mesial Temporal Lobe Epilepsy (MTLE) were considered to generate functional variability. We studied the impact noise and intensity non-uniformity in MRI simulations and SPECT scatter correction may have on registration accuracy. We quantified the amount of registration error caused by anatomical and functional variability. Registration involving ictal data was less accurate than registration involving normal data. MR intensity nonuniformity was the main factor decreasing registration accuracy. The proposed simulated database is promising to evaluate many functional neuroimaging methods, involving MRI and SPECT data.

  17. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies.

    PubMed

    Caswell, Joseph M; Singh, Manraj; Persinger, Michael A

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings. Copyright © 2016 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  18. Comparing the Pearson and Spearman correlation coefficients across distributions and sample sizes: A tutorial using simulations and empirical data.

    PubMed

    de Winter, Joost C F; Gosling, Samuel D; Potter, Jeff

    2016-09-01

    The Pearson product–moment correlation coefficient ( r p ) and the Spearman rank correlation coefficient ( r s ) are widely used in psychological research. We compare r p and r s on 3 criteria: variability, bias with respect to the population value, and robustness to an outlier. Using simulations across low (N = 5) to high (N = 1,000) sample sizes we show that, for normally distributed variables, r p and r s have similar expected values but r s is more variable, especially when the correlation is strong. However, when the variables have high kurtosis, r p is more variable than r s . Next, we conducted a sampling study of a psychometric dataset featuring symmetrically distributed data with light tails, and of 2 Likert-type survey datasets, 1 with light-tailed and the other with heavy-tailed distributions. Consistent with the simulations, r p had lower variability than r s in the psychometric dataset. In the survey datasets with heavy-tailed variables in particular, r s had lower variability than r p , and often corresponded more accurately to the population Pearson correlation coefficient ( R p ) than r p did. The simulations and the sampling studies showed that variability in terms of standard deviations can be reduced by about 20% by choosing r s instead of r p . In comparison, increasing the sample size by a factor of 2 results in a 41% reduction of the standard deviations of r s and r p . In conclusion, r p is suitable for light-tailed distributions, whereas r s is preferable when variables feature heavy-tailed distributions or when outliers are present, as is often the case in psychological research. PsycINFO Database Record (c) 2016 APA, all rights reserved

  19. Variability of North Atlantic Hurricane Frequency in a Large Ensemble of High-Resolution Climate Simulations

    NASA Astrophysics Data System (ADS)

    Mei, W.; Kamae, Y.; Xie, S. P.

    2017-12-01

    Forced and internal variability of North Atlantic hurricane frequency during 1951-2010 is studied using a large ensemble of climate simulations by a 60-km atmospheric general circulation model that is forced by observed sea surface temperatures (SSTs). The simulations well capture the interannual-to-decadal variability of hurricane frequency in best track data, and further suggest a possible underestimate of hurricane counts in the current best track data prior to 1966 when satellite measurements were unavailable. A genesis potential index (GPI) averaged over the Main Development Region (MDR) accounts for more than 80% of the forced variations in hurricane frequency, with potential intensity and vertical wind shear being the dominant factors. In line with previous studies, the difference between MDR SST and tropical mean SST is a simple but useful predictor; a one-degree increase in this SST difference produces 7.1±1.4 more hurricanes. The hurricane frequency also exhibits internal variability that is comparable in magnitude to the interannual variability. The 100-member ensemble allows us to address the following important questions: (1) Are the observations equivalent to one realization of such a large ensemble? (2) How many ensemble members are needed to reproduce the variability in observations and in the forced component of the simulations? The sources of the internal variability in hurricane frequency will be identified and discussed. The results provide an explanation for the relatively week correlation ( 0.6) between MDR GPI and hurricane frequency on interannual timescales in observations.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, Elizabeth J.; Yu, Sungduk; Kooperman, Gabriel J.

    The sensitivities of simulated mesoscale convective systems (MCSs) in the central U.S. to microphysics and grid configuration are evaluated here in a global climate model (GCM) that also permits global-scale feedbacks and variability. Since conventional GCMs do not simulate MCSs, studying their sensitivities in a global framework useful for climate change simulations has not previously been possible. To date, MCS sensitivity experiments have relied on controlled cloud resolving model (CRM) studies with limited domains, which avoid internal variability and neglect feedbacks between local convection and larger-scale dynamics. However, recent work with superparameterized (SP) GCMs has shown that eastward propagating MCS-likemore » events are captured when embedded CRMs replace convective parameterizations. This study uses a SP version of the Community Atmosphere Model version 5 (SP-CAM5) to evaluate MCS sensitivities, applying an objective empirical orthogonal function algorithm to identify MCS-like events, and harmonizing composite storms to account for seasonal and spatial heterogeneity. A five-summer control simulation is used to assess the magnitude of internal and interannual variability relative to 10 sensitivity experiments with varied CRM parameters, including ice fall speed, one-moment and two-moment microphysics, and grid spacing. MCS sensitivities were found to be subtle with respect to internal variability, and indicate that ensembles of over 100 storms may be necessary to detect robust differences in SP-GCMs. Furthermore, these results emphasize that the properties of MCSs can vary widely across individual events, and improving their representation in global simulations with significant internal variability may require comparison to long (multidecadal) time series of observed events rather than single season field campaigns.« less

  1. Forced and Free Intra-Seasonal Variability Over the South Asian Monsoon Region Simulated by 10 AGCMs

    NASA Technical Reports Server (NTRS)

    Wu, Man Li C.; Kang, In-Sik; Waliser, Duane; Atlas, Robert (Technical Monitor)

    2001-01-01

    This study examines intra-seasonal (20-70 day) variability in the South Asian monsoon region during 1997/98 in ensembles of 10 simulations with 10 different atmospheric general circulation models. The 10 ensemble members for each model are forced with the same observed weekly sea surface temperature (SST) but differ from each other in that they are started from different initial atmospheric conditions. The results show considerable differences between the models in the simulated 20-70 day variability, ranging from much weaker to much stronger than the observed. A key result is that the models do produce, to varying degrees, a response to the imposed weekly SST. The forced variability tends to be largest in the Indian and western Pacific Oceans where, for some models, it accounts for more than 1/4 of the 20-70 day intra-seasonal variability in the upper level velocity potential during these two years. A case study of a strong observed MJO (intraseasonal oscillation) event shows that the models produce an ensemble mean eastward propagating signal in the tropical precipitation field over the Indian Ocean and western Pacific, similar to that found in the observations. The associated forced 200 mb velocity potential anomalies are strongly phase locked with the precipitation anomalies, propagating slowly to the east (about 5 m/s) with a local zonal wave number two pattern that is generally consistent with the developing observed MJO. The simulated and observed events are, however, approximately in quadrature, with the simulated response 2 leading by 5-10 days. The phase lag occurs because, in the observations, the positive SST anomalies develop upstream of the main convective center in the subsidence region of the MJO, while in the simulations, the forced component is in phase with the SST. For all the models examined here, the intraseasonal variability is dominated by the free (intra-ensemble) component. The results of our case study show that the free variability has a predominately zonal wave number one pattern, and has propagation speeds (10 - 15 m/s) that are more typical of observed MJO behavior away from the convectively active regions. The free variability appears to be synchronized with the forced response, at least, during the strong event examined here. The results of this study support the idea that coupling with SSTs plays an important, though probably not dominant, role in the MJO. The magnitude of the atmospheric response to the SST appears to be in the range of 15% - 30% of the 20-70 day variability over much of the tropical eastern Indian and western Pacific Oceans. The results also highlight the need to use caution when interpreting atmospheric model simulations in which the prescribed SST resolve MJO time scales.

  2. Low-frequency variability of the Atlantic MOC in the eddying regime : the intrinsic component.

    NASA Astrophysics Data System (ADS)

    Gregorio, S.; Penduff, T.; Barnier, B.; Molines, J.-M.; Le Sommer, J.

    2012-04-01

    A 327-year 1/4° global ocean/sea-ice simulation has been produced by the DRAKKAR ocean modeling consortium. This simulation is forced by a repeated seasonal atmospheric forcing but nevertheless exhibits a substantial low-frequency variability (at interannual and longer timescales), which is therefore of intrinsic origin. This nonlinearly-generated intrinsic variability is almost absent from the coarse-resolution (2°) version of this simulation. Comparing the 1/4° simulation with its fully-forced counterpart, Penduff et al. (2011) have shown that the low-frequency variability of local sea-level is largely generated by the ocean itself in eddying areas, rather than directly forced by the atmosphere. Using the same simulations, the present study quantifies the imprint of the intrinsic low-frequency variability on the Meridional Overturning Circulation (MOC) at interannual-to-decadal timescales in the Atlantic. We first compare the intrinsic and atmospherically-forced interannual variances of the Atlantic MOC calculated in geopotential coordinates. This analysis reveals substantial sources of intrinsic MOC variability in the South Atlantic (driven by the Agulhas mesoscale activity according to Biastoch et al. (2008)), but also in the North Atlantic. We extend our investigation to the MOC calculated in isopycnal coordinates, and identify regions in the basin where the water mass transformation exhibits low-frequency intrinsic variability. In this eddy-permitting regime, intrinsic processes are shown to generate about half the total (geopotential and isopycnal) MOC interannual variance in certain key regions of the Atlantic. This intrinsic variability is absent from 2° simulations. Penduff, T., Juza, M., Barnier, B., Zika, J., Dewar, W.K., Treguier, A.-M., Molines, J.-M., Audiffren, N., 2011: Sea-level expression of intrinsic and forced ocean variabilities at interannual time scales. J. Climate, 24, 5652-5670. doi: 10.1175/JCLI-D-11-00077.1. Biastoch, A., Böning, C. W., Lutjeharms, J. R. E., 2008: Agulhas leakage dynamics affects decadal variability in Atlantic overturning circulation. Nature, 456, 489-492, doi: 10.1038/nature07426.

  3. A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability

    PubMed Central

    Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.

    2012-01-01

    Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793

  4. Variable geometry Darrieus wind machine

    NASA Astrophysics Data System (ADS)

    Pytlinski, J. T.; Serrano, D.

    1983-08-01

    A variable geometry Darrieus wind machine is proposed. The lower attachment of the blades to the rotor can move freely up and down the axle allowing the blades of change shape during rotation. Experimental data for a 17 m. diameter Darrieus rotor and a theoretical model for multiple streamtube performance prediction were used to develop a computer simulation program for studying parameters that affect the machine's performance. This new variable geometry concept is described and interrelated with multiple streamtube theory through aerodynamic parameters. The computer simulation study shows that governor behavior of a Darrieus turbine can not be attained by a standard turbine operating within normally occurring rotational velocity limits. A second generation variable geometry Darrieus wind turbine which uses a telescopic blade is proposed as a potential improvement on the studied concept.

  5. Simulation of Mesoscale Cellular Convection in Marine Stratocumulus. Part I: Drizzling Conditions

    DOE PAGES

    Zhou, Xiaoli; Ackerman, Andrew S.; Fridlind, Ann M.; ...

    2018-01-01

    This study uses eddy-permitting simulations to investigate the mechanisms that promote mesoscale variability of moisture in drizzling stratocumulus-topped marine boundary layers. Simulations show that precipitation tends to increase horizontal scales. Analysis of terms in the prognostic equation for total water mixing ratio variance indicates that moisture stratification plays a leading role in setting horizontal scales. This result is supported by simulations in which horizontal mean thermodynamic profiles are strongly nudged to their initial well-mixed state, which limits cloud scales. It is found that the spatial variability of subcloud moist cold pools surprisingly tends to respond to, rather than determine, themore » mesoscale variability, which may distinguish them from dry cold pools associated with deeper convection. Finally, simulations also indicate that moisture stratification increases cloud scales specifically by increasing latent heating within updrafts, which increases updraft buoyancy and favors greater horizontal scales.« less

  6. Simulation of Mesoscale Cellular Convection in Marine Stratocumulus. Part I: Drizzling Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Xiaoli; Ackerman, Andrew S.; Fridlind, Ann M.

    This study uses eddy-permitting simulations to investigate the mechanisms that promote mesoscale variability of moisture in drizzling stratocumulus-topped marine boundary layers. Simulations show that precipitation tends to increase horizontal scales. Analysis of terms in the prognostic equation for total water mixing ratio variance indicates that moisture stratification plays a leading role in setting horizontal scales. This result is supported by simulations in which horizontal mean thermodynamic profiles are strongly nudged to their initial well-mixed state, which limits cloud scales. It is found that the spatial variability of subcloud moist cold pools surprisingly tends to respond to, rather than determine, themore » mesoscale variability, which may distinguish them from dry cold pools associated with deeper convection. Finally, simulations also indicate that moisture stratification increases cloud scales specifically by increasing latent heating within updrafts, which increases updraft buoyancy and favors greater horizontal scales.« less

  7. A Variable Resolution Stretched Grid General Circulation Model: Regional Climate Simulation

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.; Takacs, Lawrence L.; Govindaraju, Ravi C.; Suarez, Max J.

    2000-01-01

    The development of and results obtained with a variable resolution stretched-grid GCM for the regional climate simulation mode, are presented. A global variable resolution stretched- grid used in the study has enhanced horizontal resolution over the U.S. as the area of interest The stretched-grid approach is an ideal tool for representing regional to global scale interaction& It is an alternative to the widely used nested grid approach introduced over a decade ago as a pioneering step in regional climate modeling. The major results of the study are presented for the successful stretched-grid GCM simulation of the anomalous climate event of the 1988 U.S. summer drought- The straightforward (with no updates) two month simulation is performed with 60 km regional resolution- The major drought fields, patterns and characteristics such as the time averaged 500 hPa heights precipitation and the low level jet over the drought area. appear to be close to the verifying analyses for the stretched-grid simulation- In other words, the stretched-grid GCM provides an efficient down-scaling over the area of interest with enhanced horizontal resolution. It is also shown that the GCM skill is sustained throughout the simulation extended to one year. The developed and tested in a simulation mode stretched-grid GCM is a viable tool for regional and subregional climate studies and applications.

  8. The Chandra Source Catalog: Source Variability

    NASA Astrophysics Data System (ADS)

    Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to a preliminary assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.

  9. The Chandra Source Catalog: Source Variability

    NASA Astrophysics Data System (ADS)

    Nowak, Michael; Rots, A. H.; McCollough, M. L.; Primini, F. A.; Glotfelty, K. J.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Evans, I.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hain, R.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    The Chandra Source Catalog (CSC) contains fields of view that have been studied with individual, uninterrupted observations that span integration times ranging from 1 ksec to 160 ksec, and a large number of which have received (multiple) repeat observations days to years later. The CSC thus offers an unprecedented look at the variability of the X-ray sky over a broad range of time scales, and across a wide diversity of variable X-ray sources: stars in the local galactic neighborhood, galactic and extragalactic X-ray binaries, Active Galactic Nuclei, etc. Here we describe the methods used to identify and quantify source variability within a single observation, and the methods used to assess the variability of a source when detected in multiple, individual observations. Three tests are used to detect source variability within a single observation: the Kolmogorov-Smirnov test and its variant, the Kuiper test, and a Bayesian approach originally suggested by Gregory and Loredo. The latter test not only provides an indicator of variability, but is also used to create a best estimate of the variable lightcurve shape. We assess the performance of these tests via simulation of statistically stationary, variable processes with arbitrary input power spectral densities (here we concentrate on results of red noise simulations) at variety of mean count rates and fractional root mean square variabilities relevant to CSC sources. We also assess the false positive rate via simulations of constant sources whose sole source of fluctuation is Poisson noise. We compare these simulations to an assessment of the variability found in real CSC sources, and estimate the variability sensitivities of the CSC.

  10. Collective feature selection to identify crucial epistatic variants.

    PubMed

    Verma, Shefali S; Lucas, Anastasia; Zhang, Xinyuan; Veturi, Yogasudha; Dudek, Scott; Li, Binglan; Li, Ruowang; Urbanowicz, Ryan; Moore, Jason H; Kim, Dokyoon; Ritchie, Marylyn D

    2018-01-01

    Machine learning methods have gained popularity and practicality in identifying linear and non-linear effects of variants associated with complex disease/traits. Detection of epistatic interactions still remains a challenge due to the large number of features and relatively small sample size as input, thus leading to the so-called "short fat data" problem. The efficiency of machine learning methods can be increased by limiting the number of input features. Thus, it is very important to perform variable selection before searching for epistasis. Many methods have been evaluated and proposed to perform feature selection, but no single method works best in all scenarios. We demonstrate this by conducting two separate simulation analyses to evaluate the proposed collective feature selection approach. Through our simulation study we propose a collective feature selection approach to select features that are in the "union" of the best performing methods. We explored various parametric, non-parametric, and data mining approaches to perform feature selection. We choose our top performing methods to select the union of the resulting variables based on a user-defined percentage of variants selected from each method to take to downstream analysis. Our simulation analysis shows that non-parametric data mining approaches, such as MDR, may work best under one simulation criteria for the high effect size (penetrance) datasets, while non-parametric methods designed for feature selection, such as Ranger and Gradient boosting, work best under other simulation criteria. Thus, using a collective approach proves to be more beneficial for selecting variables with epistatic effects also in low effect size datasets and different genetic architectures. Following this, we applied our proposed collective feature selection approach to select the top 1% of variables to identify potential interacting variables associated with Body Mass Index (BMI) in ~ 44,000 samples obtained from Geisinger's MyCode Community Health Initiative (on behalf of DiscovEHR collaboration). In this study, we were able to show that selecting variables using a collective feature selection approach could help in selecting true positive epistatic variables more frequently than applying any single method for feature selection via simulation studies. We were able to demonstrate the effectiveness of collective feature selection along with a comparison of many methods in our simulation analysis. We also applied our method to identify non-linear networks associated with obesity.

  11. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    PubMed

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  12. Climate Modeling and Causal Identification for Sea Ice Predictability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunke, Elizabeth Clare; Urrego Blanco, Jorge Rolando; Urban, Nathan Mark

    This project aims to better understand causes of ongoing changes in the Arctic climate system, particularly as decreasing sea ice trends have been observed in recent decades and are expected to continue in the future. As part of the Sea Ice Prediction Network, a multi-agency effort to improve sea ice prediction products on seasonal-to-interannual time scales, our team is studying sensitivity of sea ice to a collection of physical process and feedback mechanism in the coupled climate system. During 2017 we completed a set of climate model simulations using the fully coupled ACME-HiLAT model. The simulations consisted of experiments inmore » which cloud, sea ice, and air-ocean turbulent exchange parameters previously identified as important for driving output uncertainty in climate models were perturbed to account for parameter uncertainty in simulated climate variables. We conducted a sensitivity study to these parameters, which built upon a previous study we made for standalone simulations (Urrego-Blanco et al., 2016, 2017). Using the results from the ensemble of coupled simulations, we are examining robust relationships between climate variables that emerge across the experiments. We are also using causal discovery techniques to identify interaction pathways among climate variables which can help identify physical mechanisms and provide guidance in predictability studies. This work further builds on and leverages the large ensemble of standalone sea ice simulations produced in our previous w14_seaice project.« less

  13. Theory and Design Tools For Studies of Reactions to Abrupt Changes in Noise Exposure

    NASA Technical Reports Server (NTRS)

    Fields, James M.; Ehrlich, Gary E.; Zador, Paul; Shepherd, Kevin P. (Technical Monitor)

    2000-01-01

    Study plans, a pre-tested questionnaire, a sample design evaluation tool, a community publicity monitoring plan, and a theoretical framework have been developed to support combined social/acoustical surveys of residents' reactions to an abrupt change in environmental noise, Secondary analyses of more than 20 previous surveys provide estimates of three parameters of a study simulation model; within individual variability, between study wave variability, and between neighborhood variability in response to community noise. The simulation model predicts the precision of the results from social surveys of reactions to noise, including changes in noise. When the study simulation model analyzed the population distribution, noise exposure environments and feasible noise measurement program at a proposed noise change survey site, it was concluded that the site could not yield sufficient precise estimates of human reaction model to justify conducting a survey. Additional secondary analyses determined that noise reactions are affected by the season of the social survey.

  14. Simulating a Thin Accretion Disk Using PLUTO

    NASA Astrophysics Data System (ADS)

    Phillipson, Rebecca; Vogeley, Michael S.; Boyd, Patricia T.

    2017-08-01

    Accreting black hole systems such as X-ray binaries and active galactic nuclei exhibit variability in their luminosity on many timescales ranging from milliseconds to tens of days, and even hundreds of days. The mechanism(s) driving this variability and the relationship between short- and long-term variability is poorly understood. Current studies on accretion disks seek to determine how the changes in black hole mass, the rate at which mass accretes onto the central black hole, and the external environment affect the variability on scales ranging from stellar-mass black holes to supermassive black holes. Traditionally, the fluid mechanics equations governing accretion disks have been simplified by considering only the kinematics of the disk, and perhaps magnetic fields, in order for their phenomenological behavior to be predicted analytically. We seek to employ numerical techniques to study accretion disks including more complicated physics traditionally ignored in order to more accurately understand their behavior over time. We present a proof-of-concept three dimensional, global simulation using the astrophysical hydrodynamic code PLUTO of a simplified thin disk model about a central black hole which will serve as the basis for development of more complicated models including external effects such as radiation and magnetic fields. We also develop a tool to generate a synthetic light curve that displays the variability in luminosity of the simulation over time. The preliminary simulation and accompanying synthetic light curve demonstrate that PLUTO is a reliable code to perform sophisticated simulations of accretion disk systems which can then be compared to observational results.

  15. Influences of Appalachian orography on heavy rainfall and rainfall variability associated with the passage of hurricane Isabel by ensemble simulations

    NASA Astrophysics Data System (ADS)

    Oldaker, Guy; Liu, Liping; Lin, Yuh-Lang

    2017-12-01

    This study focuses on the heavy rainfall event associated with hurricane Isabel's (2003) passage over the Appalachian mountains of the eastern United States. Specifically, an ensemble consisting of two groups of simulations using the Weather Research and Forecasting model (WRF), with and without topography, is performed to investigate the orographic influences on heavy rainfall and rainfall variability. In general, the simulated ensemble mean with full terrain is able to reproduce the key observed 24-h rainfall amount and distribution, while the flat-terrain mean lacks in this respect. In fact, 30-h rainfall amounts are reduced by 75% with the removal of topography. Rainfall variability is also significantly increased with the presence of orography. Further analysis shows that the complex interaction between the hurricane and terrain along with contributions from varied microphysics, cumulus parametrization, and planetary boundary layer schemes have a pronounced effect on rainfall and rainfall variability. This study follows closely with a previous study, but for a different TC case of Isabel (2003). It is an important sensitivity test for a different TC in a very different environment. This study reveals that the rainfall variability behaves similarly, even with different settings of the environment.

  16. Analysis of covariance as a remedy for demographic mismatch of research subject groups: some sobering simulations.

    PubMed

    Adams, K M; Brown, G G; Grant, I

    1985-08-01

    Analysis of Covariance (ANCOVA) is often used in neuropsychological studies to effect ex-post-facto adjustment of performance variables amongst groups of subjects mismatched on some relevant demographic variable. This paper reviews some of the statistical assumptions underlying this usage. In an attempt to illustrate the complexities of this statistical technique, three sham studies using actual patient data are presented. These staged simulations have varying relationships between group test performance differences and levels of covariate discrepancy. The results were robust and consistent in their nature, and were held to support the wisdom of previous cautions by statisticians concerning the employment of ANCOVA to justify comparisons between incomparable groups. ANCOVA should not be used in neuropsychological research to equate groups unequal on variables such as age and education or to exert statistical control whose objective is to eliminate consideration of the covariate as an explanation for results. Finally, the report advocates by example the use of simulation to further our understanding of neuropsychological variables.

  17. The Effect of a Variable Disc Pad Friction Coefficient for the Mechanical Brake System of a Railway Vehicle

    PubMed Central

    Lee, Nam-Jin; Kang, Chul-Goo

    2015-01-01

    A brake hardware-in-the-loop simulation (HILS) system for a railway vehicle is widely applied to estimate and validate braking performance in research studies and field tests. When we develop a simulation model for a full vehicle system, the characteristics of all components are generally properly simplified based on the understanding of each component’s purpose and interaction with other components. The friction coefficient between the brake disc and the pad used in simulations has been conventionally considered constant, and the effect of a variable friction coefficient is ignored with the assumption that the variability affects the performance of the vehicle braking very little. However, the friction coefficient of a disc pad changes significantly within a range due to environmental conditions, and thus, the friction coefficient can affect the performance of the brakes considerably, especially on the wheel slide. In this paper, we apply a variable friction coefficient and analyzed the effects of the variable friction coefficient on a mechanical brake system of a railway vehicle. We introduce a mathematical formula for the variable friction coefficient in which the variable friction is represented by two variables and five parameters. The proposed formula is applied to real-time simulations using a brake HILS system, and the effectiveness of the formula is verified experimentally by testing the mechanical braking performance of the brake HILS system. PMID:26267883

  18. The Effect of a Variable Disc Pad Friction Coefficient for the Mechanical Brake System of a Railway Vehicle.

    PubMed

    Lee, Nam-Jin; Kang, Chul-Goo

    2015-01-01

    A brake hardware-in-the-loop simulation (HILS) system for a railway vehicle is widely applied to estimate and validate braking performance in research studies and field tests. When we develop a simulation model for a full vehicle system, the characteristics of all components are generally properly simplified based on the understanding of each component's purpose and interaction with other components. The friction coefficient between the brake disc and the pad used in simulations has been conventionally considered constant, and the effect of a variable friction coefficient is ignored with the assumption that the variability affects the performance of the vehicle braking very little. However, the friction coefficient of a disc pad changes significantly within a range due to environmental conditions, and thus, the friction coefficient can affect the performance of the brakes considerably, especially on the wheel slide. In this paper, we apply a variable friction coefficient and analyzed the effects of the variable friction coefficient on a mechanical brake system of a railway vehicle. We introduce a mathematical formula for the variable friction coefficient in which the variable friction is represented by two variables and five parameters. The proposed formula is applied to real-time simulations using a brake HILS system, and the effectiveness of the formula is verified experimentally by testing the mechanical braking performance of the brake HILS system.

  19. Impact of the semidiurnal lunar tide on the midlatitude thermospheric wind and ionosphere during sudden stratosphere warmings

    NASA Astrophysics Data System (ADS)

    Pedatella, N. M.; Maute, A.

    2015-12-01

    Variability of the midlatitude ionosphere and thermosphere during the 2009 and 2013 sudden stratosphere warmings (SSWs) is investigated in the present study using a combination of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) observations and thermosphere-ionosphere-mesosphere electrodynamics general circulation model (TIME-GCM) simulations. Both the COSMIC observations and TIME-GCM simulations reveal perturbations in the F region peak height (hmF2) at Southern Hemisphere midlatitudes during SSW time periods. The perturbations are ˜20-30 km, which corresponds to 10-20% variability of the background mean hmF2. The TIME-GCM simulations and COSMIC observations of the hmF2 variability are in overall good agreement, and the simulations can thus be used to understand the physical processes responsible for the hmF2 variability. Through comparison of simulations with and without the migrating semidiurnal lunar tide (M2), we conclude that the midlatitude hmF2 variability is primarily driven by the propagation of the M2 into the thermosphere where it modulates the field-aligned neutral winds, which in turn raise and lower the F region peak height. Though there are subtle differences, the consistency of the behavior between the 2009 and 2013 SSWs suggests that variability in the Southern Hemisphere midlatitude ionosphere and thermosphere is a consistent feature of the SSW impact on the upper atmosphere.

  20. Quantifying the Contribution of Wind-Driven Linear Response to the Seasonal and Interannual Variability of Amoc Volume Transports Across 26.5ºN

    NASA Astrophysics Data System (ADS)

    Shimizu, K.; von Storch, J. S.; Haak, H.; Nakayama, K.; Marotzke, J.

    2014-12-01

    Surface wind stress is considered to be an important forcing of the seasonal and interannual variability of Atlantic Meridional Overturning Circulation (AMOC) volume transports. A recent study showed that even linear response to wind forcing captures observed features of the mean seasonal cycle. However, the study did not assess the contribution of wind-driven linear response in realistic conditions against the RAPID/MOCHA array observation or Ocean General Circulation Model (OGCM) simulations, because it applied a linear two-layer model to the Atlantic assuming constant upper layer thickness and density difference across the interface. Here, we quantify the contribution of wind-driven linear response to the seasonal and interannual variability of AMOC transports by comparing wind-driven linear simulations under realistic continuous stratification against the RAPID observation and OCGM (MPI-OM) simulations with 0.4º resolution (TP04) and 0.1º resolution (STORM). All the linear and MPI-OM simulations capture more than 60% of the variance in the observed mean seasonal cycle of the Upper Mid-Ocean (UMO) and Florida Strait (FS) transports, two components of the upper branch of the AMOC. The linear and TP04 simulations also capture 25-40% of the variance in the observed transport time series between Apr 2004 and Oct 2012; the STORM simulation does not capture the observed variance because of the stochastic signal in both datasets. Comparison of half-overlapping 12-month-long segments reveals some periods when the linear and TP04 simulations capture 40-60% of the observed variance, as well as other periods when the simulations capture only 0-20% of the variance. These results show that wind-driven linear response is a major contributor to the seasonal and interannual variability of the UMO and FS transports, and that its contribution varies in an interannual timescale, probably due to the variability of stochastic processes.

  1. Patient safety during assistant propelled wheelchair transfers: the effect of the seat cushion on risk of falling.

    PubMed

    Okunribido, Olanrewaju O

    2013-01-01

    This article is a report of a study of the effect of the seat cushion on risk of falling from a wheelchair. Two laboratory studies and simulated assistant propelled wheelchair transfers were conducted with four healthy female participants. For the laboratory studies there were three independent variables: trunk posture (upright/flexed forward), seat cushion (flat polyurethane/propad low profile), and feet condition (dangling/supported), and two dependent variables: occupied wheelchair (wheelchair) center of gravity (CG), and stability. For the simulated transfers there was one independent variable: seat cushion (flat polyurethane/propad low profile), and one dependent variable: perception of safety (risk of falling). Results showed that the wheelchair CG was closer to the front wheels, and stability lower for the propad low profile cushion compared to the polyurethane cushion, when the participants sat with their feet dangling. During the simulated transfers, sitting on the propad low profile cushion caused participants to feel more apprehensive (anxious or uneasy) compared to sitting on the polyurethane cushion. The findings can contribute to the assessment of risk and care planning of non-ambulatory wheelchair users.

  2. Variable-Speed Simulation of a Dual-Clutch Gearbox Tiltrotor Driveline

    NASA Technical Reports Server (NTRS)

    DeSmidt, Hans; Wang, Kon-Well; Smith, Edward C.; Lewicki, David G.

    2012-01-01

    This investigation explores the variable-speed operation and shift response of a prototypical two-speed dual-clutch transmission tiltrotor driveline in forward flight. Here, a Comprehensive Variable-Speed Rotorcraft Propulsion System Modeling (CVSRPM) tool developed under a NASA funded NRA program is utilized to simulate the drive system dynamics. In this study, a sequential shifting control strategy is analyzed under a steady forward cruise condition. This investigation attempts to build upon previous variable-speed rotorcraft propulsion studies by 1) including a fully nonlinear transient gas-turbine engine model, 2) including clutch stick-slip friction effects, 3) including shaft flexibility, 4) incorporating a basic flight dynamics model to account for interactions with the flight control system. Through exploring the interactions between the various subsystems, this analysis provides important insights into the continuing development of variable-speed rotorcraft propulsion systems.

  3. The Education of Attention as Explanation of Variability of Practice Effects : Learning the Final Approach Phase in a Flight Simulator

    ERIC Educational Resources Information Center

    Huet, Michael; Jacobs, David M.; Camachon, Cyril; Missenard, Olivier; Gray, Rob; Montagne, Gilles

    2011-01-01

    The present study reports two experiments in which a total of 20 participants without prior flight experience practiced the final approach phase in a fixed-base simulator. All participants received self-controlled concurrent feedback during 180 practice trials. Experiment 1 shows that participants learn more quickly under variable practice…

  4. Stress Inoculation through Cognitive and Biofeedback Training

    DTIC Science & Technology

    2010-12-01

    based on Heart Rate Variability ( HRV ) with innovative simulation game-based training tools. The training system described here will be implemented on a...Variability ( HRV ) with innovative simulation game-based training tools. The training system described here will be implemented on a mobile device...and studies (e.g. Fletcher & Tobias, 2006; Thayer, 2009). HRV Coherence Training for Stress Resilience Satisfactory performance in stressful

  5. Research on the Diesel Engine with Sliding Mode Variable Structure Theory

    NASA Astrophysics Data System (ADS)

    Ma, Zhexuan; Mao, Xiaobing; Cai, Le

    2018-05-01

    This study constructed the nonlinear mathematical model of the diesel engine high-pressure common rail (HPCR) system through two polynomial fitting which was treated as a kind of affine nonlinear system. Based on sliding-mode variable structure control (SMVSC) theory, a sliding-mode controller for affine nonlinear systems was designed for achieving the control of common rail pressure and the diesel engine’s rotational speed. Finally, on the simulation platform of MATLAB, the designed nonlinear HPCR system was simulated. The simulation results demonstrated that sliding-mode variable structure control algorithm shows favourable control performances which are overcoming the shortcomings of traditional PID control in overshoot, parameter adjustment, system precision, adjustment time and ascending time.

  6. Implementing the DC Mode in Cosmological Simulations with Supercomoving Variables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gnedin, Nickolay Y; Kravtsov, Andrey V; Rudd, Douglas H

    2011-06-02

    As emphasized by previous studies, proper treatment of the density fluctuation on the fundamental scale of a cosmological simulation volume - the 'DC mode' - is critical for accurate modeling of spatial correlations on scales ~> 10% of simulation box size. We provide further illustration of the effects of the DC mode on the abundance of halos in small boxes and show that it is straightforward to incorporate this mode in cosmological codes that use the 'supercomoving' variables. The equations governing evolution of dark matter and baryons recast with these variables are particularly simple and include the expansion factor, andmore » hence the effect of the DC mode, explicitly only in the Poisson equation.« less

  7. Study report on combining diagnostic and therapeutic considerations with subsystem and whole-body simulation

    NASA Technical Reports Server (NTRS)

    Furukawa, S.

    1975-01-01

    Current applications of simulation models for clinical research described included tilt model simulation of orthostatic intolerance with hemorrhage, and modeling long term circulatory circulation. Current capabilities include: (1) simulation of analogous pathological states and effects of abnormal environmental stressors by the manipulation of system variables and changing inputs in various sequences; (2) simulation of time courses of responses of controlled variables by the altered inputs and their relationships; (3) simulation of physiological responses of treatment such as isotonic saline transfusion; (4) simulation of the effectiveness of a treatment as well as the effects of complication superimposed on an existing pathological state; and (5) comparison of the effectiveness of various treatments/countermeasures for a given pathological state. The feasibility of applying simulation models to diagnostic and therapeutic research problems is assessed.

  8. SIMRAND I- SIMULATION OF RESEARCH AND DEVELOPMENT PROJECTS

    NASA Technical Reports Server (NTRS)

    Miles, R. F.

    1994-01-01

    The Simulation of Research and Development Projects program (SIMRAND) aids in the optimal allocation of R&D resources needed to achieve project goals. SIMRAND models the system subsets or project tasks as various network paths to a final goal. Each path is described in terms of task variables such as cost per hour, cost per unit, availability of resources, etc. Uncertainty is incorporated by treating task variables as probabilistic random variables. SIMRAND calculates the measure of preference for each alternative network. The networks yielding the highest utility function (or certainty equivalence) are then ranked as the optimal network paths. SIMRAND has been used in several economic potential studies at NASA's Jet Propulsion Laboratory involving solar dish power systems and photovoltaic array construction. However, any project having tasks which can be reduced to equations and related by measures of preference can be modeled. SIMRAND analysis consists of three phases: reduction, simulation, and evaluation. In the reduction phase, analytical techniques from probability theory and simulation techniques are used to reduce the complexity of the alternative networks. In the simulation phase, a Monte Carlo simulation is used to derive statistics on the variables of interest for each alternative network path. In the evaluation phase, the simulation statistics are compared and the networks are ranked in preference by a selected decision rule. The user must supply project subsystems in terms of equations based on variables (for example, parallel and series assembly line tasks in terms of number of items, cost factors, time limits, etc). The associated cumulative distribution functions and utility functions for each variable must also be provided (allowable upper and lower limits, group decision factors, etc). SIMRAND is written in Microsoft FORTRAN 77 for batch execution and has been implemented on an IBM PC series computer operating under DOS.

  9. Sensitivity of summer ensembles of fledgling superparameterized U.S. mesoscale convective systems to cloud resolving model microphysics and grid configuration

    DOE PAGES

    Elliott, Elizabeth J.; Yu, Sungduk; Kooperman, Gabriel J.; ...

    2016-05-01

    The sensitivities of simulated mesoscale convective systems (MCSs) in the central U.S. to microphysics and grid configuration are evaluated here in a global climate model (GCM) that also permits global-scale feedbacks and variability. Since conventional GCMs do not simulate MCSs, studying their sensitivities in a global framework useful for climate change simulations has not previously been possible. To date, MCS sensitivity experiments have relied on controlled cloud resolving model (CRM) studies with limited domains, which avoid internal variability and neglect feedbacks between local convection and larger-scale dynamics. However, recent work with superparameterized (SP) GCMs has shown that eastward propagating MCS-likemore » events are captured when embedded CRMs replace convective parameterizations. This study uses a SP version of the Community Atmosphere Model version 5 (SP-CAM5) to evaluate MCS sensitivities, applying an objective empirical orthogonal function algorithm to identify MCS-like events, and harmonizing composite storms to account for seasonal and spatial heterogeneity. A five-summer control simulation is used to assess the magnitude of internal and interannual variability relative to 10 sensitivity experiments with varied CRM parameters, including ice fall speed, one-moment and two-moment microphysics, and grid spacing. MCS sensitivities were found to be subtle with respect to internal variability, and indicate that ensembles of over 100 storms may be necessary to detect robust differences in SP-GCMs. Furthermore, these results emphasize that the properties of MCSs can vary widely across individual events, and improving their representation in global simulations with significant internal variability may require comparison to long (multidecadal) time series of observed events rather than single season field campaigns.« less

  10. Sources and pathways of the upscale effects on the Southern Hemisphere jet in MPAS-CAM4 variable-resolution simulations

    DOE PAGES

    Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby; ...

    2016-10-22

    Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less

  11. Sources and pathways of the upscale effects on the Southern Hemisphere jet in MPAS-CAM4 variable-resolution simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby

    Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less

  12. Statistical Analysis of Large Simulated Yield Datasets for Studying Climate Effects

    NASA Technical Reports Server (NTRS)

    Makowski, David; Asseng, Senthold; Ewert, Frank; Bassu, Simona; Durand, Jean-Louis; Martre, Pierre; Adam, Myriam; Aggarwal, Pramod K.; Angulo, Carlos; Baron, Chritian; hide

    2015-01-01

    Many studies have been carried out during the last decade to study the effect of climate change on crop yields and other key crop characteristics. In these studies, one or several crop models were used to simulate crop growth and development for different climate scenarios that correspond to different projections of atmospheric CO2 concentration, temperature, and rainfall changes (Semenov et al., 1996; Tubiello and Ewert, 2002; White et al., 2011). The Agricultural Model Intercomparison and Improvement Project (AgMIP; Rosenzweig et al., 2013) builds on these studies with the goal of using an ensemble of multiple crop models in order to assess effects of climate change scenarios for several crops in contrasting environments. These studies generate large datasets, including thousands of simulated crop yield data. They include series of yield values obtained by combining several crop models with different climate scenarios that are defined by several climatic variables (temperature, CO2, rainfall, etc.). Such datasets potentially provide useful information on the possible effects of different climate change scenarios on crop yields. However, it is sometimes difficult to analyze these datasets and to summarize them in a useful way due to their structural complexity; simulated yield data can differ among contrasting climate scenarios, sites, and crop models. Another issue is that it is not straightforward to extrapolate the results obtained for the scenarios to alternative climate change scenarios not initially included in the simulation protocols. Additional dynamic crop model simulations for new climate change scenarios are an option but this approach is costly, especially when a large number of crop models are used to generate the simulated data, as in AgMIP. Statistical models have been used to analyze responses of measured yield data to climate variables in past studies (Lobell et al., 2011), but the use of a statistical model to analyze yields simulated by complex process-based crop models is a rather new idea. We demonstrate herewith that statistical methods can play an important role in analyzing simulated yield data sets obtained from the ensembles of process-based crop models. Formal statistical analysis is helpful to estimate the effects of different climatic variables on yield, and to describe the between-model variability of these effects.

  13. Designed experiment evaluation of key variables affecting the cutting performance of rotary instruments.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Ercoli, Carlo

    2015-04-01

    Laboratory studies of tooth preparation are often performed under a limited range of conditions involving single values for all variables other than the 1 being tested. In contrast, in clinical settings not all variables can be tightly controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but in clinical practice, the instrument must make different cuts with individual dentists applying a range of different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies is difficult. The purpose of this study was to examine the effect of 9 process variables on dental cutting in a single experiment, allowing each variable to be robustly tested over a range of values for the other 8 and permitting a direct comparison of the relative importance of each on the cutting process. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures and Macor blocks as the cutting substrate. Analysis of Variance (ANOVA) was used to judge the statistical significance (α=.05). Five variables consistently produced large, statistically significant effects (target applied load, cut length, starting rpm, diamond grit size, and cut type), while 4 variables produced relatively small, statistically insignificant effects (number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate). The control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances as well as hardware choices. These results highlight the importance of local clinical conditions (procedure, dentist) in understanding dental cutting procedures and in designing adequate experimental methodologies for future studies. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  14. Ground-Motion Variability for a Strike-Slip Earthquake from Broadband Ground-Motion Simulations

    NASA Astrophysics Data System (ADS)

    Iwaki, A.; Maeda, T.; Morikawa, N.; Fujiwara, H.

    2016-12-01

    One of the important issues in seismic hazard analysis is the evaluation of ground-motion variability due to the epistemic and aleatory uncertainties in various aspects of ground-motion simulations. This study investigates the within-event ground-motion variability in broadband ground-motion simulations for strike-slip events. We conduct ground-motion simulations for a past event (2000 MW6.6 Tottori earthquake) using a set of characterized source models (e.g. Irikura and Miyake, 2011) considering aleatory variability. Broadband ground motion is computed by a hybrid approach that combines a 3D finite-difference method (> 1 s) and the stochastic Green's function method (< 1 s), using the 3D velocity model J-SHIS v2. We consider various locations of the asperities, which are defined as the regions with large slip and stress drop within the fault, and the rupture nucleation point (hypocenter). Ground motion records at 29 K-NET and KiK-net stations are used to validate our simulations. By comparing the simulated and observed ground motion, we found that the performance of the simulations is acceptable under the condition that the source parameters are poorly constrained. In addition to the observation stations, we set 318 virtual receivers with the spatial intervals of 10 km for statistical analysis of the simulated ground motion. The maximum fault-distance is 160 km. Standard deviation (SD) of the simulated acceleration response spectra (Sa, 5% damped) of RotD50 component (Boore, 2010) is investigated at each receiver. SD from 50 different patterns of asperity locations is generally smaller than 0.15 in terms of log10 (0.34 in natural log). It shows dependence on distance at periods shorter than 1 s; SD increases as the distance decreases. On the other hand, SD from 39 different hypocenter locations is again smaller than 0.15 in log10, and showed azimuthal dependence at long periods; it increases as the rupture directivity parameter Xcosθ(Somerville et al. 1997) increases at periods longer than 1 s. The characteristics of ground-motion variability inferred from simulations can provide information on variability in simulation-based seismic hazard assessment for future earthquakes. We will further investigate the variability in other source parameters; rupture velocity and short-period level.

  15. Simulation of crop yield variability by improved root-soil-interaction modelling

    NASA Astrophysics Data System (ADS)

    Duan, X.; Gayler, S.; Priesack, E.

    2009-04-01

    Understanding the processes and factors that govern the within-field variability in crop yield has attached great importance due to applications in precision agriculture. Crop response to environment at field scale is a complex dynamic process involving the interactions of soil characteristics, weather conditions and crop management. The numerous static factors combined with temporal variations make it very difficult to identify and manage the variability pattern. Therefore, crop simulation models are considered to be useful tools in analyzing separately the effects of change in soil or weather conditions on the spatial variability, in order to identify the cause of yield variability and to quantify the spatial and temporal variation. However, tests showed that usual crop models such as CERES-Wheat and CERES-Maize were not able to quantify the observed within-field yield variability, while their performance on crop growth simulation under more homogeneous and mainly non-limiting conditions was sufficent to simulate average yields at the field-scale. On a study site in South Germany, within-field variability in crop growth has been documented since years. After detailed analysis and classification of the soil patterns, two site specific factors, the plant-available-water and the O2 deficiency, were considered as the main causes of the crop growth variability in this field. Based on our measurement of root distribution in the soil profile, we hypothesize that in our case the insufficiency of the applied crop models to simulate the yield variability can be due to the oversimplification of the involved root models which fail to be sensitive to different soil conditions. In this study, the root growth model described by Jones et al. (1991) was adapted by using data of root distributions in the field and linking the adapted root model to the CERES crop model. The ability of the new root model to increase the sensitivity of the CERES crop models to different enviromental conditions was then evaluated by means of comparison of the simualtion results with measured data and by scenario calculations.

  16. Combined effects of constant versus variable intensity simulated rainfall and reduced tillage management on cotton preemergence herbicide runoff.

    PubMed

    Potter, Thomas L; Truman, Clint C; Strickland, Timothy C; Bosch, David D; Webster, Theodore M; Franklin, Dorcas H; Bednarz, Craig W

    2006-01-01

    Pesticide runoff research relies heavily on rainfall simulation experiments. Most are conducted at a constant intensity, i.e., at a fixed rainfall rate; however, large differences in natural rainfall intensity is common. To assess implications we quantified runoff of two herbicides, fluometuron and pendimethalin, and applied preemergence after planting cotton on Tifton loamy sand. Rainfall at constant and variable intensity patterns representative of late spring thunderstorms in the Atlantic Coastal Plain region of Georgia (USA) were simulated on 6-m2 plots under strip- (ST) and conventional-tillage (CT) management. The variable pattern produced significantly higher runoff rates of both compounds from CT but not ST plots. However, on an event-basis, runoff totals (% applied) were not significantly different, with one exception: fluometuron runoff from CT plots. There was about 25% more fluometuron runoff with the variable versus the constant intensity pattern (P = 0.10). Study results suggest that conduct of simulations using variable intensity storm patterns may provide more representative rainfall simulation-based estimates of pesticide runoff and that the greatest impacts will be observed with CT. The study also found significantly more fluometuron in runoff from ST than CT plots. Further work is needed to determine whether this behavior may be generalized to other active ingredients with similar properties [low K(oc) (organic carbon partition coefficient) approximately 100 mL g(-1); high water solubility approximately 100 mg L(-1)]. If so, it should be considered when making tillage-specific herbicide recommendations to reduce runoff potential.

  17. Transition-Tempered Metadynamics Is a Promising Tool for Studying the Permeation of Drug-like Molecules through Membranes.

    PubMed

    Sun, Rui; Dama, James F; Tan, Jeffrey S; Rose, John P; Voth, Gregory A

    2016-10-11

    Metadynamics is an important enhanced sampling technique in molecular dynamics simulation to efficiently explore potential energy surfaces. The recently developed transition-tempered metadynamics (TTMetaD) has been proven to converge asymptotically without sacrificing exploration of the collective variable space in the early stages of simulations, unlike other convergent metadynamics (MetaD) methods. We have applied TTMetaD to study the permeation of drug-like molecules through a lipid bilayer to further investigate the usefulness of this method as applied to problems of relevance to medicinal chemistry. First, ethanol permeation through a lipid bilayer was studied to compare TTMetaD with nontempered metadynamics and well-tempered metadynamics. The bias energies computed from various metadynamics simulations were compared to the potential of mean force calculated from umbrella sampling. Though all of the MetaD simulations agree with one another asymptotically, TTMetaD is able to predict the most accurate and reliable estimate of the potential of mean force for permeation in the early stages of the simulations and is robust to the choice of required additional parameters. We also show that using multiple randomly initialized replicas allows convergence analysis and also provides an efficient means to converge the simulations in shorter wall times and, more unexpectedly, in shorter CPU times; splitting the CPU time between multiple replicas appears to lead to less overall error. After validating the method, we studied the permeation of a more complicated drug-like molecule, trimethoprim. Three sets of TTMetaD simulations with different choices of collective variables were carried out, and all converged within feasible simulation time. The minimum free energy paths showed that TTMetaD was able to predict almost identical permeation mechanisms in each case despite significantly different definitions of collective variables.

  18. Impact of internal variability on projections of Sahel precipitation change

    NASA Astrophysics Data System (ADS)

    Monerie, Paul-Arthur; Sanchez-Gomez, Emilia; Pohl, Benjamin; Robson, Jon; Dong, Buwen

    2017-11-01

    The impact of the increase of greenhouse gases on Sahelian precipitation is very uncertain in both its spatial pattern and magnitude. In particular, the relative importance of internal variability versus external forcings depends on the time horizon considered in the climate projection. In this study we address the respective roles of the internal climate variability versus external forcings on Sahelian precipitation by using the data from the CESM Large Ensemble Project, which consists of a 40 member ensemble performed with the CESM1-CAM5 coupled model for the period 1920-2100. We show that CESM1-CAM5 is able to simulate the mean and interannual variability of Sahel precipitation, and is representative of a CMIP5 ensemble of simulations (i.e. it simulates the same pattern of precipitation change along with equivalent magnitude and seasonal cycle changes as the CMIP5 ensemble mean). However, CESM1-CAM5 underestimates the long-term decadal variability in Sahel precipitation. For short-term (2010-2049) and mid-term (2030-2069) projections the simulated internal variability component is able to obscure the projected impact of the external forcing. For long-term (2060-2099) projections external forcing induced change becomes stronger than simulated internal variability. Precipitation changes are found to be more robust over the central Sahel than over the western Sahel, where climate change effects struggle to emerge. Ten (thirty) members are needed to separate the 10 year averaged forced response from climate internal variability response in the western Sahel for a long-term (short-term) horizon. Over the central Sahel two members (ten members) are needed for a long-term (short-term) horizon.

  19. Internal variability of a dynamically downscaled climate over North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 km and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemblemore » during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late 21st century. However, the IV is larger than the projected changes in precipitation for the mid- and late 21st century.« less

  20. Internal variability of a dynamically downscaled climate over North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble duringmore » the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.« less

  1. Internal variability of a dynamically downscaled climate over North America

    NASA Astrophysics Data System (ADS)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao; Constantinescu, Emil; Drewniak, Beth

    2018-06-01

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.

  2. Internal variability of a dynamically downscaled climate over North America

    NASA Astrophysics Data System (ADS)

    Wang, Jiali; Bessac, Julie; Kotamarthi, Rao; Constantinescu, Emil; Drewniak, Beth

    2017-09-01

    This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.

  3. Atmospheric modeling to assess wind dependence in tracer dilution method measurements of landfill methane emissions.

    PubMed

    Taylor, Diane M; Chow, Fotini K; Delkash, Madjid; Imhoff, Paul T

    2018-03-01

    The short-term temporal variability of landfill methane emissions is not well understood due to uncertainty in measurement methods. Significant variability is seen over short-term measurement campaigns with the tracer dilution method (TDM), but this variability may be due in part to measurement error rather than fluctuations in the actual landfill emissions. In this study, landfill methane emissions and TDM-measured emissions are simulated over a real landfill in Delaware, USA using the Weather Research and Forecasting model (WRF) for two emissions scenarios. In the steady emissions scenario, a constant landfill emissions rate is prescribed at each model grid point on the surface of the landfill. In the unsteady emissions scenario, emissions are calculated at each time step as a function of the local surface wind speed, resulting in variable emissions over each 1.5-h measurement period. The simulation output is used to assess the standard deviation and percent error of the TDM-measured emissions. Eight measurement periods are simulated over two different days to look at different conditions. Results show that standard deviation of the TDM- measured emissions does not increase significantly from the steady emissions simulations to the unsteady emissions scenarios, indicating that the TDM may have inherent errors in its prediction of emissions fluctuations. Results also show that TDM error does not increase significantly from the steady to the unsteady emissions simulations. This indicates that introducing variability to the landfill emissions does not increase errors in the TDM at this site. Across all simulations, TDM errors range from -15% to 43%, consistent with the range of errors seen in previous TDM studies. Simulations indicate diurnal variations of methane emissions when wind effects are significant, which may be important when developing daily and annual emissions estimates from limited field data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. HadCM3 Simulations of ENSO behaviour during the Mid-Pliocene Warm Period

    NASA Astrophysics Data System (ADS)

    Bonham, S. G.; Haywood, A. M.; Lunt, D. J.

    2009-04-01

    It has been suggested that a permanent El Niño state existed during the mid-Pliocene (ca. 3.3 - 3.0 Ma BP), with a west-to-east temperature gradient in the tropical Pacific considerably weaker than today. This is based upon a number of palaeoceanographic studies which have examined the development of the thermocline and SST gradient in the tropical Pacific over the last five million years. This state is now being referred to as El Padre in recognition of the fact that a mean state warming in EEP SSTs does not necessarily imply the presence of a permanent El Niño. Recent results from mid-Pliocene coupled ocean-atmosphere model simulations have shown clear ENSO variability whilst maintaining the warming in the EEP. This research expands on this study, using the UK Met Office GCM (HadCM3), to examine the behaviour and characteristics of ENSO in two mid-Pliocene simulations (with an open and closed Central American Seaway, CAS) compared with a control pre-industrial run, as well as produce a detailed profile of the mean state climates. The results shown include timescales of ENSO variability across four regions in the Pacific, as well as frequency, EOF and wavelet analysis. We have also looked at the interaction of ENSO with the annual cycle and the onset of ENSO events, and the interdecadal variability in the simulations. The initial timeseries produced have shown a greater variability of ENSO during the closed CAS mid-Pliocene simulation where the system oscillates between events much more frequently than seen in the pre-industrial run. The EOF and wavelet analyses quantify this behaviour, showing that the variability is approximately 15% higher over the central and eastern equatorial Pacific, with a period of oscillation of 2-5 years compared with 4-8 years for the pre-industrial simulation. These results will be compared with those obtained from the second mid-Pliocene simulation (open CAS).

  5. Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization

    NASA Astrophysics Data System (ADS)

    Lee, Kyungbook; Song, Seok Goo

    2017-09-01

    Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.

  6. Performance of Sorghum Varieties under Variable Rainfall in Central Tanzania

    PubMed Central

    Tumbo, S. D.; Kihupi, N. I.; Rwehumbiza, Filbert B.

    2017-01-01

    Rainfall variability has a significant impact on crop production with manifestations in frequent crop failure in semiarid areas. This study used the parameterized APSIM crop model to investigate how rainfall variability may affect yields of improved sorghum varieties based on long-term historical rainfall and projected climate. Analyses of historical rainfall indicate a mix of nonsignificant and significant trends on the onset, cessation, and length of the growing season. The study confirmed that rainfall variability indeed affects yields of improved sorghum varieties. Further analyses of simulated sorghum yields based on seasonal rainfall distribution indicate the concurrence of lower grain yields with the 10-day dry spells during the cropping season. Simulation results for future sorghum response, however, show that impacts of rainfall variability on sorghum will be overridden by temperature increase. We conclude that, in the event where harms imposed by moisture stress in the study area are not abated, even improved sorghum varieties are likely to perform poorly. PMID:28536708

  7. Performance of Sorghum Varieties under Variable Rainfall in Central Tanzania.

    PubMed

    Msongaleli, Barnabas M; Tumbo, S D; Kihupi, N I; Rwehumbiza, Filbert B

    2017-01-01

    Rainfall variability has a significant impact on crop production with manifestations in frequent crop failure in semiarid areas. This study used the parameterized APSIM crop model to investigate how rainfall variability may affect yields of improved sorghum varieties based on long-term historical rainfall and projected climate. Analyses of historical rainfall indicate a mix of nonsignificant and significant trends on the onset, cessation, and length of the growing season. The study confirmed that rainfall variability indeed affects yields of improved sorghum varieties. Further analyses of simulated sorghum yields based on seasonal rainfall distribution indicate the concurrence of lower grain yields with the 10-day dry spells during the cropping season. Simulation results for future sorghum response, however, show that impacts of rainfall variability on sorghum will be overridden by temperature increase. We conclude that, in the event where harms imposed by moisture stress in the study area are not abated, even improved sorghum varieties are likely to perform poorly.

  8. Effect of climate data on simulated carbon and nitrogen balances for Europe

    NASA Astrophysics Data System (ADS)

    Blanke, Jan Hendrik; Lindeskog, Mats; Lindström, Johan; Lehsten, Veiko

    2016-05-01

    In this study, we systematically assess the spatial variability in carbon and nitrogen balance simulations related to the choice of global circulation models (GCMs), representative concentration pathways (RCPs), spatial resolutions, and the downscaling methods used as calculated with LPJ-GUESS. We employed a complete factorial design and performed 24 simulations for Europe with different climate input data sets and different combinations of these four factors. Our results reveal that the variability in simulated output in Europe is moderate with 35.6%-93.5% of the total variability being common among all combinations of factors. The spatial resolution is the most important factor among the examined factors, explaining 1.5%-10.7% of the total variability followed by GCMs (0.3%-7.6%), RCPs (0%-6.3%), and downscaling methods (0.1%-4.6%). The higher-order interactions effect that captures nonlinear relations between the factors and random effects is pronounced and accounts for 1.6%-45.8% to the total variability. The most distinct hot spots of variability include the mountain ranges in North Scandinavia and the Alps, and the Iberian Peninsula. Based on our findings, we advise to conduct the application of models such as LPJ-GUESS at a reasonably high spatial resolution which is supported by the model structure. There is no notable gain in simulations of ecosystem carbon and nitrogen stocks and fluxes from using regionally downscaled climate in preference to bias-corrected, bilinearly interpolated CMIP5 projections.

  9. Arctic Ocean Freshwater: How Robust are Model Simulations

    NASA Technical Reports Server (NTRS)

    Jahn, A.; Aksenov, Y.; deCuevas, B. A.; deSteur, L.; Haekkinen, S.; Hansen, E.; Herbaut, C.; Houssais, M.-N.; Karcher, M.; Kauker, F.; hide

    2012-01-01

    The Arctic freshwater (FW) has been the focus of many modeling studies, due to the potential impact of Arctic FW on the deep water formation in the North Atlantic. A comparison of the hindcasts from ten ocean-sea ice models shows that the simulation of the Arctic FW budget is quite different in the investigated models. While they agree on the general sink and source terms of the Arctic FW budget, the long-term means as well as the variability of the FW export vary among models. The best model-to-model agreement is found for the interannual and seasonal variability of the solid FW export and the solid FW storage, which also agree well with observations. For the interannual and seasonal variability of the liquid FW export, the agreement among models is better for the Canadian Arctic Archipelago (CAA) than for Fram Strait. The reason for this is that models are more consistent in simulating volume flux anomalies than salinity anomalies and volume-flux anomalies dominate the liquid FW export variability in the CAA but not in Fram Strait. The seasonal cycle of the liquid FW export generally shows a better agreement among models than the interannual variability, and compared to observations the models capture the seasonality of the liquid FW export rather well. In order to improve future simulations of the Arctic FW budget, the simulation of the salinity field needs to be improved, so that model results on the variability of the liquid FW export and storage become more robust.

  10. Impacts of climate change and internal climate variability on french rivers streamflows

    NASA Astrophysics Data System (ADS)

    Dayon, Gildas; Boé, Julien; Martin, Eric

    2016-04-01

    The assessment of the impacts of climate change often requires to set up long chains of modeling, from the model to estimate the future concentration of greenhouse gases to the impact model. Throughout the modeling chain, sources of uncertainty accumulate making the exploitation of results for the development of adaptation strategies difficult. It is proposed here to assess the impacts of climate change on the hydrological cycle over France and the associated uncertainties. The contribution of the uncertainties from greenhouse gases emission scenario, climate models and internal variability are addressed in this work. To have a large ensemble of climate simulations, the study is based on Global Climate Models (GCM) simulations from the Coupled Model Intercomparison Phase 5 (CMIP5), including several simulations from the same GCM to properly assess uncertainties from internal climate variability. Simulations from the four Radiative Concentration Pathway (RCP) are downscaled with a statistical method developed in a previous study (Dayon et al. 2015). The hydrological system Isba-Modcou is then driven by the downscaling results on a 8 km grid over France. Isba is a land surface model that calculates the energy and water balance and Modcou a hydrogeological model that routes the surface runoff given by Isba. Based on that framework, uncertainties uncertainties from greenhouse gases emission scenario, climate models and climate internal variability are evaluated. Their relative importance is described for the next decades and the end of this century. In a last part, uncertainties due to internal climate variability on streamflows simulated with downscaled GCM and Isba-Modcou are evaluated against observations and hydrological reconstructions on the whole 20th century. Hydrological reconstructions are based on the downscaling of recent atmospheric reanalyses of the 20th century and observations of temperature and precipitation. We show that the multi-decadal variability of streamflows observed in the 20th century is generally weaker in the hydrological simulations done with the historical simulations from climate models. References: Dayon et al. (2015), Transferability in the future climate of a statistical downscaling mehtod for precipitation in France, J. Geophys. Res. Atmos., 120, 1023-1043, doi:10.1002/2014JD022236

  11. Spatial Heterogeneity of Leaf Area Index (LAI) and Its Temporal Course on Arable Land: Combining Field Measurements, Remote Sensing and Simulation in a Comprehensive Data Analysis Approach (CDAA).

    PubMed

    Reichenau, Tim G; Korres, Wolfgang; Montzka, Carsten; Fiener, Peter; Wilken, Florian; Stadler, Anja; Waldhoff, Guido; Schneider, Karl

    2016-01-01

    The ratio of leaf area to ground area (leaf area index, LAI) is an important state variable in ecosystem studies since it influences fluxes of matter and energy between the land surface and the atmosphere. As a basis for generating temporally continuous and spatially distributed datasets of LAI, the current study contributes an analysis of its spatial variability and spatial structure. Soil-vegetation-atmosphere fluxes of water, carbon and energy are nonlinearly related to LAI. Therefore, its spatial heterogeneity, i.e., the combination of spatial variability and structure, has an effect on simulations of these fluxes. To assess LAI spatial heterogeneity, we apply a Comprehensive Data Analysis Approach that combines data from remote sensing (5 m resolution) and simulation (150 m resolution) with field measurements and a detailed land use map. Test area is the arable land in the fertile loess plain of the Rur catchment on the Germany-Belgium-Netherlands border. LAI from remote sensing and simulation compares well with field measurements. Based on the simulation results, we describe characteristic crop-specific temporal patterns of LAI spatial variability. By means of these patterns, we explain the complex multimodal frequency distributions of LAI in the remote sensing data. In the test area, variability between agricultural fields is higher than within fields. Therefore, spatial resolutions less than the 5 m of the remote sensing scenes are sufficient to infer LAI spatial variability. Frequency distributions from the simulation agree better with the multimodal distributions from remote sensing than normal distributions do. The spatial structure of LAI in the test area is dominated by a short distance referring to field sizes. Longer distances that refer to soil and weather can only be derived from remote sensing data. Therefore, simulations alone are not sufficient to characterize LAI spatial structure. It can be concluded that a comprehensive picture of LAI spatial heterogeneity and its temporal course can contribute to the development of an approach to create spatially distributed and temporally continuous datasets of LAI.

  12. Spatial Heterogeneity of Leaf Area Index (LAI) and Its Temporal Course on Arable Land: Combining Field Measurements, Remote Sensing and Simulation in a Comprehensive Data Analysis Approach (CDAA)

    PubMed Central

    Korres, Wolfgang; Montzka, Carsten; Fiener, Peter; Wilken, Florian; Stadler, Anja; Waldhoff, Guido; Schneider, Karl

    2016-01-01

    The ratio of leaf area to ground area (leaf area index, LAI) is an important state variable in ecosystem studies since it influences fluxes of matter and energy between the land surface and the atmosphere. As a basis for generating temporally continuous and spatially distributed datasets of LAI, the current study contributes an analysis of its spatial variability and spatial structure. Soil-vegetation-atmosphere fluxes of water, carbon and energy are nonlinearly related to LAI. Therefore, its spatial heterogeneity, i.e., the combination of spatial variability and structure, has an effect on simulations of these fluxes. To assess LAI spatial heterogeneity, we apply a Comprehensive Data Analysis Approach that combines data from remote sensing (5 m resolution) and simulation (150 m resolution) with field measurements and a detailed land use map. Test area is the arable land in the fertile loess plain of the Rur catchment on the Germany-Belgium-Netherlands border. LAI from remote sensing and simulation compares well with field measurements. Based on the simulation results, we describe characteristic crop-specific temporal patterns of LAI spatial variability. By means of these patterns, we explain the complex multimodal frequency distributions of LAI in the remote sensing data. In the test area, variability between agricultural fields is higher than within fields. Therefore, spatial resolutions less than the 5 m of the remote sensing scenes are sufficient to infer LAI spatial variability. Frequency distributions from the simulation agree better with the multimodal distributions from remote sensing than normal distributions do. The spatial structure of LAI in the test area is dominated by a short distance referring to field sizes. Longer distances that refer to soil and weather can only be derived from remote sensing data. Therefore, simulations alone are not sufficient to characterize LAI spatial structure. It can be concluded that a comprehensive picture of LAI spatial heterogeneity and its temporal course can contribute to the development of an approach to create spatially distributed and temporally continuous datasets of LAI. PMID:27391858

  13. A Single Column Model Ensemble Approach Applied to the TWP-ICE Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, Laura; Jakob, Christian; Cheung, K.

    2013-06-27

    Single column models (SCM) are useful testbeds for investigating the parameterisation schemes of numerical weather prediction and climate models. The usefulness of SCM simulations are limited, however, by the accuracy of the best-estimate large-scale data prescribed. One method to address this uncertainty is to perform ensemble simulations of the SCM. This study first derives an ensemble of large-scale data for the Tropical Warm Pool International Cloud Experiment (TWP-ICE) based on an estimate of a possible source of error in the best-estimate product. This data is then used to carry out simulations with 11 SCM and 2 cloud-resolving models (CRM). Best-estimatemore » simulations are also performed. All models show that moisture related variables are close to observations and there are limited differences between the best-estimate and ensemble mean values. The models, however, show different sensitivities to changes in the forcing particularly when weakly forced. The ensemble simulations highlight important differences in the moisture budget between the SCM and CRM. Systematic differences are also apparent in the ensemble mean vertical structure of cloud variables. The ensemble is further used to investigate relations between cloud variables and precipitation identifying large differences between CRM and SCM. This study highlights that additional information can be gained by performing ensemble simulations enhancing the information derived from models using the more traditional single best-estimate simulation.« less

  14. Does internal variability change in response to global warming? A large ensemble modelling study of tropical rainfall

    NASA Astrophysics Data System (ADS)

    Milinski, S.; Bader, J.; Jungclaus, J. H.; Marotzke, J.

    2017-12-01

    There is some consensus on mean state changes of rainfall under global warming; changes of the internal variability, on the other hand, are more difficult to analyse and have not been discussed as much despite their importance for understanding changes in extreme events, such as droughts or floodings. We analyse changes in the rainfall variability in the tropical Atlantic region. We use a 100-member ensemble of historical (1850-2005) model simulations with the Max Planck Institute for Meteorology Earth System Model (MPI-ESM1) to identify changes of internal rainfall variability. To investigate the effects of global warming on the internal variability, we employ an additional ensemble of model simulations with stronger external forcing (1% CO2-increase per year, same integration length as the historical simulations) with 68 ensemble members. The focus of our study is on the oceanic Atlantic ITCZ. We find that the internal variability of rainfall over the tropical Atlantic does change due to global warming and that these changes in variability are larger than changes in the mean state in some regions. From splitting the total variance into patterns of variability, we see that the variability on the southern flank of the ITCZ becomes more dominant, i.e. explaining a larger fraction of the total variance in a warmer climate. In agreement with previous studies, we find that changes in the mean state show an increase and narrowing of the ITCZ. The large ensembles allow us to do a statistically robust differentiation between the changes in variability that can be explained by internal variability and those that can be attributed to the external forcing. Furthermore, we argue that internal variability in a transient climate is only well defined in the ensemble domain and not in the temporal domain, which requires the use of a large ensemble.

  15. Estimating degradation in real time and accelerated stability tests with random lot-to-lot variation: a simulation study.

    PubMed

    Magari, Robert T

    2002-03-01

    The effect of different lot-to-lot variability levels on the prediction of stability are studied based on two statistical models for estimating degradation in real time and accelerated stability tests. Lot-to-lot variability is considered as random in both models, and is attributed to two sources-variability at time zero, and variability of degradation rate. Real-time stability tests are modeled as a function of time while accelerated stability tests as a function of time and temperatures. Several data sets were simulated, and a maximum likelihood approach was used for estimation. The 95% confidence intervals for the degradation rate depend on the amount of lot-to-lot variability. When lot-to-lot degradation rate variability is relatively large (CV > or = 8%) the estimated confidence intervals do not represent the trend for individual lots. In such cases it is recommended to analyze each lot individually. Copyright 2002 Wiley-Liss, Inc. and the American Pharmaceutical Association J Pharm Sci 91: 893-899, 2002

  16. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  17. Modelling Pseudocalanus elongatus stage-structured population dynamics embedded in a water column ecosystem model for the northern North Sea

    NASA Astrophysics Data System (ADS)

    Moll, Andreas; Stegert, Christoph

    2007-01-01

    This paper outlines an approach to couple a structured zooplankton population model with state variables for eggs, nauplii, two copepodites stages and adults adapted to Pseudocalanus elongatus into the complex marine ecosystem model ECOHAM2 with 13 state variables resolving the carbon and nitrogen cycle. Different temperature and food scenarios derived from laboratory culture studies were examined to improve the process parameterisation for copepod stage dependent development processes. To study annual cycles under realistic weather and hydrographic conditions, the coupled ecosystem-zooplankton model is applied to a water column in the northern North Sea. The main ecosystem state variables were validated against observed monthly mean values. Then vertical profiles of selected state variables were compared to the physical forcing to study differences between zooplankton as one biomass state variable or partitioned into five population state variables. Simulated generation times are more affected by temperature than food conditions except during the spring phytoplankton bloom. Up to six generations within the annual cycle can be discerned in the simulation.

  18. Detection and Attribution of Simulated Climatic Extreme Events and Impacts: High Sensitivity to Bias Correction

    NASA Astrophysics Data System (ADS)

    Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.

    2015-12-01

    Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/

  19. Impact of variable river water stage on the simulation of groundwater-river interactions over the Upper Rhine Graben hydrosystem

    NASA Astrophysics Data System (ADS)

    Habets, F.; Vergnes, J.

    2013-12-01

    The Upper Rhine alluvial aquifer is an important transboundary water resource which is particularly vulnerable to pollution from the rivers due to anthropogenic activities. A realistic simulation of the groundwater-river exchanges is therefore of crucial importance for effective management of water resources, and hence is the main topic of the NAPROM project financed by the French Ministry of Ecology. Characterization of these fluxes in term of quantity and spatio-temporal variability depends on the choice made to represent the river water stage in the model. Recently, a couple surface-subsurface model has been applied to the whole aquifer basin. The river stage was first chosen to be constant over the major part of the basin for the computation of the groundwater-river interactions. The present study aims to introduce a variable river water stage to better simulate these interactions and to quantify the impact of this process over the simulated hydrological variables. The general modeling strategy is based on the Eau-Dyssée modeling platform which couples existing specialized models to address water resources and quality in regional scale river basins. In this study, Eau-Dyssée includes the RAPID river routing model and the SAM hydrogeological model. The input data consist in runoff and infiltration coming from a simulation of the ISBA land surface scheme covering the 1986-2003 period. The QtoZ module allows to calculate river stage from simulated river discharges, which is then used to calculate the exchanges between aquifer units and river. Two approaches are compared. The first one uses rating curves derived from observed river discharges and river stages. The second one is based on the Manning's formula. Manning's parameters are defined with geomorphological parametrizations and topographic data based on Digital Elevation Model (DEM). First results show a relatively good agreement between observed and simulated river water height. Taking into account a variable river stage seems to increase the amount of water exchanged between groundwater and river. Systematic biases are nevertheless found between simulated and observed mean river stage elevation. They show that the primary source of errors when simulating river stage - and hence groundwater-river interactions - is the uncertainties associated with the topographic data used to define the riverbed elevation. Thus, this study confirms the need to access to more accurate DEM for estimating riverbed elevation and studying groundwater-river interactions, at least at regional scale.

  20. Experiments in pilot decision-making during simulated low visibility approaches

    NASA Technical Reports Server (NTRS)

    Curry, R. E.; Lauber, J. K.; Billings, C. E.

    1975-01-01

    A simulation task is reported which incorporates both kinds of variables, informational and psychological, to successfully study pilot decision making behavior in the laboratory. Preliminary experiments in the measurement of decisions and the inducement of stress in simulated low visibility approaches are described.

  1. Eddy-driven low-frequency variability: physics and observability through altimetry

    NASA Astrophysics Data System (ADS)

    Penduff, Thierry; Sérazin, Guillaume; Arbic, Brian; Mueller, Malte; Richman, James G.; Shriver, Jay F.; Morten, Andrew J.; Scott, Robert B.

    2015-04-01

    Model studies have revealed the propensity of the eddying ocean circulation to generate strong low-frequency variability (LFV) intrinsically, i.e. without low-frequency atmospheric variability. In the present study, gridded satellite altimeter products, idealized quasi-geostrophic (QG) turbulent simulations, and realistic high-resolution global ocean simulations are used to study the spontaneous tendency of mesoscale (relatively high frequency and high wavenumber) kinetic energy to non-linearly cascade towards larger time and space scales. The QG model reveals that large-scale variability, arising from the well-known spatial inverse cascade, is associated with low frequencies. Low-frequency, low-wavenumber energy is maintained primarily by nonlinearities in the QG model, with forcing (by large-scale shear) and friction playing secondary roles. In realistic simulations, nonlinearities also generally drive kinetic energy to low frequencies and low wavenumbers. In some, but not all, regions of the gridded altimeter product, surface kinetic energy is also found to cascade toward low frequencies. Exercises conducted with the realistic model suggest that the spatial and temporal filtering inherent in the construction of gridded satellite altimeter maps may contribute to the discrepancies seen in some regions between the direction of frequency cascade in models versus gridded altimeter maps. Finally, the range of frequencies that are highly energized and engaged these cascades appears much greater than the range of highly energized and engaged wavenumbers. Global eddying simulations, performed in the context of the CHAOCEAN project in collaboration with the CAREER project, provide estimates of the range of timescales that these oceanic nonlinearities are likely to feed without external variability.

  2. Modeling seasonal variability of fecal coliform in natural surface waters using the modified SWAT

    NASA Astrophysics Data System (ADS)

    Cho, Kyung Hwa; Pachepsky, Yakov A.; Kim, Minjeong; Pyo, JongCheol; Park, Mi-Hyun; Kim, Young Mo; Kim, Jung-Woo; Kim, Joon Ha

    2016-04-01

    Fecal coliforms are indicators of pathogens and thereby, understanding of their fate and transport in surface waters is important to protect drinking water sources and public health. We compiled fecal coliform observations from four different sites in the USA and Korea and found a seasonal variability with a significant connection to temperature levels. In all observations, fecal coliform concentrations were relatively higher in summer and lower during the winter season. This could be explained by the seasonal dominance of growth or die-off of bacteria in soil and in-stream. Existing hydrologic models, however, have limitations in simulating the seasonal variability of fecal coliform. Soil and in-stream bacterial modules of the Soil and Water Assessment Tool (SWAT) model are oversimplified in that they exclude simulations of alternating bacterial growth. This study develops a new bacteria subroutine for the SWAT in an attempt to improve its prediction accuracy. We introduced critical temperatures as a parameter to simulate the onset of bacterial growth/die-off and to reproduce the seasonal variability of bacteria. The module developed in this study will improve modeling for environmental management schemes.

  3. Multiple-source multiple-harmonic active vibration control of variable section cylindrical structures: A numerical study

    NASA Astrophysics Data System (ADS)

    Liu, Jinxin; Chen, Xuefeng; Gao, Jiawei; Zhang, Xingwu

    2016-12-01

    Air vehicles, space vehicles and underwater vehicles, the cabins of which can be viewed as variable section cylindrical structures, have multiple rotational vibration sources (e.g., engines, propellers, compressors and motors), making the spectrum of noise multiple-harmonic. The suppression of such noise has been a focus of interests in the field of active vibration control (AVC). In this paper, a multiple-source multiple-harmonic (MSMH) active vibration suppression algorithm with feed-forward structure is proposed based on reference amplitude rectification and conjugate gradient method (CGM). An AVC simulation scheme called finite element model in-loop simulation (FEMILS) is also proposed for rapid algorithm verification. Numerical studies of AVC are conducted on a variable section cylindrical structure based on the proposed MSMH algorithm and FEMILS scheme. It can be seen from the numerical studies that: (1) the proposed MSMH algorithm can individually suppress each component of the multiple-harmonic noise with an unified and improved convergence rate; (2) the FEMILS scheme is convenient and straightforward for multiple-source simulations with an acceptable loop time. Moreover, the simulations have similar procedure to real-life control and can be easily extended to physical model platform.

  4. Operator’s Manual for Variable Weight, Variable C.G. Helmet Simulator

    DTIC Science & Technology

    1981-09-01

    fdoestify by block nufber) - A variable weight, variable CG helmet simulator has been designed to measure the effect of US Army headgear on muscle...any variable weights in the boxes, is 2.5 lb, slightly less than the weight of most quality crash helmets made by reputable manufacturers. The addition...of variable weights to the boxes can alter the center of gravity to simulate the effect of equipment attached to the out- side of a helmet. The

  5. Fluid Structural Analysis of Human Cerebral Aneurysm Using Their Own Wall Mechanical Properties

    PubMed Central

    Valencia, Alvaro; Burdiles, Patricio; Ignat, Miguel; Mura, Jorge; Rivera, Rodrigo; Sordo, Juan

    2013-01-01

    Computational Structural Dynamics (CSD) simulations, Computational Fluid Dynamics (CFD) simulation, and Fluid Structure Interaction (FSI) simulations were carried out in an anatomically realistic model of a saccular cerebral aneurysm with the objective of quantifying the effects of type of simulation on principal fluid and solid mechanics results. Eight CSD simulations, one CFD simulation, and four FSI simulations were made. The results allowed the study of the influence of the type of material elements in the solid, the aneurism's wall thickness, and the type of simulation on the modeling of a human cerebral aneurysm. The simulations use their own wall mechanical properties of the aneurysm. The more complex simulation was the FSI simulation completely coupled with hyperelastic Mooney-Rivlin material, normal internal pressure, and normal variable thickness. The FSI simulation coupled in one direction using hyperelastic Mooney-Rivlin material, normal internal pressure, and normal variable thickness is the one that presents the most similar results with respect to the more complex FSI simulation, requiring one-fourth of the calculation time. PMID:24151523

  6. Cloud-resolving model intercomparison of an MC3E squall line case: Part I-Convective updrafts: CRM Intercomparison of a Squall Line

    DOE PAGES

    Fan, Jiwen; Han, Bin; Varble, Adam; ...

    2017-09-06

    An intercomparison study of a midlatitude mesoscale squall line is performed using the Weather Research and Forecasting (WRF) model at 1 km horizontal grid spacing with eight different cloud microphysics schemes to investigate processes that contribute to the large variability in simulated cloud and precipitation properties. All simulations tend to produce a wider area of high radar reflectivity (Z e > 45 dBZ) than observed but a much narrower stratiform area. Furthermore, the magnitude of the virtual potential temperature drop associated with the gust front passage is similar in simulations and observations, while the pressure rise and peak wind speedmore » are smaller than observed, possibly suggesting that simulated cold pools are shallower than observed. Most of the microphysics schemes overestimate vertical velocity and Z e in convective updrafts as compared with observational retrievals. Simulated precipitation rates and updraft velocities have significant variability across the eight schemes, even in this strongly dynamically driven system. Differences in simulated updraft velocity correlate well with differences in simulated buoyancy and low-level vertical perturbation pressure gradient, which appears related to cold pool intensity that is controlled by the evaporation rate. Simulations with stronger updrafts have a more optimal convective state, with stronger cold pools, ambient low-level vertical wind shear, and rear-inflow jets. We found that updraft velocity variability between schemes is mainly controlled by differences in simulated ice-related processes, which impact the overall latent heating rate, whereas surface rainfall variability increases in no-ice simulations mainly because of scheme differences in collision-coalescence parameterizations.« less

  7. Operator’s Manual for Variable Weight, Variable C. G. Helmet Simulator,

    DTIC Science & Technology

    1981-09-01

    A variabh weight, variable CG helmet simulator has been designed to measure the effect of US Army headgear on muscle loading and fatigue. The helmet...less than the weight of most quality crash helmets made by reputable manufacturers. The addition of variable weights to the boxes can alter the center...of gravity to simulate the effect of equipment attached to the out- side of a helmet. The helmet simulator has been calibrated for weights of 3.2, 4.0

  8. Continuous variable quantum optical simulation for time evolution of quantum harmonic oscillators

    PubMed Central

    Deng, Xiaowei; Hao, Shuhong; Guo, Hong; Xie, Changde; Su, Xiaolong

    2016-01-01

    Quantum simulation enables one to mimic the evolution of other quantum systems using a controllable quantum system. Quantum harmonic oscillator (QHO) is one of the most important model systems in quantum physics. To observe the transient dynamics of a QHO with high oscillation frequency directly is difficult. We experimentally simulate the transient behaviors of QHO in an open system during time evolution with an optical mode and a logical operation system of continuous variable quantum computation. The time evolution of an atomic ensemble in the collective spontaneous emission is analytically simulated by mapping the atomic ensemble onto a QHO. The measured fidelity, which is used for quantifying the quality of the simulation, is higher than its classical limit. The presented simulation scheme provides a new tool for studying the dynamic behaviors of QHO. PMID:26961962

  9. On the use of nudging techniques for regional climate modeling: application for tropical convection

    NASA Astrophysics Data System (ADS)

    Pohl, Benjamin; Crétat, Julien

    2014-09-01

    Using a large set of WRF ensemble simulations at 70-km horizontal resolution over a domain encompassing the Warm Pool region and its surroundings [45°N-45°S, 10°E-240°E], this study aims at quantifying how nudging techniques can modify the simulation of deep atmospheric convection. Both seasonal mean climate, transient variability at intraseasonal timescales, and the respective weight of internal (stochastic) and forced (reproducible) variability are considered. Sensitivity to a large variety of nudging settings (nudged variables and layers and nudging strength) and to the model physics (using 3 convective parameterizations) is addressed. Integrations are carried out during a 7-month season characterized by neutral background conditions and strong intraseasonal variability. Results show that (1) the model responds differently to the nudging from one parameterization to another. Biases are decreased by ~50 % for Betts-Miller-Janjic convection against 17 % only for Grell-Dévényi, the scheme producing yet the largest biases; (2) relaxing air temperature is the most efficient way to reduce biases, while nudging the wind increases most co-variability with daily observations; (3) the model's internal variability is drastically reduced and mostly depends on the nudging strength and nudged variables; (4) interrupting the relaxation before the end of the simulations leads to an abrupt convergence towards the model's natural solution, with no clear effects on the simulated climate after a few days. The usefulness and limitations of the approach are finally discussed through the example of the Madden-Julian Oscillation, that the model fails at simulating and that can be artificially and still imperfectly reproduced in relaxation experiments.

  10. Hydroclimate variability in Scandinavia over the last millennium - insights from a climate model-proxy data comparison

    NASA Astrophysics Data System (ADS)

    Seftigen, Kristina; Goosse, Hugues; Klein, Francois; Chen, Deliang

    2017-12-01

    The integration of climate proxy information with general circulation model (GCM) results offers considerable potential for deriving greater understanding of the mechanisms underlying climate variability, as well as unique opportunities for out-of-sample evaluations of model performance. In this study, we combine insights from a new tree-ring hydroclimate reconstruction from Scandinavia with projections from a suite of forced transient simulations of the last millennium and historical intervals from the CMIP5 and PMIP3 archives. Model simulations and proxy reconstruction data are found to broadly agree on the modes of atmospheric variability that produce droughts-pluvials in the region. Despite these dynamical similarities, large differences between simulated and reconstructed hydroclimate time series remain. We find that the GCM-simulated multi-decadal and/or longer hydroclimate variability is systematically smaller than the proxy-based estimates, whereas the dominance of GCM-simulated high-frequency components of variability is not reflected in the proxy record. Furthermore, the paleoclimate evidence indicates in-phase coherencies between regional hydroclimate and temperature on decadal timescales, i.e., sustained wet periods have often been concurrent with warm periods and vice versa. The CMIP5-PMIP3 archive suggests, however, out-of-phase coherencies between the two variables in the last millennium. The lack of adequate understanding of mechanisms linking temperature and moisture supply on longer timescales has serious implications for attribution and prediction of regional hydroclimate changes. Our findings stress the need for further paleoclimate data-model intercomparison efforts to expand our understanding of the dynamics of hydroclimate variability and change, to enhance our ability to evaluate climate models, and to provide a more comprehensive view of future drought and pluvial risks.

  11. An Attempt To Estimate The Contribution Of Variability Of Wetland Extent On The Variability Of The Atmospheric Methane Growth Rate In The Years 1993-2000.

    NASA Astrophysics Data System (ADS)

    Ringeval, B.; de Noblet-Ducoudre, N.; Prigent, C.; Bousquet, P.

    2006-12-01

    The atmospheric methane growth rate presents lots of seasonal and year-to-year variations. Large uncertainties still exist in the relative part of differents sources and sinks on these variations. We have considered, in this study, the main natural sources of methane and the supposed main variable source, i.e. wetlands, and tried to simulate the variations of their emissions considering the variability of the wetland extent and of the climate. For this study, we use the methane emission model of Walter et al. (2001) and the quantification of the flooded areas for the years 1993-2000 obtained with a suite of satellite observations by Prigent et al. (2001). The data necessary to the Walter's model are obtained with simulation of a dynamic global vegetation model ORCHIDEE (Krinner et al. (2005)) constrained by the NCC climate data (Ngo-Duc et al. (2005)) and after imposing a water-saturated soil to approach productivity of wetlands. We calculate global annual methane emissions from wetlands to be 400 Tg per year, that is higher than previous results obtained with fixed wetland extent. Simulations are realised to estimate the part of variability in the emissions explained by the variability of the wetland extent. It seems that the year-to-year emission variability is mainly explained by the interannual variability of wetland extent. The seasonnal variability is explained for 75% in the tropics and only for 40% in the north of 30°N by variability of wetlands extend. Finally, we compare results with a top-down approach of Bousquet et al.(2006).

  12. SU-E-I-88: Realistic Pathological Simulations of the NCAT and Zubal Anthropomorphic Models, Based on Clinical PET/CT Data.

    PubMed

    Papadimitroulas, P; Loudos, G; Le Maitre, A; Efthimiou, N; Visvikis, D; Nikiforidis, G; Kagadis, G C

    2012-06-01

    In the present study a patient-specific dataset of realistic PET simulations was created, taking into account the variability of clinical oncology data. Tumor variability was tested in the simulated results. A comparison of the produced simulated data was performed to clinical PET/CT data, for the validation and the evaluation of the procedure. Clinical PET/CT data of oncology patients were used as the basis of the simulated variability inserting patient-specific characteristics in the NCAT and the Zubal anthropomorphic phantoms. GATE Monte Carlo toolkit was used for simulating a commercial PET scanner. The standard computational anthropomorphic phantoms were adapted to the CT data (organ shapes), using a fitting algorithm. The activity map was derived from PET images. Patient tumors were segmented and inserted in the phantom, using different activity distributions. The produced simulated data were reconstructed using the STIR opensource software and compared to the original clinical ones. The accuracy of the procedure was tested in four different oncology cases. Each pathological situation was illustrated simulating a) a healthy body, b) insertion of the clinical tumor with homogenous activity, and c) insertion of the clinical tumor with variable activity (voxel-by-voxel) based on the clinical PET data. The accuracy of the presented dataset was compared to the original PET/CT data. Partial Volume Correction (PVC) was also applied in the simulated data. In this study patient-specific characteristics were used in computational anthropomorphic models for simulating realistic pathological patients. Voxel-by-voxel activity distribution with PVC within the tumor gives the most accurate results. Radiotherapy applications can utilize the benefits of the accurate realistic imaging simulations, using the anatomicaland biological information of each patient. Further work will incorporate the development of analytical anthropomorphic models with motion and cardiac correction, combined with pathological patients to achieve high accuracy in tumor imaging. This research was supported by the Joint Research and Technology Program between Greece and France; 2009-2011 (protocol ID: 09FR103). © 2012 American Association of Physicists in Medicine.

  13. Measurement variability error for estimates of volume change

    Treesearch

    James A. Westfall; Paul L. Patterson

    2007-01-01

    Using quality assurance data, measurement variability distributions were developed for attributes that affect tree volume prediction. Random deviations from the measurement variability distributions were applied to 19381 remeasured sample trees in Maine. The additional error due to measurement variation and measurement bias was estimated via a simulation study for...

  14. Comparison of Two Grid Refinement Approaches for High Resolution Regional Climate Modeling: MPAS vs WRF

    NASA Astrophysics Data System (ADS)

    Leung, L.; Hagos, S. M.; Rauscher, S.; Ringler, T.

    2012-12-01

    This study compares two grid refinement approaches using global variable resolution model and nesting for high-resolution regional climate modeling. The global variable resolution model, Model for Prediction Across Scales (MPAS), and the limited area model, Weather Research and Forecasting (WRF) model, are compared in an idealized aqua-planet context with a focus on the spatial and temporal characteristics of tropical precipitation simulated by the models using the same physics package from the Community Atmosphere Model (CAM4). For MPAS, simulations have been performed with a quasi-uniform resolution global domain at coarse (1 degree) and high (0.25 degree) resolution, and a variable resolution domain with a high-resolution region at 0.25 degree configured inside a coarse resolution global domain at 1 degree resolution. Similarly, WRF has been configured to run on a coarse (1 degree) and high (0.25 degree) resolution tropical channel domain as well as a nested domain with a high-resolution region at 0.25 degree nested two-way inside the coarse resolution (1 degree) tropical channel. The variable resolution or nested simulations are compared against the high-resolution simulations that serve as virtual reality. Both MPAS and WRF simulate 20-day Kelvin waves propagating through the high-resolution domains fairly unaffected by the change in resolution. In addition, both models respond to increased resolution with enhanced precipitation. Grid refinement induces zonal asymmetry in precipitation (heating), accompanied by zonal anomalous Walker like circulations and standing Rossby wave signals. However, there are important differences between the anomalous patterns in MPAS and WRF due to differences in the grid refinement approaches and sensitivity of model physics to grid resolution. This study highlights the need for "scale aware" parameterizations in variable resolution and nested regional models.

  15. Nonparametric methods in actigraphy: An update

    PubMed Central

    Gonçalves, Bruno S.B.; Cavalcanti, Paula R.A.; Tavares, Gracilene R.; Campos, Tania F.; Araujo, John F.

    2014-01-01

    Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm) results for each time interval. Simulated data showed that (1) synchronization analysis depends on sample size, and (2) fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization. PMID:26483921

  16. Impacts of Considering Climate Variability on Investment Decisions in Ethiopia

    NASA Astrophysics Data System (ADS)

    Strzepek, K.; Block, P.; Rosegrant, M.; Diao, X.

    2005-12-01

    In Ethiopia, climate extremes, inducing droughts or floods, are not unusual. Monitoring the effects of these extremes, and climate variability in general, is critical for economic prediction and assessment of the country's future welfare. The focus of this study involves adding climate variability to a deterministic, mean climate-driven agro-economic model, in an attempt to understand its effects and degree of influence on general economic prediction indicators for Ethiopia. Four simulations are examined, including a baseline simulation and three investment strategies: simulations of irrigation investment, roads investment, and a combination investment of both irrigation and roads. The deterministic model is transformed into a stochastic model by dynamically adding year-to-year climate variability through climate-yield factors. Nine sets of actual, historic, variable climate data are individually assembled and implemented into the 12-year stochastic model simulation, producing an ensemble of economic prediction indicators. This ensemble allows for a probabilistic approach to planning and policy making, allowing decision makers to consider risk. The economic indicators from the deterministic and stochastic approaches, including rates of return to investments, are significantly different. The predictions of the deterministic model appreciably overestimate the future welfare of Ethiopia; the predictions of the stochastic model, utilizing actual climate data, tend to give a better semblance of what may be expected. Inclusion of climate variability is vital for proper analysis of the predictor values from this agro-economic model.

  17. Cellular Automata Simulation for Wealth Distribution

    NASA Astrophysics Data System (ADS)

    Lo, Shih-Ching

    2009-08-01

    Wealth distribution of a country is a complicate system. A model, which is based on the Epstein & Axtell's "Sugars cape" model, is presented in Netlogo. The model considers the income, age, working opportunity and salary as control variables. There are still other variables should be considered while an artificial society is established. In this study, a more complicate cellular automata model for wealth distribution model is proposed. The effects of social welfare, tax, economical investment and inheritance are considered and simulated. According to the cellular automata simulation for wealth distribution, we will have a deep insight of financial policy of the government.

  18. Influence of bulk microphysics schemes upon Weather Research and Forecasting (WRF) version 3.6.1 nor'easter simulations

    NASA Astrophysics Data System (ADS)

    Nicholls, Stephen D.; Decker, Steven G.; Tao, Wei-Kuo; Lang, Stephen E.; Shi, Jainn J.; Mohr, Karen I.

    2017-03-01

    This study evaluated the impact of five single- or double-moment bulk microphysics schemes (BMPSs) on Weather Research and Forecasting model (WRF) simulations of seven intense wintertime cyclones impacting the mid-Atlantic United States; 5-day long WRF simulations were initialized roughly 24 h prior to the onset of coastal cyclogenesis off the North Carolina coastline. In all, 35 model simulations (five BMPSs and seven cases) were run and their associated microphysics-related storm properties (hydrometer mixing ratios, precipitation, and radar reflectivity) were evaluated against model analysis and available gridded radar and ground-based precipitation products. Inter-BMPS comparisons of column-integrated mixing ratios and mixing ratio profiles reveal little variability in non-frozen hydrometeor species due to their shared programming heritage, yet their assumptions concerning snow and graupel intercepts, ice supersaturation, snow and graupel density maps, and terminal velocities led to considerable variability in both simulated frozen hydrometeor species and radar reflectivity. WRF-simulated precipitation fields exhibit minor spatiotemporal variability amongst BMPSs, yet their spatial extent is largely conserved. Compared to ground-based precipitation data, WRF simulations demonstrate low-to-moderate (0.217-0.414) threat scores and a rainfall distribution shifted toward higher values. Finally, an analysis of WRF and gridded radar reflectivity data via contoured frequency with altitude diagrams (CFADs) reveals notable variability amongst BMPSs, where better performing schemes favored lower graupel mixing ratios and better underlying aggregation assumptions.

  19. Influence of Bulk Microphysics Schemes upon Weather Research and Forecasting (WRF) Version 3.6.1 Nor'easter Simulations.

    PubMed

    Nicholls, Stephen D; Decker, Steven G; Tao, Wei-Kuo; Lang, Stephen E; Shi, Jainn J; Mohr, Karen I

    2017-01-01

    This study evaluated the impact of five, single- or double- moment bulk microphysics schemes (BMPSs) on Weather Research and Forecasting model (WRF) simulations of seven, intense winter time cyclones impacting the Mid-Atlantic United States. Five-day long WRF simulations were initialized roughly 24 hours prior to the onset of coastal cyclogenesis off the North Carolina coastline. In all, 35 model simulations (5 BMPSs and seven cases) were run and their associated microphysics-related storm properties (hydrometer mixing ratios, precipitation, and radar reflectivity) were evaluated against model analysis and available gridded radar and ground-based precipitation products. Inter-BMPS comparisons of column-integrated mixing ratios and mixing ratio profiles reveal little variability in non-frozen hydrometeor species due to their shared programming heritage, yet their assumptions concerning snow and graupel intercepts, ice supersaturation, snow and graupel density maps, and terminal velocities lead to considerable variability in both simulated frozen hydrometeor species and radar reflectivity. WRF-simulated precipitation fields exhibit minor spatio-temporal variability amongst BMPSs, yet their spatial extent is largely conserved. Compared to ground-based precipitation data, WRF-simulations demonstrate low-to-moderate (0.217-0.414) threat scores and a rainfall distribution shifted toward higher values. Finally, an analysis of WRF and gridded radar reflectivity data via contoured frequency with altitude (CFAD) diagrams reveals notable variability amongst BMPSs, where better performing schemes favored lower graupel mixing ratios and better underlying aggregation assumptions.

  20. Influence of Bulk Microphysics Schemes upon Weather Research and Forecasting (WRF) Version 3.6.1 Nor'easter Simulations

    PubMed Central

    Nicholls, Stephen D.; Decker, Steven G.; Tao, Wei-Kuo; Lang, Stephen E.; Shi, Jainn J.; Mohr, Karen I.

    2018-01-01

    This study evaluated the impact of five, single- or double- moment bulk microphysics schemes (BMPSs) on Weather Research and Forecasting model (WRF) simulations of seven, intense winter time cyclones impacting the Mid-Atlantic United States. Five-day long WRF simulations were initialized roughly 24 hours prior to the onset of coastal cyclogenesis off the North Carolina coastline. In all, 35 model simulations (5 BMPSs and seven cases) were run and their associated microphysics-related storm properties (hydrometer mixing ratios, precipitation, and radar reflectivity) were evaluated against model analysis and available gridded radar and ground-based precipitation products. Inter-BMPS comparisons of column-integrated mixing ratios and mixing ratio profiles reveal little variability in non-frozen hydrometeor species due to their shared programming heritage, yet their assumptions concerning snow and graupel intercepts, ice supersaturation, snow and graupel density maps, and terminal velocities lead to considerable variability in both simulated frozen hydrometeor species and radar reflectivity. WRF-simulated precipitation fields exhibit minor spatio-temporal variability amongst BMPSs, yet their spatial extent is largely conserved. Compared to ground-based precipitation data, WRF-simulations demonstrate low-to-moderate (0.217–0.414) threat scores and a rainfall distribution shifted toward higher values. Finally, an analysis of WRF and gridded radar reflectivity data via contoured frequency with altitude (CFAD) diagrams reveals notable variability amongst BMPSs, where better performing schemes favored lower graupel mixing ratios and better underlying aggregation assumptions. PMID:29697705

  1. Influence of Bulk Microphysics Schemes upon Weather Research and Forecasting (WRF) Version 3.6.1 Nor'easter Simulations

    NASA Technical Reports Server (NTRS)

    Nicholls, Stephen D.; Decker, Steven G.; Tao, Wei-Kuo; Lang, Stephen E.; Shi, Jainn J.; Mohr, Karen Irene

    2017-01-01

    This study evaluated the impact of five single- or double-moment bulk microphysics schemes (BMPSs) on Weather Research and Forecasting model (WRF) simulations of seven intense wintertime cyclones impacting the mid-Atlantic United States; 5-day long WRF simulations were initialized roughly 24 hours prior to the onset of coastal cyclogenesis off the North Carolina coastline. In all, 35 model simulations (five BMPSs and seven cases) were run and their associated microphysics-related storm properties (hydrometer mixing ratios, precipitation, and radar reflectivity) were evaluated against model analysis and available gridded radar and ground-based precipitation products. Inter-BMPS comparisons of column-integrated mixing ratios and mixing ratio profiles reveal little variability in non-frozen hydrometeor species due to their shared programming heritage, yet their assumptions concerning snow and graupel intercepts, ice supersaturation, snow and graupel density maps, and terminal velocities led to considerable variability in both simulated frozen hydrometeor species and radar reflectivity. WRF-simulated precipitation fields exhibit minor spatiotemporal variability amongst BMPSs, yet their spatial extent is largely conserved. Compared to ground-based precipitation data, WRF simulations demonstrate low-to-moderate (0.217 to 0.414) threat scores and a rainfall distribution shifted toward higher values. Finally, an analysis of WRF and gridded radar reflectivity data via contoured frequency with altitude (CFAD) diagrams reveals notable variability amongst BMPSs, where better performing schemes favored lower graupel mixing ratios and better underlying aggregation assumptions.

  2. Scales of variability of black carbon plumes and their dependence on resolution of ECHAM6-HAM

    NASA Astrophysics Data System (ADS)

    Weigum, Natalie; Stier, Philip; Schutgens, Nick; Kipling, Zak

    2015-04-01

    Prediction of the aerosol effect on climate depends on the ability of three-dimensional numerical models to accurately estimate aerosol properties. However, a limitation of traditional grid-based models is their inability to resolve variability on scales smaller than a grid box. Past research has shown that significant aerosol variability exists on scales smaller than these grid-boxes, which can lead to discrepancies between observations and aerosol models. The aim of this study is to understand how a global climate model's (GCM) inability to resolve sub-grid scale variability affects simulations of important aerosol features. This problem is addressed by comparing observed black carbon (BC) plume scales from the HIPPO aircraft campaign to those simulated by ECHAM-HAM GCM, and testing how model resolution affects these scales. This study additionally investigates how model resolution affects BC variability in remote and near-source regions. These issues are examined using three different approaches: comparison of observed and simulated along-flight-track plume scales, two-dimensional autocorrelation analysis, and 3-dimensional plume analysis. We find that the degree to which GCMs resolve variability can have a significant impact on the scales of BC plumes, and it is important for models to capture the scales of aerosol plume structures, which account for a large degree of aerosol variability. In this presentation, we will provide further results from the three analysis techniques along with a summary of the implication of these results on future aerosol model development.

  3. Simulations of Eurasian winter temperature trends in coupled and uncoupled CFSv2

    NASA Astrophysics Data System (ADS)

    Collow, Thomas W.; Wang, Wanqiu; Kumar, Arun

    2018-01-01

    Conflicting results have been presented regarding the link between Arctic sea-ice loss and midlatitude cooling, particularly over Eurasia. This study analyzes uncoupled (atmosphere-only) and coupled (ocean-atmosphere) simulations by the Climate Forecast System, version 2 (CFSv2), to examine this linkage during the Northern Hemisphere winter, focusing on the simulation of the observed surface cooling trend over Eurasia during the last three decades. The uncoupled simulations are Atmospheric Model Intercomparison Project (AMIP) runs forced with mean seasonal cycles of sea surface temperature (SST) and sea ice, using combinations of SST and sea ice from different time periods to assess the role that each plays individually, and to assess the role of atmospheric internal variability. Coupled runs are used to further investigate the role of internal variability via the analysis of initialized predictions and the evolution of the forecast with lead time. The AMIP simulations show a mean warming response over Eurasia due to SST changes, but little response to changes in sea ice. Individual runs simulate cooler periods over Eurasia, and this is shown to be concurrent with a stronger Siberian high and warming over Greenland. No substantial differences in the variability of Eurasian surface temperatures are found between the different model configurations. In the coupled runs, the region of significant warming over Eurasia is small at short leads, but increases at longer leads. It is concluded that, although the models have some capability in highlighting the temperature variability over Eurasia, the observed cooling may still be a consequence of internal variability.

  4. A simulation study of capacity utilization to predict future capacity for manufacturing system sustainability

    NASA Astrophysics Data System (ADS)

    Rimo, Tan Hauw Sen; Chai Tin, Ong

    2017-12-01

    Capacity utilization (CU) measurement is an important task in a manufacturing system, especially in make-to-order (MTO) type manufacturing system with product customization, in predicting capacity to meet future demand. A stochastic discrete-event simulation is developed using ARENA software to determine CU and capacity gap (CG) in short run production function. This study focused on machinery breakdown and product defective rate as random variables in the simulation. The study found that the manufacturing system run in 68.01% CU and 31.99% CG. It is revealed that machinery breakdown and product defective rate have a direct relationship with CU. By improving product defective rate into zero defect, manufacturing system can improve CU up to 73.56% and CG decrease to 26.44%. While improving machinery breakdown into zero breakdowns will improve CU up to 93.99% and the CG decrease to 6.01%. This study helps operation level to study CU using “what-if” analysis in order to meet future demand in more practical and easier method by using simulation approach. Further study is recommended by including other random variables that affect CU to make the simulation closer with the real-life situation for a better decision.

  5. Comparison of Two Stochastic Daily Rainfall Models and their Ability to Preserve Multi-year Rainfall Variability

    NASA Astrophysics Data System (ADS)

    Kamal Chowdhury, AFM; Lockart, Natalie; Willgoose, Garry; Kuczera, George; Kiem, Anthony; Parana Manage, Nadeeka

    2016-04-01

    Stochastic simulation of rainfall is often required in the simulation of streamflow and reservoir levels for water security assessment. As reservoir water levels generally vary on monthly to multi-year timescales, it is important that these rainfall series accurately simulate the multi-year variability. However, the underestimation of multi-year variability is a well-known issue in daily rainfall simulation. Focusing on this issue, we developed a hierarchical Markov Chain (MC) model in a traditional two-part MC-Gamma Distribution modelling structure, but with a new parameterization technique. We used two parameters of first-order MC process (transition probabilities of wet-to-wet and dry-to-dry days) to simulate the wet and dry days, and two parameters of Gamma distribution (mean and standard deviation of wet day rainfall) to simulate wet day rainfall depths. We found that use of deterministic Gamma parameter values results in underestimation of multi-year variability of rainfall depths. Therefore, we calculated the Gamma parameters for each month of each year from the observed data. Then, for each month, we fitted a multi-variate normal distribution to the calculated Gamma parameter values. In the model, we stochastically sampled these two Gamma parameters from the multi-variate normal distribution for each month of each year and used them to generate rainfall depth in wet days using the Gamma distribution. In another study, Mehrotra and Sharma (2007) proposed a semi-parametric Markov model. They also used a first-order MC process for rainfall occurrence simulation. But, the MC parameters were modified by using an additional factor to incorporate the multi-year variability. Generally, the additional factor is analytically derived from the rainfall over a pre-specified past periods (e.g. last 30, 180, or 360 days). They used a non-parametric kernel density process to simulate the wet day rainfall depths. In this study, we have compared the performance of our hierarchical MC model with the semi-parametric model in preserving rainfall variability in daily, monthly, and multi-year scales. To calibrate the parameters of both models and assess their ability to preserve observed statistics, we have used ground based data from 15 raingauge stations around Australia, which consist a wide range of climate zones including coastal, monsoonal, and arid climate characteristics. In preliminary results, both models show comparative performances in preserving the multi-year variability of rainfall depth and occurrence. However, the semi-parametric model shows a tendency of overestimating the mean rainfall depth, while our model shows a tendency of overestimating the number of wet days. We will discuss further the relative merits of the both models for hydrology simulation in the presentation.

  6. Modelling Geomechanical Heterogeneity of Rock Masses Using Direct and Indirect Geostatistical Conditional Simulation Methods

    NASA Astrophysics Data System (ADS)

    Eivazy, Hesameddin; Esmaieli, Kamran; Jean, Raynald

    2017-12-01

    An accurate characterization and modelling of rock mass geomechanical heterogeneity can lead to more efficient mine planning and design. Using deterministic approaches and random field methods for modelling rock mass heterogeneity is known to be limited in simulating the spatial variation and spatial pattern of the geomechanical properties. Although the applications of geostatistical techniques have demonstrated improvements in modelling the heterogeneity of geomechanical properties, geostatistical estimation methods such as Kriging result in estimates of geomechanical variables that are not fully representative of field observations. This paper reports on the development of 3D models for spatial variability of rock mass geomechanical properties using geostatistical conditional simulation method based on sequential Gaussian simulation. A methodology to simulate the heterogeneity of rock mass quality based on the rock mass rating is proposed and applied to a large open-pit mine in Canada. Using geomechanical core logging data collected from the mine site, a direct and an indirect approach were used to model the spatial variability of rock mass quality. The results of the two modelling approaches were validated against collected field data. The study aims to quantify the risks of pit slope failure and provides a measure of uncertainties in spatial variability of rock mass properties in different areas of the pit.

  7. DIF Analysis with Multilevel Data: A Simulation Study Using the Latent Variable Approach

    ERIC Educational Resources Information Center

    Jin, Ying; Eason, Hershel

    2016-01-01

    The effects of mean ability difference (MAD) and short tests on the performance of various DIF methods have been studied extensively in previous simulation studies. Their effects, however, have not been studied under multilevel data structure. MAD was frequently observed in large-scale cross-country comparison studies where the primary sampling…

  8. Read margin analysis of crossbar arrays using the cell-variability-aware simulation method

    NASA Astrophysics Data System (ADS)

    Sun, Wookyung; Choi, Sujin; Shin, Hyungsoon

    2018-02-01

    This paper proposes a new concept of read margin analysis of crossbar arrays using cell-variability-aware simulation. The size of the crossbar array should be considered to predict the read margin characteristic of the crossbar array because the read margin depends on the number of word lines and bit lines. However, an excessively high-CPU time is required to simulate large arrays using a commercial circuit simulator. A variability-aware MATLAB simulator that considers independent variability sources is developed to analyze the characteristics of the read margin according to the array size. The developed MATLAB simulator provides an effective method for reducing the simulation time while maintaining the accuracy of the read margin estimation in the crossbar array. The simulation is also highly efficient in analyzing the characteristic of the crossbar memory array considering the statistical variations in the cell characteristics.

  9. Megadroughts in Southwestern North America in ECHO-G Millennial Simulations and Their Comparison to Proxy Drought Reconstructions

    NASA Technical Reports Server (NTRS)

    Coats, Sloan; Smerdon, Jason E.; Seager, Richard; Cook, Benjamin I.; Gozalez-Rouco, J. F.

    2013-01-01

    Simulated hydroclimate variability in millennium-length forced transient and control simulations from the ECHAM and the global Hamburg Ocean Primitive Equation (ECHO-G) coupled atmosphere-ocean general circulation model (AOGCM) is analyzed and compared to 1000 years of reconstructed Palmer drought severity index (PDSI) variability from the North American Drought Atlas (NADA). The ability of the model to simulate megadroughts in the North American southwest is evaluated. (NASW: 25deg42.5degN, 125deg-105degW). Megadroughts in the ECHO-G AOGCM are found to be similar in duration and magnitude to those estimated from the NADA. The droughts in the forced simulation are not, however, temporally synchronous with those in the paleoclimate record, nor are there significant differences between the drought features simulated in the forced and control runs. These results indicate that model-simulated megadroughts can result from internal variability of the modeled climate system rather than as a response to changes in exogenous forcings. Although the ECHO-G AOGCM is capable of simulating megadroughts through persistent La Nina-like conditions in the tropical Pacific, other mechanisms can produce similarly extreme NASW moisture anomalies in the model. In particular, the lack of low-frequency coherence between NASW soil moisture and simulated modes of climate variability like the El Nino-Southern Oscillation, Pacific decadal oscillation, and Atlantic multidecadal oscillation during identified drought periods suggests that stochastic atmospheric variability can contribute significantly to the occurrence of simulated megadroughts in the NASW. These findings indicate that either an expanded paradigm is needed to understand multidecadal hydroclimate variability in the NASW or AOGCMs may incorrectly simulate the strength and/or dynamics of the connection between NASW hydroclimate variability and the tropical Pacific.

  10. Comparison of TID Effects in Space-Like Variable Dose Rates and Constant Dose Rates

    NASA Technical Reports Server (NTRS)

    Harris, Richard D.; McClure, Steven S.; Rax, Bernard G.; Evans, Robin W.; Jun, Insoo

    2008-01-01

    The degradation of the LM193 dual voltage comparator has been studied at different TID dose rate profiles, including several different constant dose rates and a variable dose rate that simulates the behavior of a solar flare. A comparison of results following constant dose rate vs. variable dose rates is made to explore how well the constant dose rates used for typical part testing predict the performance during a simulated space-like mission. Testing at a constant dose rate equal to the lowest dose rate seen during the simulated flare provides an extremely conservative estimate of the overall amount of degradation. A constant dose rate equal to the average dose rate is also more conservative than the variable rate. It appears that, for this part, weighting the dose rates by the amount of total dose received at each rate (rather than the amount of time at each dose rate) results in an average rate that produces an amount of degradation that is a reasonable approximation to that received by the variable rate.

  11. Filtering and Gridding Satellite Observations of Cloud Variables to Compare with Climate Model Output

    NASA Astrophysics Data System (ADS)

    Pitts, K.; Nasiri, S. L.; Smith, N.

    2013-12-01

    Global climate models have improved considerably over the years, yet clouds still represent a large factor of uncertainty for these models. Comparisons of model-simulated cloud variables with equivalent satellite cloud products are the best way to start diagnosing the differences between model output and observations. Gridded (level 3) cloud products from many different satellites and instruments are required for a full analysis, but these products are created by different science teams using different algorithms and filtering criteria to create similar, but not directly comparable, cloud products. This study makes use of a recently developed uniform space-time gridding algorithm to create a new set of gridded cloud products from each satellite instrument's level 2 data of interest which are each filtered using the same criteria, allowing for a more direct comparison between satellite products. The filtering is done via several variables such as cloud top pressure/height, thermodynamic phase, optical properties, satellite viewing angle, and sun zenith angle. The filtering criteria are determined based on the variable being analyzed and the science question at hand. Each comparison of different variables may require different filtering strategies as no single approach is appropriate for all problems. Beyond inter-satellite data comparison, these new sets of uniformly gridded satellite products can also be used for comparison with model-simulated cloud variables. Of particular interest to this study are the differences in the vertical distributions of ice and liquid water content between the satellite retrievals and model simulations, especially in the mid-troposphere where there are mixed-phase clouds to consider. This presentation will demonstrate the proof of concept through comparisons of cloud water path from Aqua MODIS retrievals and NASA GISS-E2-[R/H] model simulations archived in the CMIP5 data portal.

  12. A Dynamic Bayesian Network model for long-term simulation of clinical complications in type 1 diabetes.

    PubMed

    Marini, Simone; Trifoglio, Emanuele; Barbarini, Nicola; Sambo, Francesco; Di Camillo, Barbara; Malovini, Alberto; Manfrini, Marco; Cobelli, Claudio; Bellazzi, Riccardo

    2015-10-01

    The increasing prevalence of diabetes and its related complications is raising the need for effective methods to predict patient evolution and for stratifying cohorts in terms of risk of developing diabetes-related complications. In this paper, we present a novel approach to the simulation of a type 1 diabetes population, based on Dynamic Bayesian Networks, which combines literature knowledge with data mining of a rich longitudinal cohort of type 1 diabetes patients, the DCCT/EDIC study. In particular, in our approach we simulate the patient health state and complications through discretized variables. Two types of models are presented, one entirely learned from the data and the other partially driven by literature derived knowledge. The whole cohort is simulated for fifteen years, and the simulation error (i.e. for each variable, the percentage of patients predicted in the wrong state) is calculated every year on independent test data. For each variable, the population predicted in the wrong state is below 10% on both models over time. Furthermore, the distributions of real vs. simulated patients greatly overlap. Thus, the proposed models are viable tools to support decision making in type 1 diabetes. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Run-up Variability due to Source Effects

    NASA Astrophysics Data System (ADS)

    Del Giudice, Tania; Zolezzi, Francesca; Traverso, Chiara; Valfrè, Giulio; Poggi, Pamela; Parker, Eric J.

    2010-05-01

    This paper investigates the variability of tsunami run-up at a specific location due to uncertainty in earthquake source parameters. It is important to quantify this 'inter-event' variability for probabilistic assessments of tsunami hazard. In principal, this aspect of variability could be studied by comparing field observations at a single location from a number of tsunamigenic events caused by the same source. As such an extensive dataset does not exist, we decided to study the inter-event variability through numerical modelling. We attempt to answer the question 'What is the potential variability of tsunami wave run-up at a specific site, for a given magnitude earthquake occurring at a known location'. The uncertainty is expected to arise from the lack of knowledge regarding the specific details of the fault rupture 'source' parameters. The following steps were followed: the statistical distributions of the main earthquake source parameters affecting the tsunami height were established by studying fault plane solutions of known earthquakes; a case study based on a possible tsunami impact on Egypt coast has been set up and simulated, varying the geometrical parameters of the source; simulation results have been analyzed deriving relationships between run-up height and source parameters; using the derived relationships a Monte Carlo simulation has been performed in order to create the necessary dataset to investigate the inter-event variability of the run-up height along the coast; the inter-event variability of the run-up height along the coast has been investigated. Given the distribution of source parameters and their variability, we studied how this variability propagates to the run-up height, using the Cornell 'Multi-grid coupled Tsunami Model' (COMCOT). The case study was based on the large thrust faulting offshore the south-western Greek coast, thought to have been responsible for the infamous 1303 tsunami. Numerical modelling of the event was used to assess the impact on the North African coast. The effects of uncertainty in fault parameters were assessed by perturbing the base model, and observing variation on wave height along the coast. The tsunami wave run-up was computed at 4020 locations along the Egyptian coast between longitudes 28.7 E and 33.8 E. To assess the effects of fault parameters uncertainty, input model parameters have been varied and effects on run-up have been analyzed. The simulations show that for a given point there are linear relationships between run-up and both fault dislocation and rupture length. A superposition analysis shows that a linear combination of the effects of the different source parameters (evaluated results) leads to a good approximation of the simulated results. This relationship is then used as the basis for a Monte Carlo simulation. The Monte Carlo simulation was performed for 1600 scenarios at each of the 4020 points along the coast. The coefficient of variation (the ratio between standard deviation of the results and the average of the run-up heights along the coast) is comprised between 0.14 and 3.11 with an average value along the coast equal to 0.67. The coefficient of variation of normalized run-up has been compared with the standard deviation of spectral acceleration attenuation laws used for probabilistic seismic hazard assessment studies. These values have a similar meaning, and the uncertainty in the two cases is similar. The 'rule of thumb' relationship between mean and sigma can be expressed as follows: ?+ σ ≈ 2?. The implication is that the uncertainty in run-up estimation should give a range of values within approximately two times the average. This uncertainty should be considered in tsunami hazard analysis, such as inundation and risk maps, evacuation plans and the other related steps.

  14. An assessment of global climate model-simulated climate for the western cordillera of Canada (1961-90)

    NASA Astrophysics Data System (ADS)

    Bonsal, Barrie R.; Prowse, Terry D.; Pietroniro, Alain

    2003-12-01

    Climate change is projected to significantly affect future hydrologic processes over many regions of the world. This is of particular importance for alpine systems that provide critical water supplies to lower-elevation regions. The western cordillera of Canada is a prime example where changes to temperature and precipitation could have profound hydro-climatic impacts not only for the cordillera itself, but also for downstream river systems and the drought-prone Canadian Prairies. At present, impact researchers primarily rely on global climate models (GCMs) for future climate projections. The main objective of this study is to assess several GCMs in their ability to simulate the magnitude and spatial variability of current (1961-90) temperature and precipitation over the western cordillera of Canada. In addition, several gridded data sets of observed climate for the study region are evaluated.Results reveal a close correspondence among the four gridded data sets of observed climate, particularly for temperature. There is, however, considerable variability regarding the various GCM simulations of this observed climate. The British, Canadian, German, Australian, and US GFDL models are superior at simulating the magnitude and spatial variability of mean temperature. The Japanese GCM is of intermediate ability, and the US NCAR model is least representative of temperature in this region. Nearly all the models substantially overestimate the magnitude of total precipitation, both annually and on a seasonal basis. An exception involves the British (Hadley) model, which best represents the observed magnitude and spatial variability of precipitation. This study improves our understanding regarding the accuracy of GCM climate simulations over the western cordillera of Canada. The findings may assist in producing more reliable future scenarios of hydro-climatic conditions over various regions of the country. Copyright

  15. Approaches for modeling within subject variability in pharmacometric count data analysis: dynamic inter-occasion variability and stochastic differential equations.

    PubMed

    Deng, Chenhui; Plan, Elodie L; Karlsson, Mats O

    2016-06-01

    Parameter variation in pharmacometric analysis studies can be characterized as within subject parameter variability (WSV) in pharmacometric models. WSV has previously been successfully modeled using inter-occasion variability (IOV), but also stochastic differential equations (SDEs). In this study, two approaches, dynamic inter-occasion variability (dIOV) and adapted stochastic differential equations, were proposed to investigate WSV in pharmacometric count data analysis. These approaches were applied to published count models for seizure counts and Likert pain scores. Both approaches improved the model fits significantly. In addition, stochastic simulation and estimation were used to explore further the capability of the two approaches to diagnose and improve models where existing WSV is not recognized. The results of simulations confirmed the gain in introducing WSV as dIOV and SDEs when parameters vary randomly over time. Further, the approaches were also informative as diagnostics of model misspecification, when parameters changed systematically over time but this was not recognized in the structural model. The proposed approaches in this study offer strategies to characterize WSV and are not restricted to count data.

  16. Time-Variable Transit Time Distributions in the Hyporheic Zone of a Headwater Mountain Stream

    NASA Astrophysics Data System (ADS)

    Ward, Adam S.; Schmadel, Noah M.; Wondzell, Steven M.

    2018-03-01

    Exchange of water between streams and their hyporheic zones is known to be dynamic in response to hydrologic forcing, variable in space, and to exist in a framework with nested flow cells. The expected result of heterogeneous geomorphic setting, hydrologic forcing, and between-feature interaction is hyporheic transit times that are highly variable in both space and time. Transit time distributions (TTDs) are important as they reflect the potential for hyporheic processes to impact biogeochemical transformations and ecosystems. In this study we simulate time-variable transit time distributions based on dynamic vertical exchange in a headwater mountain stream with observed, heterogeneous step-pool morphology. Our simulations include hyporheic exchange over a 600 m river corridor reach driven by continuously observed, time-variable hydrologic conditions for more than 1 year. We found that spatial variability at an instance in time is typically larger than temporal variation for the reach. Furthermore, we found reach-scale TTDs were marginally variable under all but the most extreme hydrologic conditions, indicating that TTDs are highly transferable in time. Finally, we found that aggregation of annual variation in space and time into a "master TTD" reasonably represents most of the hydrologic dynamics simulated, suggesting that this aggregation approach may provide a relevant basis for scaling from features or short reaches to entire networks.

  17. Performance of the WRF model to simulate the seasonal and interannual variability of hydrometeorological variables in East Africa: a case study for the Tana River basin in Kenya

    NASA Astrophysics Data System (ADS)

    Kerandi, Noah Misati; Laux, Patrick; Arnault, Joel; Kunstmann, Harald

    2017-10-01

    This study investigates the ability of the regional climate model Weather Research and Forecasting (WRF) in simulating the seasonal and interannual variability of hydrometeorological variables in the Tana River basin (TRB) in Kenya, East Africa. The impact of two different land use classifications, i.e., the Moderate Resolution Imaging Spectroradiometer (MODIS) and the US Geological Survey (USGS) at two horizontal resolutions (50 and 25 km) is investigated. Simulated precipitation and temperature for the period 2011-2014 are compared with Tropical Rainfall Measuring Mission (TRMM), Climate Research Unit (CRU), and station data. The ability of Tropical Rainfall Measuring Mission (TRMM) and Climate Research Unit (CRU) data in reproducing in situ observation in the TRB is analyzed. All considered WRF simulations capture well the annual as well as the interannual and spatial distribution of precipitation in the TRB according to station data and the TRMM estimates. Our results demonstrate that the increase of horizontal resolution from 50 to 25 km, together with the use of the MODIS land use classification, significantly improves the precipitation results. In the case of temperature, spatial patterns and seasonal cycle are well reproduced, although there is a systematic cold bias with respect to both station and CRU data. Our results contribute to the identification of suitable and regionally adapted regional climate models (RCMs) for East Africa.

  18. Nonlinear intrinsic variables and state reconstruction in multiscale simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dsilva, Carmeline J., E-mail: cdsilva@princeton.edu; Talmon, Ronen, E-mail: ronen.talmon@yale.edu; Coifman, Ronald R., E-mail: coifman@math.yale.edu

    2013-11-14

    Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certainmore » simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.« less

  19. Nonlinear intrinsic variables and state reconstruction in multiscale simulations

    NASA Astrophysics Data System (ADS)

    Dsilva, Carmeline J.; Talmon, Ronen; Rabin, Neta; Coifman, Ronald R.; Kevrekidis, Ioannis G.

    2013-11-01

    Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certain simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.

  20. Simulation analyses of space use: Home range estimates, variability, and sample size

    USGS Publications Warehouse

    Bekoff, Marc; Mech, L. David

    1984-01-01

    Simulations of space use by animals were run to determine the relationship among home range area estimates, variability, and sample size (number of locations). As sample size increased, home range size increased asymptotically, whereas variability decreased among mean home range area estimates generated by multiple simulations for the same sample size. Our results suggest that field workers should ascertain between 100 and 200 locations in order to estimate reliably home range area. In some cases, this suggested guideline is higher than values found in the few published studies in which the relationship between home range area and number of locations is addressed. Sampling differences for small species occupying relatively small home ranges indicate that fewer locations may be sufficient to allow for a reliable estimate of home range. Intraspecific variability in social status (group member, loner, resident, transient), age, sex, reproductive condition, and food resources also have to be considered, as do season, habitat, and differences in sampling and analytical methods. Comparative data still are needed.

  1. Regional Climate Simulation and Data Assimilation with Variable-Resolution GCMs

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    2002-01-01

    Variable resolution GCMs using a global stretched grid (SG) with enhanced regional resolution over one or multiple areas of interest represents a viable new approach to regional climateklimate change and data assimilation studies and applications. The multiple areas of interest, at least one within each global quadrant, include the major global mountains and major global monsoonal circulations over North America, South America, India-China, and Australia. They also can include the polar domains, and the European and African regions. The SG-approach provides an efficient regional downscaling to mesoscales, and it is an ideal tool for representing consistent interactions of globaYlarge- and regionallmeso- scales while preserving the high quality of global circulation. Basically, the SG-GCM simulations are no different from those of the traditional uniform-grid GCM simulations besides using a variable-resolution grid. Several existing SG-GCMs developed by major centers and groups are briefly described. The major discussion is based on the GEOS (Goddard Earth Observing System) SG-GCM regional climate simulations.

  2. Situating Computer Simulation Professional Development: Does It Promote Inquiry-Based Simulation Use?

    ERIC Educational Resources Information Center

    Gonczi, Amanda L.; Maeng, Jennifer L.; Bell, Randy L.; Whitworth, Brooke A.

    2016-01-01

    This mixed-methods study sought to identify professional development implementation variables that may influence participant (a) adoption of simulations, and (b) use for inquiry-based science instruction. Two groups (Cohort 1, N = 52; Cohort 2, N = 104) received different professional development. Cohort 1 was focused on Web site use mechanics.…

  3. Analyzing the Effects of Horizontal Resolution on Long-Term Coupled WRF-CMAQ Simulations

    EPA Science Inventory

    The objective of this study is to determine the adequacy of using a relatively coarse horizontal resolution (i.e. 36 km) to simulate long-term trends of pollutant concentrations and radiation variables with the coupled WRF-CMAQ model. To this end, WRF-CMAQ simulations over the co...

  4. Variable selection in semiparametric cure models based on penalized likelihood, with application to breast cancer clinical trials.

    PubMed

    Liu, Xiang; Peng, Yingwei; Tu, Dongsheng; Liang, Hua

    2012-10-30

    Survival data with a sizable cure fraction are commonly encountered in cancer research. The semiparametric proportional hazards cure model has been recently used to analyze such data. As seen in the analysis of data from a breast cancer study, a variable selection approach is needed to identify important factors in predicting the cure status and risk of breast cancer recurrence. However, no specific variable selection method for the cure model is available. In this paper, we present a variable selection approach with penalized likelihood for the cure model. The estimation can be implemented easily by combining the computational methods for penalized logistic regression and the penalized Cox proportional hazards models with the expectation-maximization algorithm. We illustrate the proposed approach on data from a breast cancer study. We conducted Monte Carlo simulations to evaluate the performance of the proposed method. We used and compared different penalty functions in the simulation studies. Copyright © 2012 John Wiley & Sons, Ltd.

  5. Summer U.S. Surface Air Temperature Variability: Controlling Factors and AMIP Simulation Biases

    NASA Astrophysics Data System (ADS)

    Merrifield, A.; Xie, S. P.

    2016-02-01

    This study documents and investigates biases in simulating summer surface air temperature (SAT) variability over the continental U.S. in the Coupled Model Intercomparison Project (CMIP5) Atmospheric Model Intercomparison Project (AMIP). Empirical orthogonal function (EOF) and multivariate regression analyses are used to assess the relative importance of circulation and the land surface feedback at setting summer SAT over a 30-year period (1979-2008). In observations, regions of high SAT variability are closely associated with midtropospheric highs and subsidence, consistent with adiabatic theory (Meehl and Tebaldi 2004, Lau and Nath 2012). Preliminary analysis shows the majority of the AMIP models feature high SAT variability over the central U.S., displaced south and/or west of observed centers of action (COAs). SAT COAs in models tend to be concomitant with regions of high sensible heat flux variability, suggesting an excessive land surface feedback in these models modulate U.S. summer SAT. Additionally, tropical sea surface temperatures (SSTs) play a role in forcing the leading EOF mode for summer SAT, in concert with internal atmospheric variability. There is evidence that models respond to different SST patterns than observed. Addressing issues with the bulk land surface feedback and the SST-forced component of atmospheric variability may be key to improving model skill in simulating summer SAT variability over the U.S.

  6. High resolution modelling of soil moisture patterns with TerrSysMP: A comparison with sensor network data

    NASA Astrophysics Data System (ADS)

    Gebler, S.; Hendricks Franssen, H.-J.; Kollet, S. J.; Qu, W.; Vereecken, H.

    2017-04-01

    The prediction of the spatial and temporal variability of land surface states and fluxes with land surface models at high spatial resolution is still a challenge. This study compares simulation results using TerrSysMP including a 3D variably saturated groundwater flow model (ParFlow) coupled to the Community Land Model (CLM) of a 38 ha managed grassland head-water catchment in the Eifel (Germany), with soil water content (SWC) measurements from a wireless sensor network, actual evapotranspiration recorded by lysimeters and eddy covariance stations and discharge observations. TerrSysMP was discretized with a 10 × 10 m lateral resolution, variable vertical resolution (0.025-0.575 m), and the following parameterization strategies of the subsurface soil hydraulic parameters: (i) completely homogeneous, (ii) homogeneous parameters for different soil horizons, (iii) different parameters for each soil unit and soil horizon and (iv) heterogeneous stochastic realizations. Hydraulic conductivity and Mualem-Van Genuchten parameters in these simulations were sampled from probability density functions, constructed from either (i) soil texture measurements and Rosetta pedotransfer functions (ROS), or (ii) estimated soil hydraulic parameters by 1D inverse modelling using shuffle complex evolution (SCE). The results indicate that the spatial variability of SWC at the scale of a small headwater catchment is dominated by topography and spatially heterogeneous soil hydraulic parameters. The spatial variability of the soil water content thereby increases as a function of heterogeneity of soil hydraulic parameters. For lower levels of complexity, spatial variability of the SWC was underrepresented in particular for the ROS-simulations. Whereas all model simulations were able to reproduce the seasonal evapotranspiration variability, the poor discharge simulations with high model bias are likely related to short-term ET dynamics and the lack of information about bedrock characteristics and an on-site drainage system in the uncalibrated model. In general, simulation performance was better for the SCE setups. The SCE-simulations had a higher inverse air entry parameter resulting in SWC dynamics in better correspondence with data than the ROS simulations during dry periods. This illustrates that small scale measurements of soil hydraulic parameters cannot be transferred to the larger scale and that interpolated 1D inverse parameter estimates result in an acceptable performance for the catchment.

  7. Improving plot- and regional-scale crop models for simulating impacts of climate variability and extremes

    NASA Astrophysics Data System (ADS)

    Tao, F.; Rötter, R.

    2013-12-01

    Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for better informed decision-making on adaptation strategies. References 1. Coumou, D. & Rahmstorf, S. A decade of extremes. Nature Clim. Change, 2, 491-496 (2012). 2. Rötter, R. P., Carter, T. R., Olesen, J. E. & Porter, J. R. Crop-climate models need an overhaul. Nature Clim. Change, 1, 175-177 (2011). 3. Asseng, S. et al., Uncertainty in simulating wheat yields under climate change. Nature Clim. Change. 10.1038/nclimate1916. (2013). 4. Porter, J.R., & Semenov, M., Crop responses to climatic variation . Trans. R. Soc. B., 360, 2021-2035 (2005). 5. Porter, J.R. & Christensen, S. Deconstructing crop processes and models via identities. Plant, Cell and Environment . doi: 10.1111/pce.12107 (2013). 6. Boogaard, H.L., van Diepen C.A., Rötter R.P., Cabrera J.M. & van Laar H.H. User's guide for the WOFOST 7.1 crop growth simulation model and Control Center 1.5, Alterra, Wageningen, The Netherlands. (1998) 7. Tao, F. & Zhang, Z. Climate change, wheat productivity and water use in the North China Plain: a new super-ensemble-based probabilistic projection. Agric. Forest Meteorol., 170, 146-165. (2013).

  8. Multivariate evaluation of the cutting performance of rotary instruments with electric and air-turbine handpieces.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Chochlidakis, Konstantinos; Ercoli, Carlo

    2016-10-01

    Laboratory studies of tooth preparation often involve single values for all variables other than the one being tested. In contrast, in clinical settings, not all variables can be adequately controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but, in clinical practice, the instrument must make different cuts with individual dentists applying different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies difficult. The purpose of this in vitro study was to examine the effects of 9 process variables on the dental cutting of rotary cutting instruments used with an electric handpiece and compare them with those of a previous study that used an air-turbine handpiece. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using an electric handpiece in a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures with Macor blocks as the cutting substrate. Analysis of variance (ANOVA) was used to assess the statistical significance (α=.05). Four variables (targeted applied load, cut length, diamond grit size, and cut type) consistently produced large, statistically significant effects, whereas 5 variables (rotation per minute, number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate) produced relatively small, statistically insignificant effects. These results are generally similar to those previously found for an air-turbine handpiece. Regardless of whether an electric or air-turbine handpiece was used, the control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances and hardware choices. These results highlight the greater importance of local clinical conditions (procedure, dentist) in understanding dental cutting as opposed to other hardware-related factors. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  9. Spatial Variability of Trace Gases During DISCOVER-AQ: Planning for Geostationary Observations of Atmospheric Composition

    NASA Technical Reports Server (NTRS)

    Follette-Cook, Melanie B.; Pickering, K.; Crawford, J.; Appel, W.; Diskin, G.; Fried, A.; Loughner, C.; Pfister, G.; Weinheimer, A.

    2015-01-01

    Results from an in-depth analysis of trace gas variability in MD indicated that the variability in this region was large enough to be observable by a TEMPO-like instrument. The variability observed in MD is relatively similar to the other three campaigns with a few exceptions: CO variability in CA was much higher than in the other regions; HCHO variability in CA and CO was much lower; MD showed the lowest variability in NO2All model simulations do a reasonable job simulating O3 variability. For CO, the CACO simulations largely under over estimate the variability in the observations. The variability in HCHO is underestimated for every campaign. NO2 variability is slightly overestimated in MD, more so in CO. The TX simulation underestimates the variability in each trace gas. This is most likely due to missing emissions sources (C. Loughner, manuscript in preparation).Future Work: Where reasonable, we will use these model outputs to further explore the resolvability from space of these key trace gases using analyses of tropospheric column amounts relative to satellite precision requirements, similar to Follette-Cook et al. (2015).

  10. High resolution simulations of a variable HH jet

    NASA Astrophysics Data System (ADS)

    Raga, A. C.; de Colle, F.; Kajdič, P.; Esquivel, A.; Cantó, J.

    2007-04-01

    Context: In many papers, the flows in Herbig-Haro (HH) jets have been modeled as collimated outflows with a time-dependent ejection. In particular, a supersonic variability of the ejection velocity leads to the production of "internal working surfaces" which (for appropriate forms of the time-variability) can produce emitting knots that resemble the chains of knots observed along HH jets. Aims: In this paper, we present axisymmetric simulations of an "internal working surface" in a radiative jet (produced by an ejection velocity variability). We concentrate on a given parameter set (i.e., on a jet with a constante ejection density, and a sinusoidal velocity variability with a 20 yr period and a 40 km s-1 half-amplitude), and carry out a study of the behaviour of the solution for increasing numerical resolutions. Methods: In our simulations, we solve the gasdynamic equations together with a 17-species atomic/ionic network, and we are therefore able to compute emission coefficients for different emission lines. Results: We compute 3 adaptive grid simulations, with 20, 163 and 1310 grid points (at the highest grid resolution) across the initial jet radius. From these simulations we see that successively more complex structures are obtained for increasing numerical resolutions. Such an effect is seen in the stratifications of the flow variables as well as in the predicted emission line intensity maps. Conclusions: .We find that while the detailed structure of an internal working surface depends on resolution, the predicted emission line luminosities (integrated over the volume of the working surface) are surprisingly stable. This is definitely good news for the future computation of predictions from radiative jet models for carrying out comparisons with observations of HH objects.

  11. Intensified Indian Ocean climate variability during the Last Glacial Maximum

    NASA Astrophysics Data System (ADS)

    Thirumalai, K.; DiNezro, P.; Tierney, J. E.; Puy, M.; Mohtadi, M.

    2017-12-01

    Climate models project increased year-to-year climate variability in the equatorial Indian Ocean in response to greenhouse gas warming. This response has been attributed to changes in the mean climate of the Indian Ocean associated with the zonal sea-surface temperature (SST) gradient. According to these studies, air-sea coupling is enhanced due to a stronger SST gradient driving anomalous easterlies that shoal the thermocline in the eastern Indian Ocean. We propose that this relationship between the variability and the zonal SST gradient is consistent across different mean climate states. We test this hypothesis using simulations of past and future climate performed with the Community Earth System Model Version 1 (CESM1). We constrain the realism of the model for the Last Glacial Maximum (LGM) where CESM1 simulates a mean climate consistent with a stronger SST gradient, agreeing with proxy reconstructions. CESM1 also simulates a pronounced increase in seasonal and interannual variability. We develop new estimates of climate variability on these timescales during the LGM using δ18O analysis of individual foraminifera (IFA). IFA data generated from four different cores located in the eastern Indian Ocean indicate a marked increase in δ18O-variance during the LGM as compared to the late Holocene. Such a significant increase in the IFA-δ18O variance strongly supports the modeling simulations. This agreement further supports the dynamics linking year-to-year variability and an altered SST gradient, increasing our confidence in model projections.

  12. Intraseasonal Variability of the Indian Monsoon as Simulated by a Global Model

    NASA Astrophysics Data System (ADS)

    Joshi, Sneh; Kar, S. C.

    2018-01-01

    This study uses the global forecast system (GFS) model at T126 horizontal resolution to carry out seasonal simulations with prescribed sea-surface temperatures. Main objectives of the study are to evaluate the simulated Indian monsoon variability in intraseasonal timescales. The GFS model has been integrated for 29 monsoon seasons with 15 member ensembles forced with observed sea-surface temperatures (SSTs) and additional 16-member ensemble runs have been carried out using climatological SSTs. Northward propagation of intraseasonal rainfall anomalies over the Indian region from the model simulations has been examined. It is found that the model is unable to simulate the observed moisture pattern when the active zone of convection is over central India. However, the model simulates the observed pattern of specific humidity during the life cycle of northward propagation on day - 10 and day + 10 of maximum convection over central India. The space-time spectral analysis of the simulated equatorial waves shows that the ensemble members have varying amount of power in each band of wavenumbers and frequencies. However, variations among ensemble members are more in the antisymmetric component of westward moving waves and maximum difference in power is seen in the 8-20 day mode among ensemble members.

  13. Improved spectral comparisons of paleoclimate models and observations via proxy system modeling: Implications for multi-decadal variability

    NASA Astrophysics Data System (ADS)

    Dee, S. G.; Parsons, L. A.; Loope, G. R.; Overpeck, J. T.; Ault, T. R.; Emile-Geay, J.

    2017-10-01

    The spectral characteristics of paleoclimate observations spanning the last millennium suggest the presence of significant low-frequency (multi-decadal to centennial scale) variability in the climate system. Since this low-frequency climate variability is critical for climate predictions on societally-relevant scales, it is essential to establish whether General Circulation models (GCMs) are able to simulate it faithfully. Recent studies find large discrepancies between models and paleoclimate data at low frequencies, prompting concerns surrounding the ability of GCMs to predict long-term, high-magnitude variability under greenhouse forcing (Laepple and Huybers, 2014a, 2014b). However, efforts to ground climate model simulations directly in paleoclimate observations are impeded by fundamental differences between models and the proxy data: proxy systems often record a multivariate and/or nonlinear response to climate, precluding a direct comparison to GCM output. In this paper we bridge this gap via a forward proxy modeling approach, coupled to an isotope-enabled GCM. This allows us to disentangle the various contributions to signals embedded in ice cores, speleothem calcite, coral aragonite, tree-ring width, and tree-ring cellulose. The paper addresses the following questions: (1) do forward-modeled ;pseudoproxies; exhibit variability comparable to proxy data? (2) if not, which processes alter the shape of the spectrum of simulated climate variability, and are these processes broadly distinguishable from climate? We apply our method to representative case studies, and broaden these insights with an analysis of the PAGES2k database (PAGES2K Consortium, 2013). We find that current proxy system models (PSMs) can help resolve model-data discrepancies on interannual to decadal timescales, but cannot account for the mismatch in variance on multi-decadal to centennial timescales. We conclude that, specific to this set of PSMs and isotope-enabled model, the paleoclimate record may exhibit larger low-frequency variability than GCMs currently simulate, indicative of incomplete physics and/or forcings.

  14. Impact of non-migrating tides on the low latitude ionosphere during a sudden stratospheric warming event in January 2010

    NASA Astrophysics Data System (ADS)

    McDonald, S. E.; Sassi, F.; Tate, J.; McCormack, J.; Kuhl, D. D.; Drob, D. P.; Metzler, C.; Mannucci, A. J.

    2018-06-01

    The lower atmosphere contributes significantly to the day-to-day variability of the ionosphere, especially during solar minimum conditions. Ionosphere/atmosphere model simulations that incorporate meteorology from data assimilation analysis products can be critically important for elucidating the physical processes that have substantial impact on ionospheric weather. In this study, the NCAR Whole Atmosphere Community Climate Model, extended version with specified dynamics (SD-WACCM-X) is coupled with an ionospheric model (Sami3 is Another Model of the Ionosphere) to study day-to-day variability in the ionosphere during January 2010. Lower atmospheric weather patterns are introduced into the SAMI3/SD-WACCM-X simulations using the 6-h Navy Operational Global Atmospheric Prediction System-Advanced Level Physics High Altitude (NOGAPS-ALPHA) data assimilation products. The same time period is simulated using the new atmospheric forecast model, the High Altitude Navy Global Environmental Model (HA-NAVGEM), a hybrid 4D-Var prototype data assimilation with the ability to produce meteorological fields at a 3-h cadence. Our study shows that forcing SD-WACCM-X with HA-NAVGEM better resolves the semidiurnal tides and introduces more day-to-day variability into the ionosphere than forcing with NOGAPS-ALPHA. The SAMI3/SD-WACCM-X/HA-NAVGEM simulation also more accurately captures the longitudinal variability associated with non-migrating tides in the equatorial ionization anomaly (EIA) region as compared to total electron content (TEC) maps derived from GPS data. Both the TEC maps and the SAMI3/SD-WACCM-X/HA-NAVGEM simulation show an enhancement in TEC over South America during 17-21 January 2010, which coincides with the commencement of a stratospheric warming event on 19 January 2010. Analysis of the SAMI3/SD-WACCM-X/HA-NAVGEM simulations indicates non-migrating tides (including DW4, DE2 and SW5) played a role during 17-21 January in shifting the phase of the wave-3 pattern in the ionosphere on these days. Constructive interference of wave-3 and wave-4 patterns in the E × B drifts contributed to the enhanced TEC in the South American longitude sector. The results of the study highlight the importance of high fidelity meteorology in understanding the day-to-day variability of the ionosphere.

  15. Stochastic model search with binary outcomes for genome-wide association studies.

    PubMed

    Russu, Alberto; Malovini, Alberto; Puca, Annibale A; Bellazzi, Riccardo

    2012-06-01

    The spread of case-control genome-wide association studies (GWASs) has stimulated the development of new variable selection methods and predictive models. We introduce a novel Bayesian model search algorithm, Binary Outcome Stochastic Search (BOSS), which addresses the model selection problem when the number of predictors far exceeds the number of binary responses. Our method is based on a latent variable model that links the observed outcomes to the underlying genetic variables. A Markov Chain Monte Carlo approach is used for model search and to evaluate the posterior probability of each predictor. BOSS is compared with three established methods (stepwise regression, logistic lasso, and elastic net) in a simulated benchmark. Two real case studies are also investigated: a GWAS on the genetic bases of longevity, and the type 2 diabetes study from the Wellcome Trust Case Control Consortium. Simulations show that BOSS achieves higher precisions than the reference methods while preserving good recall rates. In both experimental studies, BOSS successfully detects genetic polymorphisms previously reported to be associated with the analyzed phenotypes. BOSS outperforms the other methods in terms of F-measure on simulated data. In the two real studies, BOSS successfully detects biologically relevant features, some of which are missed by univariate analysis and the three reference techniques. The proposed algorithm is an advance in the methodology for model selection with a large number of features. Our simulated and experimental results showed that BOSS proves effective in detecting relevant markers while providing a parsimonious model.

  16. Propagation of variability in railway dynamic simulations: application to virtual homologation

    NASA Astrophysics Data System (ADS)

    Funfschilling, Christine; Perrin, Guillaume; Kraft, Sönke

    2012-01-01

    Railway dynamic simulations are increasingly used to predict and analyse the behaviour of the vehicle and of the track during their whole life cycle. Up to now however, no simulation has been used in the certification procedure even if the expected benefits are important: cheaper and shorter procedures, more objectivity, better knowledge of the behaviour around critical situations. Deterministic simulations are nevertheless too poor to represent the whole physical of the track/vehicle system which contains several sources of variability: variability of the mechanical parameters of a train among a class of vehicles (mass, stiffness and damping of different suspensions), variability of the contact parameters (friction coefficient, wheel and rail profiles) and variability of the track design and quality. This variability plays an important role on the safety, on the ride quality, and thus on the certification criteria. When using the simulation for certification purposes, it seems therefore crucial to take into account the variability of the different inputs. The main goal of this article is thus to propose a method to introduce the variability in railway dynamics. A four-step method is described namely the definition of the stochastic problem, the modelling of the inputs variability, the propagation and the analysis of the output. Each step is illustrated with railway examples.

  17. The effects of cracks on the quantification of the cancellous bone fabric tensor in fossil and archaeological specimens: a simulation study.

    PubMed

    Bishop, Peter J; Clemente, Christofer J; Hocknull, Scott A; Barrett, Rod S; Lloyd, David G

    2017-03-01

    Cancellous bone is very sensitive to its prevailing mechanical environment, and study of its architecture has previously aided interpretations of locomotor biomechanics in extinct animals or archaeological populations. However, quantification of architectural features may be compromised by poor preservation in fossil and archaeological specimens, such as post mortem cracking or fracturing. In this study, the effects of post mortem cracks on the quantification of cancellous bone fabric were investigated through the simulation of cracks in otherwise undamaged modern bone samples. The effect on both scalar (degree of fabric anisotropy, fabric elongation index) and vector (principal fabric directions) variables was assessed through comparing the results of architectural analyses of cracked vs. non-cracked samples. Error was found to decrease as the relative size of the crack decreased, and as the orientation of the crack approached the orientation of the primary fabric direction. However, even in the best-case scenario simulated, error remained substantial, with at least 18% of simulations showing a > 10% error when scalar variables were considered, and at least 6.7% of simulations showing a > 10° error when vector variables were considered. As a 10% (scalar) or 10° (vector) difference is probably too large for reliable interpretation of a fossil or archaeological specimen, these results suggest that cracks should be avoided if possible when analysing cancellous bone architecture in such specimens. © 2016 Anatomical Society.

  18. Multiscale equation-free algorithms for molecular dynamics

    NASA Astrophysics Data System (ADS)

    Abi Mansour, Andrew

    Molecular dynamics is a physics-based computational tool that has been widely employed to study the dynamics and structure of macromolecules and their assemblies at the atomic scale. However, the efficiency of molecular dynamics simulation is limited because of the broad spectrum of timescales involved. To overcome this limitation, an equation-free algorithm is presented for simulating these systems using a multiscale model cast in terms of atomistic and coarse-grained variables. Both variables are evolved in time in such a way that the cross-talk between short and long scales is preserved. In this way, the coarse-grained variables guide the evolution of the atom-resolved states, while the latter provide the Newtonian physics for the former. While the atomistic variables are evolved using short molecular dynamics runs, time advancement at the coarse-grained level is achieved with a scheme that uses information from past and future states of the system while accounting for both the stochastic and deterministic features of the coarse-grained dynamics. To complete the multiscale cycle, an atom-resolved state consistent with the updated coarse-grained variables is recovered using algorithms from mathematical optimization. This multiscale paradigm is extended to nanofluidics using concepts from hydrodynamics, and it is demonstrated for macromolecular and nanofluidic systems. A toolkit is developed for prototyping these algorithms, which are then implemented within the GROMACS simulation package and released as an open source multiscale simulator.

  19. A Single-column Model Ensemble Approach Applied to the TWP-ICE Experiment

    NASA Technical Reports Server (NTRS)

    Davies, L.; Jakob, C.; Cheung, K.; DelGenio, A.; Hill, A.; Hume, T.; Keane, R. J.; Komori, T.; Larson, V. E.; Lin, Y.; hide

    2013-01-01

    Single-column models (SCM) are useful test beds for investigating the parameterization schemes of numerical weather prediction and climate models. The usefulness of SCM simulations are limited, however, by the accuracy of the best estimate large-scale observations prescribed. Errors estimating the observations will result in uncertainty in modeled simulations. One method to address the modeled uncertainty is to simulate an ensemble where the ensemble members span observational uncertainty. This study first derives an ensemble of large-scale data for the Tropical Warm Pool International Cloud Experiment (TWP-ICE) based on an estimate of a possible source of error in the best estimate product. These data are then used to carry out simulations with 11 SCM and two cloud-resolving models (CRM). Best estimate simulations are also performed. All models show that moisture-related variables are close to observations and there are limited differences between the best estimate and ensemble mean values. The models, however, show different sensitivities to changes in the forcing particularly when weakly forced. The ensemble simulations highlight important differences in the surface evaporation term of the moisture budget between the SCM and CRM. Differences are also apparent between the models in the ensemble mean vertical structure of cloud variables, while for each model, cloud properties are relatively insensitive to forcing. The ensemble is further used to investigate cloud variables and precipitation and identifies differences between CRM and SCM particularly for relationships involving ice. This study highlights the additional analysis that can be performed using ensemble simulations and hence enables a more complete model investigation compared to using the more traditional single best estimate simulation only.

  20. Support for Simulation-Based Learning; The Effects of Model Progression and Assignments on Learning about Oscillatory Motion.

    ERIC Educational Resources Information Center

    Swaak, Janine; And Others

    In this study, learners worked with a simulation of harmonic oscillation. Two supportive measures were introduced: model progression and assignments. In model progression, the model underlying the simulation is not offered in its full complexity from the start, but variables are gradually introduced. Assignments are small exercises that help the…

  1. Assessment of the effects of horizontal grid resolution on long-term air quality trends using coupled WRF-CMAQ simulations

    EPA Science Inventory

    The objective of this study is to determine the adequacy of using a relatively coarse horizontal resolution (i.e. 36 km) to simulate long-term trends of pollutant concentrations and radiation variables with the coupled WRF-CMAQ model. WRF-CMAQ simulations over the continental Uni...

  2. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  3. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  4. Evaluation of feedforward and feedback contributions to hand stiffness and variability in multijoint arm control.

    PubMed

    He, Xin; Du, Yu-Fan; Lan, Ning

    2013-07-01

    The purpose of this study is to validate a neuromechanical model of the virtual arm (VA) by comparing emerging behaviors of the model to those of experimental observations. Hand stiffness of the VA model was obtained by either theoretical computation or simulated perturbations. Variability in hand position of the VA was generated by adding signal dependent noise (SDN) to the motoneuron pools of muscles. Reflex circuits of Ia, Ib and Renshaw cells were included to regulate the motoneuron pool outputs. Evaluation of hand stiffness and variability was conducted in simulations with and without afferent feedback under different patterns of muscle activations during postural maintenance. The simulated hand stiffness and variability ellipses captured the experimentally observed features in shape, magnitude and orientation. Steady state afferent feedback contributed significantly to the increase in hand stiffness by 35.75±16.99% in area, 18.37±7.80% and 16.15±7.15% in major and minor axes; and to the reduction of hand variability by 49.41±21.19% in area, 36.89±12.78% and 18.87±23.32% in major and minor axes. The VA model reproduced the neuromechanical behaviors that were consistent with experimental data, and it could be a useful tool for study of neural control of posture and movement, as well as for application to rehabilitation.

  5. What controls the variability of oxygen in the subpolar North Pacific?

    NASA Astrophysics Data System (ADS)

    Takano, Yohei

    Dissolved oxygen is a widely observed chemical quantity in the oceans along with temperature and salinity. Changes in the dissolved oxygen have been observed over the world oceans. Observed oxygen in the Ocean Station Papa (OSP, 50°N, 145°W) in the Gulf of Alaska exhibits strong variability over interannual and decadal timescales, however, the mechanisms driving the observed variability are not yet fully understood. Furthermore, irregular sampling frequency and relatively short record length make it difficult to detect a low-frequency variability. Motivated by these observations, we investigate the mechanisms driving the low-frequency variability of oxygen in the subpolar North Pacific. The specific purposes of this study are (1) to evaluate the robustness of the observed low-frequency variability of dissolved oxygen and (2) to determine the mechanisms driving the observed variability using statistical data analysis and numerical simulations. To evaluate the robustness of the low-frequency variability, we conducted spectral analyses on the observed oxygen at OSP. To address the irregular sampling frequency we randomly sub-sampled the raw data to form 500 ensemble members with a regular time interval, and then performed spectral analyses. The resulting power spectrum of oxygen exhibits a robust low-frequency variability and a statistically significant spectral peak is identified at a timescale of 15--20 years. The wintertime oceanic barotropic streamfunction is significantly correlated with the observed oxygen anomaly at OSP with a north-south dipole structure over the North Pacific. We hypothesize that the observed low-frequency variability is primarily driven by the variability of large-scale ocean circulation in the North Pacific. To test this hypothesis, we simulate the three-dimensional distribution of oxygen anomaly between 1952 to 2001 using data-constrained circulation fields. The simulated oxygen anomaly shows an outstanding variability in the Gulf of Alaska, showing that this region is a hotspot of oxygen fluctuation. Anomalous advection acting on the climatological mean oxygen gradient is the source of oxygen variability in this simulation. Empirical Orthogonal Function (EOF) analyses of the simulated oxygen show that the two dominant modes of the oxygen anomaly explains more than 50% of oxygen variance over the North Pacific, that are closely related to the dominant modes of climate variability in the North Pacific (Pacific Decadal Oscillation and North Pacific Oscillation). Our results imply the important link between large-scale climate fluctuations, ocean circulation and biogeochemical tracers in the North Pacific.

  6. Statistical structure of intrinsic climate variability under global warming

    NASA Astrophysics Data System (ADS)

    Zhu, Xiuhua; Bye, John; Fraedrich, Klaus

    2017-04-01

    Climate variability is often studied in terms of fluctuations with respect to the mean state, whereas the dependence between the mean and variability is rarely discussed. We propose a new climate metric to measure the relationship between means and standard deviations of annual surface temperature computed over non-overlapping 100-year segments. This metric is analyzed based on equilibrium simulations of the Max Planck Institute-Earth System Model (MPI-ESM): the last millennium climate (800-1799), the future climate projection following the A1B scenario (2100-2199), and the 3100-year unforced control simulation. A linear relationship is globally observed in the control simulation and thus termed intrinsic climate variability, which is most pronounced in the tropical region with negative regression slopes over the Pacific warm pool and positive slopes in the eastern tropical Pacific. It relates to asymmetric changes in temperature extremes and associates fluctuating climate means with increase or decrease in intensity and occurrence of both El Niño and La Niña events. In the future scenario period, the linear regression slopes largely retain their spatial structure with appreciable changes in intensity and geographical locations. Since intrinsic climate variability describes the internal rhythm of the climate system, it may serve as guidance for interpreting climate variability and climate change signals in the past and the future.

  7. Climate variability in China during the last millennium based on reconstructions and simulations

    NASA Astrophysics Data System (ADS)

    García-Bustamante, E.; Luterbacher, J.; Xoplaki, E.; Werner, J. P.; Jungclaus, J.; Zorita, E.; González-Rouco, J. F.; Fernández-Donado, L.; Hegerl, G.; Ge, Q.; Hao, Z.; Wagner, S.

    2012-04-01

    Multi-decadal to centennial climate variability in China during the last millennium is analysed. We compare the low frequency temperature and precipitation variations from proxy-based reconstructions and palaeo-simulations from climate models. Focusing on the regional responses to the global climate evolution is of high relevance due to the complexity of the interactions between physical mechanisms at different spatio-temporal scales and the potential severity of the derived multiple socio-economic impacts. China stands out as a particularly interesting region, not only due to its complex climatic features, ranging from the semiarid northwestern Tibetan Plateau to the tropical monsoon southeastern climates, but also because of its wealth of proxy data. However, comprehensive assessments of proxy- and model-based information about palaeo-climatic variations in China are, to our knowledge, still lacking. In addition, existing studies depict a general lack of agreement between reconstructions and model simulations with respect to the amplitude and/or occurrence of warmer/colder and wetter/drier periods during the last millennium and the magnitude of the 20th century warming trend. Furthermore, these works are mainly focused on eastern China regions that show a denser proxy data coverage. We investigate how last millennium palaeo-runs compare to independent evidences from an unusual large number of proxy reconstructions over the study area by employing state-of-the-art palaeo-simulations with multi-member ensembles from the CMIP5/PMIP3 project. This shapes an ideal frame for the evaluation of the uncertainties associated to internal and intermodel model variability. Preliminary results indicate that despite the strong regional and seasonal dependencies, temperature reconstructions in China evidence coherent variations among all regions at centennial scale, especially during the last 500 years. The spatial consistency of low frequency temperature changes is an interesting aspect and of relevance for the assessment of forced climatic responses in China. The comparison between reconstructions and simulations from climate models show that, apart from the 20th century warming trend, the variance of the reconstructed mean China temperature lies in the envelope (uncertainty range) spanned by the temperature simulations. The uncertainty arises from the internal (multi-member ensembles) and the inter-model variability. Centennial variations tend to be broadly synchronous in the reconstructions and the simulations. However, the simulations show a delay of the warm period 1000-1300 AD. This warm medieval period both in the simulations and the reconstructions is followed by cooling till 1800 AD. Based on the simulations, the recent warming is not unprecedented and is comparable to the medieval warming. Further steps of this study will address the individual contribution of anthropogenic and natural forcings on climate variability and change during the last millennium in China. We will make use of of models that provide runs including single forcings (fingerprints) for the attribution of climate variations from decadal to multi-centennial time scales. With this aim, we will implement statistical techniques for the detection of optimal signal-to-noise-ratio between external forcings and internal variability of reconstructed temperatures and precipitation. To apply these approaches the uncertainties associated with both reconstructions and simulations will be estimated. The latter will shed some light into the mechanisms behind current climate evolution and will help to constrain uncertainties in the sensitivity of model simulations to increasing CO2 scenarios of future climate change. This work will also contribute to the overall aims of the PAGES 2k initiative in Asia (http://www.pages.unibe.ch/workinggroups/2k-network)

  8. The influence of El Niño-Southern Oscillation regimes on eastern African vegetation and its future implications under the RCP8.5 warming scenario

    NASA Astrophysics Data System (ADS)

    Fer, Istem; Tietjen, Britta; Jeltsch, Florian; Wolff, Christian

    2017-09-01

    The El Niño-Southern Oscillation (ENSO) is the main driver of the interannual variability in eastern African rainfall, with a significant impact on vegetation and agriculture and dire consequences for food and social security. In this study, we identify and quantify the ENSO contribution to the eastern African rainfall variability to forecast future eastern African vegetation response to rainfall variability related to a predicted intensified ENSO. To differentiate the vegetation variability due to ENSO, we removed the ENSO signal from the climate data using empirical orthogonal teleconnection (EOT) analysis. Then, we simulated the ecosystem carbon and water fluxes under the historical climate without components related to ENSO teleconnections. We found ENSO-driven patterns in vegetation response and confirmed that EOT analysis can successfully produce coupled tropical Pacific sea surface temperature-eastern African rainfall teleconnection from observed datasets. We further simulated eastern African vegetation response under future climate change as it is projected by climate models and under future climate change combined with a predicted increased ENSO intensity. Our EOT analysis highlights that climate simulations are still not good at capturing rainfall variability due to ENSO, and as we show here the future vegetation would be different from what is simulated under these climate model outputs lacking accurate ENSO contribution. We simulated considerable differences in eastern African vegetation growth under the influence of an intensified ENSO regime which will bring further environmental stress to a region with a reduced capacity to adapt effects of global climate change and food security.

  9. Can APEX Represent In-Field Spatial Variability and Simulate Its Effects On Crop Yields?

    USDA-ARS?s Scientific Manuscript database

    Precision agriculture, from variable rate nitrogen application to precision irrigation, promises improved management of resources by considering the spatial variability of topography and soil properties. Hydrologic models need to simulate the effects of this variability if they are to inform about t...

  10. Quantifying and mapping spatial variability in simulated forest plots

    Treesearch

    Gavin R. Corral; Harold E. Burkhart

    2016-01-01

    We used computer simulations to test the efficacy of multivariate statistical methods to detect, quantify, and map spatial variability of forest stands. Simulated stands were developed of regularly-spaced plantations of loblolly pine (Pinus taeda L.). We assumed no affects of competition or mortality, but random variability was added to individual tree characteristics...

  11. Enhancing Entropy and Enthalpy Fluctuations to Drive Crystallization in Atomistic Simulations.

    PubMed

    Piaggi, Pablo M; Valsson, Omar; Parrinello, Michele

    2017-07-07

    Crystallization is a process of great practical relevance in which rare but crucial fluctuations lead to the formation of a solid phase starting from the liquid. As in all first order first transitions, there is an interplay between enthalpy and entropy. Based on this idea, in order to drive crystallization in molecular simulations, we introduce two collective variables, one enthalpic and the other entropic. Defined in this way, these collective variables do not prejudge the structure into which the system is going to crystallize. We show the usefulness of this approach by studying the cases of sodium and aluminum that crystallize in the bcc and fcc crystalline structures, respectively. Using these two generic collective variables, we perform variationally enhanced sampling and well tempered metadynamics simulations and find that the systems transform spontaneously and reversibly between the liquid and the solid phases.

  12. Multivariate stochastic simulation with subjective multivariate normal distributions

    Treesearch

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  13. Analysis of real-time numerical integration methods applied to dynamic clamp experiments.

    PubMed

    Butera, Robert J; McCarthy, Maeve L

    2004-12-01

    Real-time systems are frequently used as an experimental tool, whereby simulated models interact in real time with neurophysiological experiments. The most demanding of these techniques is known as the dynamic clamp, where simulated ion channel conductances are artificially injected into a neuron via intracellular electrodes for measurement and stimulation. Methodologies for implementing the numerical integration of the gating variables in real time typically employ first-order numerical methods, either Euler or exponential Euler (EE). EE is often used for rapidly integrating ion channel gating variables. We find via simulation studies that for small time steps, both methods are comparable, but at larger time steps, EE performs worse than Euler. We derive error bounds for both methods, and find that the error can be characterized in terms of two ratios: time step over time constant, and voltage measurement error over the slope factor of the steady-state activation curve of the voltage-dependent gating variable. These ratios reliably bound the simulation error and yield results consistent with the simulation analysis. Our bounds quantitatively illustrate how measurement error restricts the accuracy that can be obtained by using smaller step sizes. Finally, we demonstrate that Euler can be computed with identical computational efficiency as EE.

  14. North Atlantic Tropical Cyclones: historical simulations and future changes with the new high-resolution Arpege AGCM.

    NASA Astrophysics Data System (ADS)

    Pilon, R.; Chauvin, F.; Palany, P.; Belmadani, A.

    2017-12-01

    A new version of the variable high-resolution Meteo-France Arpege atmospheric general circulation model (AGCM) has been developed for tropical cyclones (TC) studies, with a focus on the North Atlantic basin, where the model horizontal resolution is 15 km. Ensemble historical AMIP (Atmospheric Model Intercomparison Project)-type simulations (1965-2014) and future projections (2020-2080) under the IPCC (Intergovernmental Panel on Climate Change) representative concentration pathway (RCP) 8.5 scenario have been produced. TC-like vortices tracking algorithm is used to investigate TC activity and variability. TC frequency, genesis, geographical distribution and intensity are examined. Historical simulations are compared to best-track and reanalysis datasets. Model TC frequency is generally realistic but tends to be too high during the rst decade of the historical simulations. Biases appear to originate from both the tracking algorithm and model climatology. Nevertheless, the model is able to simulate extremely well intense TCs corresponding to category 5 hurricanes in the North Atlantic, where grid resolution is highest. Interaction between developing TCs and vertical wind shear is shown to be contributing factor for TC variability. Future changes in TC activity and properties are also discussed.

  15. Inability of CMIP5 Climate Models to Simulate Recent Multi-decadal Climate Change in the Tropical Pacific.

    NASA Astrophysics Data System (ADS)

    Power, S.; Delage, F.; Kociuba, G.; Wang, G.; Smith, I.

    2017-12-01

    Observed 15-year surface temperature trends beginning 1998 or later have attracted a great deal of interest because of an apparent slowdown in the rate of global warming, and contrasts between climate model simulations and observations of such trends. Many studies have addressed the statistical significance of these relatively short trends, whether they indicate a possible bias in models and the implications for global warming generally. Here we analyse historical and projected changes in 38 CMIP5 climate models. All of the models simulate multi-decadal warming in the Pacific over the past half-century that exceeds observed values. This stark difference cannot be fully explained by observed, internal multi-decadal climate variability, even if allowance is made for an apparent tendency for models to underestimate internal multi-decadal variability in the Pacific. We also show that CMIP5 models are not able to simulate the magnitude of the strengthening of the Walker Circulation over the past thirty years. Some of the reasons for these major shortcomings in the ability of models to simulate multi-decadal variability in the Pacific, and the impact these findings have on our confidence in global 21st century projections, will be discussed.

  16. Attributing uncertainty in streamflow simulations due to variable inputs via the Quantile Flow Deviation metric

    NASA Astrophysics Data System (ADS)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2018-06-01

    Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.

  17. Evaluation of the CORDEX-Africa multi-RCM hindcast: systematic model errors

    NASA Astrophysics Data System (ADS)

    Kim, J.; Waliser, Duane E.; Mattmann, Chris A.; Goodale, Cameron E.; Hart, Andrew F.; Zimdars, Paul A.; Crichton, Daniel J.; Jones, Colin; Nikulin, Grigory; Hewitson, Bruce; Jack, Chris; Lennard, Christopher; Favre, Alice

    2014-03-01

    Monthly-mean precipitation, mean (TAVG), maximum (TMAX) and minimum (TMIN) surface air temperatures, and cloudiness from the CORDEX-Africa regional climate model (RCM) hindcast experiment are evaluated for model skill and systematic biases. All RCMs simulate basic climatological features of these variables reasonably, but systematic biases also occur across these models. All RCMs show higher fidelity in simulating precipitation for the west part of Africa than for the east part, and for the tropics than for northern Sahara. Interannual variation in the wet season rainfall is better simulated for the western Sahel than for the Ethiopian Highlands. RCM skill is higher for TAVG and TMAX than for TMIN, and regionally, for the subtropics than for the tropics. RCM skill in simulating cloudiness is generally lower than for precipitation or temperatures. For all variables, multi-model ensemble (ENS) generally outperforms individual models included in ENS. An overarching conclusion in this study is that some model biases vary systematically for regions, variables, and metrics, posing difficulties in defining a single representative index to measure model fidelity, especially for constructing ENS. This is an important concern in climate change impact assessment studies because most assessment models are run for specific regions/sectors with forcing data derived from model outputs. Thus, model evaluation and ENS construction must be performed separately for regions, variables, and metrics as required by specific analysis and/or assessments. Evaluations using multiple reference datasets reveal that cross-examination, quality control, and uncertainty estimates of reference data are crucial in model evaluations.

  18. Learning from Simulation Games: Effects of Sociometric Grouping.

    ERIC Educational Resources Information Center

    Brand, Charles F.

    1980-01-01

    Study examined the influence of 141 fifth graders of two sociometric variables, mutual selection of playing partners and membership in a cohesive group, on learning from classroom simulation games. Although cognitive learning evidence existed, no effects of sociometric grouping were apparent. (CMV)

  19. Rivers and Floodplains as Key Components of Global Terrestrial Water Storage Variability

    NASA Astrophysics Data System (ADS)

    Getirana, Augusto; Kumar, Sujay; Girotto, Manuela; Rodell, Matthew

    2017-10-01

    This study quantifies the contribution of rivers and floodplains to terrestrial water storage (TWS) variability. We use state-of-the-art models to simulate land surface processes and river dynamics and to separate TWS into its main components. Based on a proposed impact index, we show that surface water storage (SWS) contributes 8% of TWS variability globally, but that contribution differs widely among climate zones. Changes in SWS are a principal component of TWS variability in the tropics, where major rivers flow over arid regions and at high latitudes. SWS accounts for 22-27% of TWS variability in both the Amazon and Nile Basins. Changes in SWS are negligible in the Western U.S., Northern Africa, Middle East, and central Asia. Based on comparisons with Gravity Recovery and Climate Experiment-based TWS, we conclude that accounting for SWS improves simulated TWS in most of South America, Africa, and Southern Asia, confirming that SWS is a key component of TWS variability.

  20. Daily Rainfall Simulation Using Climate Variables and Nonhomogeneous Hidden Markov Model

    NASA Astrophysics Data System (ADS)

    Jung, J.; Kim, H. S.; Joo, H. J.; Han, D.

    2017-12-01

    Markov chain is an easy method to handle when we compare it with other ones for the rainfall simulation. However, it also has limitations in reflecting seasonal variability of rainfall or change on rainfall patterns caused by climate change. This study applied a Nonhomogeneous Hidden Markov Model(NHMM) to consider these problems. The NHMM compared with a Hidden Markov Model(HMM) for the evaluation of a goodness of the model. First, we chose Gum river basin in Korea to apply the models and collected daily rainfall data from the stations. Also, the climate variables of geopotential height, temperature, zonal wind, and meridional wind date were collected from NCEP/NCAR reanalysis data to consider external factors affecting the rainfall event. We conducted a correlation analysis between rainfall and climate variables then developed a linear regression equation using the climate variables which have high correlation with rainfall. The monthly rainfall was obtained by the regression equation and it became input data of NHMM. Finally, the daily rainfall by NHMM was simulated and we evaluated the goodness of fit and prediction capability of NHMM by comparing with those of HMM. As a result of simulation by HMM, the correlation coefficient and root mean square error of daily/monthly rainfall were 0.2076 and 10.8243/131.1304mm each. In case of NHMM, the correlation coefficient and root mean square error of daily/monthly rainfall were 0.6652 and 10.5112/100.9865mm each. We could verify that the error of daily and monthly rainfall simulated by NHMM was improved by 2.89% and 22.99% compared with HMM. Therefore, it is expected that the results of the study could provide more accurate data for hydrologic analysis. Acknowledgements This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(2017R1A2B3005695)

  1. Hydrological characterization of Guadalquivir River Basin for the period 1980-2010 using VIC model

    NASA Astrophysics Data System (ADS)

    García-Valdecasas-Ojeda, Matilde; de Franciscis, Sebastiano; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María

    2017-04-01

    This study analyzes the changes of soil moisture and real evapotranspiration (ETR), during the last 30 years, in the Guadalquivir River Basin, located in the south of the Iberian Peninsula. Soil moisture content is related with the different components of the real evaporation, it is a relevant factor when analyzing the intensity of droughts and heat waves, and particularly, for the impact study of the climate change. The soil moisture and real evapotranspiration data consist of simulations obtained by using the Variable Infiltration Capacity (VIC) hydrological model. This is a large-scale hydrologic model and allows the estimations of different variables in the hydrological system of a basin. Land surface is modeled as a grid of large and uniform cells with sub-grid heterogeneity (e.g. land cover), while water influx is local, only depending from the interaction between grid cell and local atmosphere environment. Observational data of temperature and precipitation from Spain02 dataset have been used as input variables for VIC model. Additionally, estimates of actual evapotranspiration and soil moisture are also analyzed using temperature, precipitation, wind, humidity and radiation as input variables for VIC. These variables are obtained from a dynamical downscaling from ERA-Interim data by the Weather Research and Forecasting (WRF) model. The simulations have a spatial resolution about 9 km and the analysis is done on a seasonal time-scale. Preliminary results show that ETR presents very low values for autumn from WRF simulations compared with VIC simulations. Only significant positive trends are found during autumn for the western part of the basin for the ETR obtained with VIC model, meanwhile no significant trends are found for the ETR WRF simulations. Keywords: Soil moisture, Real evapotranspiration, Guadalquivir Basin, trends, VIC, WRF. Acknowledgements: This work has been financed by the projects P11-RNM-7941 (Junta de Andalucía-Spain) and CGL2013-48539-R (MINECO-Spain, FEDER).

  2. Impact of Subsurface Temperature Variability on Meteorological Variability: An AGCM Study

    NASA Astrophysics Data System (ADS)

    Mahanama, S. P.; Koster, R. D.; Liu, P.

    2006-05-01

    Anomalous atmospheric conditions can lead to surface temperature anomalies, which in turn can lead to temperature anomalies deep in the soil. The deep soil temperature (and the associated ground heat content) has significant memory -- the dissipation of a temperature anomaly may take weeks to months -- and thus deep soil temperature may contribute to the low frequency variability of energy and water variables elsewhere in the system. The memory may even provide some skill to subseasonal and seasonal forecasts. This study uses two long-term AGCM experiments to isolate the contribution of deep soil temperature variability to variability elsewhere in the climate system. The first experiment consists of a standard ensemble of AMIP-type simulations, simulations in which the deep soil temperature variable is allowed to interact with the rest of the system. In the second experiment, the coupling of the deep soil temperature to the rest of the climate system is disabled -- at each grid cell, the local climatological seasonal cycle of deep soil temperature (as determined from the first experiment) is prescribed. By comparing the variability of various atmospheric quantities as generated in the two experiments, we isolate the contribution of interactive deep soil temperature to that variability. The results show that interactive deep soil temperature contributes significantly to surface temperature variability. Interactive deep soil temperature, however, reduces the variability of the hydrological cycle (evaporation and precipitation), largely because it allows for a negative feedback between evaporation and temperature.

  3. West African Monsoon dynamics in idealized simulations: the competitive roles of SST warming and CO2

    NASA Astrophysics Data System (ADS)

    Gaetani, Marco; Flamant, Cyrille; Hourdin, Frederic; Bastin, Sophie; Braconnot, Pascale; Bony, Sandrine

    2015-04-01

    The West African Monsoon (WAM) is affected by large climate variability at different timescales, from interannual to multidecadal, with strong environmental and socio-economic impacts associated to climate-related rainfall variability, especially in the Sahelian belt. State-of-the-art coupled climate models still show poor ability in correctly simulating the WAM past variability and also a large spread is observed in future climate projections. In this work, the July-to-September (JAS) WAM variability in the period 1979-2008 is studied in AMIP-like simulations (SST-forced) from CMIP5. The individual roles of global SST warming and CO2 concentration increasing are investigated through idealized experiments simulating a 4K warmer SST and a 4x CO2 concentration, respectively. Results show a dry response in Sahel to SST warming, with dryer conditions over western Sahel. On the contrary, wet conditions are observed when CO2 is increased, with the strongest response over central-eastern Sahel. The precipitation changes are associated to modifications in the regional atmospheric circulation: dry (wet) conditions are associated with reduced (increased) convergence in the lower troposphere, a southward (northward) shift of the African Easterly Jet, and a weaker (stronger) Tropical Easterly Jet. The co-variability between global SST and WAM precipitation is also investigated, highlighting a reorganization of the main co-variability modes. Namely, in the 4xCO2 simulation the influence of Tropical Pacific is dominant, while it is reduced in the 4K simulation, which also shows an increased coupling with the eastern Pacific and the Indian Ocean. The above results suggest a competitive action of SST warming and CO2 increasing on the WAM climate variability, with opposite effects on precipitation. The combination of the observed positive and negative response in precipitation, with wet conditions in central-eastern Sahel and dry conditions in western Sahel, is consistent with the future precipitation trends over West Africa resulting from CMIP5 coupled simulations. It is argued that the large spread in CMIP5 future projections may be related to the weight given to SST warming and direct CO2 effect by individual models. The capability of climate models in reproducing the SST-precipitation relationship appears to be crucial in this respect.

  4. Multiple and variable speed electrical generator systems for large wind turbines

    NASA Technical Reports Server (NTRS)

    Andersen, T. S.; Hughes, P. S.; Kirschbaum, H. S.; Mutone, G. A.

    1982-01-01

    A cost effective method to achieve increased wind turbine generator energy conversion and other operational benefits through variable speed operation is presented. Earlier studies of multiple and variable speed generators in wind turbines were extended for evaluation in the context of a specific large sized conceptual design. System design and simulation have defined the costs and performance benefits which can be expected from both two speed and variable speed configurations.

  5. Downscaling Solar Power Output to 4-Seconds for Use in Integration Studies (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummon, M.; Weekley, A.; Searight, K.

    2013-10-01

    High penetration renewable integration studies require solar power data with high spatial and temporal accuracy to quantify the impact of high frequency solar power ramps on the operation of the system. Our previous work concentrated on downscaling solar power from one hour to one minute by simulation. This method used clearness classifications to categorize temporal and spatial variability, and iterative methods to simulate intra-hour clearness variability. We determined that solar power ramp correlations between sites decrease with distance and the duration of the ramp, starting at around 0.6 for 30-minute ramps between sites that are less than 20 km apart.more » The sub-hour irradiance algorithm we developed has a noise floor that causes the correlations to approach ~0.005. Below one minute, the majority of the correlations of solar power ramps between sites less than 20 km apart are zero, and thus a new method to simulate intra-minute variability is needed. These intra-minute solar power ramps can be simulated using several methods, three of which we evaluate: a cubic spline fit to the one-minute solar power data; projection of the power spectral density toward the higher frequency domain; and average high frequency power spectral density from measured data. Each of these methods either under- or over-estimates the variability of intra-minute solar power ramps. We show that an optimized weighted linear sum of methods, dependent on the classification of temporal variability of the segment of one-minute solar power data, yields time series and ramp distributions similar to measured high-resolution solar irradiance data.« less

  6. Downscaling Solar Power Output to 4-Seconds for Use in Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummon, M.; Weekley, A.; Searight, K.

    2013-10-01

    High penetration renewable integration studies require solar power data with high spatial and temporal accuracy to quantify the impact of high frequency solar power ramps on the operation of the system. Our previous work concentrated on downscaling solar power from one hour to one minute by simulation. This method used clearness classifications to categorize temporal and spatial variability, and iterative methods to simulate intra-hour clearness variability. We determined that solar power ramp correlations between sites decrease with distance and the duration of the ramp, starting at around 0.6 for 30-minute ramps between sites that are less than 20 km apart.more » The sub-hour irradiance algorithm we developed has a noise floor that causes the correlations to approach ~0.005. Below one minute, the majority of the correlations of solar power ramps between sites less than 20 km apart are zero, and thus a new method to simulate intra-minute variability is needed. These intra-minute solar power ramps can be simulated using several methods, three of which we evaluate: a cubic spline fit to the one-minute solar power data; projection of the power spectral density toward the higher frequency domain; and average high frequency power spectral density from measured data. Each of these methods either under- or over-estimates the variability of intra-minute solar power ramps. We show that an optimized weighted linear sum of methods, dependent on the classification of temporal variability of the segment of one-minute solar power data, yields time series and ramp distributions similar to measured high-resolution solar irradiance data.« less

  7. Extreme storm surge and wind wave climate scenario simulations at the Venetian littoral

    NASA Astrophysics Data System (ADS)

    Lionello, P.; Galati, M. B.; Elvini, E.

    Scenario climate projections for extreme marine storms producing storm surges and wind waves are very important for the northern flat coast of the Adriatic Sea, where the area at risk includes a unique cultural and environmental heritage, and important economic activities. This study uses a shallow water model and a spectral wave model for computing the storm surge and the wind wave field, respectively, from the sea level pressure and wind fields that have been computed by the RegCM regional climate model. Simulations cover the period 1961-1990 for the present climate (control simulations) and the period 2071-2100 for the A2 and B2 scenarios. Generalized Extreme Value analysis is used for estimating values for the 10 and 100 year return times. The adequacy of these modeling tools for a reliable estimation of the climate change signal, without needing further downscaling is shown. However, this study has mainly a methodological value, because issues such as interdecadal variability and intermodel variability cannot be addressed, since the analysis is based on single model 30-year long simulations. The control simulation looks reasonably accurate for extreme value analysis, though it overestimates/underestimates the frequency of high/low surge and wind wave events with respect to observations. Scenario simulations suggest higher frequency of intense storms for the B2 scenario, but not for the A2. Likely, these differences are not the effect of climate change, but of climate multidecadal variability. Extreme storms are stronger in future scenarios, but differences are not statistically significant. Therefore this study does not provide convincing evidence for more stormy conditions in future scenarios.

  8. Regional Community Climate Simulations with variable resolution meshes in the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Zarzycki, C. M.; Gettelman, A.; Callaghan, P.

    2017-12-01

    Accurately predicting weather extremes such as precipitation (floods and droughts) and temperature (heat waves) requires high resolution to resolve mesoscale dynamics and topography at horizontal scales of 10-30km. Simulating such resolutions globally for climate scales (years to decades) remains computationally impractical. Simulating only a small region of the planet is more tractable at these scales for climate applications. This work describes global simulations using variable-resolution static meshes with multiple dynamical cores that target the continental United States using developmental versions of the Community Earth System Model version 2 (CESM2). CESM2 is tested in idealized, aquaplanet and full physics configurations to evaluate variable mesh simulations against uniform high and uniform low resolution simulations at resolutions down to 15km. Different physical parameterization suites are also evaluated to gauge their sensitivity to resolution. Idealized variable-resolution mesh cases compare well to high resolution tests. More recent versions of the atmospheric physics, including cloud schemes for CESM2, are more stable with respect to changes in horizontal resolution. Most of the sensitivity is due to sensitivity to timestep and interactions between deep convection and large scale condensation, expected from the closure methods. The resulting full physics model produces a comparable climate to the global low resolution mesh and similar high frequency statistics in the high resolution region. Some biases are reduced (orographic precipitation in the western United States), but biases do not necessarily go away at high resolution (e.g. summertime JJA surface Temp). The simulations are able to reproduce uniform high resolution results, making them an effective tool for regional climate studies and are available in CESM2.

  9. Additional Samples: Where They Should Be Located

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilger, G. G., E-mail: jfelipe@ufrgs.br; Costa, J. F. C. L.; Koppe, J. C.

    2001-09-15

    Information for mine planning requires to be close spaced, if compared to the grid used for exploration and resource assessment. The additional samples collected during quasimining usually are located in the same pattern of the original diamond drillholes net but closer spaced. This procedure is not the best in mathematical sense for selecting a location. The impact of an additional information to reduce the uncertainty about the parameter been modeled is not the same everywhere within the deposit. Some locations are more sensitive in reducing the local and global uncertainty than others. This study introduces a methodology to select additionalmore » sample locations based on stochastic simulation. The procedure takes into account data variability and their spatial location. Multiple equally probable models representing a geological attribute are generated via geostatistical simulation. These models share basically the same histogram and the same variogram obtained from the original data set. At each block belonging to the model a value is obtained from the n simulations and their combination allows one to access local variability. Variability is measured using an uncertainty index proposed. This index was used to map zones of high variability. A value extracted from a given simulation is added to the original data set from a zone identified as erratic in the previous maps. The process of adding samples and simulation is repeated and the benefit of the additional sample is evaluated. The benefit in terms of uncertainty reduction is measure locally and globally. The procedure showed to be robust and theoretically sound, mapping zones where the additional information is most beneficial. A case study in a coal mine using coal seam thickness illustrates the method.« less

  10. Sources of variation in oxygen consumption of aquatic animals demonstrated by simulated constant oxygen consumption and respirometers of different sizes.

    PubMed

    Svendsen, M B S; Bushnell, P G; Christensen, E A F; Steffensen, J F

    2016-01-01

    As intermittent-flow respirometry has become a common method for the determination of resting metabolism or standard metabolic rate (SMR), this study investigated how much of the variability seen in the experiments was due to measurement error. Experiments simulated different constant oxygen consumption rates (M˙O2 ) of a fish, by continuously injecting anoxic water into a respirometer, altering the injection rate to correct for the washout error. The effect of respirometer-to-fish volume ratio (RFR) on SMR measurement and variability was also investigated, using the simulated constant M˙O2 and the M˙O2 of seven roach Rutilus rutilus in respirometers of two different sizes. The results show that higher RFR increases measurement variability but does not change the mean SMR established using a double Gaussian fit. Further, the study demonstrates that the variation observed when determining oxygen consumption rates of fishes in systems with reasonable RFRs mainly comes from the animal, not from the measuring equipment. © 2016 The Fisheries Society of the British Isles.

  11. Neural network simulation of soil NO3 dynamic under potato crop system

    NASA Astrophysics Data System (ADS)

    Goulet-Fortin, Jérôme; Morais, Anne; Anctil, François; Parent, Léon-Étienne; Bolinder, Martin

    2013-04-01

    Nitrate leaching is a major issue in sandy soils intensively cropped to potato. Modelling could test and improve management practices, particularly as regard to the optimal N application rates. Lack of input data is an important barrier for the application of classical process-based models to predict soil NO3 content (SNOC) and NO3 leaching (NOL). Alternatively, data driven models such as neural networks (NN) could better take into account indicators of spatial soil heterogeneity and plant growth pattern such as the leaf area index (LAI), hence reducing the amount of soil information required. The first objective of this study was to evaluate NN and hybrid models to simulate SNOC in the 0-40 cm soil layer considering inter-annual variations, spatial soil heterogeneity and differential N application rates. The second objective was to evaluate the same methodology to simulate seasonal NOL dynamic at 1 m deep. To this aim, multilayer perceptrons with different combinations of driving meteorological variables, functions of the LAI and state variables of external deterministic models have been trained and evaluated. The state variables from external models were: drainage estimated by the CLASS model and the soil temperature estimated by an ICBM subroutine. Results of SNOC simulations were compared to field data collected between 2004 and 2011 at several experimental plots under potato cropping systems in Québec, Eastern Canada. Results of NOL simulation were compared to data obtained in 2012 from 11 suction lysimeters installed in 2 experimental plots under potato cropping systems in the same region. The most performing model for SNOC simulation was obtained using a 4-input hybrid model composed of 1) cumulative LAI, 2) cumulative drainage, 3) soil temperature and 4) day of year. The most performing model for NOL simulation was obtained using a 5-input NN model composed of 1) N fertilization rate at spring, 2) LAI, 3) cumulative rainfall, 4) the day of year and 5) the percentage of clay content. The MAE was 22% for SNOC simulation and 23% for NOL simulation. High sensitivity to LAI suggests that the model may take into account field and sub-field spatial variability and support N management. Further studies are needed to fully validate the method, particularly in the case of NOL simulation.

  12. Effect of physical variables on capture of magnetic nanoparticles in simulated blood vessels

    NASA Astrophysics Data System (ADS)

    Zhang, Minghui; Brazel, Christopher

    2011-11-01

    This study investigated how the percent capture of magnetic nanoparticles in a simulated vessel varies with physical variables. Magnetic nanoparticles (MNPs) can used as part of therapeutic or diagnostic materials for cancer patients. By capturing these devices with a magnetic field, the particles can be concentrated in an area of diseased tissue. In this study, flow of nanoparticles in simulated blood vessels was used to determine the affect of applying an external magnetic field. This study used maghemite nanoparticles as the MNPs and either water or Fetal Bovine Serum as the carrier fluid. A UV-Vis collected capture data. The percent capture of MNPs was positively influenced by five physical variables: larger vessel diameters, lower linear flow velocity, higher magnetic field strength, better dispersion, lower MNP concentration, and lower protein content in fluid. Free MNPs were also compared to micelles, with the free particles having more successful magnetic capture. Four factors contributed to these trends: the strength of the magnetic field's influence on the MNPs, the MNPs' interactions with other particles and the fluid, the momentum of the nanoparticles, and magnetic mass to total mass ratio of the flowing particles. Funded by NSF REU Site #1062611.

  13. Computer Simulations and Literature Survey of Continuously Variable Transmissions for Use in Buses

    DOT National Transportation Integrated Search

    1981-12-01

    Numerous studies have been conducted on the concept of flywheel energy storage for buses. Flywheel systems require a continuously variable transmission (CVT) of some type to transmit power between the flywheel and the drive wheels. However, a CVT can...

  14. Toward robust estimation of the components of forest population change: simulation results

    Treesearch

    Francis A. Roesch

    2014-01-01

    This report presents the full simulation results of the work described in Roesch (2014), in which multiple levels of simulation were used to test the robustness of estimators for the components of forest change. In that study, a variety of spatial-temporal populations were created based on, but more variable than, an actual forest monitoring dataset, and then those...

  15. The Effects of an Energy-Environment Simulator Upon Selected Energy-Related Attitudes of Science Students and In-Service Teachers.

    ERIC Educational Resources Information Center

    Dunlop, David L.

    This document is the outcome of a study designed to investigate the energy-related attitudes of several different groups of science students and science teachers both before and after working with an energy-environment simulator for approximately an hour. During the interaction with the simulator, the participants decided upon the variables they…

  16. Estimation of Logistic Regression Models in Small Samples. A Simulation Study Using a Weakly Informative Default Prior Distribution

    ERIC Educational Resources Information Center

    Gordovil-Merino, Amalia; Guardia-Olmos, Joan; Pero-Cebollero, Maribel

    2012-01-01

    In this paper, we used simulations to compare the performance of classical and Bayesian estimations in logistic regression models using small samples. In the performed simulations, conditions were varied, including the type of relationship between independent and dependent variable values (i.e., unrelated and related values), the type of variable…

  17. The Influence of PV Module Materials and Design on Solder Joint Thermal Fatigue Durability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosco, Nick; Silverman, Timothy J.; Kurtz, Sarah

    Finite element model (FEM) simulations have been performed to elucidate the effect of flat plate photovoltaic (PV) module materials and design on PbSn eutectic solder joint thermal fatigue durability. The statistical method of Latin Hypercube sampling was employed to investigate the sensitivity of simulated damage to each input variable. Variables of laminate material properties and their thicknesses were investigated. Using analysis of variance, we determined that the rate of solder fatigue was most sensitive to solder layer thickness, with copper ribbon and silicon thickness being the next two most sensitive variables. By simulating both accelerated thermal cycles (ATCs) and PVmore » cell temperature histories through two characteristic days of service, we determined that the acceleration factor between the ATC and outdoor service was independent of the variables sampled in this study. This result implies that an ATC test will represent a similar time of outdoor exposure for a wide range of module designs. This is an encouraging result for the standard ATC that must be universally applied across all modules.« less

  18. Radio-loud AGN Variability from Propagating Relativistic Jets

    NASA Astrophysics Data System (ADS)

    Li, Yutong; Schuh, Terance; Wiita, Paul J.

    2018-06-01

    The great majority of variable emission in radio-loud AGNs is understood to arise from the relativistic flows of plasma along two oppositely directed jets. We study this process using the Athena hydrodynamics code to simulate propagating three-dimensional relativistic jets for a wide range of input jet velocities and jet-to-ambient matter density ratios. We then focus on those simulations that remain essentially stable for extended distances (60-120 times the jet radius). Adopting results for the densities, pressures and velocities from these propagating simulations we estimate emissivities from each cell. The observed emissivity from each cell is strongly dependent upon its variable Doppler boosting factor, which depends upon the changing bulk velocities in those zones with respect to our viewing angle to the jet. We then sum the approximations to the fluxes from a large number of zones upstream of the primary reconfinement shock. The light curves so produced are similar to those of blazars, although turbulence on sub-grid scales is likely to be important for the variability on the shortest timescales.

  19. Investigating systematic individual differences in sleep-deprived performance on a high-fidelity flight simulator.

    PubMed

    Van Dongen, Hans P A; Caldwell, John A; Caldwell, J Lynn

    2006-05-01

    Laboratory research has revealed considerable systematic variability in the degree to which individuals' alertness and performance are affected by sleep deprivation. However, little is known about whether or not different populations exhibit similar levels of individual variability. In the present study, we examined individual variability in performance impairment due to sleep loss in a highly select population of militaryjet pilots. Ten active-duty F-117 pilots were deprived of sleep for 38 h and studied repeatedly in a high-fidelity flight simulator. Data were analyzed with a mixed-model ANOVA to quantify individual variability. Statistically significant, systematic individual differences in the effects of sleep deprivation were observed, even when baseline differences were accounted for. The findings suggest that highly select populations may exhibit individual differences in vulnerability to performance impairment from sleep loss just as the general population does. Thus, the scientific and operational communities' reliance on group data as opposed to individual data may entail substantial misestimation of the impact of job-related stressors on safety and performance.

  20. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  1. nZVI injection into variably saturated soils: Field and modeling study

    NASA Astrophysics Data System (ADS)

    Chowdhury, Ahmed I. A.; Krol, Magdalena M.; Kocur, Christopher M.; Boparai, Hardiljeet K.; Weber, Kela P.; Sleep, Brent E.; O'Carroll, Denis M.

    2015-12-01

    Nano-scale zero valent iron (nZVI) has been used at a number of contaminated sites over the last decade. At most of these sites, significant decreases in contaminant concentrations have resulted from the application of nZVI. However, limited work has been completed investigating nZVI field-scale mobility. In this study, a field test was combined with numerical modeling to examine nZVI reactivity along with transport properties in variably saturated soils. The field test consisted of 142 L of carboxymethyle cellulose (CMC) stabilized monometallic nZVI synthesized onsite and injected into a variably saturated zone. Periodic groundwater samples were collected from the injection well, as well as, from two monitoring wells to analyze for chlorinated solvents and other geochemistry indicators. This study showed that CMC stabilized monometallic nZVI was able to decrease tricholorethene (TCE) concentrations in groundwater by more than 99% from the historical TCE concentrations. A three dimensional, three phase, finite difference numerical simulator, (CompSim) was used to further investigate nZVI and polymer transport at the variably saturated site. The model was able to accurately predict the field observed head data without parameter fitting. In addition, the numerical simulator estimated the mass of nZVI delivered to the saturated and unsaturated zones and distinguished the nZVI phase (i.e. aqueous or attached). The simulation results showed that the injected slurry migrated radially outward from the injection well, and therefore nZVI transport was governed by injection velocity and viscosity of the injected solution. A suite of sensitivity analyses was performed to investigate the impact of different injection scenarios (e.g. different volume and injection rate) on nZVI migration. Simulation results showed that injection of a higher nZVI volume delivered more iron particles at a given distance; however, the travel distance was not proportional to the increase in volume. Moreover, simulation results showed that using a 1D transport equation to simulate nZVI migration in the subsurface may overestimate the travel distance. This is because the 1D transport equation assumes a constant velocity while pore water velocity radially decreases from the well during injection. This study suggests that on-site synthesized nZVI particles are mobile in the subsurface and that a numerical simulator can be a valuable tool for optimal design of nZVI field applications.

  2. Multi-model analysis of terrestrial carbon cycles in Japan: limitations and implications of model calibration using eddy flux observations

    NASA Astrophysics Data System (ADS)

    Ichii, K.; Suzuki, T.; Kato, T.; Ito, A.; Hajima, T.; Ueyama, M.; Sasai, T.; Hirata, R.; Saigusa, N.; Ohtani, Y.; Takagi, K.

    2010-07-01

    Terrestrial biosphere models show large differences when simulating carbon and water cycles, and reducing these differences is a priority for developing more accurate estimates of the condition of terrestrial ecosystems and future climate change. To reduce uncertainties and improve the understanding of their carbon budgets, we investigated the utility of the eddy flux datasets to improve model simulations and reduce variabilities among multi-model outputs of terrestrial biosphere models in Japan. Using 9 terrestrial biosphere models (Support Vector Machine - based regressions, TOPS, CASA, VISIT, Biome-BGC, DAYCENT, SEIB, LPJ, and TRIFFID), we conducted two simulations: (1) point simulations at four eddy flux sites in Japan and (2) spatial simulations for Japan with a default model (based on original settings) and a modified model (based on model parameter tuning using eddy flux data). Generally, models using default model settings showed large deviations in model outputs from observation with large model-by-model variability. However, after we calibrated the model parameters using eddy flux data (GPP, RE and NEP), most models successfully simulated seasonal variations in the carbon cycle, with less variability among models. We also found that interannual variations in the carbon cycle are mostly consistent among models and observations. Spatial analysis also showed a large reduction in the variability among model outputs. This study demonstrated that careful validation and calibration of models with available eddy flux data reduced model-by-model differences. Yet, site history, analysis of model structure changes, and more objective procedure of model calibration should be included in the further analysis.

  3. The role of internal variability for decadal carbon uptake anomalies in the Southern Ocean

    NASA Astrophysics Data System (ADS)

    Spring, Aaron; Hi, Hongmei; Ilyina, Tatiana

    2017-04-01

    The Southern Ocean is a major sink for anthropogenic CO2 emissions and hence it plays an essential role in modulating global carbon cycle and climate change. Previous studies based on observations (e.g., Landschützer et al. 2015) show pronounced decadal variations of carbon uptake in the Southern Ocean in recent decades and this variability is largely driven by internal climate variability. However, due to limited ensemble size of simulations, the variability of this important ocean sink is still poorly assessed by the state-of-the-art earth system models (ESMs). To assess the internal variability of carbon sink in the Southern Ocean, we use a large ensemble of 100 member simulations based on the Max Planck Institute-ESM (MPI-ESM). The large ensemble of simulations is generated via perturbed initial conditions in the ocean and atmosphere. Each ensemble member includes a historical simulation from 1850 to 2005 with an extension until 2100 under Representative Concentration Pathway (RCP) 4.5 future projections. Here we use model simulations from 1980-2015 to compare with available observation-based dataset. We found several ensemble members showing decadal decreasing trends in the carbon sink, which are similar to the trend shown in observations. This result suggests that MPI-ESM large ensemble simulations are able to reproduce decadal variation of carbon sink in the Southern Ocean. Moreover, the decreasing trends of Southern Ocean carbon sink in MPI-ESM are mainly contributed by region between 50-60°S. To understand the internal variability of the air-sea carbon fluxes in the Southern Ocean, we further investigate the variability of underlying processes, such as physical climate variability and ocean biological processes. Our results indicate two main drivers for the decadal decreasing trend of carbon sink: i) Intensified winds enhance upwelling of old carbon-rich waters, this leads to increase of the ocean surface pCO2; ii) Primary production is reduced in area from 50-60°S, probably induced by reduced euphotic water column stability; therefore the biological drawdown of ocean surface pCO2 is weakened accordingly and hence the ocean is in favor of carbon outgassing. Landschützer, et al. (2015): The reinvigoration of the Southern Ocean carbon sink, Science, 349, 1221-1224.

  4. Motion Cues in Flight Simulation and Simulator Induced Sickness

    DTIC Science & Technology

    1988-06-01

    asseusod in a driving simulator by means of a response surface methodology central-composite design . The most salient finding of the study was that visual...across treatment conditions. For an orthogonal response surface methodology (IBM) design with only tro independent variables. it can be readily shown that...J.E.Fowikes 8 SESSION III - ETIOLOGICAL FACTORS IN SIMULATOR-INDUCED AFTER EFFETS THE USE OF VE& IIBULAR MODELS FOR DESIGN AND EVALUATION OF FLIGHT

  5. Single pilot scanning behavior in simulated instrument flight

    NASA Technical Reports Server (NTRS)

    Pennington, J. E.

    1979-01-01

    A simulation of tasks associated with single pilot general aviation flight under instrument flight rules was conducted as a baseline for future research studies on advanced flight controls and avionics. The tasks, ranging from simple climbs and turns to an instrument landing systems approach, were flown on a fixed base simulator. During the simulation the control inputs, state variables, and the pilots visual scan pattern including point of regard were measured and recorded.

  6. [AUTONOMIC CONTROL OF HEART RATE, BLOOD LACTATE AND ACCELERATION DURING COMBAT SIMULATION IN TAEKWONDO ELITE ATHLETES].

    PubMed

    Cerda-Kohler, Hugo; Aguayo Fuentealba, Juan Carlos; Francino Barrera, Giovanni; Guajardo-Sandoval, Adrián; Jorquera Aguilera, Carlos; Báez-San Martín, Eduardo

    2015-09-01

    the aim of the study was to measure the heart rate recovery, blood lactate and movement acceleration during simulated taekwondo competition. twelve male subjects who belong to the national team, with at least five years of experience participated in this research. They performed a simulated combat to evaluate the following variables: (i) Blood lactate after one minute recovery between each round, (ii) Heart rate recovery (HRR) at thirty and sixty seconds in each minute rest between rounds, (iii) Peak acceleration (ACCp) in each round performed. The significance level was set at p < 005. the results showed no significant differences between winners and losers in the HRR at both, thirty and sixty seconds (p > 0.05), blood lactate (p > 0.05), peak acceleration (p > 0.05) and the average acceleration of combat (p = 0.18). There was no correlation between delta lactate and ACCp (r = 0.01; p = 0.93), delta lactate and HRR (r = -0.23; p = 0.18), and ACCp and HRR (r = 0.003; p = 0.98). these data suggest that studied variables would not be decisive in the simulated combat outcomes. Other factors such as technical-tactical or psychological variables could have a significant impact on athletic performance. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  7. A Study on the Effects of Spatial Scale on Snow Process in Hyper-Resolution Hydrological Modelling over Mountainous Areas

    NASA Astrophysics Data System (ADS)

    Garousi Nejad, I.; He, S.; Tang, Q.; Ogden, F. L.; Steinke, R. C.; Frazier, N.; Tarboton, D. G.; Ohara, N.; Lin, H.

    2017-12-01

    Spatial scale is one of the main considerations in hydrological modeling of snowmelt in mountainous areas. The size of model elements controls the degree to which variability can be explicitly represented versus what needs to be parameterized using effective properties such as averages or other subgrid variability parameterizations that may degrade the quality of model simulations. For snowmelt modeling terrain parameters such as slope, aspect, vegetation and elevation play an important role in the timing and quantity of snowmelt that serves as an input to hydrologic runoff generation processes. In general, higher resolution enhances the accuracy of the simulation since fine meshes represent and preserve the spatial variability of atmospheric and surface characteristics better than coarse resolution. However, this increases computational cost and there may be a scale beyond which the model response does not improve due to diminishing sensitivity to variability and irreducible uncertainty associated with the spatial interpolation of inputs. This paper examines the influence of spatial resolution on the snowmelt process using simulations of and data from the Animas River watershed, an alpine mountainous area in Colorado, USA, using an unstructured distributed physically based hydrological model developed for a parallel computing environment, ADHydro. Five spatial resolutions (30 m, 100 m, 250 m, 500 m, and 1 km) were used to investigate the variations in hydrologic response. This study demonstrated the importance of choosing the appropriate spatial scale in the implementation of ADHydro to obtain a balance between representing spatial variability and the computational cost. According to the results, variation in the input variables and parameters due to using different spatial resolution resulted in changes in the obtained hydrological variables, especially snowmelt, both at the basin-scale and distributed across the model mesh.

  8. Underestimated AMOC Variability and Implications for AMV and Predictability in CMIP Models

    NASA Astrophysics Data System (ADS)

    Yan, Xiaoqin; Zhang, Rong; Knutson, Thomas R.

    2018-05-01

    The Atlantic Meridional Overturning Circulation (AMOC) has profound impacts on various climate phenomena. Using both observations and simulations from the Coupled Model Intercomparison Project Phase 3 and 5, here we show that most models underestimate the amplitude of low-frequency AMOC variability. We further show that stronger low-frequency AMOC variability leads to stronger linkages between the AMOC and key variables associated with the Atlantic multidecadal variability (AMV), and between the subpolar AMV signal and northern hemisphere surface air temperature. Low-frequency extratropical northern hemisphere surface air temperature variability might increase with the amplitude of low-frequency AMOC variability. Atlantic decadal predictability is much higher in models with stronger low-frequency AMOC variability and much lower in models with weaker or without AMOC variability. Our results suggest that simulating realistic low-frequency AMOC variability is very important, both for simulating realistic linkages between AMOC and AMV-related variables and for achieving substantially higher Atlantic decadal predictability.

  9. The education of attention as explanation of variability of practice effects: learning the final approach phase in a flight simulator.

    PubMed

    Huet, Michaël; Jacobs, David M; Camachon, Cyril; Missenard, Olivier; Gray, Rob; Montagne, Gilles

    2011-12-01

    The present study reports two experiments in which a total of 20 participants without prior flight experience practiced the final approach phase in a fixed-base simulator. All participants received self-controlled concurrent feedback during 180 practice trials. Experiment 1 shows that participants learn more quickly under variable practice conditions than under constant practice conditions. This finding is attributed to the education of attention to the more useful informational variables: Variability of practice reduces the usefulness of initially used informational variables, which leads to a quicker change in variable use, and hence to a larger improvement in performance. In the practice phase of Experiment 2 variability was selectively applied to some experimental factors but not to others. Participants tended to converge toward the variables that were useful in the specific conditions that they encountered during practice. This indicates that an explanation for variability of practice effects in terms of the education of attention is a useful alternative to traditional explanations based on the notion of the generalized motor program and to explanations based on the notions of noise and local minima.

  10. Internal Interdecadal Variability in CMIP5 Control Simulations

    NASA Astrophysics Data System (ADS)

    Cheung, A. H.; Mann, M. E.; Frankcombe, L. M.; England, M. H.; Steinman, B. A.; Miller, S. K.

    2015-12-01

    Here we make use of control simulations from the CMIP5 models to quantify the amplitude of the interdecadal internal variability component in Atlantic, Pacific, and Northern Hemisphere mean surface temperature. We compare against estimates derived from observations using a semi-empirical approach wherein the forced component as estimated using CMIP5 historical simulations is removed to yield an estimate of the residual, internal variability. While the observational estimates are largely consistent with those derived from the control simulations for both basins and the Northern Hemisphere, they lie in the upper range of the model distributions, suggesting the possibility of differences between the amplitudes of observed and modeled variability. We comment on some possible reasons for the disparity.

  11. Core Self-Evaluations as Causes of Satisfaction: The Mediating Role of Seeking Task Complexity

    ERIC Educational Resources Information Center

    Srivastava, Abhishek; Locke, Edwin A.; Judge, Timothy A.; Adams, John W.

    2010-01-01

    This study examined the mediating role of task complexity in the relationship between core self-evaluations (CSE) and satisfaction. In Study 1, eighty three undergraduate business students worked on a strategic decision-making simulation. The simulated environment enabled us to verify the temporal sequence of variables, use an objective measure of…

  12. Probabilistic composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  13. A Study of the Relationship between Student Final Exam Performance and Simulation Game Participation.

    ERIC Educational Resources Information Center

    Whiteley, T. R.; Faria, A. J.

    1989-01-01

    Describes study that investigated the relationship between participation in a business simulation game and performance on a final exam in a principles of marketing course. Past research on business games is reviewed; the use of midterm exam performance level as a pretest variable is explained; and question classification is described. (44…

  14. Evaluation of Boreal Summer Monsoon Intraseasonal Variability in the GASS-YOTC Multi-Model Physical Processes Experiment

    NASA Astrophysics Data System (ADS)

    Mani, N. J.; Waliser, D. E.; Jiang, X.

    2014-12-01

    While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.

  15. 2D hydrodynamic simulations of a variable length gas target for density down-ramp injection of electrons into a laser wakefield accelerator

    NASA Astrophysics Data System (ADS)

    Kononenko, O.; Lopes, N. C.; Cole, J. M.; Kamperidis, C.; Mangles, S. P. D.; Najmudin, Z.; Osterhoff, J.; Poder, K.; Rusby, D.; Symes, D. R.; Warwick, J.; Wood, J. C.; Palmer, C. A. J.

    2016-09-01

    In this work, two-dimensional (2D) hydrodynamic simulations of a variable length gas cell were performed using the open source fluid code OpenFOAM. The gas cell was designed to study controlled injection of electrons into a laser-driven wakefield at the Astra Gemini laser facility. The target consists of two compartments: an accelerator and an injector section connected via an aperture. A sharp transition between the peak and plateau density regions in the injector and accelerator compartments, respectively, was observed in simulations with various inlet pressures. The fluid simulations indicate that the length of the down-ramp connecting the sections depends on the aperture diameter, as does the density drop outside the entrance and the exit cones. Further studies showed, that increasing the inlet pressure leads to turbulence and strong fluctuations in density along the axial profile during target filling, and consequently, is expected to negatively impact the accelerator stability.

  16. Continuity-based model interfacing for plant-wide simulation: a general approach.

    PubMed

    Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A

    2006-08-01

    In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.

  17. Observed and predicted sensitivities of extreme surface ozone to meteorological drivers in three US cities

    NASA Astrophysics Data System (ADS)

    Fix, Miranda J.; Cooley, Daniel; Hodzic, Alma; Gilleland, Eric; Russell, Brook T.; Porter, William C.; Pfister, Gabriele G.

    2018-03-01

    We conduct a case study of observed and simulated maximum daily 8-h average (MDA8) ozone (O3) in three US cities for summers during 1996-2005. The purpose of this study is to evaluate the ability of a high resolution atmospheric chemistry model to reproduce observed relationships between meteorology and high or extreme O3. We employ regional coupled chemistry-transport model simulations to make three types of comparisons between simulated and observational data, comparing (1) tails of the O3 response variable, (2) distributions of meteorological predictor variables, and (3) sensitivities of high and extreme O3 to meteorological predictors. This last comparison is made using two methods: quantile regression, for the 0.95 quantile of O3, and tail dependence optimization, which is used to investigate even higher O3 extremes. Across all three locations, we find substantial differences between simulations and observational data in both meteorology and meteorological sensitivities of high and extreme O3.

  18. Simulating the hydrological impacts of inter-annual and seasonal variability in land use land cover change on streamflow

    NASA Astrophysics Data System (ADS)

    Taxak, A. K.; Ojha, C. S. P.

    2017-12-01

    Land use and land cover (LULC) changes within a watershed are recognised as an important factor affecting hydrological processes and water resources. LULC changes continuously not only in long term but also on the inter-annual and season level. Changes in LULC affects the interception, storage and moisture. A widely used approach in rainfall-runoff modelling through Land surface models (LSM)/ hydrological models is to keep LULC same throughout the model running period. In long term simulations where land use change take place during the run period, using a single LULC does not represent a true picture of ground conditions could result in stationarity of model responses. The present work presents a case study in which changes in LULC are incorporated by using multiple LULC layers. LULC for the study period were created using imageries from Landsat series, Sentinal, EO-1 ALI. Distributed, physically based Variable Infiltration Capacity (VIC) model was modified to allow inclusion of LULC as a time varying variable just like climate. The Narayani basin was simulated with LULC, leaf area index (LAI), albedo and climate data for 1992-2015. The results showed that the model simulation with varied parametrization approach has a large improvement over the conventional fixed parametrization approach in terms of long-term water balance. The proposed modelling approach could improve hydrological modelling for applications like land cover change studies, water budget studies etc.

  19. Stochastic model search with binary outcomes for genome-wide association studies

    PubMed Central

    Malovini, Alberto; Puca, Annibale A; Bellazzi, Riccardo

    2012-01-01

    Objective The spread of case–control genome-wide association studies (GWASs) has stimulated the development of new variable selection methods and predictive models. We introduce a novel Bayesian model search algorithm, Binary Outcome Stochastic Search (BOSS), which addresses the model selection problem when the number of predictors far exceeds the number of binary responses. Materials and methods Our method is based on a latent variable model that links the observed outcomes to the underlying genetic variables. A Markov Chain Monte Carlo approach is used for model search and to evaluate the posterior probability of each predictor. Results BOSS is compared with three established methods (stepwise regression, logistic lasso, and elastic net) in a simulated benchmark. Two real case studies are also investigated: a GWAS on the genetic bases of longevity, and the type 2 diabetes study from the Wellcome Trust Case Control Consortium. Simulations show that BOSS achieves higher precisions than the reference methods while preserving good recall rates. In both experimental studies, BOSS successfully detects genetic polymorphisms previously reported to be associated with the analyzed phenotypes. Discussion BOSS outperforms the other methods in terms of F-measure on simulated data. In the two real studies, BOSS successfully detects biologically relevant features, some of which are missed by univariate analysis and the three reference techniques. Conclusion The proposed algorithm is an advance in the methodology for model selection with a large number of features. Our simulated and experimental results showed that BOSS proves effective in detecting relevant markers while providing a parsimonious model. PMID:22534080

  20. Evaluation of fine soil moisture data from the IFloodS (NASA GPM) Ground Validation campaign using a fully-distributed ecohydrological model

    NASA Astrophysics Data System (ADS)

    Bastola, S.; Dialynas, Y. G.; Arnone, E.; Bras, R. L.

    2014-12-01

    The spatial variability of soil, vegetation, topography, and precipitation controls hydrological processes, consequently resulting in high spatio-temporal variability of most of the hydrological variables, such as soil moisture. Limitation in existing measuring system to characterize this spatial variability, and its importance in various application have resulted in a need of reconciling spatially distributed soil moisture evolution model and corresponding measurements. Fully distributed ecohydrological model simulates soil moisture at high resolution soil moisture. This is relevant for range of environmental studies e.g., flood forecasting. They can also be used to evaluate the value of space born soil moisture data, by assimilating them into hydrological models. In this study, fine resolution soil moisture data simulated by a physically-based distributed hydrological model, tRIBS-VEGGIE, is compared with soil moisture data collected during the field campaign in Turkey river basin, Iowa. The soil moisture series at the 2 and 4 inch depth exhibited a more rapid response to rainfall as compared to bottom 8 and 20 inch ones. The spatial variability in two distinct land surfaces of Turkey River, IA, reflects the control of vegetation, topography and soil texture in the characterization of spatial variability. The comparison of observed and simulated soil moisture at various depth showed that model was able to capture the dynamics of soil moisture at a number of gauging stations. Discrepancies are large in some of the gauging stations, which are characterized by rugged terrain and represented, in the model, through large computational units.

  1. Predicting marine physical-biogeochemical variabilities in the Gulf of Mexico and southeastern U.S. shelf sea

    NASA Astrophysics Data System (ADS)

    He, R.; Zong, H.; Xue, Z. G.; Fennel, K.; Tian, H.; Cai, W. J.; Lohrenz, S. E.

    2017-12-01

    An integrated terrestrial-ocean ecosystem modeling system is developed and used to investigate marine physical-biogeochemical variabilities in the Gulf of Mexico and southeastern US shelf sea. Such variabilities stem from variations in the shelf circulation, boundary current dynamics, impacts of climate variability, as well as growing population and associated land use practices on transport of carbon and nutrients within terrestrial systems and their delivery to the coastal ocean. We will report our efforts in evaluating the performance of the coupled modeling system via extensive model and data comparisons, as well as findings from a suite of case studies and scenario simulations. Long-term model simulation results are used to quantify regional ocean circulation dynamics, nitrogen budget and carbon fluxes. Their corresponding sub-regional differences are also characterized and contrasted.

  2. How model and input uncertainty impact maize yield simulations in West Africa

    NASA Astrophysics Data System (ADS)

    Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli

    2015-02-01

    Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.

  3. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    NASA Technical Reports Server (NTRS)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  4. The Stability and Interfacial Motion of Multi-layer Radial Porous Media and Hele-Shaw Flows

    NASA Astrophysics Data System (ADS)

    Gin, Craig; Daripa, Prabir

    2017-11-01

    In this talk, we will discuss viscous fingering instabilities of multi-layer immiscible porous media flows within the Hele-Shaw model in a radial flow geometry. We study the motion of the interfaces for flows with both constant and variable viscosity fluids. We consider the effects of using a variable injection rate on multi-layer flows. We also present a numerical approach to simulating the interface motion within linear theory using the method of eigenfunction expansion. We compare these results with fully non-linear simulations.

  5. Evaluating the variability in surface water reservoir planning characteristics during climate change impacts assessment

    NASA Astrophysics Data System (ADS)

    Soundharajan, Bankaru-Swamy; Adeloye, Adebayo J.; Remesan, Renji

    2016-07-01

    This study employed a Monte-Carlo simulation approach to characterise the uncertainties in climate change induced variations in storage requirements and performance (reliability (time- and volume-based), resilience, vulnerability and sustainability) of surface water reservoirs. Using a calibrated rainfall-runoff (R-R) model, the baseline runoff scenario was first simulated. The R-R inputs (rainfall and temperature) were then perturbed using plausible delta-changes to produce simulated climate change runoff scenarios. Stochastic models of the runoff were developed and used to generate ensembles of both the current and climate-change-perturbed future runoff scenarios. The resulting runoff ensembles were used to force simulation models of the behaviour of the reservoir to produce 'populations' of required reservoir storage capacity to meet demands, and the performance. Comparing these parameters between the current and the perturbed provided the population of climate change effects which was then analysed to determine the variability in the impacts. The methodology was applied to the Pong reservoir on the Beas River in northern India. The reservoir serves irrigation and hydropower needs and the hydrology of the catchment is highly influenced by Himalayan seasonal snow and glaciers, and Monsoon rainfall, both of which are predicted to change due to climate change. The results show that required reservoir capacity is highly variable with a coefficient of variation (CV) as high as 0.3 as the future climate becomes drier. Of the performance indices, the vulnerability recorded the highest variability (CV up to 0.5) while the volume-based reliability was the least variable. Such variabilities or uncertainties will, no doubt, complicate the development of climate change adaptation measures; however, knowledge of their sheer magnitudes as obtained in this study will help in the formulation of appropriate policy and technical interventions for sustaining and possibly enhancing water security for irrigation and other uses served by Pong reservoir.

  6. Fatigue Tests with Random Flight Simulation Loading

    NASA Technical Reports Server (NTRS)

    Schijve, J.

    1972-01-01

    Crack propagation was studied in a full-scale wing structure under different simulated flight conditions. Omission of low-amplitude gust cycles had a small effect on the crack rate. Truncation of the infrequently occurring high-amplitude gust cycles to a lower level had a noticeably accelerating effect on crack growth. The application of fail-safe load (100 percent limit load) effectively stopped subsequent crack growth under resumed flight-simulation loading. In another flight-simulation test series on sheet specimens, the variables studied are the design stress level and the cyclic frequency of the random gust loading. Inflight mean stresses vary from 5.5 to 10.0 kg/sq mm. The effect of the stress level is larger for the 2024 alloy than for the 7075 alloy. Three frequencies were employed: namely, 10 cps, 1 cps, and 0.1 cps. The frequency effect was small. The advantages and limitations of flight-simulation tests are compared with those of alternative test procedures such as constant-amplitude tests, program tests, and random-load tests. Various testing purposes are considered. The variables of flight-simulation tests are listed and their effects are discussed. A proposal is made for performing systematic flight-simulation tests in such a way that the compiled data may be used as a source of reference.

  7. Decadal predictions of Southern Ocean sea ice : testing different initialization methods with an Earth-system Model of Intermediate Complexity

    NASA Astrophysics Data System (ADS)

    Zunz, Violette; Goosse, Hugues; Dubinkina, Svetlana

    2013-04-01

    The sea ice extent in the Southern Ocean has increased since 1979 but the causes of this expansion have not been firmly identified. In particular, the contribution of internal variability and external forcing to this positive trend has not been fully established. In this region, the lack of observations and the overestimation of internal variability of the sea ice by contemporary General Circulation Models (GCMs) make it difficult to understand the behaviour of the sea ice. Nevertheless, if its evolution is governed by the internal variability of the system and if this internal variability is in some way predictable, a suitable initialization method should lead to simulations results that better fit the reality. Current GCMs decadal predictions are generally initialized through a nudging towards some observed fields. This relatively simple method does not seem to be appropriated to the initialization of sea ice in the Southern Ocean. The present study aims at identifying an initialization method that could improve the quality of the predictions of Southern Ocean sea ice at decadal timescales. We use LOVECLIM, an Earth-system Model of Intermediate Complexity that allows us to perform, within a reasonable computational time, the large amount of simulations required to test systematically different initialization procedures. These involve three data assimilation methods: a nudging, a particle filter and an efficient particle filter. In a first step, simulations are performed in an idealized framework, i.e. data from a reference simulation of LOVECLIM are used instead of observations, herein after called pseudo-observations. In this configuration, the internal variability of the model obviously agrees with the one of the pseudo-observations. This allows us to get rid of the issues related to the overestimation of the internal variability by models compared to the observed one. This way, we can work out a suitable methodology to assess the efficiency of the initialization procedures tested. It also allows us determine the upper limit of improvement that can be expected if more sophisticated initialization methods are used in decadal prediction simulations and if models have an internal variability agreeing with the observed one. Furthermore, since pseudo-observations are available everywhere at any time step, we also analyse the differences between simulations initialized with a complete dataset of pseudo-observations and the ones for which pseudo-observations data are not assimilated everywhere. In a second step, simulations are realized in a realistic framework, i.e. through the use of actual available observations. The same data assimilation methods are tested in order to check if more sophisticated methods can improve the reliability and the accuracy of decadal prediction simulations, even if they are performed with models that overestimate the internal variability of the sea ice extent in the Southern Ocean.

  8. Trans-Pacific transport and evolution of aerosols: evaluation of quasi-global WRF-Chem simulation with multiple observations

    NASA Astrophysics Data System (ADS)

    Hu, Zhiyuan; Zhao, Chun; Huang, Jianping; Leung, L. Ruby; Qian, Yun; Yu, Hongbin; Huang, Lei; Kalashnikova, Olga V.

    2016-05-01

    A fully coupled meteorology-chemistry model (WRF-Chem, the Weather Research and Forecasting model coupled with chemistry) has been configured to conduct quasi-global simulation for 5 years (2010-2014) and evaluated with multiple observation data sets for the first time. The evaluation focuses on the simulation over the trans-Pacific transport region using various reanalysis and observational data sets for meteorological fields and aerosol properties. The simulation generally captures the overall spatial and seasonal variability of satellite retrieved aerosol optical depth (AOD) and absorbing AOD (AAOD) over the Pacific that is determined by the outflow of pollutants and dust and the emissions of marine aerosols. The assessment of simulated extinction Ångström exponent (EAE) indicates that the model generally reproduces the variability of aerosol size distributions as seen by satellites. In addition, the vertical profile of aerosol extinction and its seasonality over the Pacific are also well simulated. The difference between the simulation and satellite retrievals can be mainly attributed to model biases in estimating marine aerosol emissions as well as the satellite sampling and retrieval uncertainties. Compared with the surface measurements over the western USA, the model reasonably simulates the observed magnitude and seasonality of dust, sulfate, and nitrate surface concentrations, but significantly underestimates the peak surface concentrations of carbonaceous aerosol likely due to model biases in the spatial and temporal variability of biomass burning emissions and secondary organic aerosol (SOA) production. A sensitivity simulation shows that the trans-Pacific transported dust, sulfate, and nitrate can make significant contribution to surface concentrations over the rural areas of the western USA, while the peaks of carbonaceous aerosol surface concentrations are dominated by the North American emissions. Both the retrievals and simulation show small interannual variability of aerosol characteristics for 2010-2014 averaged over three Pacific sub-regions. The evaluation in this study demonstrates that the WRF-Chem quasi-global simulation can be used for investigating trans-Pacific transport of aerosols and providing reasonable inflow chemical boundaries for the western USA, allowing one to further understand the impact of transported pollutants on the regional air quality and climate with high-resolution nested regional modeling.

  9. Effects of ocean-atmosphere coupling on rainfall over the Indian Ocean and northwestern Pacific Ocean during boreal summer

    NASA Astrophysics Data System (ADS)

    Zhou, Z. Q.; Xie, S. P.; Zhou, W.

    2016-12-01

    Atmosphere general circulation model (AGCM), forced with specified SST, has been widely used in climate studies. On one hand, AGCM is much faster to run compared to coupled general circulation model (CGCM). Also, the identical SST forcing allows a clean evaluation of the atmospheric component of CGCM. On the other hand, the coupling between atmosphere and ocean is missed in such atmosphere-only simulations. It is not clear how such simplification could affect the simulate of the atmosphere. In this study, the impact of ocean-atmosphere coupling is studied by comparing a CGCM simulation with an AGCM simulation which is forced with monthly SSTs specified from the CGCM simulation. Particularly, we focus on the climatology and interannual variability of rainfall over the IONWP during boreal summer. The IONWP is a unique region with a strong negative correlation between sea surface temperature (SST) and rainfall during boreal summer on the interannual time scale. The lead/lag correlation analysis suggests a negative feedback of rainfall on SST, which is only reasonably captured by CGCMs. We find that the lack of the negative feedback in AGCM not only enhances the climatology and interannual variability of rainfall but also increases the internal variability of rainfall over the IONWP. A simple mechanism is proposed to explain such enhancement. In addition, AGCM is able to capture the large-scale rainfall pattern over the IONWP during boreal summer, this is because that rainfall here is caused by remote ENSO effect on the interannual time scale. Our results herein suggest that people should be more careful when using an AGCM for climate change studies.

  10. Subgrid-scale effects in compressible variable-density decaying turbulence

    DOE PAGES

    GS, Sidharth; Candler, Graham V.

    2018-05-08

    We present that many turbulent flows are characterized by complex scale interactions and vorticity generation caused by compressibility and variable-density effects. In the large-eddy simulation of variable-density flows, these processes manifest themselves as subgrid-scale (SGS) terms that interact with the resolved-scale flow. This paper studies the effect of the variable-density SGS terms and quantifies their relative importance. We consider the SGS terms appearing in the density-weighted Favre-filtered equations and in the unweighted Reynolds-filtered equations. The conventional form of the Reynolds-filtered momentum equation is complicated by a temporal SGS term; therefore, we derive a new form of the Reynolds-filtered governing equationsmore » that does not contain this term and has only double-correlation SGS terms. The new form of the filtered equations has terms that represent the SGS mass flux, pressure-gradient acceleration and velocity-dilatation correlation. To evaluate the dynamical significance of the variable-density SGS effects, we carry out direct numerical simulations of compressible decaying turbulence at a turbulent Mach number of 0.3. Two different initial thermodynamic conditions are investigated: homentropic and a thermally inhomogeneous gas with regions of differing densities. The simulated flow fields are explicitly filtered to evaluate the SGS terms. The importance of the variable-density SGS terms is quantified relative to the SGS specific stress, which is the only SGS term active in incompressible constant-density turbulence. It is found that while the variable-density SGS terms in the homentropic case are negligible, they are dynamically significant in the thermally inhomogeneous flows. Investigation of the variable-density SGS terms is therefore important, not only to develop variable-density closures but also to improve the understanding of scale interactions in variable-density flows.« less

  11. Subgrid-scale effects in compressible variable-density decaying turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GS, Sidharth; Candler, Graham V.

    We present that many turbulent flows are characterized by complex scale interactions and vorticity generation caused by compressibility and variable-density effects. In the large-eddy simulation of variable-density flows, these processes manifest themselves as subgrid-scale (SGS) terms that interact with the resolved-scale flow. This paper studies the effect of the variable-density SGS terms and quantifies their relative importance. We consider the SGS terms appearing in the density-weighted Favre-filtered equations and in the unweighted Reynolds-filtered equations. The conventional form of the Reynolds-filtered momentum equation is complicated by a temporal SGS term; therefore, we derive a new form of the Reynolds-filtered governing equationsmore » that does not contain this term and has only double-correlation SGS terms. The new form of the filtered equations has terms that represent the SGS mass flux, pressure-gradient acceleration and velocity-dilatation correlation. To evaluate the dynamical significance of the variable-density SGS effects, we carry out direct numerical simulations of compressible decaying turbulence at a turbulent Mach number of 0.3. Two different initial thermodynamic conditions are investigated: homentropic and a thermally inhomogeneous gas with regions of differing densities. The simulated flow fields are explicitly filtered to evaluate the SGS terms. The importance of the variable-density SGS terms is quantified relative to the SGS specific stress, which is the only SGS term active in incompressible constant-density turbulence. It is found that while the variable-density SGS terms in the homentropic case are negligible, they are dynamically significant in the thermally inhomogeneous flows. Investigation of the variable-density SGS terms is therefore important, not only to develop variable-density closures but also to improve the understanding of scale interactions in variable-density flows.« less

  12. Interannual variability of ammonia concentrations over the United States: sources and implications

    NASA Astrophysics Data System (ADS)

    Schiferl, Luke D.; Heald, Colette L.; Van Damme, Martin; Clarisse, Lieven; Clerbaux, Cathy; Coheur, Pierre-François; Nowak, John B.; Neuman, J. Andrew; Herndon, Scott C.; Roscioli, Joseph R.; Eilerman, Scott J.

    2016-09-01

    The variability of atmospheric ammonia (NH3), emitted largely from agricultural sources, is an important factor when considering how inorganic fine particulate matter (PM2.5) concentrations and nitrogen cycling are changing over the United States. This study combines new observations of ammonia concentration from the surface, aboard aircraft, and retrieved by satellite to both evaluate the simulation of ammonia in a chemical transport model (GEOS-Chem) and identify which processes control the variability of these concentrations over a 5-year period (2008-2012). We find that the model generally underrepresents the ammonia concentration near large source regions (by 26 % at surface sites) and fails to reproduce the extent of interannual variability observed at the surface during the summer (JJA). Variability in the base simulation surface ammonia concentration is dominated by meteorology (64 %) as compared to reductions in SO2 and NOx emissions imposed by regulation (32 %) over this period. Introduction of year-to-year varying ammonia emissions based on animal population, fertilizer application, and meteorologically driven volatilization does not substantially improve the model comparison with observed ammonia concentrations, and these ammonia emissions changes have little effect on the simulated ammonia concentration variability compared to those caused by the variability of meteorology and acid-precursor emissions. There is also little effect on the PM2.5 concentration due to ammonia emissions variability in the summer when gas-phase changes are favored, but variability in wintertime emissions, as well as in early spring and late fall, will have a larger impact on PM2.5 formation. This work highlights the need for continued improvement in both satellite-based and in situ ammonia measurements to better constrain the magnitude and impacts of spatial and temporal variability in ammonia concentrations.

  13. Potential of Spark Ignition Engine, Effect of Vehicle Design Variables on Top Speed, Performance, and Fuel Economy

    DOT National Transportation Integrated Search

    1980-03-01

    The purpose of this report is to evaluate the effect of vehicle characteristics on vehicle performance and fuel economy. The studies were performed using the VEHSIM (vehicle simulation) program at the Transportation Systems Center. The computer simul...

  14. Estimation of Atlantic-Mediterranean netflow variability

    NASA Astrophysics Data System (ADS)

    Guerreiro, Catarina; Peliz, Alvaro; Miranda, Pedro

    2016-04-01

    The exchanges at the Strait of Gibraltar are extremely difficult to measure due to the strong temporal and across-strait variabilities; yet the Atlantic inflow into the Mediterranean is extremely important both for climate and to ecosystems. Most of the published numerical modeling studies do not resolve the Strait of Gibraltar realistically. Models that represent the strait at high resolution focus primarily in high frequency dynamics, whereas long-term dynamics are studied in low resolution model studies, and for that reason the Strait dynamics are poorly resolved. Estimating the variability of the exchanges requires long term and high-resolutions studies, thus an improved simulation with explicit and realistic representation of the Strait is necessary. On seasonal to inter-annual timescales the flow is essentially driven by the net evaporation contribution and consequently realistic fields of precipitation and evaporation are necessary for model setup. A comparison between observations, reanalysis and combined products shows ERA-Interim Reanalysis has the most suitable product for Mediterranean Sea. Its time and space variability are in close agreement with NOC 1.1 for the common period (1980 - 1993) and also with evaporation from OAFLUX (1989 - 2014). Subinertial fluctuations, periods from days to a few months, are the second most energetic, after tides, and are the response to atmospheric pressure fluctuations and local winds. Atmospheric pressure fluctuations in the Mediterranean cause sea level oscillations that induce a barotropic flow through the Strait. Candela's analytical model has been used to quantify this response in later studies, though comparison with observations points to an underestimation of the flow at strait. An improved representation of this term contribution to the Atlantic - Mediterranean exchange must be achieved on longer time-scales. We propose a new simulation for the last 36 years (1979 - 2014) for the Mediterranean - Atlantic domain with explicit representation of the Strait. The simulations are performed using the Regional Ocean Modeling System (ROMS) and forced with the different contributions of the freshwater budget, sea level pressure fluctuations and winds from ERA-Interim Reanalysis. The model of sea level pressure induced barotropic fluctuations simulates the barotropic variability at the Strait of Gibraltar for the last decades.

  15. Ocean carbon and heat variability in an Earth System Model

    NASA Astrophysics Data System (ADS)

    Thomas, J. L.; Waugh, D.; Gnanadesikan, A.

    2016-12-01

    Ocean carbon and heat content are very important for regulating global climate. Furthermore, due to lack of observations and dependence on parameterizations, there has been little consensus in the modeling community on the magnitude of realistic ocean carbon and heat content variability, particularly in the Southern Ocean. We assess the differences between global oceanic heat and carbon content variability in GFDL ESM2Mc using a 500-year, pre-industrial control simulation. The global carbon and heat content are directly out of phase with each other; however, in the Southern Ocean the heat and carbon content are in phase. The global heat mutli-decadal variability is primarily explained by variability in the tropics and mid-latitudes, while the variability in global carbon content is primarily explained by Southern Ocean variability. In order to test the robustness of this relationship, we use three additional pre-industrial control simulations using different mesoscale mixing parameterizations. Three pre-industrial control simulations are conducted with the along-isopycnal diffusion coefficient (Aredi) set to constant values of 400, 800 (control) and 2400 m2 s-1. These values for Aredi are within the range of parameter settings commonly used in modeling groups. Finally, one pre-industrial control simulation is conducted where the minimum in the Gent-McWilliams parameterization closure scheme (AGM) increased to 600 m2 s-1. We find that the different simulations have very different multi-decadal variability, especially in the Weddell Sea where the characteristics of deep convection are drastically changed. While the temporal frequency and amplitude global heat and carbon content changes significantly, the overall spatial pattern of variability remains unchanged between the simulations.

  16. Problems in using p-curve analysis and text-mining to detect rate of p-hacking and evidential value.

    PubMed

    Bishop, Dorothy V M; Thompson, Paul A

    2016-01-01

    Background. The p-curve is a plot of the distribution of p-values reported in a set of scientific studies. Comparisons between ranges of p-values have been used to evaluate fields of research in terms of the extent to which studies have genuine evidential value, and the extent to which they suffer from bias in the selection of variables and analyses for publication, p-hacking. Methods. p-hacking can take various forms. Here we used R code to simulate the use of ghost variables, where an experimenter gathers data on several dependent variables but reports only those with statistically significant effects. We also examined a text-mined dataset used by Head et al. (2015) and assessed its suitability for investigating p-hacking. Results. We show that when there is ghost p-hacking, the shape of the p-curve depends on whether dependent variables are intercorrelated. For uncorrelated variables, simulated p-hacked data do not give the "p-hacking bump" just below .05 that is regarded as evidence of p-hacking, though there is a negative skew when simulated variables are inter-correlated. The way p-curves vary according to features of underlying data poses problems when automated text mining is used to detect p-values in heterogeneous sets of published papers. Conclusions. The absence of a bump in the p-curve is not indicative of lack of p-hacking. Furthermore, while studies with evidential value will usually generate a right-skewed p-curve, we cannot treat a right-skewed p-curve as an indicator of the extent of evidential value, unless we have a model specific to the type of p-values entered into the analysis. We conclude that it is not feasible to use the p-curve to estimate the extent of p-hacking and evidential value unless there is considerable control over the type of data entered into the analysis. In particular, p-hacking with ghost variables is likely to be missed.

  17. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  18. Interannual rainfall variability over China in the MetUM GA6 and GC2 configurations

    NASA Astrophysics Data System (ADS)

    Stephan, Claudia Christine; Klingaman, Nicholas P.; Vidale, Pier Luigi; Turner, Andrew G.; Demory, Marie-Estelle; Guo, Liang

    2018-05-01

    Six climate simulations of the Met Office Unified Model Global Atmosphere 6.0 and Global Coupled 2.0 configurations are evaluated against observations and reanalysis data for their ability to simulate the mean state and year-to-year variability of precipitation over China. To analyse the sensitivity to air-sea coupling and horizontal resolution, atmosphere-only and coupled integrations at atmospheric horizontal resolutions of N96, N216 and N512 (corresponding to ˜ 200, 90 and 40 km in the zonal direction at the equator, respectively) are analysed. The mean and interannual variance of seasonal precipitation are too high in all simulations over China but improve with finer resolution and coupling. Empirical orthogonal teleconnection (EOT) analysis is applied to simulated and observed precipitation to identify spatial patterns of temporally coherent interannual variability in seasonal precipitation. To connect these patterns to large-scale atmospheric and coupled air-sea processes, atmospheric and oceanic fields are regressed onto the corresponding seasonal mean time series. All simulations reproduce the observed leading pattern of interannual rainfall variability in winter, spring and autumn; the leading pattern in summer is present in all but one simulation. However, only in two simulations are the four leading patterns associated with the observed physical mechanisms. Coupled simulations capture more observed patterns of variability and associate more of them with the correct physical mechanism, compared to atmosphere-only simulations at the same resolution. However, finer resolution does not improve the fidelity of these patterns or their associated mechanisms. This shows that evaluating climate models by only geographical distribution of mean precipitation and its interannual variance is insufficient. The EOT analysis adds knowledge about coherent variability and associated mechanisms.

  19. Spread in the magnitude of climate model interdecadal global temperature variability traced to disagreements over high-latitude oceans

    NASA Astrophysics Data System (ADS)

    Brown, Patrick T.; Li, Wenhong; Jiang, Jonathan H.; Su, Hui

    2016-12-01

    Unforced variability in global mean surface air temperature can obscure or exaggerate global warming on interdecadal time scales; thus, understanding both the magnitude and generating mechanisms of such variability is of critical importance for both attribution studies as well as decadal climate prediction. Coupled atmosphere-ocean general circulation models (climate models) simulate a wide range of magnitudes of unforced interdecadal variability in global mean surface air temperature (UITglobal), hampering efforts to quantify the influence of UITglobal on contemporary global temperature trends. Recently, a preliminary consensus has emerged that unforced interdecadal variability in local surface temperatures (UITlocal) over the tropical Pacific Ocean is particularly influential on UITglobal. Therefore, a reasonable hypothesis might be that the large spread in the magnitude of UITglobal across climate models can be explained by the spread in the magnitude of simulated tropical Pacific UITlocal. Here we show that this hypothesis is mostly false. Instead, the spread in the magnitude of UITglobal is linked much more strongly to the spread in the magnitude of UITlocal over high-latitude regions characterized by significant variability in oceanic convection, sea ice concentration, and energy flux at both the surface and the top of the atmosphere. Thus, efforts to constrain the climate model produced range of UITglobal magnitude would be best served by focusing on the simulation of air-sea interaction at high latitudes.

  20. Changes in temporal variability of precipitation over land due to anthropogenic forcings

    DOE PAGES

    Konapala, Goutam; Mishra, Ashok; Leung, L. Ruby

    2017-02-02

    This study investigated the anthropogenic influence on the temporal variability of annual precipitation for the period 1950-2005 as simulated by the CMIP5 models. The temporal variability of both annual precipitation amount (PRCPTOT) and intensity (SDII) was first measured using a metric of statistical dispersion called the Gini coefficient. Comparing simulations driven by both anthropogenic and natural forcings (ALL) with simulations of natural forcings only (NAT), we quantified the anthropogenic contributions to the changes in temporal variability at global, continental and sub-continental scales as a relative difference of the respective Gini coefficients of ALL and NAT. Over the period of 1950-2005,more » our results indicate that anthropogenic forcings have resulted in decreased uniformity (i.e., increase in unevenness or disparity) in annual precipitation amount and intensity at global as well as continental scales. In addition, out of the 21 sub-continental regions considered, 14 (PRCPTOT) and 17 (SDII) regions showed significant anthropogenic influences. The human impacts are generally larger for SDII compared to PRCTOT, indicating that the temporal variability of precipitation intensity is generally more susceptible to anthropogenic influence than precipitation amount. Lastly, the results highlight that anthropogenic activities have changed not only the trends but also the temporal variability of annual precipitation, which underscores the need to develop effective adaptation management practices to address the increased disparity.« less

  1. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    PubMed

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  2. Physical fitness predicts technical-tactical and time-motion profile in simulated Judo and Brazilian Jiu-Jitsu matches.

    PubMed

    Coswig, Victor S; Gentil, Paulo; Bueno, João C A; Follmer, Bruno; Marques, Vitor A; Del Vecchio, Fabrício B

    2018-01-01

    Among combat sports, Judo and Brazilian Jiu-Jitsu (BJJ) present elevated physical fitness demands from the high-intensity intermittent efforts. However, information regarding how metabolic and neuromuscular physical fitness is associated with technical-tactical performance in Judo and BJJ fights is not available. This study aimed to relate indicators of physical fitness with combat performance variables in Judo and BJJ. The sample consisted of Judo ( n  = 16) and BJJ ( n  = 24) male athletes. At the first meeting, the physical tests were applied and, in the second, simulated fights were performed for later notational analysis. The main findings indicate: (i) high reproducibility of the proposed instrument and protocol used for notational analysis in a mobile device; (ii) differences in the technical-tactical and time-motion patterns between modalities; (iii) performance-related variables are different in Judo and BJJ; and (iv) regression models based on metabolic fitness variables may account for up to 53% of the variances in technical-tactical and/or time-motion variables in Judo and up to 31% in BJJ, whereas neuromuscular fitness models can reach values up to 44 and 73% of prediction in Judo and BJJ, respectively. When all components are combined, they can explain up to 90% of high intensity actions in Judo. In conclusion, performance prediction models in simulated combat indicate that anaerobic, aerobic and neuromuscular fitness variables contribute to explain time-motion variables associated with high intensity and technical-tactical variables in Judo and BJJ fights.

  3. A Coupled Regional Climate Simulator for the Gulf of St. Lawrence, Canada

    NASA Astrophysics Data System (ADS)

    Faucher, M.; Saucier, F.; Caya, D.

    2003-12-01

    The climate of Eastern Canada is characterized by atmosphere-ocean-ice interactions due to the closeness of the North Atlantic Ocean and the Labrador Sea. Also, there are three relatively large inner basins: the Gulf of St-Lawrence, the Hudson Bay / Hudson Strait / Foxe Basin system and the Great Lakes, influencing the evolution of weather systems and therefore the regional climate. These basins are characterized by irregular coastlines and variables sea-ice in winter, so that the interactions between the atmosphere and the ocean are more complex. There are coupled general circulation models (GCMs) that are available to study the climate of Eastern Canada, but their resolution (near 350km) is to low to resolve the details of the regional climate of this area and to provide valuable information for climate impact studies. The goal of this work is to develop a coupled regional climate simulator for Eastern Canada to study the climate and its variability, necessary to assess the future climate in a double CO2 situation. An off-line coupling strategy through the interacting fields is used to link the Canadian Regional Climate Model developed at the "Universite du Quebec a Montreal" (CRCM, Caya and Laprise 1999) to the Gulf of St. Lawrence ocean model developed at the "Institut Maurice-Lamontagne" (GOM, Saucier et al. 2002). This strategy involves running both simulators separately and alternatively, using variables from the other simulator to supply the needed forcing fields every day. We present the results of a first series of seasonal simulations performed with this system to show the ability of our climate simulator to reproduce the known characteristics of the regional circulation such as mesoscale oceanic features, fronts and sea-ice. The simulations were done for the period from December 1st, 1989 to March 31st, 1990. The results are compared with those of previous uncoupled runs (Faucher et al. 2003) and with observations.

  4. Modeling the spatial and temporal variability in climate and primary productivity across the Luquillo Mountains, Puerto Rico.

    Treesearch

    Hongqing Wanga; Charles A.S. Halla; Frederick N. Scatenab; Ned Fetcherc; Wei Wua

    2003-01-01

    There are few studies that have examined the spatial variability of forest productivity over an entire tropical forested landscape. In this study, we used a spatially-explicit forest productivity model, TOPOPROD, which is based on the FORESTBGC model, to simulate spatial patterns of gross primary productivity (GPP), net primary productivity (NPP), and respiration over...

  5. Large-scale expensive black-box function optimization

    NASA Astrophysics Data System (ADS)

    Rashid, Kashif; Bailey, William; Couët, Benoît

    2012-09-01

    This paper presents the application of an adaptive radial basis function method to a computationally expensive black-box reservoir simulation model of many variables. An iterative proxy-based scheme is used to tune the control variables, distributed for finer control over a varying number of intervals covering the total simulation period, to maximize asset NPV. The method shows that large-scale simulation-based function optimization of several hundred variables is practical and effective.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitrani, J

    Bayesian networks (BN) are an excellent tool for modeling uncertainties in systems with several interdependent variables. A BN is a directed acyclic graph, and consists of a structure, or the set of directional links between variables that depend on other variables, and conditional probabilities (CP) for each variable. In this project, we apply BN's to understand uncertainties in NIF ignition experiments. One can represent various physical properties of National Ignition Facility (NIF) capsule implosions as variables in a BN. A dataset containing simulations of NIF capsule implosions was provided. The dataset was generated from a radiation hydrodynamics code, and itmore » contained 120 simulations of 16 variables. Relevant knowledge about the physics of NIF capsule implosions and greedy search algorithms were used to search for hypothetical structures for a BN. Our preliminary results found 6 links between variables in the dataset. However, we thought there should have been more links between the dataset variables based on the physics of NIF capsule implosions. Important reasons for the paucity of links are the relatively small size of the dataset, and the sampling of the values for dataset variables. Another factor that might have caused the paucity of links is the fact that in the dataset, 20% of the simulations represented successful fusion, and 80% didn't, (simulations of unsuccessful fusion are useful for measuring certain diagnostics) which skewed the distributions of several variables, and possibly reduced the number of links. Nevertheless, by illustrating the interdependencies and conditional probabilities of several parameters and diagnostics, an accurate and complete BN built from an appropriate simulation set would provide uncertainty quantification for NIF capsule implosions.« less

  7. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    PubMed

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  8. Variability of the Martian thermospheric temperatures during the last 7 Martian Years

    NASA Astrophysics Data System (ADS)

    Gonzalez-Galindo, Francisco; Lopez-Valverde, Miguel Angel; Millour, Ehouarn; Forget, François

    2014-05-01

    The temperatures and densities in the Martian upper atmosphere have a significant influence over the different processes producing atmospheric escape. A good knowledge of the thermosphere and its variability is thus necessary in order to better understand and quantify the atmospheric loss to space and the evolution of the planet. Different global models have been used to study the seasonal and interannual variability of the Martian thermosphere, usually considering three solar scenarios (solar minimum, solar medium and solar maximum conditions) to take into account the solar cycle variability. However, the variability of the solar activity within the simulated period of time is not usually considered in these models. We have improved the description of the UV solar flux included on the General Circulation Model for Mars developed at the Laboratoire de Météorologie Dynamique (LMD-MGCM) in order to include its observed day-to-day variability. We have used the model to simulate the thermospheric variability during Martian Years 24 to 30, using realistic UV solar fluxes and dust opacities. The model predicts and interannual variability of the temperatures in the upper thermosphere that ranges from about 50 K during the aphelion to up to 150 K during perihelion. The seasonal variability of temperatures due to the eccentricity of the Martian orbit is modified by the variability of the solar flux within a given Martian year. The solar rotation cycle produces temperature oscillations of up to 30 K. We have also studied the response of the modeled thermosphere to the global dust storms in Martian Year 25 and Martian Year 28. The atmospheric dynamics are significantly modified by the global dust storms, which induces significant changes in the thermospheric temperatures. The response of the model to the presence of both global dust storms is in good agreement with previous modeling results (Medvedev et al., Journal of Geophysical Research, 2013). As expected, the simulated ionosphere is also sensitive to the variability of the solar activity. Acknowledgemnt: Francisco González-Galindo is funded by a CSIC JAE-Doc contract financed by the European Social Fund

  9. Bio-inspired online variable recruitment control of fluidic artificial muscles

    NASA Astrophysics Data System (ADS)

    Jenkins, Tyler E.; Chapman, Edward M.; Bryant, Matthew

    2016-12-01

    This paper details the creation of a hybrid variable recruitment control scheme for fluidic artificial muscle (FAM) actuators with an emphasis on maximizing system efficiency and switching control performance. Variable recruitment is the process of altering a system’s active number of actuators, allowing operation in distinct force regimes. Previously, FAM variable recruitment was only quantified with offline, manual valve switching; this study addresses the creation and characterization of novel, on-line FAM switching control algorithms. The bio-inspired algorithms are implemented in conjunction with a PID and model-based controller, and applied to a simulated plant model. Variable recruitment transition effects and chatter rejection are explored via a sensitivity analysis, allowing a system designer to weigh tradeoffs in actuator modeling, algorithm choice, and necessary hardware. Variable recruitment is further developed through simulation of a robotic arm tracking a variety of spline position inputs, requiring several levels of actuator recruitment. Switching controller performance is quantified and compared with baseline systems lacking variable recruitment. The work extends current variable recruitment knowledge by creating novel online variable recruitment control schemes, and exploring how online actuator recruitment affects system efficiency and control performance. Key topics associated with implementing a variable recruitment scheme, including the effects of modeling inaccuracies, hardware considerations, and switching transition concerns are also addressed.

  10. How potentially predictable are midlatitude ocean currents?

    PubMed Central

    Nonaka, Masami; Sasai, Yoshikazu; Sasaki, Hideharu; Taguchi, Bunmei; Nakamura, Hisashi

    2016-01-01

    Predictability of atmospheric variability is known to be limited owing to significant uncertainty that arises from intrinsic variability generated independently of external forcing and/or boundary conditions. Observed atmospheric variability is therefore regarded as just a single realization among different dynamical states that could occur. In contrast, subject to wind, thermal and fresh-water forcing at the surface, the ocean circulation has been considered to be rather deterministic under the prescribed atmospheric forcing, and it still remains unknown how uncertain the upper-ocean circulation variability is. This study evaluates how much uncertainty the oceanic interannual variability can potentially have, through multiple simulations with an eddy-resolving ocean general circulation model driven by the observed interannually-varying atmospheric forcing under slightly different conditions. These ensemble “hindcast” experiments have revealed substantial uncertainty due to intrinsic variability in the extratropical ocean circulation that limits potential predictability of its interannual variability, especially along the strong western boundary currents (WBCs) in mid-latitudes, including the Kuroshio and its eastward extention. The intrinsic variability also greatly limits potential predictability of meso-scale oceanic eddy activity. These findings suggest that multi-member ensemble simulations are essential for understanding and predicting variability in the WBCs, which are important for weather and climate variability and marine ecosystems. PMID:26831954

  11. Investigation of North American Vegetation Variability under Recent Climate: A Study Using the SSiB4/TRIFFID Biophysical/Dynamic Vegetation Model

    NASA Technical Reports Server (NTRS)

    Zhang, Zhengqiu; Xue, Yongkang; MacDonald, Glen; Cox, Peter M.; Collatz, George J.

    2015-01-01

    Recent studies have shown that current dynamic vegetation models have serious weaknesses in reproducing the observed vegetation dynamics and contribute to bias in climate simulations. This study intends to identify the major factors that underlie the connections between vegetation dynamics and climate variability and investigates vegetation spatial distribution and temporal variability at seasonal to decadal scales over North America (NA) to assess a 2-D biophysical model/dynamic vegetation model's (Simplified Simple Biosphere Model version 4, coupled with the Top-down Representation of Interactive Foliage and Flora Including Dynamics Model (SSiB4/TRIFFID)) ability to simulate these characteristics for the past 60 years (1948 through 2008). Satellite data are employed as constraints for the study and to compare the relationships between vegetation and climate from the observational and the simulation data sets. Trends in NA vegetation over this period are examined. The optimum temperature for photosynthesis, leaf drop threshold temperatures, and competition coefficients in the Lotka-Volterra equation, which describes the population dynamics of species competing for some common resource, have been identified as having major impacts on vegetation spatial distribution and obtaining proper initial vegetation conditions in SSiB4/TRIFFID. The finding that vegetation competition coefficients significantly affect vegetation distribution suggests the importance of including biotic effects in dynamical vegetation modeling. The improved SSiB4/TRIFFID can reproduce the main features of the NA distributions of dominant vegetation types, the vegetation fraction, and leaf area index (LAI), including its seasonal, interannual, and decadal variabilities. The simulated NA LAI also shows a general increasing trend after the 1970s in responding to warming. Both simulation and satellite observations reveal that LAI increased substantially in the southeastern U.S. starting from the 1980s. The effects of the severe drought during 1987-1992 and the last decade in the southwestern U.S. on vegetation are also evident from decreases in the simulated and satellite-derived LAIs. Both simulated and satellite-derived LAIs have the strongest correlations with air temperature at northern middle to high latitudes in spring reflecting the effect of these climatic variables on photosynthesis and phenological processes. Meanwhile, in southwestern dry lands, negative correlations appear due to the heat and moisture stress there during the summer. Furthermore, there are also positive correlations between soil wetness and LAI, which increases from spring to summer. The present study shows both the current improvements and remaining weaknesses in dynamical vegetation models. It also highlights large continental-scale variations that have occurred in NA vegetation over the past six decades and their potential relations to climate. With more observational data availability, more studies with differentmodels and focusing on different regions will be possible and are necessary to achieve comprehensive understanding of the vegetation dynamics and climate interactions.

  12. Ensemble simulations of the role of the stratosphere in the attribution of northern extratropical tropospheric ozone variability

    NASA Astrophysics Data System (ADS)

    Hess, P.; Kinnison, D.; Tang, Q.

    2015-03-01

    Despite the need to understand the impact of changes in emissions and climate on tropospheric ozone, the attribution of tropospheric interannual ozone variability to specific processes has proven difficult. Here, we analyze the stratospheric contribution to tropospheric ozone variability and trends from 1953 to 2005 in the Northern Hemisphere (NH) mid-latitudes using four ensemble simulations of the free running (FR) Whole Atmosphere Community Climate Model (WACCM). The simulations are externally forced with observed time-varying (1) sea-surface temperatures (SSTs), (2) greenhouse gases (GHGs), (3) ozone depleting substances (ODS), (4) quasi-biennial oscillation (QBO), (5) solar variability (SV) and (6) stratospheric sulfate surface area density (SAD). A detailed representation of stratospheric chemistry is simulated, including the ozone loss due to volcanic eruptions and polar stratospheric clouds. In the troposphere, ozone production is represented by CH4-NOx smog chemistry, where surface chemical emissions remain interannually constant. Despite the simplicity of its tropospheric chemistry, at many NH measurement locations, the interannual ozone variability in the FR WACCM simulations is significantly correlated with the measured interannual variability. This suggests the importance of the external forcing applied in these simulations in driving interannual ozone variability. The variability and trend in the simulated 1953-2005 tropospheric ozone from 30 to 90° N at background surface measurement sites, 500 hPa measurement sites and in the area average are largely explained on interannual timescales by changes in the 30-90° N area averaged flux of ozone across the 100 hPa surface and changes in tropospheric methane concentrations. The average sensitivity of tropospheric ozone to methane (percent change in ozone to a percent change in methane) from 30 to 90° N is 0.17 at 500 hPa and 0.21 at the surface; the average sensitivity of tropospheric ozone to the 100 hPa ozone flux (percent change in ozone to a percent change in the ozone flux) from 30 to 90° N is 0.19 at 500 hPa and 0.11 at the surface. The 30-90° N simulated downward residual velocity at 100 hPa increased by 15% between 1953 and 2005. However, the impact of this on the 30-90° N 100 hPa ozone flux is modulated by the long-term changes in stratospheric ozone. The ozone flux decreases from 1965 to 1990 due to stratospheric ozone depletion, but increases again by approximately 7% from 1990 to 2005. The first empirical orthogonal function of interannual ozone variability explains from 40% (at the surface) to over 80% (at 150 hPa) of the simulated ozone interannual variability from 30 to 90° N. This identified mode of ozone variability shows strong stratosphere-troposphere coupling, demonstrating the importance of the stratosphere in an attribution of tropospheric ozone variability. The simulations, with no change in emissions, capture almost 50% of the measured ozone change during the 1990s at a variety of locations. This suggests that a large portion of the measured change is not due to changes in emissions, but can be traced to changes in large-scale modes of ozone variability. This emphasizes the difficulty in the attribution of ozone changes, and the importance of natural variability in understanding the trends and variability of ozone. We find little relation between the El Niño-Southern Oscillation (ENSO) index and large-scale tropospheric ozone variability over the long-term record.

  13. Western Wind and Solar Integration Study Phase 2 (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lew, D.; Brinkman, G.; Ibanez, E.

    This presentation accompanies Phase 2 of the Western Wind and Solar Integration Study, a follow-on to Phase 1, which examined the operational impacts of high penetrations of variable renewable generation on the electric power system in the West and was one of the largest variable generation studies to date. High penetrations of variable generation can induce cycling of fossil-fueled generators. Cycling leads to wear-and-tear costs and changes in emissions. Phase 2 calculated these costs and emissions, and simulated grid operations for a year to investigate the detailed impact of variable generation on the fossil-fueled fleet. The presentation highlights the scopemore » of the study and results.« less

  14. Sample Size Limits for Estimating Upper Level Mediation Models Using Multilevel SEM

    ERIC Educational Resources Information Center

    Li, Xin; Beretvas, S. Natasha

    2013-01-01

    This simulation study investigated use of the multilevel structural equation model (MLSEM) for handling measurement error in both mediator and outcome variables ("M" and "Y") in an upper level multilevel mediation model. Mediation and outcome variable indicators were generated with measurement error. Parameter and standard…

  15. Assessment of sampling stability in ecological applications of discriminant analysis

    USGS Publications Warehouse

    Williams, B.K.; Titus, K.

    1988-01-01

    A simulation study was undertaken to assess the sampling stability of the variable loadings in linear discriminant function analysis. A factorial design was used for the factors of multivariate dimensionality, dispersion structure, configuration of group means, and sample size. A total of 32,400 discriminant analyses were conducted, based on data from simulated populations with appropriate underlying statistical distributions. A review of 60 published studies and 142 individual analyses indicated that sample sizes in ecological studies often have met that requirement. However, individual group sample sizes frequently were very unequal, and checks of assumptions usually were not reported. The authors recommend that ecologists obtain group sample sizes that are at least three times as large as the number of variables measured.

  16. Improving sea level simulation in Mediterranean regional climate models

    NASA Astrophysics Data System (ADS)

    Adloff, Fanny; Jordà, Gabriel; Somot, Samuel; Sevault, Florence; Arsouze, Thomas; Meyssignac, Benoit; Li, Laurent; Planton, Serge

    2017-08-01

    For now, the question about future sea level change in the Mediterranean remains a challenge. Previous climate modelling attempts to estimate future sea level change in the Mediterranean did not meet a consensus. The low resolution of CMIP-type models prevents an accurate representation of important small scales processes acting over the Mediterranean region. For this reason among others, the use of high resolution regional ocean modelling has been recommended in literature to address the question of ongoing and future Mediterranean sea level change in response to climate change or greenhouse gases emissions. Also, it has been shown that east Atlantic sea level variability is the dominant driver of the Mediterranean variability at interannual and interdecadal scales. However, up to now, long-term regional simulations of the Mediterranean Sea do not integrate the full sea level information from the Atlantic, which is a substantial shortcoming when analysing Mediterranean sea level response. In the present study we analyse different approaches followed by state-of-the-art regional climate models to simulate Mediterranean sea level variability. Additionally we present a new simulation which incorporates improved information of Atlantic sea level forcing at the lateral boundary. We evaluate the skills of the different simulations in the frame of long-term hindcast simulations spanning from 1980 to 2012 analysing sea level variability from seasonal to multidecadal scales. Results from the new simulation show a substantial improvement in the modelled Mediterranean sea level signal. This confirms that Mediterranean mean sea level is strongly influenced by the Atlantic conditions, and thus suggests that the quality of the information in the lateral boundary conditions (LBCs) is crucial for the good modelling of Mediterranean sea level. We also found that the regional differences inside the basin, that are induced by circulation changes, are model-dependent and thus not affected by the LBCs. Finally, we argue that a correct configuration of LBCs in the Atlantic should be used for future Mediterranean simulations, which cover hindcast period, but also for scenarios.

  17. A general theoretical framework for interpreting patient-reported outcomes estimated from ordinally scaled item responses.

    PubMed

    Massof, Robert W

    2014-10-01

    A simple theoretical framework explains patient responses to items in rating scale questionnaires. Fixed latent variables position each patient and each item on the same linear scale. Item responses are governed by a set of fixed category thresholds, one for each ordinal response category. A patient's item responses are magnitude estimates of the difference between the patient variable and the patient's estimate of the item variable, relative to his/her personally defined response category thresholds. Differences between patients in their personal estimates of the item variable and in their personal choices of category thresholds are represented by random variables added to the corresponding fixed variables. Effects of intervention correspond to changes in the patient variable, the patient's response bias, and/or latent item variables for a subset of items. Intervention effects on patients' item responses were simulated by assuming the random variables are normally distributed with a constant scalar covariance matrix. Rasch analysis was used to estimate latent variables from the simulated responses. The simulations demonstrate that changes in the patient variable and changes in response bias produce indistinguishable effects on item responses and manifest as changes only in the estimated patient variable. Changes in a subset of item variables manifest as intervention-specific differential item functioning and as changes in the estimated person variable that equals the average of changes in the item variables. Simulations demonstrate that intervention-specific differential item functioning produces inefficiencies and inaccuracies in computer adaptive testing. © The Author(s) 2013 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  18. A probabilistic approach to composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  19. Results from the VALUE perfect predictor experiment: process-based evaluation

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Soares, Pedro; Hertig, Elke; Brands, Swen; Huth, Radan; Cardoso, Rita; Kotlarski, Sven; Casado, Maria; Pongracz, Rita; Bartholy, Judit

    2016-04-01

    Until recently, the evaluation of downscaled climate model simulations has typically been limited to surface climatologies, including long term means, spatial variability and extremes. But these aspects are often, at least partly, tuned in regional climate models to match observed climate. The tuning issue is of course particularly relevant for bias corrected regional climate models. In general, a good performance of a model for these aspects in present climate does therefore not imply a good performance in simulating climate change. It is now widely accepted that, to increase our condidence in climate change simulations, it is necessary to evaluate how climate models simulate relevant underlying processes. In other words, it is important to assess whether downscaling does the right for the right reason. Therefore, VALUE has carried out a broad process-based evaluation study based on its perfect predictor experiment simulations: the downscaling methods are driven by ERA-Interim data over the period 1979-2008, reference observations are given by a network of 85 meteorological stations covering all European climates. More than 30 methods participated in the evaluation. In order to compare statistical and dynamical methods, only variables provided by both types of approaches could be considered. This limited the analysis to conditioning local surface variables on variables from driving processes that are simulated by ERA-Interim. We considered the following types of processes: at the continental scale, we evaluated the performance of downscaling methods for positive and negative North Atlantic Oscillation, Atlantic ridge and blocking situations. At synoptic scales, we considered Lamb weather types for selected European regions such as Scandinavia, the United Kingdom, the Iberian Pensinsula or the Alps. At regional scales we considered phenomena such as the Mistral, the Bora or the Iberian coastal jet. Such process-based evaluation helps to attribute biases in surface variables to underlying processes and ultimately to improve climate models.

  20. Does children's energy intake at one meal influence their intake at subsequent meals? Or do we just think it does?

    PubMed

    Hanley, James A; Hutcheon, Jennifer A

    2010-05-01

    It is widely believed that young children are able to adjust their energy intake across successive meals to compensate for higher or lower intakes at a given meal. This conclusion is based on past observations that although children's intake at individual meals is highly variable, total daily intakes are relatively constant. We investigated how much of this reduction in variability could be explained by the statistical phenomenon of the variability of individual components (each meal) always being relatively larger than the variability of their sum (total daily intake), independent of any physiological compensatory mechanism. We calculated, theoretically and by simulation, how variable a child's daily intake would be if there was no correlation between intakes at individual meals. We simulated groups of children with meal/snack intakes and variability in meal/snack intakes based on previously published values. Most importantly, we assumed that there was no correlation between intakes on successive meals. In both approaches, the coefficient of variation of the daily intakes was roughly 15%, considerably less than the 34% for individual meals. Thus, most of the reduction in variability found in past studies was explained without positing strong 'compensation'. Although children's daily energy intakes are indeed considerably less variable than their individual components, this phenomenon was observed even when intakes at each meal were simulated to be totally independent. We conclude that the commonly held belief that young children have a strong physiological compensatory mechanism to adjust intake at one meal based on intake at prior meals is likely to be based on flawed statistical reasoning.

  1. Assessing the Impact of Climatic Variability and Change on Maize Production in the Midwestern USA

    NASA Astrophysics Data System (ADS)

    Andresen, J.; Jain, A. K.; Niyogi, D. S.; Alagarswamy, G.; Biehl, L.; Delamater, P.; Doering, O.; Elias, A.; Elmore, R.; Gramig, B.; Hart, C.; Kellner, O.; Liu, X.; Mohankumar, E.; Prokopy, L. S.; Song, C.; Todey, D.; Widhalm, M.

    2013-12-01

    Weather and climate remain among the most important uncontrollable factors in agricultural production systems. In this study, three process-based crop simulation models were used to identify the impacts of climate on the production of maize in the Midwestern U.S.A. during the past century. The 12-state region is a key global production area, responsible for more than 80% of U.S. domestic and 25% of total global production. The study is a part of the Useful to Useable (U2U) Project, a USDA NIFA-sponsored project seeking to improve the resilience and profitability of farming operations in the region amid climate variability and change. Three process-based crop simulation models were used in the study: CERES-Maize (DSSAT, Hoogenboom et al., 2012), the Hybrid-Maize model (Yang et al., 2004), and the Integrated Science Assessment Model (ISAM, Song et al., 2013). Model validation was carried out with individual plot and county observations. The models were run with 4 to 50 km spatial resolution gridded weather data for representative soils and cultivars, 1981-2012, to examine spatial and temporal yield variability within the region. We also examined the influence of different crop models and spatial scales on regional scale yield estimation, as well as a yield gap analysis between observed and attainable yields. An additional study was carried out with the CERES-Maize model at 18 individual site locations 1901-2012 to examine longer term historical trends. For all simulations, all input variables were held constant in order to isolate the impacts of climate. In general, the model estimates were in good agreement with observed yields, especially in central sections of the region. Regionally, low precipitation and soil moisture stress were chief limitations to simulated crop yields. The study suggests that at least part of the observed yield increases in the region during recent decades have occurred as the result of wetter, less stressful growing season weather conditions.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konapala, Goutam; Mishra, Ashok; Leung, L. Ruby

    This study investigated the anthropogenic influence on the temporal variability of annual precipitation for the period 1950-2005 as simulated by the CMIP5 models. The temporal variability of both annual precipitation amount (PRCPTOT) and intensity (SDII) was first measured using a metric of statistical dispersion called the Gini coefficient. Comparing simulations driven by both anthropogenic and natural forcings (ALL) with simulations of natural forcings only (NAT), we quantified the anthropogenic contributions to the changes in temporal variability at global, continental and sub-continental scales as a relative difference of the respective Gini coefficients of ALL and NAT. Over the period of 1950-2005,more » our results indicate that anthropogenic forcings have resulted in decreased uniformity (i.e., increase in unevenness or disparity) in annual precipitation amount and intensity at global as well as continental scales. In addition, out of the 21 sub-continental regions considered, 14 (PRCPTOT) and 17 (SDII) regions showed significant anthropogenic influences. The human impacts are generally larger for SDII compared to PRCTOT, indicating that the temporal variability of precipitation intensity is generally more susceptible to anthropogenic influence than precipitation amount. Lastly, the results highlight that anthropogenic activities have changed not only the trends but also the temporal variability of annual precipitation, which underscores the need to develop effective adaptation management practices to address the increased disparity.« less

  3. Underestimated interannual variability of East Asian summer rainfall under climate change

    NASA Astrophysics Data System (ADS)

    Ren, Yongjian; Song, Lianchun; Xiao, Ying; Du, Liangmin

    2018-02-01

    This study evaluates the performance of climate models in simulating the climatological mean and interannual variability of East Asian summer rainfall (EASR) using Coupled Model Intercomparison Project Phase 5 (CMIP5). Compared to the observation, the interannual variability of EASR during 1979-2005 is underestimated by the CMIP5 with a range of 0.86 16.08%. Based on bias correction of CMIP5 simulations with historical data, the reliability of future projections will be enhanced. The corrected EASR under representative concentration pathways (RCPs) 4.5 and 8.5 increases by 5.6 and 7.5% during 2081-2100 relative to the baseline of 1986-2005, respectively. After correction, the areas with both negative and positive anomalies decrease, which are mainly located in the South China Sea and central China, and southern China and west of the Philippines, separately. In comparison to the baseline, the interannual variability of EASR increases by 20.8% under RCP4.5 but 26.2% under RCP8.5 in 2006-2100, which is underestimated by 10.7 and 11.1% under both RCPs in the original CMIP5 simulation. Compared with the mean precipitation, the interannual variability of EASR is notably larger under global warming. Thus, the probabilities of floods and droughts may increase in the future.

  4. The intraannual variability of land-atmosphere coupling over North America in the Canadian Regional Climate Model (CRCM5)

    NASA Astrophysics Data System (ADS)

    Yang Kam Wing, G.; Sushama, L.; Diro, G. T.

    2016-12-01

    This study investigates the intraannual variability of soil moisture-temperature coupling over North America. To this effect, coupled and uncoupled simulations are performed with the fifth-generation Canadian Regional Climate Model (CRCM5), driven by ERA-Interim. In coupled simulations, land and atmosphere interact freely; in uncoupled simulations, the interannual variability of soil moisture is suppressed by prescribing climatological values for soil liquid and frozen water contents. The study also explores projected changes to coupling by comparing coupled and uncoupled CRCM5 simulations for current (1981-2010) and future (2071-2100) periods, driven by the Canadian Earth System Model. Coupling differs for the northern and southern parts of North America. Over the southern half, it is persistent throughout the year while for the northern half, strongly coupled regions generally follow the freezing line during the cold months. Detailed analysis of the southern Canadian Prairies reveals seasonal differences in the underlying coupling mechanism. During spring and fall, as opposed to summer, the interactive soil moisture phase impacts the snow depth and surface albedo, which further impacts the surface energy budget and thus the surface air temperature; the air temperature then influences the snow depth in a feedback loop. Projected changes to coupling are also season specific: relatively drier soil conditions strengthen coupling during summer, while changes in soil moisture phase, snow depth, and cloud cover impact coupling during colder months. Furthermore, results demonstrate that soil moisture variability amplifies the frequency of temperature extremes over regions of strong coupling in current and future climates.

  5. One- and Two-dimensional Solitary Wave States in the Nonlinear Kramers Equation with Movement Direction as a Variable

    NASA Astrophysics Data System (ADS)

    Sakaguchi, Hidetsugu; Ishibashi, Kazuya

    2018-06-01

    We study self-propelled particles by direct numerical simulation of the nonlinear Kramers equation for self-propelled particles. In our previous paper, we studied self-propelled particles with velocity variables in one dimension. In this paper, we consider another model in which each particle exhibits directional motion. The movement direction is expressed with a variable ϕ. We show that one-dimensional solitary wave states appear in direct numerical simulations of the nonlinear Kramers equation in one- and two-dimensional systems, which is a generalization of our previous result. Furthermore, we find two-dimensionally localized states in the case that each self-propelled particle exhibits rotational motion. The center of mass of the two-dimensionally localized state exhibits circular motion, which implies collective rotating motion. Finally, we consider a simple one-dimensional model equation to qualitatively understand the formation of the solitary wave state.

  6. Water-vapour variability within a convective boundary-layer assessed by large-eddy simulations and IHOP_2002 observations

    NASA Astrophysics Data System (ADS)

    Couvreux, F.; Guichard, F.; Redelsperger, J. L.; Kiemle, C.; Masson, V.; Lafore, J. P.; Flamant, C.

    2005-10-01

    This study presents a comprehensive analysis of the variability of water vapour in a growing convective boundary-layer (CBL) over land, highlighting the complex links between advection, convective activity and moisture heterogeneity in the boundary layer. A Large-eddy Simulation (LES) is designed, based on observations, and validated, using an independent data-set collected during the International H2O Project (IHOP 2002) fieldexperiment. Ample information about the moisture distribution in space and time, as well as other important CBL parameters are acquired by mesonet stations, balloon soundings, instruments on-board two aircraft and the DLR airborne water-vapour differential-absorption lidar. Because it can deliver two-dimensional cross-sections at high spatial resolution (140 m horizontal, 200 m vertical), the airborne lidar offers valuable insights of small-scale moisture-variability throughout the CBL. The LES is able to reproduce the development of the CBL in the morning and early afternoon, as assessed by comparisons of simulated mean profiles of key meteorological variables with sounding data. Simulated profiles of the variance of water-vapour mixing-ratio were found to be in good agreement with the lidar-derived counterparts. Finally, probability-density functions of potential temperature, vertical velocity and water-vapour mixing-ratio calculated from the LES show great consistency with those derived from aircraft in situ measurements in the middle of the CBL. Downdraughts entrained from above the CBL are governing the scale of moisture variability. Characteristic length-scales are found to be larger for water-vapour mixing-ratio than for temperature.The observed water-vapour variability exhibits contributions from different scales. The influence of the mesoscale (larger than LES domain size, i.e. 10 km) on the smaller-scale variability is assessed using LES and observations. The small-scale variability of water vapour is found to be important and to be driven by the dynamics of the CBL. Both lidar observations and LES evidence that dry downdraughts entrained from above the CBL are governing the scale of moisture variability. Characteristic length-scales are found to be larger for water-vapour mixing-ratio than for temperature and vertical velocity. In particular, intrusions of drier free-troposphere air from above the growing CBL impose a marked negative skewness on the water-vapour distribution within it, both as observed and in the simulation.

  7. Variability in Temperature-Related Mortality Projections under Climate Change

    PubMed Central

    Benmarhnia, Tarik; Sottile, Marie-France; Plante, Céline; Brand, Allan; Casati, Barbara; Fournier, Michel

    2014-01-01

    Background: Most studies that have assessed impacts on mortality of future temperature increases have relied on a small number of simulations and have not addressed the variability and sources of uncertainty in their mortality projections. Objectives: We assessed the variability of temperature projections and dependent future mortality distributions, using a large panel of temperature simulations based on different climate models and emission scenarios. Methods: We used historical data from 1990 through 2007 for Montreal, Quebec, Canada, and Poisson regression models to estimate relative risks (RR) for daily nonaccidental mortality in association with three different daily temperature metrics (mean, minimum, and maximum temperature) during June through August. To estimate future numbers of deaths attributable to ambient temperatures and the uncertainty of the estimates, we used 32 different simulations of daily temperatures for June–August 2020–2037 derived from three global climate models (GCMs) and a Canadian regional climate model with three sets of RRs (one based on the observed historical data, and two on bootstrap samples that generated the 95% CI of the attributable number (AN) of deaths). We then used analysis of covariance to evaluate the influence of the simulation, the projected year, and the sets of RRs used to derive the attributable numbers of deaths. Results: We found that < 1% of the variability in the distributions of simulated temperature for June–August of 2020–2037 was explained by differences among the simulations. Estimated ANs for 2020–2037 ranged from 34 to 174 per summer (i.e., June–August). Most of the variability in mortality projections (38%) was related to the temperature–mortality RR used to estimate the ANs. Conclusions: The choice of the RR estimate for the association between temperature and mortality may be important to reduce uncertainty in mortality projections. Citation: Benmarhnia T, Sottile MF, Plante C, Brand A, Casati B, Fournier M, Smargiassi A. 2014. Variability in temperature-related mortality projections under climate change. Environ Health Perspect 122:1293–1298; http://dx.doi.org/10.1289/ehp.1306954 PMID:25036003

  8. Predictability of Subsurface Temperature and the AMOC

    NASA Astrophysics Data System (ADS)

    Chang, Y.; Schubert, S. D.

    2013-12-01

    GEOS 5 coupled model is extensively used for experimental decadal climate prediction. Understanding the limits of decadal ocean predictability is critical for making progress in these efforts. Using this model, we study the subsurface temperature initial value predictability, the variability of the Atlantic meridional overturning circulation (AMOC) and its impacts on the global climate. Our approach is to utilize the idealized data assimilation technology developed at the GMAO. The technique 'replay' allows us to assess, for example, the impact of the surface wind stresses and/or precipitation on the ocean in a very well controlled environment. By running the coupled model in replay mode we can in fact constrain the model using any existing reanalysis data set. We replay the model constraining (nudging) it to the MERRA reanalysis in various fields from 1948-2012. The fields, u,v,T,q,ps, are adjusted towards the 6-hourly analyzed fields in atmosphere. The simulated AMOC variability is studied with a 400-year-long segment of replay integration. The 84 cases of 10-year hindcasts are initialized from 4 different replay cycles. Here, the variability and predictability are examined further by a measure to quantify how much the subsurface temperature and AMOC variability has been influenced by atmospheric forcing and by ocean internal variability. The simulated impact of the AMOC on the multi-decadal variability of the SST, sea surface height (SSH) and sea ice extent is also studied.

  9. An exploration of the relationship between knowledge and performance-related variables in high-fidelity simulation: designing instruction that promotes expertise in practice.

    PubMed

    Hauber, Roxanne P; Cormier, Eileen; Whyte, James

    2010-01-01

    Increasingly, high-fidelity patient simulation (HFPS) is becoming essential to nursing education. Much remains unknown about how classroom learning is connected to student decision-making in simulation scenarios and the degree to which transference takes place between the classroom setting and actual practice. The present study was part of a larger pilot study aimed at determining the relationship between nursing students' clinical ability to prioritize their actions and the associated cognitions and physiologic outcomes of care using HFPS. In an effort to better explain the knowledge base being used by nursing students in HFPS, the investigators explored the relationship between common measures of knowledge and performance-related variables. Findings are discussed within the context of the expert performance approach and concepts from cognitive psychology, such as cognitive architecture, cognitive load, memory, and transference.

  10. Using collective variables to drive molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Fiorin, Giacomo; Klein, Michael L.; Hénin, Jérôme

    2013-12-01

    A software framework is introduced that facilitates the application of biasing algorithms to collective variables of the type commonly employed to drive massively parallel molecular dynamics (MD) simulations. The modular framework that is presented enables one to combine existing collective variables into new ones, and combine any chosen collective variable with available biasing methods. The latter include the classic time-dependent biases referred to as steered MD and targeted MD, the temperature-accelerated MD algorithm, as well as the adaptive free-energy biases called metadynamics and adaptive biasing force. The present modular software is extensible, and portable between commonly used MD simulation engines.

  11. A comparison of the cardiovascular effects of simulated and spontaneous laughter.

    PubMed

    Law, Mikaela M; Broadbent, Elizabeth A; Sollers, John J

    2018-04-01

    Laughter has long been regarded as beneficial for health, but the mechanisms are not clearly understood. The current study aimed to compare the acute cardiovascular effects of spontaneous and simulated laughter. A mixed factorial experiment was performed to examine changes in cardiovascular variables in response to experimental tasks across conditions. A sample of 72 participants were randomised to one of three 6 min interventions. Participants in the simulated laughter condition were asked to generate fake laughter, the spontaneous laughter condition viewed a humorous video, and the control condition watched a non-humorous documentary. This was followed by a laboratory stress task. Heart rate and heart rate variability (as indexed by rMSSD) were monitored continuously throughout the experiment using ECG. The simulated laughter condition had a significantly higher heart rate (p < .001, η p 2  = .26) and lower rMSSD (p < .001, η p 2  = .13) during the laughter task compared to the other two conditions. Follow-up hierarchical regressions indicated that the difference in heart rate was due to the fact that the simulated condition produced more laughter. The difference in rMSSD, however, was unique to the simulated condition even when controlling for the amount of laughter. The simulated laughter condition had a significantly lower mean HR during the stress task but this was not significant after controlling amount of laughter produced. Laughter leads to increased heart rate and reduced heart rate variability, which is similar to the effects of exercise. This finding is more pronounced in simulated laughter. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Interactions between Antarctic sea ice and large-scale atmospheric modes in CMIP5 models

    NASA Astrophysics Data System (ADS)

    Schroeter, Serena; Hobbs, Will; Bindoff, Nathaniel L.

    2017-03-01

    The response of Antarctic sea ice to large-scale patterns of atmospheric variability varies according to sea ice sector and season. In this study, interannual atmosphere-sea ice interactions were explored using observations and reanalysis data, and compared with simulated interactions by models in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Simulated relationships between atmospheric variability and sea ice variability generally reproduced the observed relationships, though more closely during the season of sea ice advance than the season of sea ice retreat. Atmospheric influence on sea ice is known to be strongest during advance, and it appears that models are able to capture the dominance of the atmosphere during advance. Simulations of ocean-atmosphere-sea ice interactions during retreat, however, require further investigation. A large proportion of model ensemble members overestimated the relative importance of the Southern Annular Mode (SAM) compared with other modes of high southern latitude climate, while the influence of tropical forcing was underestimated. This result emerged particularly strongly during the season of sea ice retreat. The zonal patterns of the SAM in many models and its exaggerated influence on sea ice overwhelm the comparatively underestimated meridional influence, suggesting that simulated sea ice variability would become more zonally symmetric as a result. Across the seasons of sea ice advance and retreat, three of the five sectors did not reveal a strong relationship with a pattern of large-scale atmospheric variability in one or both seasons, indicating that sea ice in these sectors may be influenced more strongly by atmospheric variability unexplained by the major atmospheric modes, or by heat exchange in the ocean.

  13. Exploring the impacts of physics and resolution on aqua-planet simulations from a nonhydrostatic global variable-resolution modeling framework: IMPACTS OF PHYSICS AND RESOLUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Chun; Leung, L. Ruby; Park, Sang-Hun

    Advances in computing resources are gradually moving regional and global numerical forecasting simulations towards sub-10 km resolution, but global high resolution climate simulations remain a challenge. The non-hydrostatic Model for Prediction Across Scales (MPAS) provides a global framework to achieve very high resolution using regional mesh refinement. Previous studies using the hydrostatic version of MPAS (H-MPAS) with the physics parameterizations of Community Atmosphere Model version 4 (CAM4) found notable resolution dependent behaviors. This study revisits the resolution sensitivity using the non-hydrostatic version of MPAS (NH-MPAS) with both CAM4 and CAM5 physics. A series of aqua-planet simulations at global quasi-uniform resolutionsmore » ranging from 240 km to 30 km and global variable resolution simulations with a regional mesh refinement of 30 km resolution over the tropics are analyzed, with a primary focus on the distinct characteristics of NH-MPAS in simulating precipitation, clouds, and large-scale circulation features compared to H-MPAS-CAM4. The resolution sensitivity of total precipitation and column integrated moisture in NH-MPAS is smaller than that in H-MPAS-CAM4. This contributes importantly to the reduced resolution sensitivity of large-scale circulation features such as the inter-tropical convergence zone and Hadley circulation in NH-MPAS compared to H-MPAS. In addition, NH-MPAS shows almost no resolution sensitivity in the simulated westerly jet, in contrast to the obvious poleward shift in H-MPAS with increasing resolution, which is partly explained by differences in the hyperdiffusion coefficients used in the two models that influence wave activity. With the reduced resolution sensitivity, simulations in the refined region of the NH-MPAS global variable resolution configuration exhibit zonally symmetric features that are more comparable to the quasi-uniform high-resolution simulations than those from H-MPAS that displays zonal asymmetry in simulations inside the refined region. Overall, NH-MPAS with CAM5 physics shows less resolution sensitivity compared to CAM4. These results provide a reference for future studies to further explore the use of NH-MPAS for high-resolution climate simulations in idealized and realistic configurations.« less

  14. Relaxing the rule of ten events per variable in logistic and Cox regression.

    PubMed

    Vittinghoff, Eric; McCulloch, Charles E

    2007-03-15

    The rule of thumb that logistic and Cox models should be used with a minimum of 10 outcome events per predictor variable (EPV), based on two simulation studies, may be too conservative. The authors conducted a large simulation study of other influences on confidence interval coverage, type I error, relative bias, and other model performance measures. They found a range of circumstances in which coverage and bias were within acceptable levels despite less than 10 EPV, as well as other factors that were as influential as or more influential than EPV. They conclude that this rule can be relaxed, in particular for sensitivity analyses undertaken to demonstrate adequate control of confounding.

  15. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  16. Subject order-independent group ICA (SOI-GICA) for functional MRI data analysis.

    PubMed

    Zhang, Han; Zuo, Xi-Nian; Ma, Shuang-Ye; Zang, Yu-Feng; Milham, Michael P; Zhu, Chao-Zhe

    2010-07-15

    Independent component analysis (ICA) is a data-driven approach to study functional magnetic resonance imaging (fMRI) data. Particularly, for group analysis on multiple subjects, temporally concatenation group ICA (TC-GICA) is intensively used. However, due to the usually limited computational capability, data reduction with principal component analysis (PCA: a standard preprocessing step of ICA decomposition) is difficult to achieve for a large dataset. To overcome this, TC-GICA employs multiple-stage PCA data reduction. Such multiple-stage PCA data reduction, however, leads to variable outputs due to different subject concatenation orders. Consequently, the ICA algorithm uses the variable multiple-stage PCA outputs and generates variable decompositions. In this study, a rigorous theoretical analysis was conducted to prove the existence of such variability. Simulated and real fMRI experiments were used to demonstrate the subject-order-induced variability of TC-GICA results using multiple PCA data reductions. To solve this problem, we propose a new subject order-independent group ICA (SOI-GICA). Both simulated and real fMRI data experiments demonstrated the high robustness and accuracy of the SOI-GICA results compared to those of traditional TC-GICA. Accordingly, we recommend SOI-GICA for group ICA-based fMRI studies, especially those with large data sets. Copyright 2010 Elsevier Inc. All rights reserved.

  17. An improved neutral landscape model for recreating real landscapes and generating landscape series for spatial ecological simulations.

    PubMed

    van Strien, Maarten J; Slager, Cornelis T J; de Vries, Bauke; Grêt-Regamey, Adrienne

    2016-06-01

    Many studies have assessed the effect of landscape patterns on spatial ecological processes by simulating these processes in computer-generated landscapes with varying composition and configuration. To generate such landscapes, various neutral landscape models have been developed. However, the limited set of landscape-level pattern variables included in these models is often inadequate to generate landscapes that reflect real landscapes. In order to achieve more flexibility and variability in the generated landscapes patterns, a more complete set of class- and patch-level pattern variables should be implemented in these models. These enhancements have been implemented in Landscape Generator (LG), which is a software that uses optimization algorithms to generate landscapes that match user-defined target values. Developed for participatory spatial planning at small scale, we enhanced the usability of LG and demonstrated how it can be used for larger scale ecological studies. First, we used LG to recreate landscape patterns from a real landscape (i.e., a mountainous region in Switzerland). Second, we generated landscape series with incrementally changing pattern variables, which could be used in ecological simulation studies. We found that LG was able to recreate landscape patterns that approximate those of real landscapes. Furthermore, we successfully generated landscape series that would not have been possible with traditional neutral landscape models. LG is a promising novel approach for generating neutral landscapes and enables testing of new hypotheses regarding the influence of landscape patterns on ecological processes. LG is freely available online.

  18. Synchronization of chaotic systems involving fractional operators of Liouville-Caputo type with variable-order

    NASA Astrophysics Data System (ADS)

    Coronel-Escamilla, A.; Gómez-Aguilar, J. F.; Torres, L.; Escobar-Jiménez, R. F.; Valtierra-Rodríguez, M.

    2017-12-01

    In this paper, we propose a state-observer-based approach to synchronize variable-order fractional (VOF) chaotic systems. In particular, this work is focused on complete synchronization with a so-called unidirectional master-slave topology. The master is described by a dynamical system in state-space representation whereas the slave is described by a state observer. The slave is composed of a master copy and a correction term which in turn is constituted of an estimation error and an appropriate gain that assures the synchronization. The differential equations of the VOF chaotic system are described by the Liouville-Caputo and Atangana-Baleanu-Caputo derivatives. Numerical simulations involving the synchronization of Rössler oscillators, Chua's systems and multi-scrolls are studied. The simulations show that different chaotic behaviors can be obtained if different smooths functions defined in the interval (0 , 1 ] are used as the variable order of the fractional derivatives. Furthermore, simulations show that the VOF chaotic systems can be synchronized.

  19. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis.

    PubMed

    van Smeden, Maarten; de Groot, Joris A H; Moons, Karel G M; Collins, Gary S; Altman, Douglas G; Eijkemans, Marinus J C; Reitsma, Johannes B

    2016-11-24

    Ten events per variable (EPV) is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth's correction, are compared. The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect ('separation'). We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth's correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  20. GCM Simulation of the Large-scale North American Monsoon Including Water Vapor Tracer Diagnostics

    NASA Technical Reports Server (NTRS)

    Bosilovich, Michael G.; Schubert, Siegfried D.; Sud, Yogesh; Walker, Gregory K.

    2002-01-01

    In this study, we have applied GCM water vapor tracers (WVT) to simulate the North American water cycle. WVTs allow quantitative computation of the geographical source of water for precipitation that occurs anywhere in the model simulation. This can be used to isolate the impact that local surface evaporation has on precipitation, compared to advection and convection. A 15 year 1 deg, 1.25 deg. simulation has been performed with 11 global and 11 North American regional WVTs. Figure 1 shows the source regions of the North American WVTs. When water evaporates from one of these predefined regions, its mass is used as the source for a distinct prognostic variable in the model. This prognostic variable allows the water to be transported and removed (precipitated) from the system in an identical way that occurs to the prognostic specific humidity. Details of the model are outlined by Bosilovich and Schubert (2002) and Bosilovich (2002). Here, we present results pertaining to the onset of the simulated North American monsoon.

  1. Physiologic Simulation of the Fontan Surgery with Variable Wall Properties and Respiration

    NASA Astrophysics Data System (ADS)

    Long, Christopher; Bazilevs, Yuri; Feinstein, Jeffrey; Marsden, Alison

    2010-11-01

    Children born with single ventricle heart defects typically undergo a surgical procedure known as a total cavopulmonary connection (TCPC). The goal of this work is to perform hemodynamic simulations accounting for motion of the arterial walls in the TCPC. We perform fluid structure interactions (FSI) simulations using an Arbitrary Lagrangian Eulerian (ALE) finite element framework into a patient-specific model of the TCPC. The patient's post-op anatomy is reconstructed from MRI data. Respiration rate, heart rate, and venous pressures are obtained from catheterization data, and flowrates are obtained from phase contrast MRI data and are used together with a respiratory model. Lumped parameter (RCR) boundary conditions are used at the outlets. This study is the first to introduce variable elastic properties for the different areas of the TCPC, including a Gore-Tex conduit. Quantities such as wall shear stresses and pressures at critical junctions are extracted from the simulation and are compared with pressure tracings from clinical data as well as with rigid wall simulations.

  2. Quadrature Moments Method for the Simulation of Turbulent Reactive Flows

    NASA Technical Reports Server (NTRS)

    Raman, Venkatramanan; Pitsch, Heinz; Fox, Rodney O.

    2003-01-01

    A sub-filter model for reactive flows, namely the DQMOM model, was formulated for Large Eddy Simulation (LES) using the filtered mass density function. Transport equations required to determine the location and size of the delta-peaks were then formulated for a 2-peak decomposition of the FDF. The DQMOM scheme was implemented in an existing structured-grid LES solver. Simulations of scalar shear layer using an experimental configuration showed that the first and second moments of both reactive and inert scalars are in good agreement with a conventional Lagrangian scheme that evolves the same FDF. Comparisons with LES simulations performed using laminar chemistry assumption for the reactive scalar show that the new method provides vast improvements at minimal computational cost. Currently, the DQMOM model is being implemented for use with the progress variable/mixture fraction model of Pierce. Comparisons with experimental results and LES simulations using a single-environment for the progress-variable are planned. Future studies will aim at understanding the effect of increase in environments on predictions.

  3. Performance of nonlinear mixed effects models in the presence of informative dropout.

    PubMed

    Björnsson, Marcus A; Friberg, Lena E; Simonsson, Ulrika S H

    2015-01-01

    Informative dropout can lead to bias in statistical analyses if not handled appropriately. The objective of this simulation study was to investigate the performance of nonlinear mixed effects models with regard to bias and precision, with and without handling informative dropout. An efficacy variable and dropout depending on that efficacy variable were simulated and model parameters were reestimated, with or without including a dropout model. The Laplace and FOCE-I estimation methods in NONMEM 7, and the stochastic simulations and estimations (SSE) functionality in PsN, were used in the analysis. For the base scenario, bias was low, less than 5% for all fixed effects parameters, when a dropout model was used in the estimations. When a dropout model was not included, bias increased up to 8% for the Laplace method and up to 21% if the FOCE-I estimation method was applied. The bias increased with decreasing number of observations per subject, increasing placebo effect and increasing dropout rate, but was relatively unaffected by the number of subjects in the study. This study illustrates that ignoring informative dropout can lead to biased parameters in nonlinear mixed effects modeling, but even in cases with few observations or high dropout rate, the bias is relatively low and only translates into small effects on predictions of the underlying effect variable. A dropout model is, however, crucial in the presence of informative dropout in order to make realistic simulations of trial outcomes.

  4. Comparison of Measured and Numerically Simulated Turbulence Statistics in a Convective Boundary Layer Over Complex Terrain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rai, Raj K.; Berg, Larry K.; Kosović, Branko

    High resolution numerical simulation can provide insight into important physical processes that occur within the planetary boundary layer (PBL). The present work employs large eddy simulation (LES) using the Weather Forecasting and Research (WRF) model, with the LES domain nested within mesoscale simulation, to simulate real conditions in the convective PBL over an area of complex terrain. A multiple nesting approach has been used to downsize the grid spacing from 12.15 km (mesoscale) to 0.03 km (LES). A careful selection of grid spacing in the WRF Meso domain has been conducted to minimize artifacts in the WRF-LES solutions. The WRF-LESmore » results have been evaluated with in situ and remote sensing observations collected during the US Department of Energy-supported Columbia BasinWind Energy Study (CBWES). Comparison of the first- and second-order moments, turbulence spectrum, and probability density function (PDF) of wind speed shows good agreement between the simulations and data. Furthermore, the WRF-LES variables show a great deal of variability in space and time caused by the complex topography in the LES domain. The WRF-LES results show that the flow structures, such as roll vortices and convective cells, vary depending on both the location and time of day. In addition to basic studies related to boundary-layer meteorology, results from these simulations can be used in other applications, such as studying wind energy resources, atmospheric dispersion, fire weather etc.« less

  5. Simulation fails to replicate stress in trainees performing a technical procedure in the clinical environment.

    PubMed

    Baker, B G; Bhalla, A; Doleman, B; Yarnold, E; Simons, S; Lund, J N; Williams, J P

    2017-01-01

    Simulation-based training (SBT) has become an increasingly important method by which doctors learn. Stress has an impact upon learning, performance, technical, and non-technical skills. However, there are currently no studies that compare stress in the clinical and simulated environment. We aimed to compare objective (heart rate variability, HRV) and subjective (state trait anxiety inventory, STAI) measures of stress theatre with a simulated environment. HRV recordings were obtained from eight anesthetic trainees performing an uncomplicated rapid sequence induction at pre-determined procedural steps using a wireless Polar RS800CX monitor © in an emergency theatre setting. This was repeated in the simulated environment. Participants completed an STAI before and after the procedure. Eight trainees completed the study. The theatre environment caused an increase in objective stress vs baseline (p = .004). There was no significant difference between average objective stress levels across all time points (p = .20) between environments. However, there was a significant interaction between the variables of objective stress and environment (p = .045). There was no significant difference in subjective stress (p = .27) between environments. Simulation was unable to accurately replicate the stress of the technical procedure. This is the first study that compares the stress during SBT with the theatre environment and has implications for the assessment of simulated environments for use in examinations, rating of technical and non-technical skills, and stress management training.

  6. The Effects of Longitudinal Control-System Dynamics on Pilot Opinion and Response Characteristics as Determined from Flight Tests and from Ground Simulator Studies

    NASA Technical Reports Server (NTRS)

    Sadoff, Melvin

    1958-01-01

    The results of a fixed-base simulator study of the effects of variable longitudinal control-system dynamics on pilot opinion are presented and compared with flight-test data. The control-system variables considered in this investigation included stick force per g, time constant, and dead-band, or stabilizer breakout force. In general, the fairly good correlation between flight and simulator results for two pilots demonstrates the validity of fixed-base simulator studies which are designed to complement and supplement flight studies and serve as a guide in control-system preliminary design. However, in the investigation of certain problem areas (e.g., sensitive control-system configurations associated with pilot- induced oscillations in flight), fixed-base simulator results did not predict the occurrence of an instability, although the pilots noted the system was extremely sensitive and unsatisfactory. If it is desired to predict pilot-induced-oscillation tendencies, tests in moving-base simulators may be required. It was found possible to represent the human pilot by a linear pilot analog for the tracking task assumed in the present study. The criterion used to adjust the pilot analog was the root-mean-square tracking error of one of the human pilots on the fixed-base simulator. Matching the tracking error of the pilot analog to that of the human pilot gave an approximation to the variation of human-pilot behavior over a range of control-system dynamics. Results of the pilot-analog study indicated that both for optimized control-system dynamics (for poor airplane dynamics) and for a region of good airplane dynamics, the pilot response characteristics are approximately the same.

  7. Snow-atmosphere coupling and its impact on temperature variability and extremes over North America

    NASA Astrophysics Data System (ADS)

    Diro, G. T.; Sushama, L.; Huziy, O.

    2018-04-01

    The impact of snow-atmosphere coupling on climate variability and extremes over North America is investigated using modeling experiments with the fifth generation Canadian Regional Climate Model (CRCM5). To this end, two CRCM5 simulations driven by ERA-Interim reanalysis for the 1981-2010 period are performed, where snow cover and depth are prescribed (uncoupled) in one simulation while they evolve interactively (coupled) during model integration in the second one. Results indicate systematic influence of snow cover and snow depth variability on the inter-annual variability of soil and air temperatures during winter and spring seasons. Inter-annual variability of air temperature is larger in the coupled simulation, with snow cover and depth variability accounting for 40-60% of winter temperature variability over the Mid-west, Northern Great Plains and over the Canadian Prairies. The contribution of snow variability reaches even more than 70% during spring and the regions of high snow-temperature coupling extend north of the boreal forests. The dominant process contributing to the snow-atmosphere coupling is the albedo effect in winter, while the hydrological effect controls the coupling in spring. Snow cover/depth variability at different locations is also found to affect extremes. For instance, variability of cold-spell characteristics is sensitive to snow cover/depth variation over the Mid-west and Northern Great Plains, whereas, warm-spell variability is sensitive to snow variation primarily in regions with climatologically extensive snow cover such as northeast Canada and the Rockies. Furthermore, snow-atmosphere interactions appear to have contributed to enhancing the number of cold spell days during the 2002 spring, which is the coldest recorded during the study period, by over 50%, over western North America. Additional results also provide useful information on the importance of the interactions of snow with large-scale mode of variability in modulating temperature extreme characteristics.

  8. Simulating Ordinal Data

    ERIC Educational Resources Information Center

    Ferrari, Pier Alda; Barbiero, Alessandro

    2012-01-01

    The increasing use of ordinal variables in different fields has led to the introduction of new statistical methods for their analysis. The performance of these methods needs to be investigated under a number of experimental conditions. Procedures to simulate from ordinal variables are then required. In this article, we deal with simulation from…

  9. Impact of spectral nudging on regional climate simulation over CORDEX East Asia using WRF

    NASA Astrophysics Data System (ADS)

    Tang, Jianping; Wang, Shuyu; Niu, Xiaorui; Hui, Pinhong; Zong, Peishu; Wang, Xueyuan

    2017-04-01

    In this study, the impact of the spectral nudging method on regional climate simulation over the Coordinated Regional Climate Downscaling Experiment East Asia (CORDEX-EA) region is investigated using the Weather Research and Forecasting model (WRF). Driven by the ERA-Interim reanalysis, five continuous simulations covering 1989-2007 are conducted by the WRF model, in which four runs adopt the interior spectral nudging with different wavenumbers, nudging variables and nudging coefficients. Model validation shows that WRF has the ability to simulate spatial distributions and temporal variations of the surface climate (air temperature and precipitation) over CORDEX-EA domain. Comparably the spectral nudging technique is effective in improving the model's skill in the following aspects: (1), the simulated biases and root mean square errors of annual mean temperature and precipitation are obviously reduced. The SN3-UVT (spectral nudging with wavenumber 3 in both zonal and meridional directions applied to U, V and T) and SN6 (spectral nudging with wavenumber 6 in both zonal and meridional directions applied to U and V) experiments give the best simulations for temperature and precipitation respectively. The inter-annual and seasonal variances produced by the SN experiments are also closer to the ERA-Interim observation. (2), the application of spectral nudging in WRF is helpful for simulating the extreme temperature and precipitation, and the SN3-UVT simulation shows a clear advantage over the other simulations in depicting both the spatial distributions and inter-annual variances of temperature and precipitation extremes. With the spectral nudging, WRF is able to preserve the variability in the large scale climate information, and therefore adjust the temperature and precipitation variabilities toward the observation.

  10. Logistic Risk Model for the Unique Effects of Inherent Aerobic Capacity on (+)G(sub z) Tolerance Before and After Simulated Weightlessness

    NASA Technical Reports Server (NTRS)

    Ludwig, David A.; Convertino, Victor A.; Goldwater, Danielle J.; Sandler, Harold

    1987-01-01

    Small sample size (n less than 1O) and inappropriate analysis of multivariate data have hindered previous attempts to describe which physiologic and demographic variables are most important in determining how long humans can tolerate acceleration. Data from previous centrifuge studies conducted at NASA/Ames Research Center, utilizing a 7-14 d bed rest protocol to simulate weightlessness, were included in the current investigation. After review, data on 25 women and 22 men were available for analysis. Study variables included gender, age, weight, height, percent body fat, resting heart rate, mean arterial pressure, Vo(sub 2)max and plasma volume. Since the dependent variable was time to greyout (failure), two contemporary biostatistical modeling procedures (proportional hazard and logistic discriminant function) were used to estimate risk, given a particular subject's profile. After adjusting for pro-bed-rest tolerance time, none of the profile variables remained in the risk equation for post-bed-rest tolerance greyout. However, prior to bed rest, risk of greyout could be predicted with 91% accuracy. All of the profile variables except weight, MAP, and those related to inherent aerobic capacity (Vo(sub 2)max, percent body fat, resting heart rate) entered the risk equation for pro-bed-rest greyout. A cross-validation using 24 new subjects indicated a very stable model for risk prediction, accurate within 5% of the original equation. The result for the inherent fitness variables is significant in that a consensus as to whether an increased aerobic capacity is beneficial or detrimental has not been satisfactorily established. We conclude that tolerance to +Gz acceleration before and after simulated weightlessness is independent of inherent aerobic fitness.

  11. Impact Of The Material Variability On The Stamping Process: Numerical And Analytical Analysis

    NASA Astrophysics Data System (ADS)

    Ledoux, Yann; Sergent, Alain; Arrieux, Robert

    2007-05-01

    The finite element simulation is a very useful tool in the deep drawing industry. It is used more particularly for the development and the validation of new stamping tools. It allows to decrease cost and time for the tooling design and set up. But one of the most important difficulties to have a good agreement between the simulation and the real process comes from the definition of the numerical conditions (mesh, punch travel speed, limit conditions,…) and the parameters which model the material behavior. Indeed, in press shop, when the sheet set changes, often a variation of the formed part geometry is observed according to the variability of the material properties between these different sets. This last parameter represents probably one of the main source of process deviation when the process is set up. That's why it is important to study the influence of material data variation on the geometry of a classical stamped part. The chosen geometry is an omega shaped part because of its simplicity and it is representative one in the automotive industry (car body reinforcement). Moreover, it shows important springback deviations. An isotropic behaviour law is assumed. The impact of the statistical deviation of the three law coefficients characterizing the material and the friction coefficient around their nominal values is tested. A Gaussian distribution is supposed and their impact on the geometry variation is studied by FE simulation. An other approach is envisaged consisting in modeling the process variability by a mathematical model and then, in function of the input parameters variability, it is proposed to define an analytical model which leads to find the part geometry variability around the nominal shape. These two approaches allow to predict the process capability as a function of the material parameter variability.

  12. How well do the GCMs/RCMs capture the multi-scale temporal variability of precipitation in the Southwestern United States?

    NASA Astrophysics Data System (ADS)

    Jiang, Peng; Gautam, Mahesh R.; Zhu, Jianting; Yu, Zhongbo

    2013-02-01

    SummaryMulti-scale temporal variability of precipitation has an established relationship with floods and droughts. In this paper, we present the diagnostics on the ability of 16 General Circulation Models (GCMs) from Bias Corrected and Downscaled (BCSD) World Climate Research Program's (WCRP's) Coupled Model Inter-comparison Project Phase 3 (CMIP3) projections and 10 Regional Climate Models (RCMs) that participated in the North American Regional Climate Change Assessment Program (NARCCAP) to represent multi-scale temporal variability determined from the observed station data. Four regions (Los Angeles, Las Vegas, Tucson, and Cimarron) in the Southwest United States are selected as they represent four different precipitation regions classified by clustering method. We investigate how storm properties and seasonal, inter-annual, and decadal precipitation variabilities differed between GCMs/RCMs and observed records in these regions. We find that current GCMs/RCMs tend to simulate longer storm duration and lower storm intensity compared to those from observed records. Most GCMs/RCMs fail to produce the high-intensity summer storms caused by local convective heat transport associated with the summer monsoon. Both inter-annual and decadal bands are present in the GCM/RCM-simulated precipitation time series; however, these do not line up to the patterns of large-scale ocean oscillations such as El Nino/La Nina Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO). Our results show that the studied GCMs/RCMs can capture long-term monthly mean as the examined data is bias-corrected and downscaled, but fail to simulate the multi-scale precipitation variability including flood generating extreme events, which suggests their inadequacy for studies on floods and droughts that are strongly associated with multi-scale temporal precipitation variability.

  13. Wind Forced Variability in Eddy Formation, Eddy Shedding, and the Separation of the East Australian Current

    NASA Astrophysics Data System (ADS)

    Bull, Christopher Y. S.; Kiss, Andrew E.; Jourdain, Nicolas C.; England, Matthew H.; van Sebille, Erik

    2017-12-01

    The East Australian Current (EAC), like many other subtropical western boundary currents, is believed to be penetrating further poleward in recent decades. Previous observational and model studies have used steady state dynamics to relate changes in the westerly winds to changes in the separation behavior of the EAC. As yet, little work has been undertaken on the impact of forcing variability on the EAC and Tasman Sea circulation. Here using an eddy-permitting regional ocean model, we present a suite of simulations forced by the same time-mean fields, but with different atmospheric and remote ocean variability. These eddy-permitting results demonstrate the nonlinear response of the EAC to variable, nonstationary inhomogeneous forcing. These simulations show an EAC with high intrinsic variability and stochastic eddy shedding. We show that wind stress variability on time scales shorter than 56 days leads to increases in eddy shedding rates and southward eddy propagation, producing an increased transport and southward reach of the mean EAC extension. We adopt an energetics framework that shows the EAC extension changes to be coincident with an increase in offshore, upstream eddy variance (via increased barotropic instability) and increase in subsurface mean kinetic energy along the length of the EAC. The response of EAC separation to regional variable wind stress has important implications for both past and future climate change studies.

  14. Optimization of Operations Resources via Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  15. [Model for unplanned self extubation of ICU patients using system dynamics approach].

    PubMed

    Song, Yu Gil; Yun, Eun Kyoung

    2015-04-01

    In this study a system dynamics methodology was used to identify correlation and nonlinear feedback structure among factors affecting unplanned extubation (UE) of ICU patients and to construct and verify a simulation model. Factors affecting UE were identified through a theoretical background established by reviewing literature and preceding studies and referencing various statistical data. Related variables were decided through verification of content validity by an expert group. A causal loop diagram (CLD) was made based on the variables. Stock & Flow modeling using Vensim PLE Plus Version 6.0 b was performed to establish a model for UE. Based on the literature review and expert verification, 18 variables associated with UE were identified and CLD was prepared. From the prepared CLD, a model was developed by converting to the Stock & Flow Diagram. Results of the simulation showed that patient stress, patient in an agitated state, restraint application, patient movability, and individual intensive nursing were variables giving the greatest effect to UE probability. To verify agreement of the UE model with real situations, simulation with 5 cases was performed. Equation check and sensitivity analysis on TIME STEP were executed to validate model integrity. Results show that identification of a proper model enables prediction of UE probability. This prediction allows for adjustment of related factors, and provides basic data do develop nursing interventions to decrease UE.

  16. Two types of physical inconsistency to avoid with quantile mapping: a case study with relative humidity over North America.

    NASA Astrophysics Data System (ADS)

    Grenier, P.

    2017-12-01

    Statistical post-processing techniques aim at generating plausible climate scenarios from climate simulations and observation-based reference products. These techniques are generally not physically-based, and consequently they remedy the problem of simulation biases at the risk of generating physical inconsistency (PI). Although this concern is often emphasized, it is rarely addressed quantitatively. Here, PI generated by quantile mapping (QM), a technique widely used in climatological and hydrological applications, is investigated using relative humidity (RH) and its parent variables, namely specific humidity (SH), temperature and pressure. PI is classified into two types: 1) inadequate value for an individual variable (e.g. RH > 100 %), and 2) breaking of an inter-variable relationship. Scenarios built for this study correspond to twelve sites representing a variety of climate types over North America. Data used are an ensemble of ten 3-hourly global (CMIP5) and regional (CORDEX-NAM) simulations, as well as the CFSR reanalysis. PI of type 1 is discussed in terms of frequency of occurrence and amplitude of unphysical cases for RH and SH variables. PI of type 2 is investigated with heuristic proxies designed to directly compare the physical inconsistency problem with the initial bias problem. Finally, recommendations are provided for an appropriate use of QM given the potential to generate physical inconsistency of types 1 and 2.

  17. Evaluation of variable selection methods for random forests and omics data sets.

    PubMed

    Degenhardt, Frauke; Seifert, Stephan; Szymczak, Silke

    2017-10-16

    Machine learning methods and in particular random forests are promising approaches for prediction based on high dimensional omics data sets. They provide variable importance measures to rank predictors according to their predictive power. If building a prediction model is the main goal of a study, often a minimal set of variables with good prediction performance is selected. However, if the objective is the identification of involved variables to find active networks and pathways, approaches that aim to select all relevant variables should be preferred. We evaluated several variable selection procedures based on simulated data as well as publicly available experimental methylation and gene expression data. Our comparison included the Boruta algorithm, the Vita method, recurrent relative variable importance, a permutation approach and its parametric variant (Altmann) as well as recursive feature elimination (RFE). In our simulation studies, Boruta was the most powerful approach, followed closely by the Vita method. Both approaches demonstrated similar stability in variable selection, while Vita was the most robust approach under a pure null model without any predictor variables related to the outcome. In the analysis of the different experimental data sets, Vita demonstrated slightly better stability in variable selection and was less computationally intensive than Boruta.In conclusion, we recommend the Boruta and Vita approaches for the analysis of high-dimensional data sets. Vita is considerably faster than Boruta and thus more suitable for large data sets, but only Boruta can also be applied in low-dimensional settings. © The Author 2017. Published by Oxford University Press.

  18. Older People's Perceptions of Pedestrian Friendliness and Traffic Safety: An Experiment Using Computer-Simulated Walking Environments.

    PubMed

    Kahlert, Daniela; Schlicht, Wolfgang

    2015-08-21

    Traffic safety and pedestrian friendliness are considered to be important conditions for older people's motivation to walk through their environment. This study uses an experimental study design with computer-simulated living environments to investigate the effect of micro-scale environmental factors (parking spaces and green verges with trees) on older people's perceptions of both motivational antecedents (dependent variables). Seventy-four consecutively recruited older people were randomly assigned watching one of two scenarios (independent variable) on a computer screen. The scenarios simulated a stroll on a sidewalk, as it is 'typical' for a German city. In version 'A,' the subjects take a fictive walk on a sidewalk where a number of cars are parked partially on it. In version 'B', cars are in parking spaces separated from the sidewalk by grass verges and trees. Subjects assessed their impressions of both dependent variables. A multivariate analysis of covariance showed that subjects' ratings on perceived traffic safety and pedestrian friendliness were higher for Version 'B' compared to version 'A'. Cohen's d indicates medium (d = 0.73) and large (d = 1.23) effect sizes for traffic safety and pedestrian friendliness, respectively. The study suggests that elements of the built environment might affect motivational antecedents of older people's walking behavior.

  19. Modeling the temporal variability of zinc concentrations in zinc roof runoff-experimental study and uncertainty analysis.

    PubMed

    Sage, Jérémie; El Oreibi, Elissar; Saad, Mohamed; Gromaire, Marie-Christine

    2016-08-01

    This study investigates the temporal variability of zinc concentrations from zinc roof runoff. The influence of rainfall characteristics and dry period duration is evaluated by combining laboratory experiment on small zinc sheets and in situ measurements under real weather conditions from a 1.6-m(2) zinc panel. A reformulation of a commonly used conceptual runoff quality model is introduced and its ability to simulate the evolution of zinc concentrations is evaluated. A systematic and sharp decrease from initially high to relatively low and stable zinc concentrations after 0.5 to 2 mm of rainfall is observed for both experiments, suggesting that highly soluble corrosion products are removed at early stages of runoff. A moderate dependence between antecedent dry period duration and the magnitude of zinc concentrations at the beginning of a rain event is evidenced. Contrariwise, results indicate that concentrations are not significantly influenced by rainfall intensities. Simulated rainfall experiment nonetheless suggests that a slight effect of rainfall intensities may be expected after the initial decrease of concentrations. Finally, this study shows that relatively simple conceptual runoff quality models may be adopted to simulate the variability of zinc concentrations during a rain event and from a rain event to another.

  20. Impacts of Model Bias on the Climate Change Signal and Effects of Weighted Ensembles of Regional Climate Model Simulations: A Case Study over Southern Québec, Canada

    DOE PAGES

    Eum, Hyung-Il; Gachon, Philippe; Laprise, René

    2016-01-01

    This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less

  1. Impacts of Model Bias on the Climate Change Signal and Effects of Weighted Ensembles of Regional Climate Model Simulations: A Case Study over Southern Québec, Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eum, Hyung-Il; Gachon, Philippe; Laprise, René

    This study examined the impact of model biases on climate change signals for daily precipitation and for minimum and maximum temperatures. Through the use of multiple climate scenarios from 12 regional climate model simulations, the ensemble mean, and three synthetic simulations generated by a weighting procedure, we investigated intermodel seasonal climate change signals between current and future periods, for both median and extreme precipitation/temperature values. A significant dependence of seasonal climate change signals on the model biases over southern Québec in Canada was detected for temperatures, but not for precipitation. This suggests that the regional temperature change signal is affectedmore » by local processes. Seasonally, model bias affects future mean and extreme values in winter and summer. In addition, potentially large increases in future extremes of temperature and precipitation values were projected. For three synthetic scenarios, systematically less bias and a narrow range of mean change for all variables were projected compared to those of climate model simulations. In addition, synthetic scenarios were found to better capture the spatial variability of extreme cold temperatures than the ensemble mean scenario. Finally, these results indicate that the synthetic scenarios have greater potential to reduce the uncertainty of future climate projections and capture the spatial variability of extreme climate events.« less

  2. Calculation of the Local Free Energy Landscape in the Restricted Region by the Modified Tomographic Method.

    PubMed

    Chen, Changjun

    2016-03-31

    The free energy landscape is the most important information in the study of the reaction mechanisms of the molecules. However, it is difficult to calculate. In a large collective variable space, a molecule must take a long time to obtain the sufficient sampling during the simulation. To save the calculation quantity, decreasing the sampling region and constructing the local free energy landscape is required in practice. However, the restricted region in the collective variable space may have an irregular shape. Simply restricting one or more collective variables of the molecule cannot satisfy the requirement. In this paper, we propose a modified tomographic method to perform the simulation. First, it divides the restricted region by some hyperplanes and connects the centers of hyperplanes together by a curve. Second, it forces the molecule to sample on the curve and the hyperplanes in the simulation and calculates the free energy data on them. Finally, all the free energy data are combined together to form the local free energy landscape. Without consideration of the area outside the restricted region, this free energy calculation can be more efficient. By this method, one can further optimize the path quickly in the collective variable space.

  3. Study on the variable cycle engine modeling techniques based on the component method

    NASA Astrophysics Data System (ADS)

    Zhang, Lihua; Xue, Hui; Bao, Yuhai; Li, Jijun; Yan, Lan

    2016-01-01

    Based on the structure platform of the gas turbine engine, the components of variable cycle engine were simulated by using the component method. The mathematical model of nonlinear equations correspondeing to each component of the gas turbine engine was established. Based on Matlab programming, the nonlinear equations were solved by using Newton-Raphson steady-state algorithm, and the performance of the components for engine was calculated. The numerical simulation results showed that the model bulit can describe the basic performance of the gas turbine engine, which verified the validity of the model.

  4. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    USGS Publications Warehouse

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  5. Investigating the Sensitivity of Model Intraseasonal Variability to Minimum Entrainment

    NASA Astrophysics Data System (ADS)

    Hannah, W. M.; Maloney, E. D.

    2008-12-01

    Previous studies have shown that using a Relaxed Arakawa-Schubert (RAS) convective parameterization with appropriate convective triggers and assumptions about rain re-evaporation produces realistic intraseasonal variability. RAS represents convection with an ensemble of clouds detraining at different heights, each with different entrainment rate, the highest clouds having the lowest entrainment rates. If tropospheric temperature gradients are weak and boundary layer moist static energy is relatively constant, then by limiting the minimum entrainment rate deep convection is suppressed in the presence of dry tropospheric air. This allows moist static energy to accumulate and be discharged during strong intraseasonal convective events, which is consistent with the discharge/recharge paradigm. This study will examine the sensitivity of intra-seasonal variability to changes in minimum entrainment rate in the NCAR-CAM3 with the RAS scheme. Simulations using several minimum entrainment rate thresholds will be investigated. A frequency-wavenumber analysis will show the improvement of the MJO signal as minimum entrainment rate is increased. The spatial and vertical structure of MJO-like disturbances will be examined, including an analysis of the time evolution of vertical humidity distribution for each simulation. Simulated results will be compared to observed MJO events in NCEP-1 reanalysis and CMAP precipitation.

  6. Ionosphere variability during the 2009 SSW: Influence of the lunar semidiurnal tide and mechanisms producing electron density variability

    NASA Astrophysics Data System (ADS)

    Pedatella, N. M.; Liu, H.-L.; Sassi, F.; Lei, J.; Chau, J. L.; Zhang, X.

    2014-05-01

    To investigate ionosphere variability during the 2009 sudden stratosphere warming (SSW), we present simulation results that combine the Whole Atmosphere Community Climate Model Extended version and the thermosphere-ionosphere-mesosphere electrodynamics general circulation model (TIME-GCM). The simulations reveal notable enhancements in both the migrating semidiurnal solar (SW2) and lunar (M2) tides during the SSW. The SW2 and M2 amplitudes reach ˜50 m s-1 and ˜40 m s-1, respectively, in zonal wind at E region altitudes. The dramatic increase in the M2 at these altitudes influences the dynamo generation of electric fields, and the importance of the M2 on the ionosphere variability during the 2009 SSW is demonstrated by comparing simulations with and without the M2. TIME-GCM simulations that incorporate the M2 are found to be in good agreement with Jicamarca Incoherent Scatter Radar vertical plasma drifts and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) observations of the maximum F region electron density. The agreement with observations is worse if the M2 is not included in the simulation, demonstrating that the lunar tide is an important contributor to the ionosphere variability during the 2009 SSW. We additionally investigate sources of the F region electron density variability during the SSW. The primary driver of the electron density variability is changes in electric fields. Changes in meridional neutral winds and thermosphere composition are found to also contribute to the electron density variability during the 2009 SSW. The electron density variability for the 2009 SSW is therefore not solely due to variability in electric fields as previously thought.

  7. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  8. Analysis of the long-term surface wind variability over complex terrain using a high spatial resolution WRF simulation

    NASA Astrophysics Data System (ADS)

    Jiménez, Pedro A.; González-Rouco, J. Fidel; Montávez, Juan P.; García-Bustamante, E.; Navarro, J.; Dudhia, J.

    2013-04-01

    This work uses a WRF numerical simulation from 1960 to 2005 performed at a high horizontal resolution (2 km) to analyze the surface wind variability over a complex terrain region located in northern Iberia. A shorter slice of this simulation has been used in a previous study to demonstrate the ability of the WRF model in reproducing the observed wind variability during the period 1992-2005. Learning from that validation exercise, the extended simulation is herein used to inspect the wind behavior where and when observations are not available and to determine the main synoptic mechanisms responsible for the surface wind variability. A principal component analysis was applied to the daily mean wind. Two principal modes of variation accumulate a large percentage of the wind variability (83.7%). The first mode reflects the channeling of the flow between the large mountain systems in northern Iberia modulated by the smaller topographic features of the region. The second mode further contributes to stress the differentiated wind behavior over the mountains and valleys. Both modes show significant contributions at the higher frequencies during the whole analyzed period, with different contributions at lower frequencies during the different decades. A strong relationship was found between these two modes and the zonal and meridional large scale pressure gradients over the area. This relationship is described in the context of the influence of standard circulation modes relevant in the European region like the North Atlantic Oscillation, the East Atlantic pattern, East Atlantic/Western Russia pattern, and the Scandinavian pattern.

  9. Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables

    DTIC Science & Technology

    2013-06-01

    1 18th ICCRTS Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables...AND SUBTITLE Using a Functional Simulation of Crisis Management to Test the C2 Agility Model Parameters on Key Performance Variables 5a. CONTRACT...command in crisis management. C2 Agility Model Agility can be conceptualized at a number of different levels; for instance at the team

  10. New Insights in Tropospheric Ozone and its Variability

    NASA Technical Reports Server (NTRS)

    Oman, Luke D.; Douglass, Anne R.; Ziemke, Jerry R.; Rodriquez, Jose M.

    2011-01-01

    We have produced time-slice simulations using the Goddard Earth Observing System Version 5 (GEOS-5) coupled to a comprehensive stratospheric and tropospheric chemical mechanism. These simulations are forced with observed sea surface temperatures over the past 25 years and use constant specified surface emissions, thereby providing a measure of the dynamically controlled ozone response. We examine the model performance in simulating tropospheric ozone and its variability. Here we show targeted comparisons results from our simulations with a multi-decadal tropical tropospheric column ozone dataset obtained from satellite observations of total column ozone. We use SHADOZ ozonesondes to gain insight into the observed vertical response and compare with the simulated vertical structure. This work includes but is not limited to ENSO related variability.

  11. A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.

    2012-01-01

    A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.

  12. A Comparison of Methods for Estimating Quadratic Effects in Nonlinear Structural Equation Models

    ERIC Educational Resources Information Center

    Harring, Jeffrey R.; Weiss, Brandi A.; Hsu, Jui-Chen

    2012-01-01

    Two Monte Carlo simulations were performed to compare methods for estimating and testing hypotheses of quadratic effects in latent variable regression models. The methods considered in the current study were (a) a 2-stage moderated regression approach using latent variable scores, (b) an unconstrained product indicator approach, (c) a latent…

  13. Quantitative Comparison of the Variability in Observed and Simulated Shortwave Reflectance

    NASA Technical Reports Server (NTRS)

    Roberts, Yolanda, L.; Pilewskie, P.; Kindel, B. C.; Feldman, D. R.; Collins, W. D.

    2013-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is a climate observation system that has been designed to monitor the Earth's climate with unprecedented absolute radiometric accuracy and SI traceability. Climate Observation System Simulation Experiments (OSSEs) have been generated to simulate CLARREO hyperspectral shortwave imager measurements to help define the measurement characteristics needed for CLARREO to achieve its objectives. To evaluate how well the OSSE-simulated reflectance spectra reproduce the Earth s climate variability at the beginning of the 21st century, we compared the variability of the OSSE reflectance spectra to that of the reflectance spectra measured by the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY). Principal component analysis (PCA) is a multivariate decomposition technique used to represent and study the variability of hyperspectral radiation measurements. Using PCA, between 99.7%and 99.9%of the total variance the OSSE and SCIAMACHY data sets can be explained by subspaces defined by six principal components (PCs). To quantify how much information is shared between the simulated and observed data sets, we spectrally decomposed the intersection of the two data set subspaces. The results from four cases in 2004 showed that the two data sets share eight (January and October) and seven (April and July) dimensions, which correspond to about 99.9% of the total SCIAMACHY variance for each month. The spectral nature of these shared spaces, understood by examining the transformed eigenvectors calculated from the subspace intersections, exhibit similar physical characteristics to the original PCs calculated from each data set, such as water vapor absorption, vegetation reflectance, and cloud reflectance.

  14. Study of Variable Turbulent Prandtl Number Model for Heat Transfer to Supercritical Fluids in Vertical Tubes

    NASA Astrophysics Data System (ADS)

    Tian, Ran; Dai, Xiaoye; Wang, Dabiao; Shi, Lin

    2018-06-01

    In order to improve the prediction performance of the numerical simulations for heat transfer of supercritical pressure fluids, a variable turbulent Prandtl number (Prt) model for vertical upward flow at supercritical pressures was developed in this study. The effects of Prt on the numerical simulation were analyzed, especially for the heat transfer deterioration conditions. Based on the analyses, the turbulent Prandtl number was modeled as a function of the turbulent viscosity ratio and molecular Prandtl number. The model was evaluated using experimental heat transfer data of CO2, water and Freon. The wall temperatures, including the heat transfer deterioration cases, were more accurately predicted by this model than by traditional numerical calculations with a constant Prt. By analyzing the predicted results with and without the variable Prt model, it was found that the predicted velocity distribution and turbulent mixing characteristics with the variable Prt model are quite different from that predicted by a constant Prt. When heat transfer deterioration occurs, the radial velocity profile deviates from the log-law profile and the restrained turbulent mixing then leads to the deteriorated heat transfer.

  15. CMIP5 land surface models systematically underestimate inter-annual variability of net ecosystem exchange in semi-arid southwestern North America.

    NASA Astrophysics Data System (ADS)

    MacBean, N.; Scott, R. L.; Biederman, J. A.; Vuichard, N.; Hudson, A.; Barnes, M.; Fox, A. M.; Smith, W. K.; Peylin, P. P.; Maignan, F.; Moore, D. J.

    2017-12-01

    Recent studies based on analysis of atmospheric CO2 inversions, satellite data and terrestrial biosphere model simulations have suggested that semi-arid ecosystems play a dominant role in the interannual variability and long-term trend in the global carbon sink. These studies have largely cited the response of vegetation activity to changing moisture availability as the primary mechanism of variability. However, some land surface models (LSMs) used in these studies have performed poorly in comparison to satellite-based observations of vegetation dynamics in semi-arid regions. Further analysis is therefore needed to ensure semi-arid carbon cycle processes are well represented in global scale LSMs before we can fully establish their contribution to the global carbon cycle. In this study, we evaluated annual net ecosystem exchange (NEE) simulated by CMIP5 land surface models using observations from 20 Ameriflux sites across semi-arid southwestern North America. We found that CMIP5 models systematically underestimate the magnitude and sign of NEE inter-annual variability; therefore, the true role of semi-arid regions in the global carbon cycle may be even more important than previously thought. To diagnose the factors responsible for this bias, we used the ORCHIDEE LSM to test different climate forcing data, prescribed vegetation fractions and model structures. Climate and prescribed vegetation do contribute to uncertainty in annual NEE simulations, but the bias is primarily caused by incorrect timing and magnitude of peak gross carbon fluxes. Modifications to the hydrology scheme improved simulations of soil moisture in comparison to data. This in turn improved the seasonal cycle of carbon uptake due to a more realistic limitation on photosynthesis during water stress. However, the peak fluxes are still too low, and phenology is poorly represented for desert shrubs and grasses. We provide suggestions on model developments needed to tackle these issues in the future.

  16. Understanding the West African monsoon variability and its remote effects: an illustration of the grid point nudging methodology

    NASA Astrophysics Data System (ADS)

    Bielli, Soline; Douville, Hervé; Pohl, Benjamin

    2010-07-01

    General circulation models still show deficiencies in simulating the basic features of the West African Monsoon at intraseasonal, seasonal and interannual timescales. It is however, difficult to disentangle the remote versus regional factors that contribute to such deficiencies, and to diagnose their possible consequences for the simulation of the global atmospheric variability. The aim of the present study is to address these questions using the so-called grid point nudging technique, where prognostic atmospheric fields are relaxed either inside or outside the West African Monsoon region toward the ERA40 reanalysis. This regional or quasi-global nudging is tested in ensembles of boreal summer simulations. The impact is evaluated first on the model climatology, then on intraseasonal timescales with an emphasis on North Atlantic/Europe weather regimes, and finally on interannual timescales. Results show that systematic biases in the model climatology over West Africa are mostly of regional origin and have a limited impact outside the domain. A clear impact is found however on the eddy component of the extratropical circulation, in particular over the North Atlantic/European sector. At intraseasonal timescale, the main regional biases also resist to the quasi-global nudging though their magnitude is reduced. Conversely, nudging the model over West Africa exerts a strong impact on the frequency of the two North Atlantic weather regimes that favor the occurrence of heat waves over Europe. Significant impacts are also found at interannual timescale. Not surprisingly, the quasi-global nudging allows the model to capture the variability of large-scale dynamical monsoon indices, but exerts a weaker control on rainfall variability suggesting the additional contribution of regional processes. Conversely, nudging the model toward West Africa suppresses the spurious ENSO teleconnection that is simulated over Europe in the control experiment, thereby emphasizing the relevance of a realistic West African monsoon simulation for seasonal prediction in the extratropics. Further experiments will be devoted to case studies aiming at a better understanding of regional processes governing the monsoon variability and of the possible monsoon teleconnections, especially over Europe.

  17. nZVI injection into variably saturated soils: Field and modeling study.

    PubMed

    Chowdhury, Ahmed I A; Krol, Magdalena M; Kocur, Christopher M; Boparai, Hardiljeet K; Weber, Kela P; Sleep, Brent E; O'Carroll, Denis M

    2015-12-01

    Nano-scale zero valent iron (nZVI) has been used at a number of contaminated sites over the last decade. At most of these sites, significant decreases in contaminant concentrations have resulted from the application of nZVI. However, limited work has been completed investigating nZVI field-scale mobility. In this study, a field test was combined with numerical modeling to examine nZVI reactivity along with transport properties in variably saturated soils. The field test consisted of 142L of carboxymethyle cellulose (CMC) stabilized monometallic nZVI synthesized onsite and injected into a variably saturated zone. Periodic groundwater samples were collected from the injection well, as well as, from two monitoring wells to analyze for chlorinated solvents and other geochemistry indicators. This study showed that CMC stabilized monometallic nZVI was able to decrease tricholorethene (TCE) concentrations in groundwater by more than 99% from the historical TCE concentrations. A three dimensional, three phase, finite difference numerical simulator, (CompSim) was used to further investigate nZVI and polymer transport at the variably saturated site. The model was able to accurately predict the field observed head data without parameter fitting. In addition, the numerical simulator estimated the mass of nZVI delivered to the saturated and unsaturated zones and distinguished the nZVI phase (i.e. aqueous or attached). The simulation results showed that the injected slurry migrated radially outward from the injection well, and therefore nZVI transport was governed by injection velocity and viscosity of the injected solution. A suite of sensitivity analyses was performed to investigate the impact of different injection scenarios (e.g. different volume and injection rate) on nZVI migration. Simulation results showed that injection of a higher nZVI volume delivered more iron particles at a given distance; however, the travel distance was not proportional to the increase in volume. Moreover, simulation results showed that using a 1D transport equation to simulate nZVI migration in the subsurface may overestimate the travel distance. This is because the 1D transport equation assumes a constant velocity while pore water velocity radially decreases from the well during injection. This study suggests that on-site synthesized nZVI particles are mobile in the subsurface and that a numerical simulator can be a valuable tool for optimal design of nZVI field applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Scale issues in soil hydrology related to measurement and simulation: A case study in Colorado

    USDA-ARS?s Scientific Manuscript database

    State variables, such as soil water content (SWC), are typically measured or inferred at very small scales while being simulated at larger scales relevant to spatial management or hillslope areas. Thus there is an implicit spatial disparity that is often ignored. Surface runoff, on the other hand, ...

  19. A Feedback Intervention to Increase Digital and Paper Checklist Performance in Technically Advanced Aircraft Simulation

    ERIC Educational Resources Information Center

    Rantz, William G.; Van Houten, Ron

    2011-01-01

    This study examined whether pilots operating a flight simulator completed digital or paper flight checklists more accurately after receiving postflight graphic and verbal feedback. The dependent variable was the number of checklist items completed correctly per flight. Following treatment, checklist completion with paper and digital checklists…

  20. Simulation of Teacher Demand, Demographics, and Mobility: A Preliminary Report.

    ERIC Educational Resources Information Center

    Baugh, William H.; Stone, Joe A.

    A Markov chain is used to construct a simulation model of the educator labor market in Oregon. The variables crucial to this study, drawn from the University of Southern California faculty planning model, include factors such as appointment rate; age; probability of attaining promotion; retirement, resignation and mortality rates; length of…

  1. Model aerodynamic test results for two variable cycle engine coannular exhaust systems at simulated takeoff and cruise conditions. [Lewis 8 by 6-foot supersonic wind tunnel tests

    NASA Technical Reports Server (NTRS)

    Nelson, D. P.

    1980-01-01

    Wind tunnel tests were conducted to evaluate the aerodynamic performance of a coannular exhaust nozzle for a proposed variable stream control supersonic propulsion system. Tests were conducted with two simulated configurations differing primarily in the fan duct flowpaths: a short flap mechanism for fan stream control with an isentropic contoured flow splitter, and an iris fan nozzle with a conical flow splitter. Both designs feature a translating primary plug and an auxiliary inlet ejector. Tests were conducted at takeoff and simulated cruise conditions. Data were acquired at Mach numbers of 0, 0.36, 0.9, and 2.0 for a wide range of nozzle operating conditions. At simulated supersonic cruise, both configurations demonstrated good performance, comparable to levels assumed in earlier advanced supersonic propulsion studies. However, at subsonic cruise, both configurations exhibited performance that was 6 to 7.5 percent less than the study assumptions. At take off conditions, the iris configuration performance approached the assumed levels, while the short flap design was 4 to 6 percent less.

  2. Hybrid stochastic simulations of intracellular reaction-diffusion systems.

    PubMed

    Kalantzis, Georgios

    2009-06-01

    With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed an adaptive partitioning strategy in which processes with high frequency are simulated with deterministic rate-based equations, and those with low frequency using the exact stochastic algorithm of Gillespie. Therefore the stochastic behavior of cellular pathways is preserved while being able to apply it to large populations of molecules. We describe our method and demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors.

  3. Bayesian Techniques for Comparing Time-dependent GRMHD Simulations to Variable Event Horizon Telescope Observations

    NASA Astrophysics Data System (ADS)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan; Medeiros, Lia; Özel, Feryal; Psaltis, Dimitrios

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore the robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.

  4. BAYESIAN TECHNIQUES FOR COMPARING TIME-DEPENDENT GRMHD SIMULATIONS TO VARIABLE EVENT HORIZON TELESCOPE OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Junhan; Marrone, Daniel P.; Chan, Chi-Kwan

    2016-12-01

    The Event Horizon Telescope (EHT) is a millimeter-wavelength, very-long-baseline interferometry (VLBI) experiment that is capable of observing black holes with horizon-scale resolution. Early observations have revealed variable horizon-scale emission in the Galactic Center black hole, Sagittarius A* (Sgr A*). Comparing such observations to time-dependent general relativistic magnetohydrodynamic (GRMHD) simulations requires statistical tools that explicitly consider the variability in both the data and the models. We develop here a Bayesian method to compare time-resolved simulation images to variable VLBI data, in order to infer model parameters and perform model comparisons. We use mock EHT data based on GRMHD simulations to explore themore » robustness of this Bayesian method and contrast it to approaches that do not consider the effects of variability. We find that time-independent models lead to offset values of the inferred parameters with artificially reduced uncertainties. Moreover, neglecting the variability in the data and the models often leads to erroneous model selections. We finally apply our method to the early EHT data on Sgr A*.« less

  5. The role of updraft velocity in temporal variability of cloud hydrometeor number

    NASA Astrophysics Data System (ADS)

    Sullivan, Sylvia; Nenes, Athanasios; Lee, Dong Min; Oreopoulos, Lazaros

    2016-04-01

    Significant effort has been dedicated to incorporating direct aerosol-cloud links, through parameterization of liquid droplet activation and ice crystal nucleation, within climate models. This significant accomplishment has generated the need for understanding which parameters affecting hydrometer formation drives its variability in coupled climate simulations, as it provides the basis for optimal parameter estimation as well as robust comparison with data, and other models. Sensitivity analysis alone does not address this issue, given that the importance of each parameter for hydrometer formation depends on its variance and sensitivity. To address the above issue, we develop and use a series of attribution metrics defined with adjoint sensitivities to attribute the temporal variability in droplet and crystal number to important aerosol and dynamical parameters. This attribution analysis is done both for the NASA Global Modeling and Assimilation Office Goddard Earth Observing System Model, Version 5 and the National Center for Atmospheric Research Community Atmosphere Model Version 5.1. Within the GEOS simulation, up to 48% of temporal variability in output ice crystal number and 61% in droplet number can be attributed to input updraft velocity fluctuations, while for the CAM simulation, they explain as much as 89% of the ice crystal number variability. This above results suggest that vertical velocity in both model frameworks is seen to be a very important (or dominant) driver of hydrometer variability. Yet, observations of vertical velocity are seldomly available (or used) to evaluate the vertical velocities in simulations; this strikingly contrasts the amount and quality of data available for aerosol-related parameters. Consequentially, there is a strong need for retrievals or measurements of vertical velocity for addressing this important knowledge gap that requires a significant investment and effort by the atmospheric community. The attribution metrics as a tool of understanding for hydrometer variability can be instrumental for understanding the source of differences between models used for aerosol-cloud-climate interaction studies.

  6. The Challenge of Simulating the Regional Climate over Florida

    NASA Astrophysics Data System (ADS)

    Misra, V.; Mishra, A. K.

    2015-12-01

    In this study we show that the unique geography of the peninsular Florida with close proximity to strong mesoscale surface ocean currents among other factors warrants the use of relatively high resolution climate models to project Florida's hydroclimate. In the absence of such high resolution climate models we highlight the deficiencies of two relatively coarse spatial resolution CMIP5 models with respect to the warm western boundary current of the Gulf Stream. As a consequence it affects the coastal SST and the land-ocean contrast, affecting the rainy summer seasonal precipitation accumulation over peninsular Florida. We also show this through two sensitivity studies conducted with a regional coupled ocean atmosphere model with different bathymetries that dislocate and modulate the strength of the Gulf Stream that locally affects the SST in the two simulations. These studies show that a stronger and more easterly displaced Gulf Stream produces warmer coastal SST's along the Atlantic coast of Florida that enhances the precipitation over peninsular Florida relative to the other regional climate model simulation. However the regional model simulations indicate that variability of wet season rainfall variability in peninsular Florida becomes less dependent on the land-ocean contrast with a stronger Gulf Stream current.

  7. Sources and Impacts of Modeled and Observed Low-Frequency Climate Variability

    NASA Astrophysics Data System (ADS)

    Parsons, Luke Alexander

    Here we analyze climate variability using instrumental, paleoclimate (proxy), and the latest climate model data to understand more about the sources and impacts of low-frequency climate variability. Understanding the drivers of climate variability at interannual to century timescales is important for studies of climate change, including analyses of detection and attribution of climate change impacts. Additionally, correctly modeling the sources and impacts of variability is key to the simulation of abrupt change (Alley et al., 2003) and extended drought (Seager et al., 2005; Pelletier and Turcotte, 1997; Ault et al., 2014). In Appendix A, we employ an Earth system model (GFDL-ESM2M) simulation to study the impacts of a weakening of the Atlantic meridional overturning circulation (AMOC) on the climate of the American Tropics. The AMOC drives some degree of local and global internal low-frequency climate variability (Manabe and Stouffer, 1995; Thornalley et al., 2009) and helps control the position of the tropical rainfall belt (Zhang and Delworth, 2005). We find that a major weakening of the AMOC can cause large-scale temperature, precipitation, and carbon storage changes in Central and South America. Our results suggest that possible future changes in AMOC strength alone will not be sufficient to drive a large-scale dieback of the Amazonian forest, but this key natural ecosystem is sensitive to dry-season length and timing of rainfall (Parsons et al., 2014). In Appendix B, we compare a paleoclimate record of precipitation variability in the Peruvian Amazon to climate model precipitation variability. The paleoclimate (Lake Limon) record indicates that precipitation variability in western Amazonia is 'red' (i.e., increasing variability with timescale). By contrast, most state-of-the-art climate models indicate precipitation variability in this region is nearly 'white' (i.e., equally variability across timescales). This paleo-model disagreement in the overall structure of the variance spectrum has important consequences for the probability of multi-year drought. Our lake record suggests there is a significant background threat of multi-year, and even decade-length, drought in western Amazonia, whereas climate model simulations indicate most droughts likely last no longer than one to three years. These findings suggest climate models may underestimate the future risk of extended drought in this important region. In Appendix C, we expand our analysis of climate variability beyond South America. We use observations, well-constrained tropical paleoclimate, and Earth system model data to examine the overall shape of the climate spectrum across interannual to century frequencies. We find a general agreement among observations and models that temperature variability increases with timescale across most of the globe outside the tropics. However, as compared to paleoclimate records, climate models generate too little low-frequency variability in the tropics (e.g., Laepple and Huybers, 2014). When we compare the shape of the simulated climate spectrum to the spectrum of a simple autoregressive process, we find much of the modeled surface temperature variability in the tropics could be explained by ocean smoothing of weather noise. Importantly, modeled precipitation tends to be similar to white noise across much of the globe. By contrast, paleoclimate records of various types from around the globe indicate that both temperature and precipitation variability should experience much more low-frequency variability than a simple autoregressive or white-noise process. In summary, state-of-the-art climate models generate some degree of dynamically driven low-frequency climate variability, especially at high latitudes. However, the latest climate models, observations, and paleoclimate data provide us with drastically different pictures of the background climate system and its associated risks. This research has important consequences for improving how we simulate climate extremes as we enter a warmer (and often drier) world in the coming centuries; if climate models underestimate low-frequency variability, we will underestimate the risk of future abrupt change and extreme events, such as megadroughts.

  8. Variable selection in discrete survival models including heterogeneity.

    PubMed

    Groll, Andreas; Tutz, Gerhard

    2017-04-01

    Several variable selection procedures are available for continuous time-to-event data. However, if time is measured in a discrete way and therefore many ties occur models for continuous time are inadequate. We propose penalized likelihood methods that perform efficient variable selection in discrete survival modeling with explicit modeling of the heterogeneity in the population. The method is based on a combination of ridge and lasso type penalties that are tailored to the case of discrete survival. The performance is studied in simulation studies and an application to the birth of the first child.

  9. Physical fitness predicts technical-tactical and time-motion profile in simulated Judo and Brazilian Jiu-Jitsu matches

    PubMed Central

    Gentil, Paulo; Bueno, João C.A.; Follmer, Bruno; Marques, Vitor A.; Del Vecchio, Fabrício B.

    2018-01-01

    Background Among combat sports, Judo and Brazilian Jiu-Jitsu (BJJ) present elevated physical fitness demands from the high-intensity intermittent efforts. However, information regarding how metabolic and neuromuscular physical fitness is associated with technical-tactical performance in Judo and BJJ fights is not available. This study aimed to relate indicators of physical fitness with combat performance variables in Judo and BJJ. Methods The sample consisted of Judo (n = 16) and BJJ (n = 24) male athletes. At the first meeting, the physical tests were applied and, in the second, simulated fights were performed for later notational analysis. Results The main findings indicate: (i) high reproducibility of the proposed instrument and protocol used for notational analysis in a mobile device; (ii) differences in the technical-tactical and time-motion patterns between modalities; (iii) performance-related variables are different in Judo and BJJ; and (iv) regression models based on metabolic fitness variables may account for up to 53% of the variances in technical-tactical and/or time-motion variables in Judo and up to 31% in BJJ, whereas neuromuscular fitness models can reach values up to 44 and 73% of prediction in Judo and BJJ, respectively. When all components are combined, they can explain up to 90% of high intensity actions in Judo. Discussion In conclusion, performance prediction models in simulated combat indicate that anaerobic, aerobic and neuromuscular fitness variables contribute to explain time-motion variables associated with high intensity and technical-tactical variables in Judo and BJJ fights. PMID:29844991

  10. Investigation on the Nonlinear Control System of High-Pressure Common Rail (HPCR) System in a Diesel Engine

    NASA Astrophysics Data System (ADS)

    Cai, Le; Mao, Xiaobing; Ma, Zhexuan

    2018-02-01

    This study first constructed the nonlinear mathematical model of the high-pressure common rail (HPCR) system in the diesel engine. Then, the nonlinear state transformation was performed using the flow’s calculation and the standard state space equation was acquired. Based on sliding-mode variable structure control (SMVSC) theory, a sliding-mode controller for nonlinear systems was designed for achieving the control of common rail pressure and the diesel engine’s rotational speed. Finally, on the simulation platform of MATLAB, the designed nonlinear HPCR system was simulated. The simulation results demonstrate that sliding-mode variable structure control algorithm shows favorable control performances and overcome the shortcomings of traditional PID control in overshoot, parameter adjustment, system precision, adjustment time and ascending time.

  11. Low-Level Jets and Their Effects on the South American Summer Climate as Simulated by the NCEP Eta Model(.

    NASA Astrophysics Data System (ADS)

    Vernekar, Anandu D.; Kirtman, Ben P.; Fennessy, Michael J.

    2003-01-01

    The National Centers for Environmental Prediction (NCEP) Eta Model (80 km, 38L) is used to simulate the tropical South American summer (January-March) climate for 1983, 1985, 1987, 1989, and 1991 using lateral boundary conditions from the NCEP-National Center for Atmospheric Research (NCAR) reanalysis. Simulations of the lower tropospheric circulation and precipitation are analyzed to study the variability on diurnal, intraseasonal, and interannual timescales. The results are compared with observations and previous studies.The Eta Model produces better regional circulation details, such as low-level jets (LLJs), than does the reanalysis because of its higher resolution, more realistic topography and coastal geometry, and because of its ability to realistically simulate the effects of mesoscale circulation on the time-mean flow. The model detects not only the LLJ east of the Andes Mountains and the LLJ west of northern Cordillera Occidental, which have been reported in previous studies, but it also detects three distinct LLJs just north of the equator embedded in the strong northeasterly trade winds over Colombia, Venezuela, and Guiana. All the LLJs show strong diurnal variability with a nocturnal maximum. The LLJ east of the Andes Mountains brings warm moist air from the Amazon basin to the Gran Chaco region where the jet exits. The moisture convergence in the jet exit region creates favorable conditions for precipitation. Hence, the precipitation over the region also shows strong diurnal variability with a nocturnal maximum. The LLJs just north of the equator bring moisture from the tropical Atlantic Ocean, the western Caribbean Sea, and the Gulf of Panama to their exit regions over the northern Amazon basin and west coasts of Colombia and Ecuador. The precipitation over these regions also has diurnal variability with a nocturnal maximum. The diurnal variability of precipitation over most of the Tropics has an afternoon rainfall maximum except for regions influenced by LLJs, which have a nocturnal rainfall maximum. The intraseasonal variability of the LLJs is episodic with an approximate period of 20 days. The interannual variability of the LLJs is dominated by the ENSO cycle. The LLJ east of the Andes Mountains is stronger in the warm phase of ENSO than in the cold phase. However, the model has some difficulty simulating the observed relationship between the strength of LLJ and precipitation, but the model succeeds in the case of LLJs just north of the equator. For example, these LLJs are weaker in the warm phase of ENSO than in the cold phase. Hence, during the warm (cold) phase of ENSO, dry (wet) conditions normally occur over the northern part of the Amazon basin, which is the exit region of these LLJs.

  12. High-resolution regional climate model evaluation using variable-resolution CESM over California

    NASA Astrophysics Data System (ADS)

    Huang, X.; Rhoades, A.; Ullrich, P. A.; Zarzycki, C. M.

    2015-12-01

    Understanding the effect of climate change at regional scales remains a topic of intensive research. Though computational constraints remain a problem, high horizontal resolution is needed to represent topographic forcing, which is a significant driver of local climate variability. Although regional climate models (RCMs) have traditionally been used at these scales, variable-resolution global climate models (VRGCMs) have recently arisen as an alternative for studying regional weather and climate allowing two-way interaction between these domains without the need for nudging. In this study, the recently developed variable-resolution option within the Community Earth System Model (CESM) is assessed for long-term regional climate modeling over California. Our variable-resolution simulations will focus on relatively high resolutions for climate assessment, namely 28km and 14km regional resolution, which are much more typical for dynamically downscaled studies. For comparison with the more widely used RCM method, the Weather Research and Forecasting (WRF) model will be used for simulations at 27km and 9km. All simulations use the AMIP (Atmospheric Model Intercomparison Project) protocols. The time period is from 1979-01-01 to 2005-12-31 (UTC), and year 1979 was discarded as spin up time. The mean climatology across California's diverse climate zones, including temperature and precipitation, is analyzed and contrasted with the Weather Research and Forcasting (WRF) model (as a traditional RCM), regional reanalysis, gridded observational datasets and uniform high-resolution CESM at 0.25 degree with the finite volume (FV) dynamical core. The results show that variable-resolution CESM is competitive in representing regional climatology on both annual and seasonal time scales. This assessment adds value to the use of VRGCMs for projecting climate change over the coming century and improve our understanding of both past and future regional climate related to fine-scale processes. This assessment is also relevant for addressing the scale limitation of current RCMs or VRGCMs when next-generation model resolution increases to ~10km and beyond.

  13. Sequential Gaussian co-simulation of rate decline parameters of longwall gob gas ventholes.

    PubMed

    Karacan, C Özgen; Olea, Ricardo A

    2013-04-01

    Gob gas ventholes (GGVs) are used to control methane inflows into a longwall mining operation by capturing the gas within the overlying fractured strata before it enters the work environment. Using geostatistical co-simulation techniques, this paper maps the parameters of their rate decline behaviors across the study area, a longwall mine in the Northern Appalachian basin. Geostatistical gas-in-place (GIP) simulations were performed, using data from 64 exploration boreholes, and GIP data were mapped within the fractured zone of the study area. In addition, methane flowrates monitored from 10 GGVs were analyzed using decline curve analyses (DCA) techniques to determine parameters of decline rates. Surface elevation showed the most influence on methane production from GGVs and thus was used to investigate its relation with DCA parameters using correlation techniques on normal-scored data. Geostatistical analysis was pursued using sequential Gaussian co-simulation with surface elevation as the secondary variable and with DCA parameters as the primary variables. The primary DCA variables were effective percentage decline rate, rate at production start, rate at the beginning of forecast period, and production end duration. Co-simulation results were presented to visualize decline parameters at an area-wide scale. Wells located at lower elevations, i.e., at the bottom of valleys, tend to perform better in terms of their rate declines compared to those at higher elevations. These results were used to calculate drainage radii of GGVs using GIP realizations. The calculated drainage radii are close to ones predicted by pressure transient tests.

  14. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP [Investigating the scale dependence of SCM simulated precipitation and cloud by using gridded forcing data at SGP

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-05

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version ofmore » the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. As a result, other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.« less

  15. Sequential Gaussian co-simulation of rate decline parameters of longwall gob gas ventholes

    USGS Publications Warehouse

    Karacan, C. Özgen; Olea, Ricardo A.

    2013-01-01

    Gob gas ventholes (GGVs) are used to control methane inflows into a longwall mining operation by capturing the gas within the overlying fractured strata before it enters the work environment. Using geostatistical co-simulation techniques, this paper maps the parameters of their rate decline behaviors across the study area, a longwall mine in the Northern Appalachian basin. Geostatistical gas-in-place (GIP) simulations were performed, using data from 64 exploration boreholes, and GIP data were mapped within the fractured zone of the study area. In addition, methane flowrates monitored from 10 GGVs were analyzed using decline curve analyses (DCA) techniques to determine parameters of decline rates. Surface elevation showed the most influence on methane production from GGVs and thus was used to investigate its relation with DCA parameters using correlation techniques on normal-scored data. Geostatistical analysis was pursued using sequential Gaussian co-simulation with surface elevation as the secondary variable and with DCA parameters as the primary variables. The primary DCA variables were effective percentage decline rate, rate at production start, rate at the beginning of forecast period, and production end duration. Co-simulation results were presented to visualize decline parameters at an area-wide scale. Wells located at lower elevations, i.e., at the bottom of valleys, tend to perform better in terms of their rate declines compared to those at higher elevations. These results were used to calculate drainage radii of GGVs using GIP realizations. The calculated drainage radii are close to ones predicted by pressure transient tests.

  16. Sequential Gaussian co-simulation of rate decline parameters of longwall gob gas ventholes

    PubMed Central

    Karacan, C.Özgen; Olea, Ricardo A.

    2015-01-01

    Gob gas ventholes (GGVs) are used to control methane inflows into a longwall mining operation by capturing the gas within the overlying fractured strata before it enters the work environment. Using geostatistical co-simulation techniques, this paper maps the parameters of their rate decline behaviors across the study area, a longwall mine in the Northern Appalachian basin. Geostatistical gas-in-place (GIP) simulations were performed, using data from 64 exploration boreholes, and GIP data were mapped within the fractured zone of the study area. In addition, methane flowrates monitored from 10 GGVs were analyzed using decline curve analyses (DCA) techniques to determine parameters of decline rates. Surface elevation showed the most influence on methane production from GGVs and thus was used to investigate its relation with DCA parameters using correlation techniques on normal-scored data. Geostatistical analysis was pursued using sequential Gaussian co-simulation with surface elevation as the secondary variable and with DCA parameters as the primary variables. The primary DCA variables were effective percentage decline rate, rate at production start, rate at the beginning of forecast period, and production end duration. Co-simulation results were presented to visualize decline parameters at an area-wide scale. Wells located at lower elevations, i.e., at the bottom of valleys, tend to perform better in terms of their rate declines compared to those at higher elevations. These results were used to calculate drainage radii of GGVs using GIP realizations. The calculated drainage radii are close to ones predicted by pressure transient tests. PMID:26190930

  17. Reply to Comment by Laprise on 'the Added Value to Global Model Projections of Climate Change by Dynamical Downscaling: a Case Study over the Continental U.S. Using the GISS-ModelE2 and WRF Models'

    NASA Technical Reports Server (NTRS)

    Shindell, Drew Todd; Racherla, Pavan; Milly, George Peter

    2014-01-01

    In his comment, Laprise raises several points that we agree merit consideration. His primary critique is that our study [Racherla et al., 2012] tested the ability of the WRF regional climate model to reproduce historical temperature and precipitation change relative to the driving global climate model (GCM) using only a single simulation rather than an ensemble. He asserts that the observed changes are smaller than the internal variability in the climate system (i.e., not statistically significant) and that thus a single simulation should not necessarily be able to capture the observations. Laprise points out that the statistical signal is reduced for a multi-decadal trend such as the one we analyzed in comparison with mean climatology and cites two studies showing that for particular climate parameters it can take any years for a signal to be discerned over internal variability. He states that The results of theexperiment as designed were strongly influenced by the presence of internal variability and sampling errors,which masked the rather small climate changes that may have occurred as a consequence of changes inforcing during the period considered. While Laprise discusses statistics in general terms at some length, for the actual climate trends examined in our study, he offers no evidence that the forced signal was smallcompared with internal variability. The two studies he cites [de Ela et al., 2013; Maraun, 2013] do not provide convincing evidence as they concern climate variables averaged over different times and areas. One in fact examines extreme precipitation events, which by definition are rare and thus have a lower significance level. We accept the general point that it is important to consider internal variability, and as noted in our paper we agree that an ensemble of simulations is in principle an optimal, though computationally expensive, approach. While we did not present the statistical significance of the observations in our original paper, we have now evaluated those for the regional temperature trends used in our study to evaluate the added value of WRF and thus can analyze data as to the magnitude of the trends with respect to internal variability.

  18. Impact of dynamical regionalization on precipitation biases and teleconnections over West Africa

    NASA Astrophysics Data System (ADS)

    Gómara, Iñigo; Mohino, Elsa; Losada, Teresa; Domínguez, Marta; Suárez-Moreno, Roberto; Rodríguez-Fonseca, Belén

    2018-06-01

    West African societies are highly dependent on the West African Monsoon (WAM). Thus, a correct representation of the WAM in climate models is of paramount importance. In this article, the ability of 8 CMIP5 historical General Circulation Models (GCMs) and 4 CORDEX-Africa Regional Climate Models (RCMs) to characterize the WAM dynamics and variability is assessed for the period July-August-September 1979-2004. Simulations are compared with observations. Uncertainties in RCM performance and lateral boundary conditions are assessed individually. Results show that both GCMs and RCMs have trouble to simulate the northward migration of the Intertropical Convergence Zone in boreal summer. The greatest bias improvements are obtained after regionalization of the most inaccurate GCM simulations. To assess WAM variability, a Maximum Covariance Analysis is performed between Sea Surface Temperature and precipitation anomalies in observations, GCM and RCM simulations. The assessed variability patterns are: El Niño-Southern Oscillation (ENSO); the eastern Mediterranean (MED); and the Atlantic Equatorial Mode (EM). Evidence is given that regionalization of the ENSO-WAM teleconnection does not provide any added value. Unlike GCMs, RCMs are unable to precisely represent the ENSO impact on air subsidence over West Africa. Contrastingly, the simulation of the MED-WAM teleconnection is improved after regionalization. Humidity advection and convergence over the Sahel area are better simulated by RCMs. Finally, no robust conclusions can be determined for the EM-WAM teleconnection, which cannot be isolated for the 1979-2004 period. The novel results in this article will help to select the most appropriate RCM simulations to study WAM teleconnections.

  19. A Probabilistic Approach to Quantify the Impact of Uncertainty Propagation in Musculoskeletal Simulations

    PubMed Central

    Myers, Casey A.; Laz, Peter J.; Shelburne, Kevin B.; Davidson, Bradley S.

    2015-01-01

    Uncertainty that arises from measurement error and parameter estimation can significantly affect the interpretation of musculoskeletal simulations; however, these effects are rarely addressed. The objective of this study was to develop an open-source probabilistic musculoskeletal modeling framework to assess how measurement error and parameter uncertainty propagate through a gait simulation. A baseline gait simulation was performed for a male subject using OpenSim for three stages: inverse kinematics, inverse dynamics, and muscle force prediction. A series of Monte Carlo simulations were performed that considered intrarater variability in marker placement, movement artifacts in each phase of gait, variability in body segment parameters, and variability in muscle parameters calculated from cadaveric investigations. Propagation of uncertainty was performed by also using the output distributions from one stage as input distributions to subsequent stages. Confidence bounds (5–95%) and sensitivity of outputs to model input parameters were calculated throughout the gait cycle. The combined impact of uncertainty resulted in mean bounds that ranged from 2.7° to 6.4° in joint kinematics, 2.7 to 8.1 N m in joint moments, and 35.8 to 130.8 N in muscle forces. The impact of movement artifact was 1.8 times larger than any other propagated source. Sensitivity to specific body segment parameters and muscle parameters were linked to where in the gait cycle they were calculated. We anticipate that through the increased use of probabilistic tools, researchers will better understand the strengths and limitations of their musculoskeletal simulations and more effectively use simulations to evaluate hypotheses and inform clinical decisions. PMID:25404535

  20. Using Empirical Orthogonal Teleconnections to Analyze Interannual Precipitation Variability in China

    NASA Astrophysics Data System (ADS)

    Stephan, C.; Klingaman, N. P.; Vidale, P. L.; Turner, A. G.; Demory, M. E.; Guo, L.

    2017-12-01

    Interannual rainfall variability in China affects agriculture, infrastructure and water resource management. A consistent and objective method, Empirical Orthogonal Teleconnection (EOT) analysis, is applied to precipitation observations over China in all seasons. Instead of maximizing the explained space-time variance, the method identifies regions in China that best explain the temporal variability in domain-averaged rainfall. It produces known teleconnections, that include high positive correlations with ENSO in eastern China in winter, along the Yangtze River in summer, and in southeast China during spring. New findings include that variability along the southeast coast in winter, in the Yangtze valley in spring, and in eastern China in autumn, are associated with extratropical Rossby wave trains. The same analysis is applied to six climate simulations of the Met Office Unified Model with and without air-sea coupling and at various horizontal resolutions of 40, 90 and 200 km. All simulations reproduce the observed patterns of interannual rainfall variability in winter, spring and autumn; the leading pattern in summer is present in all but one simulation. However, only in two simulations are all patterns associated with the observed physical mechanism. Coupled simulations capture more observed patterns of variability and associate more of them with the correct physical mechanism, compared to atmosphere-only simulations at the same resolution. Finer resolution does not improve the fidelity of these patterns or their associated mechanisms. Evaluating climate models by only geographical distribution of mean precipitation and its interannual variance is insufficient; attention must be paid to associated mechanisms.

  1. Forward modeling of tree-ring data: a case study with a global network

    NASA Astrophysics Data System (ADS)

    Breitenmoser, P. D.; Frank, D.; Brönnimann, S.

    2012-04-01

    Information derived from tree-rings is one of the most powerful tools presently available for studying past climatic variability as well as identifying fundamental relationships between tree-growth and climate. Climate reconstructions are typically performed by extending linear relationships, established during the overlapping period of instrumental and climate proxy archives into the past. Such analyses, however, are limited by methodological assumptions, including stationarity and linearity of the climate-proxy relationship. We investigate climate and tree-ring data using the Vaganov-Shashkin-Lite (VS-Lite) forward model of tree-ring width formation to examine the relations among actual tree growth and climate (as inferred from the simulated chronologies) to reconstruct past climate variability. The VS-lite model has been shown to produce skill comparable to that achieved using classical dendrochronological statistical modeling techniques when applied on simulations of a network of North American tree-ring chronologies. Although the detailed mechanistic processes such as photosynthesis, storage, or cell processes are not modeled directly, the net effect of the dominating nonlinear climatic controls on tree-growth are implemented into the model by the principle of limiting factors and threshold growth response functions. The VS-lite model requires as inputs only latitude, monthly mean temperature and monthly accumulated precipitation. Hence, this simple, process-based model enables ring-width simulation at any location where monthly climate records exist. In this study, we analyse the growth response of simulated tree-rings to monthly climate conditions obtained from the 20th century reanalysis project back to 1871. These simulated tree-ring chronologies are compared to the climate-driven variability in worldwide observed tree-ring chronologies from the International Tree Ring Database. Results point toward the suitability of the relationship among actual tree growth and climate (as inferred from the simulated chronologies) for use in global palaeoclimate reconstructions.

  2. Simulating the Snow Water Equivalent and its changing pattern over Nepal

    NASA Astrophysics Data System (ADS)

    Niroula, S.; Joseph, J.; Ghosh, S.

    2016-12-01

    Snow fall in the Himalayan region is one of the primary sources of fresh water, which accounts around 10% of total precipitation of Nepal. Snow water is an intricate variable in terms of its global and regional estimates whose complexity is favored by spatial variability linked with rugged topography. The study is primarily focused on simulation of Snow Water Equivalent (SWE) by the use of a macroscale hydrologic model, Variable Infiltration Capacity (VIC). As whole Nepal including its Himalayas lies under the catchment of Ganga River in India, contributing at least 40% of annual discharge of Ganges, this model was run in the entire watershed that covers part of Tibet and Bangladesh as well. Meteorological inputs for 29 years (1979-2007) are drawn from ERA-INTERIM and APHRODITE dataset for horizontal resolution of 0.25 degrees. The analysis was performed to study temporal variability of SWE in the Himalayan region of Nepal. The model was calibrated by observed stream flows of the tributaries of the Gandaki River in Nepal which ultimately feeds river Ganga. Further, the simulated SWE is used to estimate stream flow in this river basin. Since Nepal has a greater snow cover accumulation in monsoon season than in winter at high altitudes, seasonality fluctuations in SWE affecting the stream flows are known. The model provided fair estimates of SWE and stream flow as per statistical analysis. Stream flows are known to be sensitive to the changes in snow water that can bring a negative impact on power generation in a country which has huge hydroelectric potential. In addition, our results on simulated SWE in second largest snow-fed catchment of the country will be helpful for reservoir management, flood forecasting and other water resource management issues. Keywords: Hydrology, Snow Water Equivalent, Variable Infiltration Capacity, Gandaki River Basin, Stream Flow

  3. Trans-Pacific transport and evolution of aerosols: Evaluation of quasi-global WRF-Chem simulation with multiple observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Zhiyuan; Zhao, Chun; Huang, Jianping

    A fully coupled meteorology-chemistry model (WRF-Chem, the Weather Research and Forecasting model coupled with chemistry) has been configured to conduct quasi-global simulation for 5 years (2010–2014) and evaluated with multiple observation data sets for the first time. The evaluation focuses on the simulation over the trans-Pacific transport region using various reanalysis and observational data sets for meteorological fields and aerosol properties. The simulation generally captures the overall spatial and seasonal variability of satellite retrieved aerosol optical depth (AOD) and absorbing AOD (AAOD) over the Pacific that is determined by the outflow of pollutants and dust and the emissions of marine aerosols.more » The assessment of simulated extinction Ångström exponent (EAE) indicates that the model generally reproduces the variability of aerosol size distributions as seen by satellites. In addition, the vertical profile of aerosol extinction and its seasonality over the Pacific are also well simulated. The difference between the simulation and satellite retrievals can be mainly attributed to model biases in estimating marine aerosol emissions as well as the satellite sampling and retrieval uncertainties. Compared with the surface measurements over the western USA, the model reasonably simulates the observed magnitude and seasonality of dust, sulfate, and nitrate surface concentrations, but significantly underestimates the peak surface concentrations of carbonaceous aerosol likely due to model biases in the spatial and temporal variability of biomass burning emissions and secondary organic aerosol (SOA) production. A sensitivity simulation shows that the trans-Pacific transported dust, sulfate, and nitrate can make significant contribution to surface concentrations over the rural areas of the western USA, while the peaks of carbonaceous aerosol surface concentrations are dominated by the North American emissions. Both the retrievals and simulation show small interannual variability of aerosol characteristics for 2010–2014 averaged over three Pacific sub-regions. Furthermore, the evaluation in this study demonstrates that the WRF-Chem quasi-global simulation can be used for investigating trans-Pacific transport of aerosols and providing reasonable inflow chemical boundaries for the western USA, allowing one to further understand the impact of transported pollutants on the regional air quality and climate with high-resolution nested regional modeling.« less

  4. Trans-Pacific transport and evolution of aerosols: Evaluation of quasi-global WRF-Chem simulation with multiple observations

    DOE PAGES

    Hu, Zhiyuan; Zhao, Chun; Huang, Jianping; ...

    2016-05-10

    A fully coupled meteorology-chemistry model (WRF-Chem, the Weather Research and Forecasting model coupled with chemistry) has been configured to conduct quasi-global simulation for 5 years (2010–2014) and evaluated with multiple observation data sets for the first time. The evaluation focuses on the simulation over the trans-Pacific transport region using various reanalysis and observational data sets for meteorological fields and aerosol properties. The simulation generally captures the overall spatial and seasonal variability of satellite retrieved aerosol optical depth (AOD) and absorbing AOD (AAOD) over the Pacific that is determined by the outflow of pollutants and dust and the emissions of marine aerosols.more » The assessment of simulated extinction Ångström exponent (EAE) indicates that the model generally reproduces the variability of aerosol size distributions as seen by satellites. In addition, the vertical profile of aerosol extinction and its seasonality over the Pacific are also well simulated. The difference between the simulation and satellite retrievals can be mainly attributed to model biases in estimating marine aerosol emissions as well as the satellite sampling and retrieval uncertainties. Compared with the surface measurements over the western USA, the model reasonably simulates the observed magnitude and seasonality of dust, sulfate, and nitrate surface concentrations, but significantly underestimates the peak surface concentrations of carbonaceous aerosol likely due to model biases in the spatial and temporal variability of biomass burning emissions and secondary organic aerosol (SOA) production. A sensitivity simulation shows that the trans-Pacific transported dust, sulfate, and nitrate can make significant contribution to surface concentrations over the rural areas of the western USA, while the peaks of carbonaceous aerosol surface concentrations are dominated by the North American emissions. Both the retrievals and simulation show small interannual variability of aerosol characteristics for 2010–2014 averaged over three Pacific sub-regions. Furthermore, the evaluation in this study demonstrates that the WRF-Chem quasi-global simulation can be used for investigating trans-Pacific transport of aerosols and providing reasonable inflow chemical boundaries for the western USA, allowing one to further understand the impact of transported pollutants on the regional air quality and climate with high-resolution nested regional modeling.« less

  5. Simulated effects of recruitment variability, exploitation, and reduced habitat area on the muskellunge population in Shoepack Lake, Voyageurs National Park, Minnesota

    USGS Publications Warehouse

    Frohnauer, N.K.; Pierce, C.L.; Kallemeyn, L.W.

    2007-01-01

    The genetically unique population of muskellunge Esox masquinongy inhabiting Shoepack Lake in Voyageurs National Park, Minnesota, is potentially at risk for loss of genetic variability and long-term viability. Shoepack Lake has been subject to dramatic surface area changes from the construction of an outlet dam by beavers Castor canadensis and its subsequent failure. We simulated the long-term dynamics of this population in response to recruitment variation, increased exploitation, and reduced habitat area. We then estimated the effective population size of the simulated population and evaluated potential threats to long-term viability, based on which we recommend management actions to help preserve the long-term viability of the population. Simulations based on the population size and habitat area at the beginning of a companion study resulted in an effective population size that was generally above the threshold level for risk of loss of genetic variability, except when fishing mortality was increased. Simulations based on the reduced habitat area after the beaver dam failure and our assumption of a proportional reduction in population size resulted in an effective population size that was generally below the threshold level for risk of loss of genetic variability. Our results identified two potential threats to the long-term viability of the Shoepack Lake muskellunge population, reduction in habitat area and exploitation. Increased exploitation can be prevented through traditional fishery management approaches such as the adoption of no-kill, barbless hook, and limited entry regulations. Maintenance of the greatest possible habitat area and prevention of future habitat area reductions will require maintenance of the outlet dam built by beavers. Our study should enhance the long-term viability of the Shoepack Lake muskellunge population and illustrates a useful approach for other unique populations. ?? Copyright by the American Fisheries Society 2007.

  6. Simulation in Canadian postgraduate emergency medicine training - a national survey.

    PubMed

    Russell, Evan; Hall, Andrew Koch; Hagel, Carly; Petrosoniak, Andrew; Dagnone, Jeffrey Damon; Howes, Daniel

    2018-01-01

    Simulation-based education (SBE) is an important training strategy in emergency medicine (EM) postgraduate programs. This study sought to characterize the use of simulation in FRCPC-EM residency programs across Canada. A national survey was administered to residents and knowledgeable program representatives (PRs) at all Canadian FRCPC-EM programs. Survey question themes included simulation program characteristics, the frequency of resident participation, the location and administration of SBE, institutional barriers, interprofessional involvement, content, assessment strategies, and attitudes about SBE. Resident and PR response rates were 63% (203/321) and 100% (16/16), respectively. Residents reported a median of 20 (range 0-150) hours of annual simulation training, with 52% of residents indicating that the time dedicated to simulation training met their needs. PRs reported the frequency of SBE sessions ranging from weekly to every 6 months, with 15 (94%) programs having an established simulation curriculum. Two (13%) of the programs used simulation for resident assessment, although 15 (94%) of PRs indicated that they would be comfortable with simulation-based assessment. The most common PR-identified barriers to administering simulation were a lack of protected faculty time (75%) and a lack of faculty experience with simulation (56%). Interprofessional involvement in simulation was strongly valued by both residents and PRs. SBE is frequently used by Canadian FRCPC-EM residency programs. However, there exists considerable variability in the structure, frequency, and timing of simulation-based activities. As programs transition to competency-based medical education, national organizations and collaborations should consider the variability in how SBE is administered.

  7. Biochemical Network Stochastic Simulator (BioNetS): software for stochastic modeling of biochemical networks.

    PubMed

    Adalsteinsson, David; McMillen, David; Elston, Timothy C

    2004-03-08

    Intrinsic fluctuations due to the stochastic nature of biochemical reactions can have large effects on the response of biochemical networks. This is particularly true for pathways that involve transcriptional regulation, where generally there are two copies of each gene and the number of messenger RNA (mRNA) molecules can be small. Therefore, there is a need for computational tools for developing and investigating stochastic models of biochemical networks. We have developed the software package Biochemical Network Stochastic Simulator (BioNetS) for efficiently and accurately simulating stochastic models of biochemical networks. BioNetS has a graphical user interface that allows models to be entered in a straightforward manner, and allows the user to specify the type of random variable (discrete or continuous) for each chemical species in the network. The discrete variables are simulated using an efficient implementation of the Gillespie algorithm. For the continuous random variables, BioNetS constructs and numerically solves the appropriate chemical Langevin equations. The software package has been developed to scale efficiently with network size, thereby allowing large systems to be studied. BioNetS runs as a BioSpice agent and can be downloaded from http://www.biospice.org. BioNetS also can be run as a stand alone package. All the required files are accessible from http://x.amath.unc.edu/BioNetS. We have developed BioNetS to be a reliable tool for studying the stochastic dynamics of large biochemical networks. Important features of BioNetS are its ability to handle hybrid models that consist of both continuous and discrete random variables and its ability to model cell growth and division. We have verified the accuracy and efficiency of the numerical methods by considering several test systems.

  8. Parametric Sensitivity Analysis for the Asian Summer Monsoon Precipitation Simulation in the Beijing Climate Center AGCM Version 2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Ben; Zhang, Yaocun; Qian, Yun

    In this study, we apply an efficient sampling approach and conduct a large number of simulations to explore the sensitivity of the simulated Asian summer monsoon (ASM) precipitation, including the climatological state and interannual variability, to eight parameters related to the cloud and precipitation processes in the Beijing Climate Center AGCM version 2.1 (BCC_AGCM2.1). Our results show that BCC_AGCM2.1 has large biases in simulating the ASM precipitation. The precipitation efficiency and evaporation coefficient for deep convection are the most sensitive parameters in simulating the ASM precipitation. With optimal parameter values, the simulated precipitation climatology could be remarkably improved, e.g. increasedmore » precipitation over the equator Indian Ocean, suppressed precipitation over the Philippine Sea, and more realistic Meiyu distribution over Eastern China. The ASM precipitation interannual variability is further analyzed, with a focus on the ENSO impacts. It shows the simulations with better ASM precipitation climatology can also produce more realistic precipitation anomalies during El Niño decaying summer. In the low-skill experiments for precipitation climatology, the ENSO-induced precipitation anomalies are most significant over continents (vs. over ocean in observation) in the South Asian monsoon region. More realistic results are derived from the higher-skill experiments with stronger anomalies over the Indian Ocean and weaker anomalies over India and the western Pacific, favoring more evident easterly anomalies forced by the tropical Indian Ocean warming and stronger Indian Ocean-western Pacific tele-connection as observed. Our model results reveal a strong connection between the simulated ASM precipitation climatological state and interannual variability in BCC_AGCM2.1 when key parameters are perturbed.« less

  9. New variable selection methods for zero-inflated count data with applications to the substance abuse field

    PubMed Central

    Buu, Anne; Johnson, Norman J.; Li, Runze; Tan, Xianming

    2011-01-01

    Zero-inflated count data are very common in health surveys. This study develops new variable selection methods for the zero-inflated Poisson regression model. Our simulations demonstrate the negative consequences which arise from the ignorance of zero-inflation. Among the competing methods, the one-step SCAD method is recommended because it has the highest specificity, sensitivity, exact fit, and lowest estimation error. The design of the simulations is based on the special features of two large national databases commonly used in the alcoholism and substance abuse field so that our findings can be easily generalized to the real settings. Applications of the methodology are demonstrated by empirical analyses on the data from a well-known alcohol study. PMID:21563207

  10. Synthetic ALSPAC longitudinal datasets for the Big Data VR project.

    PubMed

    Avraam, Demetris; Wilson, Rebecca C; Burton, Paul

    2017-01-01

    Three synthetic datasets - of observation size 15,000, 155,000 and 1,555,000 participants, respectively - were created by simulating eleven cardiac and anthropometric variables from nine collection ages of the ALSAPC birth cohort study. The synthetic datasets retain similar data properties to the ALSPAC study data they are simulated from (co-variance matrices, as well as the mean and variance values of the variables) without including the original data itself or disclosing participant information.  In this instance, the three synthetic datasets have been utilised in an academia-industry collaboration to build a prototype virtual reality data analysis software, but they could have a broader use in method and software development projects where sensitive data cannot be freely shared.

  11. Effects of ice shelf basal melt variability on evolution of Thwaites Glacier

    NASA Astrophysics Data System (ADS)

    Hoffman, M. J.; Fyke, J. G.; Price, S. F.; Asay-Davis, X.; Perego, M.

    2017-12-01

    Theory, modeling, and observations indicate that marine ice sheets on a retrograde bed, including Thwaites Glacier, Antarctica, are only conditionally stable. Previous modeling studies have shown that rapid, unstable retreat can occur when steady ice-shelf basal melting causes the grounding line to retreat past restraining bedrock bumps. Here we explore the initiation and evolution of unstable retreat of Thwaites Glacier when the ice-shelf basal melt forcing includes temporal variability mimicking realistic climate variability. We use the three-dimensional, higher-order Model for Prediction Across Scales-Land Ice (MPASLI) model forced with an ice shelf basal melt parameterization derived from previous coupled ice sheet/ocean simulations. We add sinusoidal temporal variability to the melt parameterization that represents shoaling and deepening of Circumpolar Deep Water. We perform an ensemble of 250 year duration simulations with different values for the amplitude, period, and phase of the variability. Preliminary results suggest that, overall, variability leads to slower grounding line retreat and less mass loss than steady simulations. Short period (2 yr) variability leads to similar results as steady forcing, whereas decadal variability can result in up to one-third less mass loss. Differences in phase lead to a large range in mass loss/grounding line retreat, but it is always less than the steady forcing. The timing of ungrounding from each restraining bedrock bump, which is strongly affected by the melt variability, is the rate limiting factor, and variability-driven delays in ungrounding at each bump accumulate. Grounding line retreat in the regions between bedrock bumps is relatively unaffected by ice shelf melt variability. While the results are sensitive to the form of the melt parameterization and its variability, we conclude that decadal period ice shelf melt variability could potentially delay marine ice sheet instability by up to many decades. However, it does not alter the eventual mass loss and sea level rise at centennial scales. The potential differences are significant enough to highlight the need for further observations to constrain the amplitude and period of the modes of climate and ocean variability relevant to Antarctic ice shelf melting.

  12. Communication cost of simulating Bell correlations.

    PubMed

    Toner, B F; Bacon, D

    2003-10-31

    What classical resources are required to simulate quantum correlations? For the simplest and most important case of local projective measurements on an entangled Bell pair state, we show that exact simulation is possible using local hidden variables augmented by just one bit of classical communication. Certain quantum teleportation experiments, which teleport a single qubit, therefore admit a local hidden variables model.

  13. Intercomparison of 3D pore-scale flow and solute transport simulation methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.

    2016-09-01

    Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This study provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less

  14. Mixture Factor Analysis for Approximating a Nonnormally Distributed Continuous Latent Factor with Continuous and Dichotomous Observed Variables

    ERIC Educational Resources Information Center

    Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo

    2012-01-01

    Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…

  15. Testing for Two-Way Interactions in the Multigroup Common Factor Model

    ERIC Educational Resources Information Center

    van Smeden, Maarten; Hessen, David J.

    2013-01-01

    In this article, a 2-way multigroup common factor model (MG-CFM) is presented. The MG-CFM can be used to estimate interaction effects between 2 grouping variables on 1 or more hypothesized latent variables. For testing the significance of such interactions, a likelihood ratio test is presented. In a simulation study, the robustness of the…

  16. Collinear Latent Variables in Multilevel Confirmatory Factor Analysis: A Comparison of Maximum Likelihood and Bayesian Estimations

    ERIC Educational Resources Information Center

    Can, Seda; van de Schoot, Rens; Hox, Joop

    2015-01-01

    Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation…

  17. Sensitivity analysis of tracer transport in variably saturated soils at USDA-ARS OPE3 field site

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to assess the effects of uncertainties in hydrologic and geochemical parameters on the results of simulations of the tracer transport in variably saturated soils at the USDA-ARS OPE3 field site. A tracer experiment with a pulse of KCL solution applied to an irrigatio...

  18. Selecting climate change scenarios using impact-relevant sensitivities

    Treesearch

    Julie A. Vano; John B. Kim; David E. Rupp; Philip W. Mote

    2015-01-01

    Climate impact studies often require the selection of a small number of climate scenarios. Ideally, a subset would have simulations that both (1) appropriately represent the range of possible futures for the variable/s most important to the impact under investigation and (2) come from global climate models (GCMs) that provide plausible results for future climate in the...

  19. Seasonal and interannual variability in wetland methane emissions simulated by CLM4Me' and CAM-chem and comparisons to observations of concentrations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, L.; Paudel, R.; Hess, P. G. M.

    Understanding the temporal and spatial variation of wetland methane emissions is essential to the estimation of the global methane budget. Our goal for this study is three-fold: (i) to evaluate the wetland methane fluxes simulated in two versions of the Community Land Model, the Carbon-Nitrogen (CN; i.e., CLM4.0) and the Biogeochemistry (BGC; i.e., CLM4.5) versions using the methane emission model CLM4Me' so as to determine the sensitivity of the emissions to the underlying carbon model; (ii) to compare the simulated atmospheric methane concentrations to observations, including latitudinal gradients and interannual variability so as to determine the extent to which themore » atmospheric observations constrain the emissions; (iii) to understand the drivers of seasonal and interannual variability in atmospheric methane concentrations. Simulations of the transport and removal of methane use the Community Atmosphere Model with chemistry (CAM-chem) model in conjunction with CLM4Me' methane emissions from both CN and BGC simulations and other methane emission sources from literature. In each case we compare model-simulated atmospheric methane concentration with observations. In addition, we simulate the atmospheric concentrations based on the TransCom wetland and rice paddy emissions derived from a different terrestrial ecosystem model, Vegetation Integrative Simulator for Trace gases (VISIT). Our analysis indicates CN wetland methane emissions are higher in the tropics and lower at high latitudes than emissions from BGC. In CN, methane emissions decrease from 1993 to 2004 while this trend does not appear in the BGC version. In the CN version, methane emission variations follow satellite-derived inundation wetlands closely. However, they are dissimilar in BGC due to its different carbon cycle. CAM-chem simulations with CLM4Me' methane emissions suggest that both prescribed anthropogenic and predicted wetlands methane emissions contribute substantially to seasonal and interannual variability in atmospheric methane concentration. Simulated atmospheric CH 4 concentrations in CAM-chem are highly correlated with observations at most of the 14 measurement stations evaluated with an average correlation between 0.71 and 0.80 depending on the simulation (for the period of 1993–2004 for most stations based on data availability). Our results suggest that different spatial patterns of wetland emissions can have significant impacts on Northern and Southern hemisphere (N–S) atmospheric CH 4 concentration gradients and growth rates. In conclusion, this study suggests that both anthropogenic and wetland emissions have significant contributions to seasonal and interannual variations in atmospheric CH 4 concentrations. However, our analysis also indicates the existence of large uncertainties in terms of spatial patterns and magnitude of global wetland methane budgets, and that substantial uncertainty comes from the carbon model underlying the methane flux modules.« less

  20. Seasonal and interannual variability in wetland methane emissions simulated by CLM4Me' and CAM-chem and comparisons to observations of concentrations

    DOE PAGES

    Meng, L.; Paudel, R.; Hess, P. G. M.; ...

    2015-07-03

    Understanding the temporal and spatial variation of wetland methane emissions is essential to the estimation of the global methane budget. Our goal for this study is three-fold: (i) to evaluate the wetland methane fluxes simulated in two versions of the Community Land Model, the Carbon-Nitrogen (CN; i.e., CLM4.0) and the Biogeochemistry (BGC; i.e., CLM4.5) versions using the methane emission model CLM4Me' so as to determine the sensitivity of the emissions to the underlying carbon model; (ii) to compare the simulated atmospheric methane concentrations to observations, including latitudinal gradients and interannual variability so as to determine the extent to which themore » atmospheric observations constrain the emissions; (iii) to understand the drivers of seasonal and interannual variability in atmospheric methane concentrations. Simulations of the transport and removal of methane use the Community Atmosphere Model with chemistry (CAM-chem) model in conjunction with CLM4Me' methane emissions from both CN and BGC simulations and other methane emission sources from literature. In each case we compare model-simulated atmospheric methane concentration with observations. In addition, we simulate the atmospheric concentrations based on the TransCom wetland and rice paddy emissions derived from a different terrestrial ecosystem model, Vegetation Integrative Simulator for Trace gases (VISIT). Our analysis indicates CN wetland methane emissions are higher in the tropics and lower at high latitudes than emissions from BGC. In CN, methane emissions decrease from 1993 to 2004 while this trend does not appear in the BGC version. In the CN version, methane emission variations follow satellite-derived inundation wetlands closely. However, they are dissimilar in BGC due to its different carbon cycle. CAM-chem simulations with CLM4Me' methane emissions suggest that both prescribed anthropogenic and predicted wetlands methane emissions contribute substantially to seasonal and interannual variability in atmospheric methane concentration. Simulated atmospheric CH 4 concentrations in CAM-chem are highly correlated with observations at most of the 14 measurement stations evaluated with an average correlation between 0.71 and 0.80 depending on the simulation (for the period of 1993–2004 for most stations based on data availability). Our results suggest that different spatial patterns of wetland emissions can have significant impacts on Northern and Southern hemisphere (N–S) atmospheric CH 4 concentration gradients and growth rates. In conclusion, this study suggests that both anthropogenic and wetland emissions have significant contributions to seasonal and interannual variations in atmospheric CH 4 concentrations. However, our analysis also indicates the existence of large uncertainties in terms of spatial patterns and magnitude of global wetland methane budgets, and that substantial uncertainty comes from the carbon model underlying the methane flux modules.« less

  1. Modular, high power, variable R dynamic electrical load simulator

    NASA Technical Reports Server (NTRS)

    Joncas, K. P.

    1974-01-01

    The design of a previously developed basic variable R load simulator was entended to increase its power dissipation and transient handling capabilities. The delivered units satisfy all design requirements, and provides for a high power, modular simulation capability uniquely suited to the simulation of complex load responses. In addition to presenting conclusions and recommendations and pertinent background information, the report covers program accomplishments; describes the simulator basic circuits, transfer characteristic, protective features, assembly, and specifications; indicates the results of simulator evaluation, including burn-in and acceptance testing; provides acceptance test data; and summarizes the monthly progress reports.

  2. Impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Li, Chao; Brissette, François P.; Chen, Hua; Wang, Mingna; Essou, Gilles R. C.

    2018-05-01

    Bias correction is usually implemented prior to using climate model outputs for impact studies. However, bias correction methods that are commonly used treat climate variables independently and often ignore inter-variable dependencies. The effects of ignoring such dependencies on impact studies need to be investigated. This study aims to assess the impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling. To this end, a joint bias correction (JBC) method which corrects the joint distribution of two variables as a whole is compared with an independent bias correction (IBC) method; this is considered in terms of correcting simulations of precipitation and temperature from 26 climate models for hydrological modeling over 12 watersheds located in various climate regimes. The results show that the simulated precipitation and temperature are considerably biased not only in the individual distributions, but also in their correlations, which in turn result in biased hydrological simulations. In addition to reducing the biases of the individual characteristics of precipitation and temperature, the JBC method can also reduce the bias in precipitation-temperature (P-T) correlations. In terms of hydrological modeling, the JBC method performs significantly better than the IBC method for 11 out of the 12 watersheds over the calibration period. For the validation period, the advantages of the JBC method are greatly reduced as the performance becomes dependent on the watershed, GCM and hydrological metric considered. For arid/tropical and snowfall-rainfall-mixed watersheds, JBC performs better than IBC. For snowfall- or rainfall-dominated watersheds, however, the two methods behave similarly, with IBC performing somewhat better than JBC. Overall, the results emphasize the advantages of correcting the P-T correlation when using climate model-simulated precipitation and temperature to assess the impact of climate change on watershed hydrology. However, a thorough validation and a comparison with other methods are recommended before using the JBC method, since it may perform worse than the IBC method for some cases due to bias nonstationarity of climate model outputs.

  3. Validation of the Carotid Intima-Media Thickness Variability: Can Manual Segmentations Be Trusted as Ground Truth?

    PubMed

    Meiburger, Kristen M; Molinari, Filippo; Wong, Justin; Aguilar, Luis; Gallo, Diego; Steinman, David A; Morbiducci, Umberto

    2016-07-01

    The common carotid artery intima-media thickness (IMT) is widely accepted and used as an indicator of atherosclerosis. Recent studies, however, have found that the irregularity of the IMT along the carotid artery wall has a stronger correlation with atherosclerosis than the IMT itself. We set out to validate IMT variability (IMTV), a parameter defined to assess IMT irregularities along the wall. In particular, we analyzed whether or not manual segmentations of the lumen-intima and media-adventitia can be considered reliable in calculation of the IMTV parameter. To do this, we used a total of 60 simulated ultrasound images with a priori IMT and IMTV values. The images, simulated using the Fast And Mechanistic Ultrasound Simulation software, presented five different morphologies, four nominal IMT values and three different levels of variability along the carotid artery wall (no variability, small variability and large variability). Three experts traced the lumen-intima (LI) and media-adventitia (MA) profiles, and two automated algorithms were employed to obtain the LI and MA profiles. One expert also re-traced the LI and MA profiles to test intra-reader variability. The average IMTV measurements of the profiles used to simulate the longitudinal B-mode images were 0.002 ± 0.002, 0.149 ± 0.035 and 0.286 ± 0.068 mm for the cases of no variability, small variability and large variability, respectively. The IMTV measurements of one of the automated algorithms were statistically similar (p > 0.05, Wilcoxon signed rank) when considering small and large variability, but non-significant when considering no variability (p < 0.05, Wilcoxon signed rank). The second automated algorithm resulted in statistically similar values in the small variability case. Two readers' manual tracings, however, produced IMTV measurements with a statistically significant difference considering all three variability levels, whereas the third reader found a statistically significant difference in both the no variability and large variability cases. Moreover, the error range between the reader and automatic IMTV values was approximately 0.15 mm, which is on the same order of small IMTV values, indicating that manual and automatic IMTV readings should be not used interchangeably in clinical practice. On the basis of our findings, we conclude that expert manual tracings should not be considered reliable in IMTV measurement and, therefore, should not be trusted as ground truth. On the other hand, our automated algorithm was found to be more reliable, indicating how automated techniques could therefore foster analysis of the carotid artery intima-media thickness irregularity. Copyright © 2016 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  4. Inter-annual variability of the Mediterranean thermohaline circulation in Med-CORDEX simulations

    NASA Astrophysics Data System (ADS)

    Vittoria Struglia, Maria; Adani, Mario; Carillo, Adriana; Pisacane, Giovanna; Sannino, Gianmaria; Beuvier, Jonathan; Lovato, Tomas; Sevault, Florence; Vervatis, Vassilios

    2016-04-01

    Recent atmospheric reanalysis products, such as ERA40 and ERA-interim, and their regional dynamical downscaling prompted the HyMeX/Med-CORDEX community to perform hind-cast simulations of the Mediterranean Sea, giving the opportunity to evaluate the response of different ocean models to a realistic inter-annual atmospheric forcing. Ocean numerical modeling studies have been steadily improving over the last decade through hind-cast processing, and are complementary to observations in studying the relative importance of the mechanisms playing a role in ocean variability, either external forcing or internal ocean variability. This work presents a review and an inter-comparison of the most recent hind-cast simulations of the Mediterranean Sea Circulation, produced in the framework of the Med-CORDEX initiative, at resolutions spanning from 1/8° to 1/16°. The richness of the simulations available for this study is exploited to address the effects of increasing resolution, both of models and forcing, the initialization procedure, and the prescription of the atmospheric boundary conditions, which are particularly relevant in order to model a realistic THC, in the perspective of fully coupled regional ocean-atmosphere models. The mean circulation is well reproduced by all the simulations. However, it can be observed that the horizontal resolution of both atmospheric forcing and ocean model plays a fundamental role in the reproduction of some specific features of both sub-basins and important differences can be observed among low and high resolution atmosphere forcing. We analyze the mean circulation on both the long-term and decadal time scale, and the represented inter-annual variability of intermediate and deep water mass formation processes in both the Eastern and Western sub-basins, finding that models agree with observations in correspondence of specific events, such as the 1992-1993 Eastern Mediterranean Transient, and the 2005-2006 event in the Gulf of Lion. Long-term trends of the hydrological properties have been investigated at sub-basin scale and have been interpreted in terms of response to forcing and boundary conditions, detectable differences resulting mainly due either to the different initialization and spin up procedure or to the different prescription of Atlantic boundary conditions.

  5. Nonlinear vs. linear biasing in Trp-cage folding simulations

    NASA Astrophysics Data System (ADS)

    Spiwok, Vojtěch; Oborský, Pavel; Pazúriková, Jana; Křenek, Aleš; Králová, Blanka

    2015-03-01

    Biased simulations have great potential for the study of slow processes, including protein folding. Atomic motions in molecules are nonlinear, which suggests that simulations with enhanced sampling of collective motions traced by nonlinear dimensionality reduction methods may perform better than linear ones. In this study, we compare an unbiased folding simulation of the Trp-cage miniprotein with metadynamics simulations using both linear (principle component analysis) and nonlinear (Isomap) low dimensional embeddings as collective variables. Folding of the mini-protein was successfully simulated in 200 ns simulation with linear biasing and non-linear motion biasing. The folded state was correctly predicted as the free energy minimum in both simulations. We found that the advantage of linear motion biasing is that it can sample a larger conformational space, whereas the advantage of nonlinear motion biasing lies in slightly better resolution of the resulting free energy surface. In terms of sampling efficiency, both methods are comparable.

  6. Nonlinear vs. linear biasing in Trp-cage folding simulations.

    PubMed

    Spiwok, Vojtěch; Oborský, Pavel; Pazúriková, Jana; Křenek, Aleš; Králová, Blanka

    2015-03-21

    Biased simulations have great potential for the study of slow processes, including protein folding. Atomic motions in molecules are nonlinear, which suggests that simulations with enhanced sampling of collective motions traced by nonlinear dimensionality reduction methods may perform better than linear ones. In this study, we compare an unbiased folding simulation of the Trp-cage miniprotein with metadynamics simulations using both linear (principle component analysis) and nonlinear (Isomap) low dimensional embeddings as collective variables. Folding of the mini-protein was successfully simulated in 200 ns simulation with linear biasing and non-linear motion biasing. The folded state was correctly predicted as the free energy minimum in both simulations. We found that the advantage of linear motion biasing is that it can sample a larger conformational space, whereas the advantage of nonlinear motion biasing lies in slightly better resolution of the resulting free energy surface. In terms of sampling efficiency, both methods are comparable.

  7. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Problems in using p-curve analysis and text-mining to detect rate of p-hacking and evidential value

    PubMed Central

    Thompson, Paul A.

    2016-01-01

    Background. The p-curve is a plot of the distribution of p-values reported in a set of scientific studies. Comparisons between ranges of p-values have been used to evaluate fields of research in terms of the extent to which studies have genuine evidential value, and the extent to which they suffer from bias in the selection of variables and analyses for publication, p-hacking. Methods. p-hacking can take various forms. Here we used R code to simulate the use of ghost variables, where an experimenter gathers data on several dependent variables but reports only those with statistically significant effects. We also examined a text-mined dataset used by Head et al. (2015) and assessed its suitability for investigating p-hacking. Results. We show that when there is ghost p-hacking, the shape of the p-curve depends on whether dependent variables are intercorrelated. For uncorrelated variables, simulated p-hacked data do not give the “p-hacking bump” just below .05 that is regarded as evidence of p-hacking, though there is a negative skew when simulated variables are inter-correlated. The way p-curves vary according to features of underlying data poses problems when automated text mining is used to detect p-values in heterogeneous sets of published papers. Conclusions. The absence of a bump in the p-curve is not indicative of lack of p-hacking. Furthermore, while studies with evidential value will usually generate a right-skewed p-curve, we cannot treat a right-skewed p-curve as an indicator of the extent of evidential value, unless we have a model specific to the type of p-values entered into the analysis. We conclude that it is not feasible to use the p-curve to estimate the extent of p-hacking and evidential value unless there is considerable control over the type of data entered into the analysis. In particular, p-hacking with ghost variables is likely to be missed. PMID:26925335

  9. Impact of Urbanization on Spatial Variability of Rainfall-A case study of Mumbai city with WRF Model

    NASA Astrophysics Data System (ADS)

    Mathew, M.; Paul, S.; Devanand, A.; Ghosh, S.

    2015-12-01

    Urban precipitation enhancement has been identified over many cities in India by previous studies conducted. Anthropogenic effects such as change in land cover from hilly forest areas to flat topography with solid concrete infrastructures has certain effect on the local weather, the same way the greenhouse gas has on climate change. Urbanization could alter the large scale forcings to such an extent that it may bring about temporal and spatial changes in the urban weather. The present study investigate the physical processes involved in urban forcings, such as the effect of sudden increase in wind velocity travelling through the channel space in between the dense array of buildings, which give rise to turbulence and air mass instability in urban boundary layer and in return alters the rainfall distribution as well as rainfall initiation. A numerical model study is conducted over Mumbai metropolitan city which lies on the west coast of India, to assess the effect of urban morphology on the increase in number of extreme rainfall events in specific locations. An attempt has been made to simulate twenty extreme rainfall events that occurred over the summer monsoon period of the year 2014 using high resolution WRF-ARW (Weather Research and Forecasting-Advanced Research WRF) model to assess the urban land cover mechanisms that influences precipitation variability over this spatially varying urbanized region. The result is tested against simulations with altered land use. The correlation of precipitation with spatial variability of land use is found using a detailed urban land use classification. The initial and boundary conditions for running the model were obtained from the global model ECMWF(European Centre for Medium Range Weather Forecast) reanalysis data having a horizontal resolution of 0.75 °x 0.75°. The high resolution simulations show significant spatial variability in the accumulated rainfall, within a few kilometers itself. Understanding the spatial variability of precipitation will help in the planning and management of the built environment more efficiently.

  10. Mechanical and optical behavior of a tunable liquid lens using a variable cross section membrane: modeling results

    NASA Astrophysics Data System (ADS)

    Flores-Bustamante, Mario C.; Rosete-Aguilar, Martha; Calixto, Sergio

    2016-03-01

    A lens containing a liquid medium and having at least one elastic membrane as one of its components is known as an elastic membrane lens (EML). The elastic membrane may have a constant or variable thickness. The optical properties of the EML change by modifying the profile of its elastic membrane(s). The EML formed of elastic constant thickness membrane(s) have been studied extensively. However, EML information using elastic membrane of variable thickness is limited. In this work, we present simulation results of the mechanical and optical behavior of two EML with variable thickness membranes (convex-plane membranes). The profile of its surfaces were modified by liquid medium volume increases. The model of the convex-plane membranes, as well as the simulation of its mechanical behavior, were performed using Solidworks® software; and surface's points of the deformed elastic lens were obtained. Experimental stress-strain data, obtained from a silicone rubber simple tensile test, according to ASTM D638 norm, were used in the simulation. Algebraic expressions, (Schwarzschild formula, up to four deformation coefficients, in a cylindrical coordinate system (r, z)), of the meridional profiles of the first and second surfaces of the deformed convex-plane membranes, were obtained using the results from Solidworks® and a program in the software Mathematica®. The optical performance of the EML was obtained by simulation using the software OSLO® and the algebraic expressions obtained in Mathematica®.

  11. False-Positive Rate of AKI Using Consensus Creatinine–Based Criteria

    PubMed Central

    Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G.S.; Negoianu, Dan; Testani, Jeffrey M.; Berns, Jeffrey S.; Parikh, Chirag R.

    2015-01-01

    Background and objectives Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false–positive rates because of inherent laboratory and biologic variabilities of creatinine. Design, setting, participants, & measurements We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false–positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient’s true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Results Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false–positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%–8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false–positive AKI diagnosis rate of 30.5% (interquartile range =30.1%–30.9%) versus 2.0% (interquartile range =1.9%–2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Conclusions Use of small serum creatinine changes to diagnose AKI is limited by high false–positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. PMID:26336912

  12. False-Positive Rate of AKI Using Consensus Creatinine-Based Criteria.

    PubMed

    Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G S; Negoianu, Dan; Testani, Jeffrey M; Berns, Jeffrey S; Parikh, Chirag R; Wilson, F Perry

    2015-10-07

    Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false-positive rates because of inherent laboratory and biologic variabilities of creatinine. We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false-positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient's true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false-positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%-8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false-positive AKI diagnosis rate of 30.5% (interquartile range =30.1%-30.9%) versus 2.0% (interquartile range =1.9%-2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Use of small serum creatinine changes to diagnose AKI is limited by high false-positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. Copyright © 2015 by the American Society of Nephrology.

  13. Evaluation of ozone hindcasts: optimal data sets to use, and results from simulations with the GMI model for 1990-2010.

    NASA Astrophysics Data System (ADS)

    Logan, J. A.; Megretskaia, I.; Liu, J.; Rodriguez, J. M.; Strahan, S. E.; Damon, M.; Steenrod, S. D.

    2012-12-01

    Simulations of atmospheric composition in the recent past (hindcasts) are a valuable tool for determining the causes of interannual variability (IAV) and trends in tropospheric ozone, including factors such as anthropogenic emissions, biomass burning, stratospheric input, and variability in meteorology. We will review the ozone data sets (balloon, satellite, and surface) that are the most reliable for evaluating hindcasts, and demonstrate their application with the GMI model. The GMI model is driven by the GEOS-5/MERRA reanalysis and includes both stratospheric and tropospheric chemistry. Preliminary analysis of a simulation for 1990-2010 using constant fossil fuel emissions is promising. The model reproduces the recent interannual variability (IAV) in ozone in the lowermost stratosphere seen in MLS and sonde data, as well as the IAV seen in sonde data in the lower stratosphere since 1995, and captures much of the IAV and short-term trends in surface ozone at remote sites, showing the influence of variability in dynamics. There was considerable IAV in ozone in the lowermost stratosphere in the Aura period, but almost none at European alpine sites in winter/spring, when ozone at 150 hPa has been shown to be correlated with that at 700 hPa in earlier years. The model matches the IAV in alpine ozone in Europe in July-September, including the high values in heat-waves, showing the role of variability in meteorology. A focus on IAV in each season is essential. The model matches IAV in MLS in the upper troposphere, TES tropical ozone, and the tropospheric ozone column (OMI/MLS) the best in tSropical regions controlled by ENSO related changes in dynamics. This study, combined with sensitivity simulations with changes to emissions, and simulations with passive tracers (see Abstract by Rodriguez et al. Session A76), lays the foundations for assessment of the mechanisms that have influenced tropospheric ozone in the past two decades.

  14. Effect of year-to-year variability of leaf area index on variable infiltration capacity model performance and simulation of streamflow during drought

    NASA Astrophysics Data System (ADS)

    Tesemma, Z. K.; Wei, Y.; Peel, M. C.; Western, A. W.

    2014-09-01

    This study assessed the effect of using observed monthly leaf area index (LAI) on hydrologic model performance and the simulation of streamflow during drought using the variable infiltration capacity (VIC) hydrological model in the Goulburn-Broken catchment of Australia, which has heterogeneous vegetation, soil and climate zones. VIC was calibrated with both observed monthly LAI and long-term mean monthly LAI, which were derived from the Global Land Surface Satellite (GLASS) observed monthly LAI dataset covering the period from 1982 to 2012. The model performance under wet and dry climates for the two different LAI inputs was assessed using three criteria, the classical Nash-Sutcliffe efficiency, the logarithm transformed flow Nash-Sutcliffe efficiency and the percentage bias. Finally, the percentage deviation of the simulated monthly streamflow using the observed monthly LAI from simulated streamflow using long-term mean monthly LAI was computed. The VIC model predicted monthly streamflow in the selected sub-catchments with model efficiencies ranging from 61.5 to 95.9% during calibration (1982-1997) and 59 to 92.4% during validation (1998-2012). Our results suggest systematic improvements from 4 to 25% in the Nash-Sutcliffe efficiency in pasture dominated catchments when the VIC model was calibrated with the observed monthly LAI instead of the long-term mean monthly LAI. There was limited systematic improvement in tree dominated catchments. The results also suggest that the model overestimation or underestimation of streamflow during wet and dry periods can be reduced to some extent by including the year-to-year variability of LAI in the model, thus reflecting the responses of vegetation to fluctuations in climate and other factors. Hence, the year-to-year variability in LAI should not be neglected; rather it should be included in model calibration as well as simulation of monthly water balance.

  15. The effect of year-to-year variability of leaf area index on Variable Infiltration Capacity model performance and simulation of runoff

    NASA Astrophysics Data System (ADS)

    Tesemma, Z. K.; Wei, Y.; Peel, M. C.; Western, A. W.

    2015-09-01

    This study assessed the effect of using observed monthly leaf area index (LAI) on hydrological model performance and the simulation of runoff using the Variable Infiltration Capacity (VIC) hydrological model in the Goulburn-Broken catchment of Australia, which has heterogeneous vegetation, soil and climate zones. VIC was calibrated with both observed monthly LAI and long-term mean monthly LAI, which were derived from the Global Land Surface Satellite (GLASS) leaf area index dataset covering the period from 1982 to 2012. The model performance under wet and dry climates for the two different LAI inputs was assessed using three criteria, the classical Nash-Sutcliffe efficiency, the logarithm transformed flow Nash-Sutcliffe efficiency and the percentage bias. Finally, the deviation of the simulated monthly runoff using the observed monthly LAI from simulated runoff using long-term mean monthly LAI was computed. The VIC model predicted monthly runoff in the selected sub-catchments with model efficiencies ranging from 61.5% to 95.9% during calibration (1982-1997) and 59% to 92.4% during validation (1998-2012). Our results suggest systematic improvements, from 4% to 25% in Nash-Sutcliffe efficiency, in sparsely forested sub-catchments when the VIC model was calibrated with observed monthly LAI instead of long-term mean monthly LAI. There was limited systematic improvement in tree dominated sub-catchments. The results also suggest that the model overestimation or underestimation of runoff during wet and dry periods can be reduced to 25 mm and 35 mm respectively by including the year-to-year variability of LAI in the model, thus reflecting the responses of vegetation to fluctuations in climate and other factors. Hence, the year-to-year variability in LAI should not be neglected; rather it should be included in model calibration as well as simulation of monthly water balance.

  16. LES study of microphysical variability bias in shallow cumulus

    NASA Astrophysics Data System (ADS)

    Kogan, Yefim

    2017-05-01

    Subgrid-scale (SGS) variability of cloud microphysical variables over the mesoscale numerical weather prediction (NWP) model has been evaluated by means of joint probability distribution functions (JPDFs). The latter were obtained using dynamically balanced Large Eddy Simulation (LES) model dataset from a case of marine trade cumulus initialized with soundings from Rain in Cumulus Over the Ocean (RICO) field project. Bias in autoconversion and accretion rates from different formulations of the JPDFs was analyzed. Approximating the 2-D PDF using a generic (fixed-in-time), but variable-in-height JPDFs give an acceptable level of accuracy, whereas neglecting the SGS variability altogether results in a substantial underestimate of the grid-mean total conversion rate and producing negative bias in rain water. Nevertheless the total effect on rain formation may be uncertain in the long run due to the fact that the negative bias in rain water may be counterbalanced by the positive bias in cloud water. Consequently, the overall effect of SGS neglect needs to be investigated in direct simulations with a NWP model.

  17. Monte Carlo method for photon heating using temperature-dependent optical properties.

    PubMed

    Slade, Adam Broadbent; Aguilar, Guillermo

    2015-02-01

    The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Oxygen distribution in tumors: A qualitative analysis and modeling study providing a novel Monte Carlo approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lagerlöf, Jakob H., E-mail: Jakob@radfys.gu.se; Kindblom, Jon; Bernhardt, Peter

    2014-09-15

    Purpose: To construct a Monte Carlo (MC)-based simulation model for analyzing the dependence of tumor oxygen distribution on different variables related to tumor vasculature [blood velocity, vessel-to-vessel proximity (vessel proximity), and inflowing oxygen partial pressure (pO{sub 2})]. Methods: A voxel-based tissue model containing parallel capillaries with square cross-sections (sides of 10 μm) was constructed. Green's function was used for diffusion calculations and Michaelis-Menten's kinetics to manage oxygen consumption. The model was tuned to approximately reproduce the oxygenational status of a renal carcinoma; the depth oxygenation curves (DOC) were fitted with an analytical expression to facilitate rapid MC simulations of tumormore » oxygen distribution. DOCs were simulated with three variables at three settings each (blood velocity, vessel proximity, and inflowing pO{sub 2}), which resulted in 27 combinations of conditions. To create a model that simulated variable oxygen distributions, the oxygen tension at a specific point was randomly sampled with trilinear interpolation in the dataset from the first simulation. Six correlations between blood velocity, vessel proximity, and inflowing pO{sub 2} were hypothesized. Variable models with correlated parameters were compared to each other and to a nonvariable, DOC-based model to evaluate the differences in simulated oxygen distributions and tumor radiosensitivities for different tumor sizes. Results: For tumors with radii ranging from 5 to 30 mm, the nonvariable DOC model tended to generate normal or log-normal oxygen distributions, with a cut-off at zero. The pO{sub 2} distributions simulated with the six-variable DOC models were quite different from the distributions generated with the nonvariable DOC model; in the former case the variable models simulated oxygen distributions that were more similar to in vivo results found in the literature. For larger tumors, the oxygen distributions became truncated in the lower end, due to anoxia, but smaller tumors showed undisturbed oxygen distributions. The six different models with correlated parameters generated three classes of oxygen distributions. The first was a hypothetical, negative covariance between vessel proximity and pO{sub 2} (VPO-C scenario); the second was a hypothetical positive covariance between vessel proximity and pO{sub 2} (VPO+C scenario); and the third was the hypothesis of no correlation between vessel proximity and pO{sub 2} (UP scenario). The VPO-C scenario produced a distinctly different oxygen distribution than the two other scenarios. The shape of the VPO-C scenario was similar to that of the nonvariable DOC model, and the larger the tumor, the greater the similarity between the two models. For all simulations, the mean oxygen tension decreased and the hypoxic fraction increased with tumor size. The absorbed dose required for definitive tumor control was highest for the VPO+C scenario, followed by the UP and VPO-C scenarios. Conclusions: A novel MC algorithm was presented which simulated oxygen distributions and radiation response for various biological parameter values. The analysis showed that the VPO-C scenario generated a clearly different oxygen distribution from the VPO+C scenario; the former exhibited a lower hypoxic fraction and higher radiosensitivity. In future studies, this modeling approach might be valuable for qualitative analyses of factors that affect oxygen distribution as well as analyses of specific experimental and clinical situations.« less

  19. Oxygen distribution in tumors: a qualitative analysis and modeling study providing a novel Monte Carlo approach.

    PubMed

    Lagerlöf, Jakob H; Kindblom, Jon; Bernhardt, Peter

    2014-09-01

    To construct a Monte Carlo (MC)-based simulation model for analyzing the dependence of tumor oxygen distribution on different variables related to tumor vasculature [blood velocity, vessel-to-vessel proximity (vessel proximity), and inflowing oxygen partial pressure (pO2)]. A voxel-based tissue model containing parallel capillaries with square cross-sections (sides of 10 μm) was constructed. Green's function was used for diffusion calculations and Michaelis-Menten's kinetics to manage oxygen consumption. The model was tuned to approximately reproduce the oxygenational status of a renal carcinoma; the depth oxygenation curves (DOC) were fitted with an analytical expression to facilitate rapid MC simulations of tumor oxygen distribution. DOCs were simulated with three variables at three settings each (blood velocity, vessel proximity, and inflowing pO2), which resulted in 27 combinations of conditions. To create a model that simulated variable oxygen distributions, the oxygen tension at a specific point was randomly sampled with trilinear interpolation in the dataset from the first simulation. Six correlations between blood velocity, vessel proximity, and inflowing pO2 were hypothesized. Variable models with correlated parameters were compared to each other and to a nonvariable, DOC-based model to evaluate the differences in simulated oxygen distributions and tumor radiosensitivities for different tumor sizes. For tumors with radii ranging from 5 to 30 mm, the nonvariable DOC model tended to generate normal or log-normal oxygen distributions, with a cut-off at zero. The pO2 distributions simulated with the six-variable DOC models were quite different from the distributions generated with the nonvariable DOC model; in the former case the variable models simulated oxygen distributions that were more similar to in vivo results found in the literature. For larger tumors, the oxygen distributions became truncated in the lower end, due to anoxia, but smaller tumors showed undisturbed oxygen distributions. The six different models with correlated parameters generated three classes of oxygen distributions. The first was a hypothetical, negative covariance between vessel proximity and pO2 (VPO-C scenario); the second was a hypothetical positive covariance between vessel proximity and pO2 (VPO+C scenario); and the third was the hypothesis of no correlation between vessel proximity and pO2 (UP scenario). The VPO-C scenario produced a distinctly different oxygen distribution than the two other scenarios. The shape of the VPO-C scenario was similar to that of the nonvariable DOC model, and the larger the tumor, the greater the similarity between the two models. For all simulations, the mean oxygen tension decreased and the hypoxic fraction increased with tumor size. The absorbed dose required for definitive tumor control was highest for the VPO+C scenario, followed by the UP and VPO-C scenarios. A novel MC algorithm was presented which simulated oxygen distributions and radiation response for various biological parameter values. The analysis showed that the VPO-C scenario generated a clearly different oxygen distribution from the VPO+C scenario; the former exhibited a lower hypoxic fraction and higher radiosensitivity. In future studies, this modeling approach might be valuable for qualitative analyses of factors that affect oxygen distribution as well as analyses of specific experimental and clinical situations.

  20. Quantifying Intrinsic Variability of Sagittarius A* Using Closure Phase Measurements of the Event Horizon Telescope

    NASA Astrophysics Data System (ADS)

    Roelofs, Freek; Johnson, Michael D.; Shiokawa, Hotaka; Doeleman, Sheperd S.; Falcke, Heino

    2017-09-01

    General relativistic magnetohydrodynamic (GRMHD) simulations of accretion disks and jets associated with supermassive black holes show variability on a wide range of timescales. On timescales comparable to or longer than the gravitational timescale {t}G={GM}/{c}3, variation may be dominated by orbital dynamics of the inhomogeneous accretion flow. Turbulent evolution within the accretion disk is expected on timescales comparable to the orbital period, typically an order of magnitude larger than t G . For Sgr A*, t G is much shorter than the typical duration of a VLBI experiment, enabling us to study this variability within a single observation. Closure phases, the sum of interferometric visibility phases on a triangle of baselines, are particularly useful for studying this variability. In addition to a changing source structure, variations in observed closure phase can also be due to interstellar scattering, thermal noise, and the changing geometry of projected baselines over time due to Earth rotation. We present a metric that is able to distinguish the latter two from intrinsic or scattering variability. This metric is validated using synthetic observations of GRMHD simulations of Sgr A*. When applied to existing multi-epoch EHT data of Sgr A*, this metric shows that the data are most consistent with source models containing intrinsic variability from source dynamics, interstellar scattering, or a combination of those. The effects of black hole inclination, orientation, spin, and morphology (disk or jet) on the expected closure phase variability are also discussed.

  1. Simulated driving and brain imaging: combining behavior, brain activity, and virtual reality.

    PubMed

    Carvalho, Kara N; Pearlson, Godfrey D; Astur, Robert S; Calhoun, Vince D

    2006-01-01

    Virtual reality in the form of simulated driving is a useful tool for studying the brain. Various clinical questions can be addressed, including both the role of alcohol as a modulator of brain function and regional brain activation related to elements of driving. We reviewed a study of the neural correlates of alcohol intoxication through the use of a simulated-driving paradigm and wished to demonstrate the utility of recording continuous-driving behavior through a new study using a programmable driving simulator developed at our center. Functional magnetic resonance imaging data was collected from subjects while operating a driving simulator. Independent component analysis (ICA) was used to analyze the data. Specific brain regions modulated by alcohol, and relationships between behavior, brain function, and alcohol blood levels were examined with aggregate behavioral measures. Fifteen driving epochs taken from two subjects while also recording continuously recorded driving variables were analyzed with ICA. Preliminary findings reveal that four independent components correlate with various aspects of behavior. An increase in braking while driving was found to increase activation in motor areas, while cerebellar areas showed signal increases during steering maintenance, yet signal decreases during steering changes. Additional components and significant findings are further outlined. In summary, continuous behavioral variables conjoined with ICA may offer new insight into the neural correlates of complex human behavior.

  2. Low-order nonlinear dynamic model of IC engine-variable pitch propeller system for general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Richard, Jacques C.

    1995-01-01

    This paper presents a dynamic model of an internal combustion engine coupled to a variable pitch propeller. The low-order, nonlinear time-dependent model is useful for simulating the propulsion system of general aviation single-engine light aircraft. This model is suitable for investigating engine diagnostics and monitoring and for control design and development. Furthermore, the model may be extended to provide a tool for the study of engine emissions, fuel economy, component effects, alternative fuels, alternative engine cycles, flight simulators, sensors, and actuators. Results show that the model provides a reasonable representation of the propulsion system dynamics from zero to 10 Hertz.

  3. Analysis of turbojet-engine controls for afterburning starting

    NASA Technical Reports Server (NTRS)

    Phillips, W E , Jr

    1956-01-01

    A simulation procedure is developed for studying the effects of an afterburner start on a controlled turbojet engine. The afterburner start is represented by introducing a step decrease in the effective exhaust-nozzle area, after which the control returns the controlled engine variables to their initial values. The degree and speed with which the control acts are a measure of the effectiveness of the particular control system. Data are presented from five systems investigated using an electronic analog computer and the developed simulation procedure. These systems are compared with respect to steady-state errors, speed of response, and transient deviations of the system variables.

  4. SEAWAT: A Computer Program for Simulation of Variable-Density Groundwater Flow and Multi-Species Solute and Heat Transport

    USGS Publications Warehouse

    Langevin, Christian D.

    2009-01-01

    SEAWAT is a MODFLOW-based computer program designed to simulate variable-density groundwater flow coupled with multi-species solute and heat transport. The program has been used for a wide variety of groundwater studies including saltwater intrusion in coastal aquifers, aquifer storage and recovery in brackish limestone aquifers, and brine migration within continental aquifers. SEAWAT is relatively easy to apply because it uses the familiar MODFLOW structure. Thus, most commonly used pre- and post-processors can be used to create datasets and visualize results. SEAWAT is a public domain computer program distributed free of charge by the U.S. Geological Survey.

  5. Modelling and Simulation of the Dynamics of the Antigen-Specific T Cell Response Using Variable Structure Control Theory.

    PubMed

    Anelone, Anet J N; Spurgeon, Sarah K

    2016-01-01

    Experimental and mathematical studies in immunology have revealed that the dynamics of the programmed T cell response to vigorous infection can be conveniently modelled using a sigmoidal or a discontinuous immune response function. This paper hypothesizes strong synergies between this existing work and the dynamical behaviour of engineering systems with a variable structure control (VSC) law. These findings motivate the interpretation of the immune system as a variable structure control system. It is shown that dynamical properties as well as conditions to analytically assess the transition from health to disease can be developed for the specific T cell response from the theory of variable structure control. In particular, it is shown that the robustness properties of the specific T cell response as observed in experiments can be explained analytically using a VSC perspective. Further, the predictive capacity of the VSC framework to determine the T cell help required to overcome chronic Lymphocytic Choriomeningitis Virus (LCMV) infection is demonstrated. The findings demonstrate that studying the immune system using variable structure control theory provides a new framework for evaluating immunological dynamics and experimental observations. A modelling and simulation tool results with predictive capacity to determine how to modify the immune response to achieve healthy outcomes which may have application in drug development and vaccine design.

  6. Day-to-day ionospheric variability due to lower atmosphere perturbations

    NASA Astrophysics Data System (ADS)

    Liu, H.; Yudin, V. A.; Roble, R. G.

    2013-12-01

    Ionospheric day-to-day variability is a ubiquitous feature, even in the absence of appreciable geomagnetic activities. Although meteorological perturbations have been recognized as an important source of the variability, it is not well represented in previous modeling studies, and the mechanism is not well understood. This study demonstrates that TIME-GCM (Thermosphere-Ionosphere-Mesosphere-Electrodynamics General Circulation Model) constrained in the stratosphere and mesosphere by the hourly Whole Atmosphere Community Climate Model (WACCM) simulations is capable of reproducing observed features of day-to-day variability in the thermosphere-ionosphere. Realistic weather patterns in the lower atmosphere in WACCM was specified by Modern Era Retrospective reanalysis for Research and Application (MERRA). The day-to-day variations in mean zonal wind, migrating and non-migrating tides in the thermosphere, vertical and zonal ExB drifts, and ionosphere F2 layer peak electron density (NmF2) are examined. The standard deviations of the drifts and NmF2 display local time and longitudinal dependence that compare favorably with observations. Their magnitudes are 50% or more of those from observations. The day-to-day thermosphere and ionosphere variability in the model is primarily caused by the perturbations originated in lower atmosphere, since the model simulation is under constant solar minimum and low geomagnetic conditions.

  7. The effect of workstation and task variables on forces applied during simulated meat cutting.

    PubMed

    McGorry, Raymond W; Dempsey, Patrick G; O'Brien, Niall V

    2004-12-01

    The purpose of the study was to investigate factors related to force and postural exposure during a simulated meat cutting task. The hypothesis was that workstation, tool and task variables would affect the dependent kinetic variables of gripping force, cutting moment and the dependent kinematic variables of elbow elevation and wrist angular displacement in the flexion/extension and radial/ulnar deviation planes. To evaluate this hypothesis a 3 x 3 x 2 x 2 x 2 (surface orientation by surface height by blade angle by cut complexity by work pace) within-subject factorial design was conducted with 12 participants. The results indicated that the variables can act and interact to modify the kinematics and kinetics of a cutting task. Participants used greater grip force and cutting moment when working at a pace based on productivity. The interactions of the work surface height and orientation indicated that the use of an adjustable workstation could minimize wrist deviation from neutral and improve shoulder posture during cutting operations. Angling the knife blade also interacted with workstation variables to improve wrist and upper extremity posture, but this benefit must be weighed against the potential for small increases in force exposure.

  8. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  9. Individual Colorimetric Observer Model

    PubMed Central

    Asano, Yuta; Fairchild, Mark D.; Blondé, Laurent

    2016-01-01

    This study proposes a vision model for individual colorimetric observers. The proposed model can be beneficial in many color-critical applications such as color grading and soft proofing to assess ranges of color matches instead of a single average match. We extended the CIE 2006 physiological observer by adding eight additional physiological parameters to model individual color-normal observers. These eight parameters control lens pigment density, macular pigment density, optical densities of L-, M-, and S-cone photopigments, and λmax shifts of L-, M-, and S-cone photopigments. By identifying the variability of each physiological parameter, the model can simulate color matching functions among color-normal populations using Monte Carlo simulation. The variabilities of the eight parameters were identified through two steps. In the first step, extensive reviews of past studies were performed for each of the eight physiological parameters. In the second step, the obtained variabilities were scaled to fit a color matching dataset. The model was validated using three different datasets: traditional color matching, applied color matching, and Rayleigh matches. PMID:26862905

  10. Evaluation of the ORCHIDEE ecosystem model over Africa against 25 years of satellite-based water and carbon measurements

    NASA Astrophysics Data System (ADS)

    Traore, Abdoul Khadre; Ciais, Philippe; Vuichard, Nicolas; Poulter, Benjamin; Viovy, Nicolas; Guimberteau, Matthieu; Jung, Martin; Myneni, Ranga; Fisher, Joshua B.

    2014-08-01

    Few studies have evaluated land surface models for African ecosystems. Here we evaluate the Organizing Carbon and Hydrology in Dynamic Ecosystems (ORCHIDEE) process-based model for the interannual variability (IAV) of the fraction of absorbed active radiation, the gross primary productivity (GPP), soil moisture, and evapotranspiration (ET). Two ORCHIDEE versions are tested, which differ by their soil hydrology parameterization, one with a two-layer simple bucket and the other a more complex 11-layer soil-water diffusion. In addition, we evaluate the sensitivity of climate forcing data, atmospheric CO2, and soil depth. Beside a very generic vegetation parameterization, ORCHIDEE simulates rather well the IAV of GPP and ET (0.5 < r < 0.9 interannual correlation) over Africa except in forestlands. The ORCHIDEE 11-layer version outperforms the two-layer version for simulating IAV of soil moisture, whereas both versions have similar performance of GPP and ET. Effects of CO2 trends, and of variable soil depth on the IAV of GPP, ET, and soil moisture are small, although these drivers influence the trends of these variables. The meteorological forcing data appear to be quite important for faithfully reproducing the IAV of simulated variables, suggesting that in regions with sparse weather station data, the model uncertainty is strongly related to uncertain meteorological forcing. Simulated variables are positively and strongly correlated with precipitation but negatively and weakly correlated with temperature and solar radiation. Model-derived and observation-based sensitivities are in agreement for the driving role of precipitation. However, the modeled GPP is too sensitive to precipitation, suggesting that processes such as increased water use efficiency during drought need to be incorporated in ORCHIDEE.

  11. MODFLOW/MT3DMS-based simulation of variable-density ground water flow and transport

    USGS Publications Warehouse

    Langevin, C.D.; Guo, W.

    2006-01-01

    This paper presents an approach for coupling MODFLOW and MT3DMS for the simulation of variable-density ground water flow. MODFLOW routines were modified to solve a variable-density form of the ground water flow equation in which the density terms are calculated using an equation of state and the simulated MT3DMS solute concentrations. Changes to the MODFLOW and MT3DMS input files were kept to a minimum, and thus existing data files and data files created with most pre- and postprocessors can be used directly with the SEAWAT code. The approach was tested by simulating the Henry problem and two of the saltpool laboratory experiments (low- and high-density cases). For the Henry problem, the simulated results compared well with the steady-state semianalytic solution and also the transient isochlor movement as simulated by a finite-element model. For the saltpool problem, the simulated breakthrough curves compared better with the laboratory measurements for the low-density case than for the high-density case but showed good agreement with the measured salinity isosurfaces for both cases. Results from the test cases presented here indicate that the MODFLOW/MT3DMS approach provides accurate solutions for problems involving variable-density ground water flow and solute transport. ?? 2006 National Ground Water Association.

  12. Climate simulations and projections with a super-parameterized climate model

    DOE PAGES

    Stan, Cristiana; Xu, Li

    2014-07-01

    The mean climate and its variability are analyzed in a suite of numerical experiments with a fully coupled general circulation model in which subgrid-scale moist convection is explicitly represented through embedded 2D cloud-system resolving models. Control simulations forced by the present day, fixed atmospheric carbon dioxide concentration are conducted using two horizontal resolutions and validated against observations and reanalyses. The mean state simulated by the higher resolution configuration has smaller biases. Climate variability also shows some sensitivity to resolution but not as uniform as in the case of mean state. The interannual and seasonal variability are better represented in themore » simulation at lower resolution whereas the subseasonal variability is more accurate in the higher resolution simulation. The equilibrium climate sensitivity of the model is estimated from a simulation forced by an abrupt quadrupling of the atmospheric carbon dioxide concentration. The equilibrium climate sensitivity temperature of the model is 2.77 °C, and this value is slightly smaller than the mean value (3.37 °C) of contemporary models using conventional representation of cloud processes. As a result, the climate change simulation forced by the representative concentration pathway 8.5 scenario projects an increase in the frequency of severe droughts over most of the North America.« less

  13. The reliability and validity of a soccer-specific nonmotorised treadmill simulation (intermittent soccer performance test).

    PubMed

    Aldous, Jeffrey W F; Akubat, Ibrahim; Chrismas, Bryna C R; Watkins, Samuel L; Mauger, Alexis R; Midgley, Adrian W; Abt, Grant; Taylor, Lee

    2014-07-01

    This study investigated the reliability and validity of a novel nonmotorised treadmill (NMT)-based soccer simulation using a novel activity category called a "variable run" to quantify fatigue during high-speed running. Twelve male University soccer players completed 3 familiarization sessions and 1 peak speed assessment before completing the intermittent soccer performance test (iSPT) twice. The 2 iSPTs were separated by 6-10 days. The total distance, sprint distance, and high-speed running distance (HSD) were 8,968 ± 430 m, 980 ± 75 m and 2,122 ± 140 m, respectively. No significant difference (p > 0.05) was found between repeated trials of the iSPT for all physiological and performance variables. Reliability measures between iSPT1 and iSPT2 showed good agreement (coefficient of variation: <4.6%; intraclass correlation coefficient: >0.80). Furthermore, the variable run phase showed HSD significantly decreased (p ≤ 0.05) in the last 15 minutes (89 ± 6 m) compared with the first 15 minutes (85 ± 7 m), quantifying decrements in high-speed exercise compared with the previous literature. This study validates the iSPT as a NMT-based soccer simulation compared with the previous match-play data and is a reliable tool for assessing and monitoring physiological and performance variables in soccer players. The iSPT could be used in a number of ways including player rehabilitation, understanding the efficacy of nutritional interventions, and also the quantification of environmentally mediated decrements on soccer-specific performance.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Wei; Balkovic, Juraj; van der Velde, M.

    Crop models are increasingly used to assess impacts of climate change/variability and management practices on productivity and environmental performance of alternative cropping systems. Calibration is an important procedure to improve reliability of model simulations, especially for large area applications. However, global-scale crop model calibration has rarely been exercised due to limited data availability and expensive computing cost. Here we present a simple approach to calibrate Environmental Policy Integrated Climate (EPIC) model for a global implementation of rice. We identify four parameters (potential heat unit – PHU, planting density – PD, harvest index – HI, and biomass energy ratio – BER)more » and calibrate them regionally to capture the spatial pattern of reported rice yield in 2000. Model performance is assessed by comparing simulated outputs with independent FAO national data. The comparison demonstrates that the global calibration scheme performs satisfactorily in reproducing the spatial pattern of rice yield, particularly in main rice production areas. Spatial agreement increases substantially when more parameters are selected and calibrated, but with varying efficiencies. Among the parameters, PHU and HI exhibit the highest efficiencies in increasing the spatial agreement. Simulations with different calibration strategies generate a pronounced discrepancy of 5–35% in mean yields across latitude bands, and a small to moderate difference in estimated yield variability and yield changing trend for the period of 1981–2000. Present calibration has little effects in improving simulated yield variability and trends at both regional and global levels, suggesting further works are needed to reproduce temporal variability of reported yields. This study highlights the importance of crop models’ calibration, and presents the possibility of a transparent and consistent up scaling approach for global crop simulations given current availability of global databases of weather, soil, crop calendar, fertilizer and irrigation management information, and reported yield.« less

  15. Force-Sensing Enhanced Simulation Environment (ForSense) for laparoscopic surgery training and assessment.

    PubMed

    Cundy, Thomas P; Thangaraj, Evelyn; Rafii-Tari, Hedyeh; Payne, Christopher J; Azzie, Georges; Sodergren, Mikael H; Yang, Guang-Zhong; Darzi, Ara

    2015-04-01

    Excessive or inappropriate tissue interaction force during laparoscopic surgery is a recognized contributor to surgical error, especially for robotic surgery. Measurement of force at the tool-tissue interface is, therefore, a clinically relevant skill assessment variable that may improve effectiveness of surgical simulation. Popular box trainer simulators lack the necessary technology to measure force. The aim of this study was to develop a force sensing unit that may be integrated easily with existing box trainer simulators and to (1) validate multiple force variables as objective measurements of laparoscopic skill, and (2) determine concurrent validity of a revised scoring metric. A base plate unit sensitized to a force transducer was retrofitted to a box trainer. Participants of 3 different levels of operative experience performed 5 repetitions of a peg transfer and suture task. Multiple outcome variables of force were assessed as well as a revised scoring metric that incorporated a penalty for force error. Mean, maximum, and overall magnitudes of force were significantly different among the 3 levels of experience, as well as force error. Experts were found to exert the least force and fastest task completion times, and vice versa for novices. Overall magnitude of force was the variable most correlated with experience level and task completion time. The revised scoring metric had similar predictive strength for experience level compared with the standard scoring metric. Current box trainer simulators can be adapted for enhanced objective measurements of skill involving force sensing. These outcomes are significantly influenced by level of expertise and are relevant to operative safety in laparoscopic surgery. Conventional proficiency standards that focus predominantly on task completion time may be integrated with force-based outcomes to be more accurately reflective of skill quality. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. How Well Do Global Climate Models Simulate the Variability of Atlantic Tropical Cyclones Associated with ENSO?

    NASA Technical Reports Server (NTRS)

    Wang, Hui; Long, Lindsey; Kumar, Arun; Wang, Wanqiu; Schemm, Jae-Kyung E.; Zhao, Ming; Vecchi, Gabriel A.; LaRow, Timorhy E.; Lim, Young-Kwon; Schubert, Siegfried D.; hide

    2013-01-01

    The variability of Atlantic tropical cyclones (TCs) associated with El Nino-Southern Oscillation (ENSO) in model simulations is assessed and compared with observations. The model experiments are 28-yr simulations forced with the observed sea surface temperature from 1982 to 2009. The simulations were coordinated by the U.S. CLIVAR Hurricane Working Group and conducted with five global climate models (GCMs) with a total of 16 ensemble members. The model performance is evaluated based on both individual model ensemble means and multi-model ensemble mean. The latter has the highest anomaly correlation (0.86) for the interannual variability of TCs. Previous observational studies show a strong association between ENSO and Atlantic TC activity, as well as distinctions in the TC activities during eastern Pacific (EP) and central Pacific (CP) El Nino events. The analysis of track density and TC origin indicates that each model has different mean biases. Overall, the GCMs simulate the variability of Atlantic TCs well with weaker activity during EP El Nino and stronger activity during La Nina. For CP El Nino, there is a slight increase in the number of TCs as compared with EP El Nino. However, the spatial distribution of track density and TC origin is less consistent among the models. Particularly, there is no indication of increasing TC activity over the U.S. southeast coastal region as in observations. The difference between the models and observations is likely due to the bias of vertical wind shear in response to the shift of tropical heating associated with CP El Nino, as well as the model bias in the mean circulation.

  17. ENSO Modulations due to Interannual Variability of Freshwater Forcing and Ocean Biology-induced Heating in the Tropical Pacific

    PubMed Central

    Zhang, Rong-Hua; Gao, Chuan; Kang, Xianbiao; Zhi, Hai; Wang, Zhanggui; Feng, Licheng

    2015-01-01

    Recent studies have identified clear climate feedbacks associated with interannual variations in freshwater forcing (FWF) and ocean biology-induced heating (OBH) in the tropical Pacific. The interrelationships among the related anomaly fields are analyzed using hybrid coupled model (HCM) simulations to illustrate their combined roles in modulating the El Niño-Southern Oscillation (ENSO). The HCM-based supporting experiments are performed to isolate the related feedbacks, with interannually varying FWF and OBH being represented individually or collectively, which allows their effects to be examined in a clear way. It is demonstrated that the interannual freshwater forcing enhances ENSO variability and slightly prolongs the simulated ENSO period, while the interannual OBH reduces ENSO variability and slightly shortens the ENSO period, with their feedback effects tending to counteract each other. PMID:26678931

  18. An experimental study of human pilot's scanning behavior

    NASA Technical Reports Server (NTRS)

    Washizu, K.; Tanaka, K.; Osawa, T.

    1982-01-01

    The scanning behavior and the control behavior of the pilot who manually controls the two-variable system, which is the most basic one of multi-variable systems are investigated. Two control tasks which simulate the actual airplane attitude and airspeed control were set up. In order to simulate the change of the situation where the pilot is placed, such as changes of flight phase, mission and others, the subject was requested to vary the weightings, as his control strategy, upon each task. Changes of human control dynamics and his canning properties caused by the modification of the situation were investigated. By making use of the experimental results, the optimal model of the control behavior and the scanning behavior of the pilot in the two-variable system is proposed from the standpoint of making the performance index minimal.

  19. Interannual Rainfall Variability in North-East Brazil: Observation and Model Simulation

    NASA Astrophysics Data System (ADS)

    Harzallah, A.; Rocha de Aragão, J. O.; Sadourny, R.

    1996-08-01

    The relationship between interannual variability of rainfall in north-east Brazil and tropical sea-surface temperature is studied using observations and model simulations. The simulated precipitation is the average of seven independent realizations performed using the Laboratoire de Météorologie Dynamique atmospheric general model forced by the 1970-1988 observed sea-surface temperature. The model reproduces very well the rainfall anomalies (correlation of 091 between observed and modelled anomalies). The study confirms that precipitation in north-east Brazil is highly correlated to the sea-surface temperature in the tropical Atlantic and Pacific oceans. Using the singular value decomposition method, we find that Nordeste rainfall is modulated by two independent oscillations, both governed by the Atlantic dipole, but one involving only the Pacific, the other one having a period of about 10 years. Correlations between precipitation in north-east Brazil during February-May and the sea-surface temperature 6 months earlier indicate that both modes are essential to estimate the quality of the rainy season.

  20. Simulation and experimental design of a new advanced variable step size Incremental Conductance MPPT algorithm for PV systems.

    PubMed

    Loukriz, Abdelhamid; Haddadi, Mourad; Messalti, Sabir

    2016-05-01

    Improvement of the efficiency of photovoltaic system based on new maximum power point tracking (MPPT) algorithms is the most promising solution due to its low cost and its easy implementation without equipment updating. Many MPPT methods with fixed step size have been developed. However, when atmospheric conditions change rapidly , the performance of conventional algorithms is reduced. In this paper, a new variable step size Incremental Conductance IC MPPT algorithm has been proposed. Modeling and simulation of different operational conditions of conventional Incremental Conductance IC and proposed methods are presented. The proposed method was developed and tested successfully on a photovoltaic system based on Flyback converter and control circuit using dsPIC30F4011. Both, simulation and experimental design are provided in several aspects. A comparative study between the proposed variable step size and fixed step size IC MPPT method under similar operating conditions is presented. The obtained results demonstrate the efficiency of the proposed MPPT algorithm in terms of speed in MPP tracking and accuracy. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  1. A new approach to fluid-structure interaction within graphics hardware accelerated smooth particle hydrodynamics considering heterogeneous particle size distribution

    NASA Astrophysics Data System (ADS)

    Eghtesad, Adnan; Knezevic, Marko

    2018-07-01

    A corrective smooth particle method (CSPM) within smooth particle hydrodynamics (SPH) is used to study the deformation of an aircraft structure under high-velocity water-ditching impact load. The CSPM-SPH method features a new approach for the prediction of two-way fluid-structure interaction coupling. Results indicate that the implementation is well suited for modeling the deformation of structures under high-velocity impact into water as evident from the predicted stress and strain localizations in the aircraft structure as well as the integrity of the impacted interfaces, which show no artificial particle penetrations. To reduce the simulation time, a heterogeneous particle size distribution over a complex three-dimensional geometry is used. The variable particle size is achieved from a finite element mesh with variable element size and, as a result, variable nodal (i.e., SPH particle) spacing. To further accelerate the simulations, the SPH code is ported to a graphics processing unit using the OpenACC standard. The implementation and simulation results are described and discussed in this paper.

  2. A new approach to fluid-structure interaction within graphics hardware accelerated smooth particle hydrodynamics considering heterogeneous particle size distribution

    NASA Astrophysics Data System (ADS)

    Eghtesad, Adnan; Knezevic, Marko

    2017-12-01

    A corrective smooth particle method (CSPM) within smooth particle hydrodynamics (SPH) is used to study the deformation of an aircraft structure under high-velocity water-ditching impact load. The CSPM-SPH method features a new approach for the prediction of two-way fluid-structure interaction coupling. Results indicate that the implementation is well suited for modeling the deformation of structures under high-velocity impact into water as evident from the predicted stress and strain localizations in the aircraft structure as well as the integrity of the impacted interfaces, which show no artificial particle penetrations. To reduce the simulation time, a heterogeneous particle size distribution over a complex three-dimensional geometry is used. The variable particle size is achieved from a finite element mesh with variable element size and, as a result, variable nodal (i.e., SPH particle) spacing. To further accelerate the simulations, the SPH code is ported to a graphics processing unit using the OpenACC standard. The implementation and simulation results are described and discussed in this paper.

  3. Sobol' sensitivity analysis for stressor impacts on honeybee ...

    EPA Pesticide Factsheets

    We employ Monte Carlo simulation and nonlinear sensitivity analysis techniques to describe the dynamics of a bee exposure model, VarroaPop. Daily simulations are performed of hive population trajectories, taking into account queen strength, foraging success, mite impacts, weather, colony resources, population structure, and other important variables. This allows us to test the effects of defined pesticide exposure scenarios versus controlled simulations that lack pesticide exposure. The daily resolution of the model also allows us to conditionally identify sensitivity metrics. We use the variancebased global decomposition sensitivity analysis method, Sobol’, to assess firstand secondorder parameter sensitivities within VarroaPop, allowing us to determine how variance in the output is attributed to each of the input variables across different exposure scenarios. Simulations with VarroaPop indicate queen strength, forager life span and pesticide toxicity parameters are consistent, critical inputs for colony dynamics. Further analysis also reveals that the relative importance of these parameters fluctuates throughout the simulation period according to the status of other inputs. Our preliminary results show that model variability is conditional and can be attributed to different parameters depending on different timescales. By using sensitivity analysis to assess model output and variability, calibrations of simulation models can be better informed to yield more

  4. Modeling studies on the formation of Hurricane Helene: the impact of GPS dropwindsondes from the NAMMA 2006 field campaign

    NASA Astrophysics Data System (ADS)

    Folmer, Michael J.; Pasken, Robert W.; Chiao, Sen; Dunion, Jason; Halverson, Jeffrey

    2016-12-01

    Numerical simulations, using the weather research and forecasting (WRF) model in concert with GPS dropwindsondes released during the NASA African Monsoon Multidisciplinary Analyses 2006 Field Campaign, were conducted to provide additional insight on SAL-TC interaction. Using NCEP Final analysis datasets to initialize the WRF, a sensitivity test was performed on the assimilated (i.e., observation nudging) GPS dropwindsondes to understand the effects of individual variables (i.e., moisture, temperature, and winds) on the simulation and determine the extent of improvement when compared to available observations. The results suggested that GPS dropwindsonde temperature data provided the most significant difference in the simulated storm organization, storm strength, and synoptic environment, but all of the variables assimilated at the same time give a more representative mesoscale and synoptic picture.

  5. How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?

    DOE PAGES

    Hagos, Samson; Ruby Leung, L.; Zhao, Chun; ...

    2018-02-10

    Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less

  6. Trace gas variability within the Asian monsoon anticyclone on intraseasonal and interannual timescales

    NASA Astrophysics Data System (ADS)

    Nützel, Matthias; Dameris, Martin; Fierli, Federico; Stiller, Gabriele; Garny, Hella; Jöckel, Patrick

    2016-04-01

    The Asian monsoon and the associated monsoon anticyclone have the potential of substantially influencing the composition of the UTLS (upper troposphere/lower stratosphere) and hence global climate. Here we study the variability of the Asian summer monsoon anticyclone in the UTLS on intraseasonal and interannual timescales using results from long term simulations performed with the CCM EMAC (ECHAM5/MESSy Atmospheric Chemistry). In particular, we focus on specified dynamics simulations (Newtonian relaxation to ERA-Interim data) covering the period 1980-2013, which have been performed within the ESCiMo (Earth System Chemistry integrated Modelling) project (Jöckel et al., GMDD, 2015). Our main focus lies on variability of the anticyclone's strength (in terms of potential vorticity, geopotential and circulation) and variability in trace gas signatures (O3, H2O) within the anticyclone. To support our findings, we also include observations from satellites (MIPAS, MLS). Our work is linked to the EU StratoClim campaign in 2016.

  7. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 2. Application to Owens Valley, California

    USGS Publications Warehouse

    Guymon, Gary L.; Yen, Chung-Cheng

    1990-01-01

    The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.

  8. An efficient deterministic-probabilistic approach to modeling regional groundwater flow: 2. Application to Owens Valley, California

    NASA Astrophysics Data System (ADS)

    Guymon, Gary L.; Yen, Chung-Cheng

    1990-07-01

    The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.

  9. How Do Microphysical Processes Influence Large-Scale Precipitation Variability and Extremes?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Ruby Leung, L.; Zhao, Chun

    Convection permitting simulations using the Model for Prediction Across Scales-Atmosphere (MPAS-A) are used to examine how microphysical processes affect large-scale precipitation variability and extremes. An episode of the Madden-Julian Oscillation is simulated using MPAS-A with a refined region at 4-km grid spacing over the Indian Ocean. It is shown that cloud microphysical processes regulate the precipitable water (PW) statistics. Because of the non-linear relationship between precipitation and PW, PW exceeding a certain critical value (PWcr) contributes disproportionately to precipitation variability. However, the frequency of PW exceeding PWcr decreases rapidly with PW, so changes in microphysical processes that shift the columnmore » PW statistics relative to PWcr even slightly have large impacts on precipitation variability. Furthermore, precipitation variance and extreme precipitation frequency are approximately linearly related to the difference between the mean and critical PW values. Thus observed precipitation statistics could be used to directly constrain model microphysical parameters as this study demonstrates using radar observations from DYNAMO field campaign.« less

  10. The coupled atmosphere-chemistry-ocean model SOCOL-MPIOM

    NASA Astrophysics Data System (ADS)

    Muthers, S.; Anet, J. G.; Stenke, A.; Raible, C. C.; Rozanov, E.; Brönnimann, S.; Peter, T.; Arfeuille, F. X.; Shapiro, A. I.; Beer, J.; Steinhilber, F.; Brugnara, Y.; Schmutz, W.

    2014-05-01

    The newly developed atmosphere-ocean-chemistry-climate model SOCOL-MPIOM is presented by demonstrating the influence of the interactive chemistry module on the climate state and the variability. Therefore, we compare pre-industrial control simulations with (CHEM) and without (NOCHEM) interactive chemistry. In general, the influence of the chemistry on the mean state and the variability is small and mainly restricted to the stratosphere and mesosphere. The largest differences are found for the atmospheric dynamics in the polar regions, with slightly stronger northern and southern winter polar vortices in CHEM. The strengthening of the vortex is related to larger stratospheric temperature gradients, which are attributed to a parametrization of the absorption of ozone and oxygen in the Lyman-alpha, Schumann-Runge, Hartley, and Higgins bands. This effect is parametrized in the version with interactive chemistry only. A second reason for the temperature differences between CHEM and NOCHEM is related to diurnal variations in the ozone concentrations in the higher atmosphere, which are missing in NOCHEM. Furthermore, stratospheric water vapour concentrations differ substantially between the two experiments, but their effect on the temperatures is small. In both setups, the simulated intensity and variability of the northern polar vortex is inside the range of present day observations. Sudden stratospheric warming events are well reproduced in terms of their frequency, but the distribution amongst the winter months is too uniform. Additionally, the performance of SOCOL-MPIOM under changing external forcings is assessed for the period 1600-2000 using an ensemble of simulations driven by a spectral solar forcing reconstruction. The amplitude of the reconstruction is large in comparison to other state-of-the-art reconstructions, providing an upper limit for the importance of the solar signal. In the pre-industrial period (1600-1850) the simulated surface temperature trends are in reasonable agreement with temperature reconstructions, although the multi-decadal variability is more pronounced. This enhanced variability can be attributed to the variability in the solar forcing. The simulated temperature reductions during the Maunder Minimum are in the lowest probability range of the proxy records. During the Dalton Minimum, when also volcanic forcing is an important driver of temperature variations, the agreement is better. In the industrial period from 1850 onward SOCOL-MPIOM overestimates the temperature increase in comparison to observational data sets. Sensitivity simulations show that this overestimation can be attributed to the increasing trend in the solar forcing reconstruction that is used in this study and an additional warming induced by the simulated ozone changes.

  11. A Computer Simulation Study of Vntr Population Genetics: Constrained Recombination Rules Out the Infinite Alleles Model

    PubMed Central

    Harding, R. M.; Boyce, A. J.; Martinson, J. J.; Flint, J.; Clegg, J. B.

    1993-01-01

    Extensive allelic diversity in variable numbers of tandem repeats (VNTRs) has been discovered in the human genome. For population genetic studies of VNTRs, such as forensic applications, it is important to know whether a neutral mutation-drift balance of VNTR polymorphism can be represented by the infinite alleles model. The assumption of the infinite alleles model that each new mutant is unique is very likely to be violated by unequal sister chromatid exchange (USCE), the primary process believed to generate VNTR mutants. We show that increasing both mutation rates and misalignment constraint for intrachromosomal recombination in a computer simulation model reduces simulated VNTR diversity below the expectations of the infinite alleles model. Maximal constraint, represented as slippage of single repeats, reduces simulated VNTR diversity to levels expected from the stepwise mutation model. Although misalignment rule is the more important variable, mutation rate also has an effect. At moderate rates of USCE, simulated VNTR diversity fluctuates around infinite alleles expectation. However, if rates of USCE are high, as for hypervariable VNTRs, simulated VNTR diversity is consistently lower than predicted by the infinite alleles model. This has been observed for many VNTRs and accounted for by technical problems in distinguishing alleles of neighboring size classes. We use sampling theory to confirm the intrinsically poor fit to the infinite alleles model of both simulated VNTR diversity and observed VNTR polymorphisms sampled from two Papua New Guinean populations. PMID:8293988

  12. A computer simulation study of VNTR population genetics: constrained recombination rules out the infinite alleles model.

    PubMed

    Harding, R M; Boyce, A J; Martinson, J J; Flint, J; Clegg, J B

    1993-11-01

    Extensive allelic diversity in variable numbers of tandem repeats (VNTRs) has been discovered in the human genome. For population genetic studies of VNTRs, such as forensic applications, it is important to know whether a neutral mutation-drift balance of VNTR polymorphism can be represented by the infinite alleles model. The assumption of the infinite alleles model that each new mutant is unique is very likely to be violated by unequal sister chromatid exchange (USCE), the primary process believed to generate VNTR mutants. We show that increasing both mutation rates and misalignment constraint for intrachromosomal recombination in a computer simulation model reduces simulated VNTR diversity below the expectations of the infinite alleles model. Maximal constraint, represented as slippage of single repeats, reduces simulated VNTR diversity to levels expected from the stepwise mutation model. Although misalignment rule is the more important variable, mutation rate also has an effect. At moderate rates of USCE, simulated VNTR diversity fluctuates around infinite alleles expectation. However, if rates of USCE are high, as for hypervariable VNTRs, simulated VNTR diversity is consistently lower than predicted by the infinite alleles model. This has been observed for many VNTRs and accounted for by technical problems in distinguishing alleles of neighboring size classes. We use sampling theory to confirm the intrinsically poor fit to the infinite alleles model of both simulated VNTR diversity and observed VNTR polymorphisms sampled from two Papua New Guinean populations.

  13. A computer simulation study of VNTR population genetics: Constrained recombination rules out the infinite alleles model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harding, R.M.; Martinson, J.J.; Flint, J.

    1993-11-01

    Extensive allelic diversity in variable numbers of tandem repeats (VNTRs) has been discovered in the human genome. For population genetic studies of VNTRs, such as forensic applications, it is important to know whether a neutral mutation-drift balance of VNTR polymorphism can be represented by the infinite alleles model. The assumption of the infinite alleles model that each new mutant is unique is very likely to be violated by unequal sister chromatid exchange (USCE), the primary process believed to generate VNTR mutants. The authors show that increasing both mutation rates and misalignment constraint for intrachromosomal recombination in a computer simulation modelmore » reduces simulated VNTR diversity below the expectations of the infinite alleles model. Maximal constraint, represented as slippage of single repeats, reduces simulated VNTR diversity to levels expected from the stepwise mutation model. Although misalignment rule is the more important variable, mutation rate also has an effect. At moderate rates of USCE, simulated VNTR diversity fluctuates around infinite alleles expectation. However, if rates of USCE are high, as for hypervariable VNTRs, simulated VNTR diversity is consistently lower than predicted by the infinite alleles model. This has been observed for many VNTRs and accounted for by technical problems in distinguishing alleles of neighboring size classes. The authors use sampling theory to confirm the intrinsically poor fit to the infinite model of both simulated VNTR diversity and observed VNTR polymorphisms sampled from two Papua New Guinean populations. 25 refs., 20 figs., 4 tabs.« less

  14. Older People’s Perceptions of Pedestrian Friendliness and Traffic Safety: An Experiment Using Computer-Simulated Walking Environments

    PubMed Central

    Kahlert, Daniela; Schlicht, Wolfgang

    2015-01-01

    Traffic safety and pedestrian friendliness are considered to be important conditions for older people’s motivation to walk through their environment. This study uses an experimental study design with computer-simulated living environments to investigate the effect of micro-scale environmental factors (parking spaces and green verges with trees) on older people’s perceptions of both motivational antecedents (dependent variables). Seventy-four consecutively recruited older people were randomly assigned watching one of two scenarios (independent variable) on a computer screen. The scenarios simulated a stroll on a sidewalk, as it is ‘typical’ for a German city. In version ‘A,’ the subjects take a fictive walk on a sidewalk where a number of cars are parked partially on it. In version ‘B’, cars are in parking spaces separated from the sidewalk by grass verges and trees. Subjects assessed their impressions of both dependent variables. A multivariate analysis of covariance showed that subjects’ ratings on perceived traffic safety and pedestrian friendliness were higher for Version ‘B’ compared to version ‘A’. Cohen’s d indicates medium (d = 0.73) and large (d = 1.23) effect sizes for traffic safety and pedestrian friendliness, respectively. The study suggests that elements of the built environment might affect motivational antecedents of older people’s walking behavior. PMID:26308026

  15. Does raising type 1 error rate improve power to detect interactions in linear regression models? A simulation study.

    PubMed

    Durand, Casey P

    2013-01-01

    Statistical interactions are a common component of data analysis across a broad range of scientific disciplines. However, the statistical power to detect interactions is often undesirably low. One solution is to elevate the Type 1 error rate so that important interactions are not missed in a low power situation. To date, no study has quantified the effects of this practice on power in a linear regression model. A Monte Carlo simulation study was performed. A continuous dependent variable was specified, along with three types of interactions: continuous variable by continuous variable; continuous by dichotomous; and dichotomous by dichotomous. For each of the three scenarios, the interaction effect sizes, sample sizes, and Type 1 error rate were varied, resulting in a total of 240 unique simulations. In general, power to detect the interaction effect was either so low or so high at α = 0.05 that raising the Type 1 error rate only served to increase the probability of including a spurious interaction in the model. A small number of scenarios were identified in which an elevated Type 1 error rate may be justified. Routinely elevating Type 1 error rate when testing interaction effects is not an advisable practice. Researchers are best served by positing interaction effects a priori and accounting for them when conducting sample size calculations.

  16. Variability of simulants used in recreating stab events.

    PubMed

    Carr, D J; Wainwright, A

    2011-07-15

    Forensic investigators commonly use simulants/backing materials to mount fabrics and/or garments on when recreating damage due to stab events. Such work may be conducted in support of an investigation to connect a particular knife to a stabbing event by comparing the severance morphology obtained in the laboratory to that observed in the incident. There does not appear to have been a comparison of the effect of simulant type on the morphology of severances in fabrics and simulants, nor on the variability of simulants. This work investigates three simulants (pork, gelatine, expanded polystyrene), two knife blades (carving, bread), and how severances in the simulants and an apparel fabric typically used to manufacture T-shirts (single jersey) were affected by (i) simulant type and (ii) blade type. Severances were formed using a laboratory impact apparatus to ensure a consistent impact velocity and hence impact energy independently of the other variables. The impact velocity was chosen so that the force measured was similar to that measured in human performance trials. Force-time and energy-time curves were analysed and severance morphology (y, z directions) investigated. Simulant type and knife type significantly affected the critical forensic measurements of severance length (y direction) in the fabric and 'skin' (Tuftane). The use of EPS resulted in the lowest variability in data, further the severances recorded in both the fabric and Tuftane more accurately reflected the dimensions of the impacting knives. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. A-SIDE: Video Simulation of Teen Alcohol and Marijuana Use Contexts

    PubMed Central

    Anderson, Kristen G; Brackenbury, Lauren; Quackenbush, Mathias; Buras, Morgan; Brown, Sandra A; Price, Joseph

    2014-01-01

    Objective: This investigation examined the concurrent validity of a new video simulation assessing adolescent alcohol and marijuana decision making in peer contexts (A-SIDE). Method: One hundred eleven youth (60% female; age 14–19 years; 80% White, 12.6% Latino; 24% recruited from treatment centers) completed the A-SIDE simulation, self-report measures of alcohol and marijuana use and disorder symptoms, and measures of alcohol (i.e., drinking motives and expectancies) and marijuana (i.e., expectancies) cognitions in the laboratory. Results: Study findings support concurrent associations between behavioral willingness to use alcohol and marijuana on the simulation and current use variables as well as on drinking motives and marijuana expectancies. Relations with use variables were found even when sample characteristics were controlled. Interestingly, willingness to accept nonalcoholic beverages (e.g., soda) and food offers in the simulation were inversely related to recent alcohol and marijuana use behavior. Conclusions: These findings are consistent with prior work using laboratory simulations with college students and provide preliminary validity evidence for this procedure. Future work is needed to examine the predictive utility of the A-SIDE with larger and more diverse samples of youth. PMID:25343652

  18. Parallel methodology to capture cyclic variability in motored engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ameen, Muhsin M.; Yang, Xiaofeng; Kuo, Tang-Wei

    2016-07-28

    Numerical prediction of of cycle-to-cycle variability (CCV) in SI engines is extremely challenging for two key reasons: (i) high-fidelity methods such as large eddy simulation (LES) are require to accurately capture the in-cylinder turbulent flowfield, and (ii) CCV is experienced over long timescales and hence the simulations need to be performed for hundreds of consecutive cycles. In this study, a new methodology is proposed to dissociate this long time-scale problem into several shorter time-scale problems, which can considerably reduce the computational time without sacrificing the fidelity of the simulations. The strategy is to perform multiple single-cycle simulations in parallel bymore » effectively perturbing the simulation parameters such as the initial and boundary conditions. It is shown that by perturbing the initial velocity field effectively based on the intensity of the in-cylinder turbulence, the mean and variance of the in-cylinder flowfield is captured reasonably well. Adding perturbations in the initial pressure field and the boundary pressure improves the predictions. It is shown that this new approach is able to give accurate predictions of the flowfield statistics in less than one-tenth of time required for the conventional approach of simulating consecutive engine cycles.« less

  19. Final Report: Closeout of the Award NO. DE-FG02-98ER62618 (M.S. Fox-Rabinovitz, P.I.)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox-Rabinovitz, M. S.

    The final report describes the study aimed at exploring the variable-resolution stretched-grid (SG) approach to decadal regional climate modeling using advanced numerical techniques. The obtained results have shown that variable-resolution SG-GCMs using stretched grids with fine resolution over the area(s) of interest, is a viable established approach to regional climate modeling. The developed SG-GCMs have been extensively used for regional climate experimentation. The SG-GCM simulations are aimed at studying the U.S. regional climate variability with an emphasis on studying anomalous summer climate events, the U.S. droughts and floods.

  20. Integrative Bayesian variable selection with gene-based informative priors for genome-wide association studies.

    PubMed

    Zhang, Xiaoshuai; Xue, Fuzhong; Liu, Hong; Zhu, Dianwen; Peng, Bin; Wiemels, Joseph L; Yang, Xiaowei

    2014-12-10

    Genome-wide Association Studies (GWAS) are typically designed to identify phenotype-associated single nucleotide polymorphisms (SNPs) individually using univariate analysis methods. Though providing valuable insights into genetic risks of common diseases, the genetic variants identified by GWAS generally account for only a small proportion of the total heritability for complex diseases. To solve this "missing heritability" problem, we implemented a strategy called integrative Bayesian Variable Selection (iBVS), which is based on a hierarchical model that incorporates an informative prior by considering the gene interrelationship as a network. It was applied here to both simulated and real data sets. Simulation studies indicated that the iBVS method was advantageous in its performance with highest AUC in both variable selection and outcome prediction, when compared to Stepwise and LASSO based strategies. In an analysis of a leprosy case-control study, iBVS selected 94 SNPs as predictors, while LASSO selected 100 SNPs. The Stepwise regression yielded a more parsimonious model with only 3 SNPs. The prediction results demonstrated that the iBVS method had comparable performance with that of LASSO, but better than Stepwise strategies. The proposed iBVS strategy is a novel and valid method for Genome-wide Association Studies, with the additional advantage in that it produces more interpretable posterior probabilities for each variable unlike LASSO and other penalized regression methods.

  1. Perspectives for short timescale variability studies with Gaia

    NASA Astrophysics Data System (ADS)

    Roelens, M.; Eyer, L.; Mowlavi, N.; Lecoeur-Taïbi, I.; Rimoldini, L.; Blanco-Cuaresma, S.; Palaversa, L.; Süveges, M.; Charnas, J.; Wevers, T.

    2017-12-01

    We assess the potential of Gaia for detecting and characterizing short timescale variables, i.e. at timescale from a few seconds to a dozen hours, through extensive light-curve simulations for various short timescale variable types, including both periodic and non-periodic variability. We evidence that the variogram analysis applied to Gaia photometry should enable to detect such fast variability phenomena, down to amplitudes of a few millimagnitudes, with limited contamination from longer timescale variables or constant sources. This approach also gives valuable information on the typical timescale(s) of the considered variation, which could complement results of classical period search methods, and help prepare ground-based follow-up of the Gaia short timescale candidates.

  2. Study for urbanization corresponding to socio-economic activities in Savannaket, Laos using satellite remote sensing

    NASA Astrophysics Data System (ADS)

    Kimijiama, S.; Nagai, M.

    2014-06-01

    In Greater Mekong Sub-region (GMS), economic liberalization and deregulation facilitated by GMS Regional Economic Corporation Program (GMS-ECP) has triggered urbanization in the region. However, the urbanization rate and its linkage to socio-economic activities are ambiguous. The objectives of this paper are to: (a) determine the changes in urban area from 1972 to 2013 using remote sensing data, and (b) analyse the relationships between urbanization with respect to socio-economic activities in central Laos. The study employed supervised classification and human visible interpretation to determine changes in urbanization rate. Regression analysis was used to analyze the correlation between the urbanization rate and socio-economic variables. The result shows that the urban area increased significantly from 1972 to 2013. The socio-economic variables such as school enrollment, labour force, mortality rate, water source and sanitation highly correlated with the rate of urbanization during the period. The study concluded that identifying the highly correlated socio-economic variables with urbanization rate could enable us to conduct a further urbanization simulation. The simulation helps in designing policies for sustainable development.

  3. The Effect of the Interannual Variability of the OH Sink on the Interannual Variability of the Atmospheric Methane Mixing Ratio and Carbon Stable Isotope Composition

    NASA Astrophysics Data System (ADS)

    Guillermo Nuñez Ramirez, Tonatiuh; Houweling, Sander; Marshall, Julia; Williams, Jason; Brailsford, Gordon; Schneising, Oliver; Heimann, Martin

    2013-04-01

    The atmospheric hydroxyl radical concentration (OH) varies due to changes in the incoming UV radiation, in the abundance of atmospheric species involved in the production, recycling and destruction of OH molecules and due to climate variability. Variability in carbon monoxide emissions from biomass burning induced by El Niño Southern Oscillation are particularly important. Although the OH sink accounts for the oxidation of approximately 90% of atmospheric CH4, the effect of the variability in the distribution and strength of the OH sink on the interannual variability of atmospheric methane (CH4) mixing ratio and stable carbon isotope composition (δ13C-CH4) has often been ignored. To show this effect we simulated the atmospheric signals of CH4 in a three-dimensional atmospheric transport model (TM3). ERA Interim reanalysis data provided the atmospheric transport and temperature variability from 1990 to 2010. We performed simulations using time dependent OH concentration estimations from an atmospheric chemistry transport model and an atmospheric chemistry climate model. The models assumed a different set of reactions and algorithms which caused a very different strength and distribution of the OH concentration. Methane emissions were based on published bottom-up estimates including inventories, upscaled estimations and modeled fluxes. The simulations also included modeled concentrations of atomic chlorine (Cl) and excited oxygen atoms (O(1D)). The isotopic signal of the sources and the fractionation factors of the sinks were based on literature values, however the isotopic signal from wetlands and enteric fermentation processes followed a linear relationship with a map of C4 plant fraction. The same set of CH4emissions and stratospheric reactants was used in all simulations. Two simulations were done per OH field: one in which the CH4 sources were allowed to vary interannually, and a second where the sources were climatological. The simulated mixing ratios and isotopic compositions at global reference stations were used to construct more robust indicators such as global and zonal means and interhemispheric differences. We also compared the model CH4 mixing ratio to satellite observations, for the period 2003 to 2004 with SCIAMACHY and from 2009 to 2010 with GOSAT. The interannual variability of the different OH fields imprinted an interannual variation of the atmospheric CH4 mixing ratio with a magnitude of ±10 ppb, which is comparable to the effect of all sources combined. Meanwhile its effect on the interannual variability of δ13C-CH4 was minor (< 10%). The interannual variability of the mixing ratio interhemispheric difference is dominated by the sources because the OH sink is concentrated in the tropics, thus its interannual variability affects both hemispheres. Meanwhile, although the OH plays an important role in the establishment of an interhemispheric gradient of δ13C-CH4, the interannual variation of this gradient is negligibly affected by the choice of OH field. Overall the study showed that the variability of the OH sink plays a significant role in the interannual variability of the atmospheric methane mixing ratio, and must be considered to improve our understanding of the recent trends in the global methane budget.

  4. LES/PDF studies of joint statistics of mixture fraction and progress variable in piloted methane jet flames with inhomogeneous inlet flows

    NASA Astrophysics Data System (ADS)

    Zhang, Pei; Barlow, Robert; Masri, Assaad; Wang, Haifeng

    2016-11-01

    The mixture fraction and progress variable are often used as independent variables for describing turbulent premixed and non-premixed flames. There is a growing interest in using these two variables for describing partially premixed flames. The joint statistical distribution of the mixture fraction and progress variable is of great interest in developing models for partially premixed flames. In this work, we conduct predictive studies of the joint statistics of mixture fraction and progress variable in a series of piloted methane jet flames with inhomogeneous inlet flows. The employed models combine large eddy simulations with the Monte Carlo probability density function (PDF) method. The joint PDFs and marginal PDFs are examined in detail by comparing the model predictions and the measurements. Different presumed shapes of the joint PDFs are also evaluated.

  5. On the granular fingering instability: controlled triggering in laboratory experiments and numerical simulations

    NASA Astrophysics Data System (ADS)

    Vriend, Nathalie; Tsang, Jonny; Arran, Matthew; Jin, Binbin; Johnsen, Alexander

    2017-11-01

    When a mixture of small, smooth particles and larger, coarse particles is released on a rough inclined plane, the initial uniform front may break up in distinct fingers which elongate over time. This fingering instability is sensitive to the unique arrangement of individual particles and is driven by granular segregation (Pouliquen et al., 1997). Variability in initial conditions create significant limitations for consistent experimental and numerical validation of newly developed theoretical models (Baker et al., 2016) for finger formation. We present an experimental study using a novel tool that sets the initial fingering width of the instability. By changing this trigger width between experiments, we explore the response of the avalanche breakup to perturbations of different widths. Discrete particle simulations (using MercuryDPM, Thornton et al., 2012) are conducted under a similar setting, reproducing the variable finger width, allowing validation between experiments and numerical simulations. A good agreement between simulations and experiments is obtained, and ongoing theoretical work is briefly introduced. NMV acknowledges the Royal Society Dorothy Hodgkin Research Fellowship.

  6. Hybrid General Pattern Search and Simulated Annealing for Industrail Production Planning Problems

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Barsoum, N.

    2010-06-01

    In this paper, the hybridization of GPS (General Pattern Search) method and SA (Simulated Annealing) incorporated in the optimization process in order to look for the global optimal solution for the fitness function and decision variables as well as minimum computational CPU time. The real strength of SA approach been tested in this case study problem of industrial production planning. This is due to the great advantage of SA for being easily escaping from trapped in local minima by accepting up-hill move through a probabilistic procedure in the final stages of optimization process. Vasant [1] in his Ph. D thesis has provided 16 different techniques of heuristic and meta-heuristic in solving industrial production problems with non-linear cubic objective functions, eight decision variables and 29 constraints. In this paper, fuzzy technological problems have been solved using hybrid techniques of general pattern search and simulated annealing. The simulated and computational results are compared to other various evolutionary techniques.

  7. Incorporation of Fixed Installation Costs into Optimization of Groundwater Remediation with a New Efficient Surrogate Nonlinear Mixed Integer Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Shoemaker, Christine; Wan, Ying

    2016-04-01

    Optimization of nonlinear water resources management issues which have a mixture of fixed (e.g. construction cost for a well) and variable (e.g. cost per gallon of water pumped) costs has been not well addressed because prior algorithms for the resulting nonlinear mixed integer problems have required many groundwater simulations (with different configurations of decision variable), especially when the solution space is multimodal. In particular heuristic methods like genetic algorithms have often been used in the water resources area, but they require so many groundwater simulations that only small systems have been solved. Hence there is a need to have a method that reduces the number of expensive groundwater simulations. A recently published algorithm for nonlinear mixed integer programming using surrogates was shown in this study to greatly reduce the computational effort for obtaining accurate answers to problems involving fixed costs for well construction as well as variable costs for pumping because of a substantial reduction in the number of groundwater simulations required to obtain an accurate answer. Results are presented for a US EPA hazardous waste site. The nonlinear mixed integer surrogate algorithm is general and can be used on other problems arising in hydrology with open source codes in Matlab and python ("pySOT" in Bitbucket).

  8. Climate and atmosphere simulator for experiments on ecological systems in changing environments.

    PubMed

    Verdier, Bruno; Jouanneau, Isabelle; Simonnet, Benoit; Rabin, Christian; Van Dooren, Tom J M; Delpierre, Nicolas; Clobert, Jean; Abbadie, Luc; Ferrière, Régis; Le Galliard, Jean-François

    2014-01-01

    Grand challenges in global change research and environmental science raise the need for replicated experiments on ecosystems subjected to controlled changes in multiple environmental factors. We designed and developed the Ecolab as a variable climate and atmosphere simulator for multifactor experimentation on natural or artificial ecosystems. The Ecolab integrates atmosphere conditioning technology optimized for accuracy and reliability. The centerpiece is a highly contained, 13-m(3) chamber to host communities of aquatic and terrestrial species and control climate (temperature, humidity, rainfall, irradiance) and atmosphere conditions (O2 and CO2 concentrations). Temperature in the atmosphere and in the water or soil column can be controlled independently of each other. All climatic and atmospheric variables can be programmed to follow dynamical trajectories and simulate gradual as well as step changes. We demonstrate the Ecolab's capacity to simulate a broad range of atmospheric and climatic conditions, their diurnal and seasonal variations, and to support the growth of a model terrestrial plant in two contrasting climate scenarios. The adaptability of the Ecolab design makes it possible to study interactions between variable climate-atmosphere factors and biotic disturbances. Developed as an open-access, multichamber platform, this equipment is available to the international scientific community for exploring interactions and feedbacks between ecological and climate systems.

  9. The use of sensory perception indicators for improving the characterization and modelling of total petroleum hydrocarbon (TPH) grade in soils.

    PubMed

    Roxo, Sónia; de Almeida, José António; Matias, Filipa Vieira; Mata-Lima, Herlander; Barbosa, Sofia

    2016-03-01

    This paper proposes a multistep approach for creating a 3D stochastic model of total petroleum hydrocarbon (TPH) grade in potentially polluted soils of a deactivated oil storage site by using chemical analysis results as primary or hard data and classes of sensory perception variables as secondary or soft data. First, the statistical relationship between the sensory perception variables (e.g. colour, odour and oil-water reaction) and TPH grade is analysed, after which the sensory perception variable exhibiting the highest correlation is selected (oil-water reaction in this case study). The probabilities of cells belonging to classes of oil-water reaction are then estimated for the entire soil volume using indicator kriging. Next, local histograms of TPH grade for each grid cell are computed, combining the probabilities of belonging to a specific sensory perception indicator class and conditional to the simulated values of TPH grade. Finally, simulated images of TPH grade are generated by using the P-field simulation algorithm, utilising the local histograms of TPH grade for each grid cell. The set of simulated TPH values allows several calculations to be performed, such as average values, local uncertainties and the probability of the TPH grade of the soil exceeding a specific threshold value.

  10. A Simulation Study of Categorizing Continuous Exposure Variables Measured with Error in Autism Research: Small Changes with Large Effects.

    PubMed

    Heavner, Karyn; Burstyn, Igor

    2015-08-24

    Variation in the odds ratio (OR) resulting from selection of cutoffs for categorizing continuous variables is rarely discussed. We present results for the effect of varying cutoffs used to categorize a mismeasured exposure in a simulated population in the context of autism spectrum disorders research. Simulated cohorts were created with three distinct exposure-outcome curves and three measurement error variances for the exposure. ORs were calculated using logistic regression for 61 cutoffs (mean ± 3 standard deviations) used to dichotomize the observed exposure. ORs were calculated for five categories with a wide range for the cutoffs. For each scenario and cutoff, the OR, sensitivity, and specificity were calculated. The three exposure-outcome relationships had distinctly shaped OR (versus cutoff) curves, but increasing measurement error obscured the shape. At extreme cutoffs, there was non-monotonic oscillation in the ORs that cannot be attributed to "small numbers." Exposure misclassification following categorization of the mismeasured exposure was differential, as predicted by theory. Sensitivity was higher among cases and specificity among controls. Cutoffs chosen for categorizing continuous variables can have profound effects on study results. When measurement error is not too great, the shape of the OR curve may provide insight into the true shape of the exposure-disease relationship.

  11. THREE-DIMENSIONAL SIMULATIONS OF LONG DURATION GAMMA-RAY BURST JETS: TIMESCALES FROM VARIABLE ENGINES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    López-Cámara, D.; Lazzati, Davide; Morsony, Brian J., E-mail: diego@astro.unam.mx

    2016-08-01

    Gamma-ray burst (GRB) light curves are characterized by marked variability, each showing unique properties. The origin of this variability, at least for a fraction of long GRBs, may be the result of an unsteady central engine. It is thus important to study the effects that an episodic central engine has on the jet propagation and, eventually, on the prompt emission within the collapsar scenario. Thus, in this study we follow the interaction of pulsed outflows with their progenitor stars with hydrodynamic numerical simulations in both two and three dimensions. We show that the propagation of unsteady jets is affected bymore » the interaction with the progenitor material well after the break-out time, especially for jets with long quiescent times comparable to or larger than a second. We also show that this interaction can lead to an asymmetric behavior in which pulse durations and quiescent periods are systematically different. After the pulsed jets drill through the progenitor and the interstellar medium, we find that, on average, the quiescent epochs last longer than the pulses (even in simulations with symmetrical active and quiescent engine times). This could explain the asymmetry detected in the light curves of long quiescent time GRBs.« less

  12. Arctic storms simulated in atmospheric general circulation models under uniform high, uniform low, and variable resolutions

    NASA Astrophysics Data System (ADS)

    Roesler, E. L.; Bosler, P. A.; Taylor, M.

    2016-12-01

    The impact of strong extratropical storms on coastal communities is large, and the extent to which storms will change with a warming Arctic is unknown. Understanding storms in reanalysis and in climate models is important for future predictions. We know that the number of detected Arctic storms in reanalysis is sensitive to grid resolution. To understand Arctic storm sensitivity to resolution in climate models, we describe simulations designed to identify and compare Arctic storms at uniform low resolution (1 degree), at uniform high resolution (1/8 degree), and at variable resolution (1 degree to 1/8 degree). High-resolution simulations resolve more fine-scale structure and extremes, such as storms, in the atmosphere than a uniform low-resolution simulation. However, the computational cost of running a globally uniform high-resolution simulation is often prohibitive. The variable resolution tool in atmospheric general circulation models permits regional high-resolution solutions at a fraction of the computational cost. The storms are identified using the open-source search algorithm, Stride Search. The uniform high-resolution simulation has over 50% more storms than the uniform low-resolution and over 25% more storms than the variable resolution simulations. Storm statistics from each of the simulations is presented and compared with reanalysis. We propose variable resolution as a cost-effective means of investigating physics/dynamics coupling in the Arctic environment. Future work will include comparisons with observed storms to investigate tuning parameters for high resolution models. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND2016-7402 A

  13. Accelerating simulation for the multiple-point statistics algorithm using vector quantization

    NASA Astrophysics Data System (ADS)

    Zuo, Chen; Pan, Zhibin; Liang, Hao

    2018-03-01

    Multiple-point statistics (MPS) is a prominent algorithm to simulate categorical variables based on a sequential simulation procedure. Assuming training images (TIs) as prior conceptual models, MPS extracts patterns from TIs using a template and records their occurrences in a database. However, complex patterns increase the size of the database and require considerable time to retrieve the desired elements. In order to speed up simulation and improve simulation quality over state-of-the-art MPS methods, we propose an accelerating simulation for MPS using vector quantization (VQ), called VQ-MPS. First, a variable representation is presented to make categorical variables applicable for vector quantization. Second, we adopt a tree-structured VQ to compress the database so that stationary simulations are realized. Finally, a transformed template and classified VQ are used to address nonstationarity. A two-dimensional (2D) stationary channelized reservoir image is used to validate the proposed VQ-MPS. In comparison with several existing MPS programs, our method exhibits significantly better performance in terms of computational time, pattern reproductions, and spatial uncertainty. Further demonstrations consist of a 2D four facies simulation, two 2D nonstationary channel simulations, and a three-dimensional (3D) rock simulation. The results reveal that our proposed method is also capable of solving multifacies, nonstationarity, and 3D simulations based on 2D TIs.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J; Li, X; Ding, X

    Purpose: We investigate the spot characteristic and dose profiles properties from a compact gantry proton therapy system. This compact design features a dedicated pencil beam scanning nozzle with the scanning magnet located upstream of the final 60 degree bending magnet. Due to the unique beam line design, uncertainty has been raised in the virtual source-to-axis distance (SAD). We investigate its potential clinical impact through measurements and simulation. Methods: A scintillator camera based detector was used to measure spot characteristics and position accuracy. An ion chamber array device was used to measure planar dose profile. Dose profile in-air simulation was performedmore » using in-house built MATLAB program based on additional spot parameters directly from measurements. Spot characteristics such as position and in-air sigma values were used to general simulated 2D elliptical Gaussian spots. The virtual SAD distance changes in the longitudinal direction were also simulated. Planar dose profiles were generated by summation of simulated spots at the isocenter, 15 cm above the isocenter, and 15 cm below the isocenter for evaluation of potential clinical dosimetric impact. Results: We found that the virtual SAD varies depending on the spot location on the longitudinal axis. Measurements have shown that the variable SAD changes from 7 to 12 meters from one end to the other end of the treatment field in the longitudinal direction. The simulation shows that the planer dose profiles differences between the fixed SAD and variable SAD are within 3% from the isocenter profile and the lateral penumbras are within 1 mm difference. Conclusion: Our measurements and simulations show that there are minimum effects on the spot characteristics and dose profiles for this up-stream scanning compact system proton system. Further treatment planning study is needed with the variable virtual SAD accounted for in the planning system to show minimum dosimetric impact.« less

  15. Influence of Leaf Area Index Prescriptions on Simulations of Heat, Moisture, and Carbon Fluxes

    NASA Technical Reports Server (NTRS)

    Kala, Jatin; Decker, Mark; Exbrayat, Jean-Francois; Pitman, Andy J.; Carouge, Claire; Evans, Jason P.; Abramowitz, Gab; Mocko, David

    2013-01-01

    Leaf-area index (LAI), the total one-sided surface area of leaf per ground surface area, is a key component of land surface models. We investigate the influence of differing, plausible LAI prescriptions on heat, moisture, and carbon fluxes simulated by the Community Atmosphere Biosphere Land Exchange (CABLEv1.4b) model over the Australian continent. A 15-member ensemble monthly LAI data-set is generated using the MODIS LAI product and gridded observations of temperature and precipitation. Offline simulations lasting 29 years (1980-2008) are carried out at 25 km resolution with the composite monthly means from the MODIS LAI product (control simulation) and compared with simulations using each of the 15-member ensemble monthly-varying LAI data-sets generated. The imposed changes in LAI did not strongly influence the sensible and latent fluxes but the carbon fluxes were more strongly affected. Croplands showed the largest sensitivity in gross primary production with differences ranging from -90 to 60 %. PFTs with high absolute LAI and low inter-annual variability, such as evergreen broadleaf trees, showed the least response to the different LAI prescriptions, whilst those with lower absolute LAI and higher inter-annual variability, such as croplands, were more sensitive. We show that reliance on a single LAI prescription may not accurately reflect the uncertainty in the simulation of the terrestrial carbon fluxes, especially for PFTs with high inter-annual variability. Our study highlights that the accurate representation of LAI in land surface models is key to the simulation of the terrestrial carbon cycle. Hence this will become critical in quantifying the uncertainty in future changes in primary production.

  16. Can we improve streamflow simulation by using higher resolution rainfall information?

    NASA Astrophysics Data System (ADS)

    Lobligeois, Florent; Andréassian, Vazken; Perrin, Charles

    2013-04-01

    The catchment response to rainfall is the interplay between space-time variability of precipitation, catchment characteristics and antecedent hydrological conditions. Precipitation dominates the high frequency hydrological response, and its simulation is thus dependent on the way rainfall is represented. One of the characteristics which distinguishes distributed from lumped models is their ability to represent explicitly the spatial variability of precipitation and catchment characteristics. The sensitivity of runoff hydrographs to the spatial variability of forcing data has been a major concern of researchers over the last three decades. However, although the literature on the relationship between spatial rainfall and runoff response is abundant, results are contrasted and sometimes contradictory. Several studies concluded that including information on rainfall spatial distribution improves discharge simulation (e.g. Ajami et al., 2004, among others) whereas other studies showed the lack of significant improvement in simulations with better information on rainfall spatial pattern (e.g. Andréassian et al., 2004, among others). The difficulties to reach a clear consensus is mainly due to the fact that each modeling study is implemented only on a few catchments whereas the impact of the spatial distribution of rainfall on runoff is known to be catchment and event characteristics-dependent. Many studies are virtual experiments and only compare flow simulations, which makes it difficult to reach conclusions transposable to real-life case studies. Moreover, the hydrological rainfall-runoff models differ between the studies and the parameterization strategies sometimes tend to advantage the distributed approach (or the lumped one). Recently, Météo-France developed a rainfall reanalysis over the whole French territory at the 1-kilometer resolution and the hourly time step over a 10-year period combining radar data and raingauge measurements: weather radar data were corrected and adjusted with both hourly and daily raingauge data. Based on this new high resolution product, we propose a framework to evaluate the improvements in streamflow simulation by using higher resolution rainfall information. Semi-distributed modelling is performed for different spatial resolution of precipitation forcing: from lumped to semi-distributed simulations. Here we do not work on synthetic (simulated) streamflow, but with actual measurements, on a large set of 181 French catchments representing a variety of size and climate. The rainfall-runoff model is re-calibrated for each resolution of rainfall spatial distribution over a 5-year sub-period and evaluated on the complementary sub-period in validation mode. The results are analysed by catchment classes based on catchment area and for various types of rainfall events based on the spatial variability of precipitation. References Ajami, N. K., Gupta, H. V, Wagener, T. & Sorooshian, S. (2004) Calibration of a semi-distributed hydrologic model for streamflow estimation along a river system. Journal of Hydrology 298(1-4), 112-135. Andréassian, V., Oddos, A., Michel, C., Anctil, F., Perrin, C. & Loumagne, C. (2004) Impact of spatial aggregation of inputs and parameters on the efficiency of rainfall-runoff models: A theoretical study using chimera watersheds. Water Resources Research 40(5), 1-9.

  17. Rainfall variability and extremes over southern Africa: assessment of a climate model to reproduce daily extremes

    NASA Astrophysics Data System (ADS)

    Williams, C.; Kniveton, D.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will be assessed, again by a comparison with extremes from the MIRA dataset.

  18. Improving Seasonal Crop Monitoring and Forecasting for Soybean and Corn in Iowa

    NASA Astrophysics Data System (ADS)

    Togliatti, K.; Archontoulis, S.; Dietzel, R.; VanLoocke, A.

    2016-12-01

    Accurately forecasting crop yield in advance of harvest could greatly benefit farmers, however few evaluations have been conducted to determine the effectiveness of forecasting methods. We tested one such method that used a combination of short-term weather forecasting from the Weather Research and Forecasting Model (WRF) to predict in season weather variables, such as, maximum and minimum temperature, precipitation and radiation at 4 different forecast lengths (2 weeks, 1 week, 3 days, and 0 days). This forecasted weather data along with the current and historic (previous 35 years) data from the Iowa Environmental Mesonet was combined to drive Agricultural Production Systems sIMulator (APSIM) simulations to forecast soybean and corn yields in 2015 and 2016. The goal of this study is to find the forecast length that reduces the variability of simulated yield predictions while also increasing the accuracy of those predictions. APSIM simulations of crop variables were evaluated against bi-weekly field measurements of phenology, biomass, and leaf area index from early and late planted soybean plots located at the Agricultural Engineering and Agronomy Research Farm in central Iowa as well as the Northwest Research Farm in northwestern Iowa. WRF model predictions were evaluated against observed weather data collected at the experimental fields. Maximum temperature was the most accurately predicted variable, followed by minimum temperature and radiation, and precipitation was least accurate according to RMSE values and the number of days that were forecasted within a 20% error of the observed weather. Our analysis indicated that for the majority of months in the growing season the 3 day forecast performed the best. The 1 week forecast came in second and the 2 week forecast was the least accurate for the majority of months. Preliminary results for yield indicate that the 2 week forecast is the least variable of the forecast lengths, however it also is the least accurate. The 3 day and 1 week forecast have a better accuracy, with an increase in variability.

  19. Impact of Spatial Soil and Climate Input Data Aggregation on Regional Yield Simulations

    PubMed Central

    Hoffmann, Holger; Zhao, Gang; Asseng, Senthold; Bindi, Marco; Biernath, Christian; Constantin, Julie; Coucheney, Elsa; Dechow, Rene; Doro, Luca; Eckersten, Henrik; Gaiser, Thomas; Grosz, Balázs; Heinlein, Florian; Kassie, Belay T.; Kersebaum, Kurt-Christian; Klein, Christian; Kuhnert, Matthias; Lewan, Elisabet; Moriondo, Marco; Nendel, Claas; Priesack, Eckart; Raynal, Helene; Roggero, Pier P.; Rötter, Reimund P.; Siebert, Stefan; Specka, Xenia; Tao, Fulu; Teixeira, Edmar; Trombi, Giacomo; Wallach, Daniel; Weihermüller, Lutz; Yeluripati, Jagadeesh; Ewert, Frank

    2016-01-01

    We show the error in water-limited yields simulated by crop models which is associated with spatially aggregated soil and climate input data. Crop simulations at large scales (regional, national, continental) frequently use input data of low resolution. Therefore, climate and soil data are often generated via averaging and sampling by area majority. This may bias simulated yields at large scales, varying largely across models. Thus, we evaluated the error associated with spatially aggregated soil and climate data for 14 crop models. Yields of winter wheat and silage maize were simulated under water-limited production conditions. We calculated this error from crop yields simulated at spatial resolutions from 1 to 100 km for the state of North Rhine-Westphalia, Germany. Most models showed yields biased by <15% when aggregating only soil data. The relative mean absolute error (rMAE) of most models using aggregated soil data was in the range or larger than the inter-annual or inter-model variability in yields. This error increased further when both climate and soil data were aggregated. Distinct error patterns indicate that the rMAE may be estimated from few soil variables. Illustrating the range of these aggregation effects across models, this study is a first step towards an ex-ante assessment of aggregation errors in large-scale simulations. PMID:27055028

  20. Impact of Spatial Soil and Climate Input Data Aggregation on Regional Yield Simulations.

    PubMed

    Hoffmann, Holger; Zhao, Gang; Asseng, Senthold; Bindi, Marco; Biernath, Christian; Constantin, Julie; Coucheney, Elsa; Dechow, Rene; Doro, Luca; Eckersten, Henrik; Gaiser, Thomas; Grosz, Balázs; Heinlein, Florian; Kassie, Belay T; Kersebaum, Kurt-Christian; Klein, Christian; Kuhnert, Matthias; Lewan, Elisabet; Moriondo, Marco; Nendel, Claas; Priesack, Eckart; Raynal, Helene; Roggero, Pier P; Rötter, Reimund P; Siebert, Stefan; Specka, Xenia; Tao, Fulu; Teixeira, Edmar; Trombi, Giacomo; Wallach, Daniel; Weihermüller, Lutz; Yeluripati, Jagadeesh; Ewert, Frank

    2016-01-01

    We show the error in water-limited yields simulated by crop models which is associated with spatially aggregated soil and climate input data. Crop simulations at large scales (regional, national, continental) frequently use input data of low resolution. Therefore, climate and soil data are often generated via averaging and sampling by area majority. This may bias simulated yields at large scales, varying largely across models. Thus, we evaluated the error associated with spatially aggregated soil and climate data for 14 crop models. Yields of winter wheat and silage maize were simulated under water-limited production conditions. We calculated this error from crop yields simulated at spatial resolutions from 1 to 100 km for the state of North Rhine-Westphalia, Germany. Most models showed yields biased by <15% when aggregating only soil data. The relative mean absolute error (rMAE) of most models using aggregated soil data was in the range or larger than the inter-annual or inter-model variability in yields. This error increased further when both climate and soil data were aggregated. Distinct error patterns indicate that the rMAE may be estimated from few soil variables. Illustrating the range of these aggregation effects across models, this study is a first step towards an ex-ante assessment of aggregation errors in large-scale simulations.

  1. Assessment of simulated water balance from Noah, Noah-MP, CLM, and VIC over CONUS using the NLDAS test bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Xitian; Yang, Zong-Liang; Xia, Youlong

    2014-12-27

    This study assesses the hydrologic performance of four land surface models (LSMs) for the conterminous United States using the North American Land Data Assimilation System (NLDAS) test bed. The four LSMs are the baseline community Noah LSM (Noah, version 2.8), the Variable Infiltration Capacity (VIC, version 4.0.5) model, the substantially augmented Noah LSM with multiparameterization options (hence Noah-MP), and the Community Land Model version 4 (CLM4). All four models are driven by the same NLDAS-2 atmospheric forcing. Modeled terrestrial water storage (TWS), streamflow, evapotranspiration (ET), and soil moisture are compared with each other and evaluated against the identical observations. Relativemore » to Noah, the other three models offer significant improvements in simulating TWS and streamflow and moderate improvements in simulating ET and soil moisture. Noah-MP provides the best performance in simulating soil moisture and is among the best in simulating TWS, CLM4 shows the best performance in simulating ET, and VIC ranks the highest in performing the streamflow simulations. Despite these improvements, CLM4, Noah-MP, and VIC exhibit deficiencies, such as the low variability of soil moisture in CLM4, the fast growth of spring ET in Noah-MP, and the constant overestimation of ET in VIC.« less

  2. Building Coherent Validation Arguments for the Measurement of Latent Constructs with Unified Statistical Frameworks

    ERIC Educational Resources Information Center

    Rupp, Andre A.

    2012-01-01

    In the focus article of this issue, von Davier, Naemi, and Roberts essentially coupled: (1) a short methodological review of structural similarities of latent variable models with discrete and continuous latent variables; and (2) 2 short empirical case studies that show how these models can be applied to real, rather than simulated, large-scale…

  3. Historical range of variability in live and dead wood biomass: a regional-scale simulation study

    Treesearch

    Etsuko Nonaka; Thomas A. Spies; Michael C. Wimberly; Janet L. Ohmann

    2007-01-01

    The historical range of variability (HRV) in landscape structure and composition created by natural disturbance can serve as a general guide for evaluating ecological conditions of managed landscapes. HRV approaches to evaluating landscapes have been based on age classes or developmental stages, which may obscure variation in live and dead stand structure. Developing...

  4. Comparative hybrid and digital simulation studies of the behaviour of a wind generator equipped with a static frequency converter

    NASA Astrophysics Data System (ADS)

    Dube, B.; Lefebvre, S.; Perocheau, A.; Nakra, H. L.

    1988-01-01

    This paper describes the comparative results obtained from digital and hybrid simulation studies on a variable speed wind generator interconnected to the utility grid. The wind generator is a vertical-axis Darrieus type coupled to a synchronous machine by a gear-box; the synchronous machine is connected to the AC utility grid through a static frequency converter. Digital simulation results have been obtained using CSMP software; these results are compared with those obtained from a real-time hybrid simulator that in turn uses a part of the IREQ HVDC simulator. The agreement between hybrid and digital simulation results is generally good. The results demonstrate that the digital simulation reproduces the dynamic behavior of the system in a satisfactory manner and thus constitutes a valid tool for the design of the control systems of the wind generator.

  5. Influence of Multidimensionality on Convergence of Sampling in Protein Simulation

    NASA Astrophysics Data System (ADS)

    Metsugi, Shoichi

    2005-06-01

    We study the problem of convergence of sampling in protein simulation originating in the multidimensionality of protein’s conformational space. Since several important physical quantities are given by second moments of dynamical variables, we attempt to obtain the time of simulation necessary for their sufficient convergence. We perform a molecular dynamics simulation of a protein and the subsequent principal component (PC) analysis as a function of simulation time T. As T increases, PC vectors with smaller amplitude of variations are identified and their amplitudes are equilibrated before identifying and equilibrating vectors with larger amplitude of variations. This sequential identification and equilibration mechanism makes protein simulation a useful method although it has an intrinsic multidimensional nature.

  6. Examining solutions to missing data in longitudinal nursing research.

    PubMed

    Roberts, Mary B; Sullivan, Mary C; Winchester, Suzy B

    2017-04-01

    Longitudinal studies are highly valuable in pediatrics because they provide useful data about developmental patterns of child health and behavior over time. When data are missing, the value of the research is impacted. The study's purpose was to (1) introduce a three-step approach to assess and address missing data and (2) illustrate this approach using categorical and continuous-level variables from a longitudinal study of premature infants. A three-step approach with simulations was followed to assess the amount and pattern of missing data and to determine the most appropriate imputation method for the missing data. Patterns of missingness were Missing Completely at Random, Missing at Random, and Not Missing at Random. Missing continuous-level data were imputed using mean replacement, stochastic regression, multiple imputation, and fully conditional specification (FCS). Missing categorical-level data were imputed using last value carried forward, hot-decking, stochastic regression, and FCS. Simulations were used to evaluate these imputation methods under different patterns of missingness at different levels of missing data. The rate of missingness was 16-23% for continuous variables and 1-28% for categorical variables. FCS imputation provided the least difference in mean and standard deviation estimates for continuous measures. FCS imputation was acceptable for categorical measures. Results obtained through simulation reinforced and confirmed these findings. Significant investments are made in the collection of longitudinal data. The prudent handling of missing data can protect these investments and potentially improve the scientific information contained in pediatric longitudinal studies. © 2017 Wiley Periodicals, Inc.

  7. Effects of Meteorological Variability on the Thermosphere-Ionosphere System during the Moderate Geomagnetic Disturbed January 2013 Period As Simulated By Time-GCM

    NASA Astrophysics Data System (ADS)

    Maute, A. I.; Hagan, M. E.; Richmond, A. D.; Liu, H.; Yudin, V. A.

    2014-12-01

    The ionosphere-thermosphere system is affected by solar and magnetospheric processes and by meteorological variability. Ionospheric observations of total electron content during the current solar cycle have shown that variability associated with meteorological forcing is important during solar minimum, and can have significant ionospheric effects during solar medium to maximum conditions. Numerical models can be used to study the comparative importance of geomagnetic and meterological forcing.This study focuses on the January 2013 Stratospheric Sudden Warming (SSW) period, which is associated with a very disturbed middle atmosphere as well as with moderately disturbed solar geomagntic conditions. We employ the NCAR Thermosphere-Ionosphere-Mesosphere-Electrodynamics General Circulation Model (TIME-GCM) with a nudging scheme using Whole-Atmosphere-Community-Climate-Model-Extended (WACCM-X)/Goddard Earth Observing System Model, Version 5 (GEOS5) results to simulate the effects of the meteorological and solar wind forcing on the upper atmosphere. The model results are evaluated by comparing with observations e.g., TEC, NmF2, ion drifts. We study the effect of the SSW on the wave spectrum, and the associated changes in the low latitude vertical drifts. These changes are compared to the impact of the moderate geomagnetic forcing on the TI-system during the January 2013 time period by conducting numerical experiments. We will present select highlights from our study and elude to the comparative importance of the forcing from above and below as simulated by the TIME-GCM.

  8. Remote sensing data with the conditional latin hypercube sampling and geostatistical approach to delineate landscape changes induced by large chronological physical disturbances.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Wang, Cheng-Long; Yu, Hsiao-Hsuan; Wang, Yung-Chieh

    2009-01-01

    This study applies variogram analyses of normalized difference vegetation index (NDVI) images derived from SPOT HRV images obtained before and after the ChiChi earthquake in the Chenyulan watershed, Taiwan, as well as images after four large typhoons, to delineate the spatial patterns, spatial structures and spatial variability of landscapes caused by these large disturbances. The conditional Latin hypercube sampling approach was applied to select samples from multiple NDVI images. Kriging and sequential Gaussian simulation with sufficient samples were then used to generate maps of NDVI images. The variography of NDVI image results demonstrate that spatial patterns of disturbed landscapes were successfully delineated by variogram analysis in study areas. The high-magnitude Chi-Chi earthquake created spatial landscape variations in the study area. After the earthquake, the cumulative impacts of typhoons on landscape patterns depended on the magnitudes and paths of typhoons, but were not always evident in the spatiotemporal variability of landscapes in the study area. The statistics and spatial structures of multiple NDVI images were captured by 3,000 samples from 62,500 grids in the NDVI images. Kriging and sequential Gaussian simulation with the 3,000 samples effectively reproduced spatial patterns of NDVI images. However, the proposed approach, which integrates the conditional Latin hypercube sampling approach, variogram, kriging and sequential Gaussian simulation in remotely sensed images, efficiently monitors, samples and maps the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial variability and heterogeneity.

  9. Minimum fuel control of a vehicle with a continuously variable transmission. [control system simulation

    NASA Technical Reports Server (NTRS)

    Burghart, J. H.; Donoghue, J. F.

    1980-01-01

    The design and evaluation of a control system for a sedan with a heat engine and a continuously variable transmission, is considered in a effort to minimize fuel consumption and achieve satisfactory dynamic response of vehicle variables as the vehicle is driven over a standard driving cycle. Even though the vehicle system was highly nonlinear, attention was restricted to linear control algorithms which could be easily understood and implemented demonstrated by simulation. Simulation results also revealed that the vehicle could exhibit unexpected dynamic behavior which must be taken into account in any control system design.

  10. Multimodel comparison of the ionosphere variability during the 2009 sudden stratosphere warming

    NASA Astrophysics Data System (ADS)

    Pedatella, N. M.; Fang, T.-W.; Jin, H.; Sassi, F.; Schmidt, H.; Chau, J. L.; Siddiqui, T. A.; Goncharenko, L.

    2016-07-01

    A comparison of different model simulations of the ionosphere variability during the 2009 sudden stratosphere warming (SSW) is presented. The focus is on the equatorial and low-latitude ionosphere simulated by the Ground-to-topside model of the Atmosphere and Ionosphere for Aeronomy (GAIA), Whole Atmosphere Model plus Global Ionosphere Plasmasphere (WAM+GIP), and Whole Atmosphere Community Climate Model eXtended version plus Thermosphere-Ionosphere-Mesosphere-Electrodynamics General Circulation Model (WACCMX+TIMEGCM). The simulations are compared with observations of the equatorial vertical plasma drift in the American and Indian longitude sectors, zonal mean F region peak density (NmF2) from the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) satellites, and ground-based Global Positioning System (GPS) total electron content (TEC) at 75°W. The model simulations all reproduce the observed morning enhancement and afternoon decrease in the vertical plasma drift, as well as the progression of the anomalies toward later local times over the course of several days. However, notable discrepancies among the simulations are seen in terms of the magnitude of the drift perturbations, and rate of the local time shift. Comparison of the electron densities further reveals that although many of the broad features of the ionosphere variability are captured by the simulations, there are significant differences among the different model simulations, as well as between the simulations and observations. Additional simulations are performed where the neutral atmospheres from four different whole atmosphere models (GAIA, HAMMONIA (Hamburg Model of the Neutral and Ionized Atmosphere), WAM, and WACCMX) provide the lower atmospheric forcing in the TIME-GCM. These simulations demonstrate that different neutral atmospheres, in particular, differences in the solar migrating semidiurnal tide, are partly responsible for the differences in the simulated ionosphere variability in GAIA, WAM+GIP, and WACCMX+TIMEGCM.

  11. The effects of workload on respiratory variables in simulated flight: a preliminary study.

    PubMed

    Karavidas, Maria Katsamanis; Lehrer, Paul M; Lu, Shou-En; Vaschillo, Evgeny; Vaschillo, Bronya; Cheng, Andrew

    2010-04-01

    In this pilot study, we investigated respiratory activity and end-tidal carbon dioxide (P(et)CO(2)) during exposure to varying levels of work load in a simulated flight environment. Seven pilots (age: 34-60) participated in a one-session test on the Boeing 737-800 simulator. Physiological data were collected while pilots wore an ambulatory multi-channel recording device. Respiratory variables, including inductance plethysmography (respiratory pattern) and pressure of end-tidal carbon dioxide (P(et)CO(2)), were collected demonstrating change in CO(2) levels proportional to changes in flight task workload. Pilots performed a set of simulation flight tasks. Pilot performance was rated for each task by a test pilot; and self-report of workload was taken using the NASA-TLX scale. Mixed model analysis revealed that respiration rate and minute ventilation are significantly associated with workload levels and evaluator scores controlling for "vanilla baseline" condition. Hypocapnia exclusively occurred in tasks where pilots performed more poorly. This study was designed as a preliminary investigation in order to develop a psychophysiological assessment methodology, rather than to offer conclusive findings. The results show that the respiratory system is very reactive to high workload conditions in aviation and suggest that hypocapnia may pose a flight safety risk under some circumstances. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Checking distributional assumptions for pharmacokinetic summary statistics based on simulations with compartmental models.

    PubMed

    Shen, Meiyu; Russek-Cohen, Estelle; Slud, Eric V

    2016-08-12

    Bioequivalence (BE) studies are an essential part of the evaluation of generic drugs. The most common in vivo BE study design is the two-period two-treatment crossover design. AUC (area under the concentration-time curve) and Cmax (maximum concentration) are obtained from the observed concentration-time profiles for each subject from each treatment under each sequence. In the BE evaluation of pharmacokinetic crossover studies, the normality of the univariate response variable, e.g. log(AUC) 1 or log(Cmax), is often assumed in the literature without much evidence. Therefore, we investigate the distributional assumption of the normality of response variables, log(AUC) and log(Cmax), by simulating concentration-time profiles from two-stage pharmacokinetic models (commonly used in pharmacokinetic research) for a wide range of pharmacokinetic parameters and measurement error structures. Our simulations show that, under reasonable distributional assumptions on the pharmacokinetic parameters, log(AUC) has heavy tails and log(Cmax) is skewed. Sensitivity analyses are conducted to investigate how the distribution of the standardized log(AUC) (or the standardized log(Cmax)) for a large number of simulated subjects deviates from normality if distributions of errors in the pharmacokinetic model for plasma concentrations deviate from normality and if the plasma concentration can be described by different compartmental models.

  13. Estimating social carrying capacity through computer simulation modeling: an application to Arches National Park, Utah

    Treesearch

    Benjamin Wang; Robert E. Manning; Steven R. Lawson; William A. Valliere

    2001-01-01

    Recent research and management experience has led to several frameworks for defining and managing carrying capacity of national parks and related areas. These frameworks rely on monitoring indicator variables to ensure that standards of quality are maintained. The objective of this study was to develop a computer simulation model to estimate the relationships between...

  14. Simulating historical variability in the amount of old forests in the Oregon Coast Range.

    Treesearch

    M.C. Wimberly; T.M. Spies; C.J. Long; C. Whitlock

    2000-01-01

    We developed the landscape age-class demographics simulator (LADS) to model historical variability in the amount of old-growth and late-successional forest in the Oregon Coast Range over the past 3,000 years. The model simulated temporal and spatial patterns of forest fires along with the resulting fluctuations in the distribution of forest age classes across the...

  15. The role of SST variability in the simulation of the MJO

    NASA Astrophysics Data System (ADS)

    Stan, Cristiana

    2017-12-01

    The sensitivity of the Madden-Julian Oscillation to high-frequency variability (period 1-5 days) of sea surface temperature (SST) is investigated using numerical experiments with the super-parameterized Community Climate System Model. The findings of this study emphasize the importance of air-sea interactions in the simulation of the MJO, and stress the necessity of an accurate representation of ocean variability on short time scales. Eliminating 1-5-day variability of surface boundary forcing reduces the intraseasonal variability (ISV) of the tropics during the boreal winter. The ISV spectrum becomes close to the red noise background spectrum. The variability of atmospheric circulation shifts to longer time scales. In the absence of high-frequency variability of SST the MJO power gets confined to wavenumbers 1-2 and the magnitude of westward power associated with Rossby waves increases. The MJO convective activity propagating eastward from the Indian Ocean does not cross the Maritime Continent, and convection in the western Pacific Ocean is locally generated. In the Indian Ocean convection tends to follow the meridional propagation of SST anomalies. The response of the MJO to 1-5-day variability in the SST is through the charging and discharging mechanisms contributing to the atmospheric column moist static energy before and after peak MJO convection. Horizontal advection and surface fluxes show the largest sensitivity to SST perturbations.

  16. The nature and use of prediction skills in a biological computer simulation

    NASA Astrophysics Data System (ADS)

    Lavoie, Derrick R.; Good, Ron

    The primary goal of this study was to examine the science process skill of prediction using qualitative research methodology. The think-aloud interview, modeled after Ericsson and Simon (1984), let to the identification of 63 program exploration and prediction behaviors.The performance of seven formal and seven concrete operational high-school biology students were videotaped during a three-phase learning sequence on water pollution. Subjects explored the effects of five independent variables on two dependent variables over time using a computer-simulation program. Predictions were made concerning the effect of the independent variables upon dependent variables through time. Subjects were identified according to initial knowledge of the subject matter and success at solving three selected prediction problems.Successful predictors generally had high initial knowledge of the subject matter and were formal operational. Unsuccessful predictors generally had low initial knowledge and were concrete operational. High initial knowledge seemed to be more important to predictive success than stage of Piagetian cognitive development.Successful prediction behaviors involved systematic manipulation of the independent variables, note taking, identification and use of appropriate independent-dependent variable relationships, high interest and motivation, and in general, higher-level thinking skills. Behaviors characteristic of unsuccessful predictors were nonsystematic manipulation of independent variables, lack of motivation and persistence, misconceptions, and the identification and use of inappropriate independent-dependent variable relationships.

  17. A laboratory investigation of the variability of cloud reflected radiance fields

    NASA Technical Reports Server (NTRS)

    Mckee, T. B.; Cox, S. K.

    1986-01-01

    A method to determine the radiative properties of complex cloud fields was developed. A Cloud field optical simulator (CFOS) was constructed to simulate the interaction of cloud fields with visible radiation. The CFOS was verified by comparing experimental results from it with calculations performed with a Monte Carlo radiative transfer model. A software library was developed to process, reduce, and display CFOS data. The CFSOS was utilized to study the reflected radiane patterns from simulated cloud fields.

  18. Stochastic Time Models of Syllable Structure

    PubMed Central

    Shaw, Jason A.; Gafos, Adamantios I.

    2015-01-01

    Drawing on phonology research within the generative linguistics tradition, stochastic methods, and notions from complex systems, we develop a modelling paradigm linking phonological structure, expressed in terms of syllables, to speech movement data acquired with 3D electromagnetic articulography and X-ray microbeam methods. The essential variable in the models is syllable structure. When mapped to discrete coordination topologies, syllabic organization imposes systematic patterns of variability on the temporal dynamics of speech articulation. We simulated these dynamics under different syllabic parses and evaluated simulations against experimental data from Arabic and English, two languages claimed to parse similar strings of segments into different syllabic structures. Model simulations replicated several key experimental results, including the fallibility of past phonetic heuristics for syllable structure, and exposed the range of conditions under which such heuristics remain valid. More importantly, the modelling approach consistently diagnosed syllable structure proving resilient to multiple sources of variability in experimental data including measurement variability, speaker variability, and contextual variability. Prospects for extensions of our modelling paradigm to acoustic data are also discussed. PMID:25996153

  19. Preliminary report of the Hepatic Encephalopathy Assessment Driving Simulator (HEADS) score.

    PubMed

    Baskin-Bey, Edwina S; Stewart, Charmaine A; Mitchell, Mary M; Bida, John P; Rosenthal, Theodore J; Nyberg, Scott L

    2008-01-01

    Audiovisual simulations of real-life driving (ie, driving simulators) have been used to assess neurologic dysfunction in a variety of medical applications. However, the use of simulated driving to assess neurologic impairment in the setting of liver disease (ie, hepatic encephalopathy) is limited. The aim of this analysis was to develop a scoring system based on simulated driving performance to assess mild cognitive impairment in cirrhotic patients with hepatic encephalopathy. This preliminary analysis was conducted as part of the Hepatic Encephalopathy Assessment Driving Simulator (HEADS) pilot study. Cirrhotic volunteers initially underwent a battery of neuropsychological tests to identify those cirrhotic patients with mild cognitive impairment. Performance during an audiovisually simulated course of on-road driving was then compared between mildly impaired cirrhotic patients and healthy volunteers. A scoring system was developed to quantify the likelihood of cognitive impairment on the basis of data from the simulated on-road driving. Mildly impaired cirrhotic patients performed below the level of healthy volunteers on the driving simulator. Univariate logistic regression and correlation models indicated that several driving simulator variables were significant predictors of cognitive impairment. Five variables (run time, total map performance, number of collisions, visual divided attention response, and average lane position) were incorporated into a quantitative model, the HEADS scoring system. The HEADS score (0-9 points) showed a strong correlation with cognitive impairment as measured by area under the receiver-operator curve (.89). The HEADS system appears to be a promising new tool for the assessment of mild hepatic encephalopathy.

  20. Validation of the mean radiant temperature simulated by the RayMan software in urban environments.

    PubMed

    Lee, Hyunjung; Mayer, Helmut

    2016-11-01

    The RayMan software is worldwide applied in investigations on different issues in human-biometeorology. However, only the simulated mean radiant temperature (T mrt ) has been validated so far in a few case studies. They are based on T mrt values, which were experimentally determined in urban environments by use of a globe thermometer or applying the six-directional method. This study analyses previous T mrt validations in a comparative manner. Their results are extended by a recent validation of T mrt in an urban micro-environment in Freiburg (southwest Germany), which can be regarded as relatively heterogeneous due to different shading intensities by tree crowns. In addition, a validation of the physiologically equivalent temperature (PET) simulated by RayMan is conducted for the first time. The validations are based on experimentally determined T mrt and PET values, which were calculated from measured meteorological variables in the daytime of a clear-sky summer day. In total, the validation results show that RayMan is capable of simulating T mrt satisfactorily under relatively homogeneous site conditions. However, the inaccuracy of simulated T mrt is increasing with lower sun elevation and growing heterogeneity of the simulation site. As T mrt represents the meteorological variable that mostly governs PET in the daytime of clear-sky summer days, the accuracy of simulated T mrt is mainly responsible for the accuracy of simulated PET. The T mrt validations result in some recommendations, which concern an update of physical principles applied in the RayMan software to simulate the short- and long-wave radiant flux densities, especially from vertical building walls and tree crowns.

  1. On the upscaling of process-based models in deltaic applications

    NASA Astrophysics Data System (ADS)

    Li, L.; Storms, J. E. A.; Walstra, D. J. R.

    2018-03-01

    Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.

  2. Global MHD Simulations of the Earth's Bow Shock Shape and Motion Under Variable Solar Wind Conditions

    NASA Astrophysics Data System (ADS)

    Mejnertsen, L.; Eastwood, J. P.; Hietala, H.; Schwartz, S. J.; Chittenden, J. P.

    2018-01-01

    Empirical models of the Earth's bow shock are often used to place in situ measurements in context and to understand the global behavior of the foreshock/bow shock system. They are derived statistically from spacecraft bow shock crossings and typically treat the shock surface as a conic section parameterized according to a uniform solar wind ram pressure, although more complex models exist. Here a global magnetohydrodynamic simulation is used to analyze the variability of the Earth's bow shock under real solar wind conditions. The shape and location of the bow shock is found as a function of time, and this is used to calculate the shock velocity over the shock surface. The results are compared to existing empirical models. Good agreement is found in the variability of the subsolar shock location. However, empirical models fail to reproduce the two-dimensional shape of the shock in the simulation. This is because significant solar wind variability occurs on timescales less than the transit time of a single solar wind phase front over the curved shock surface. Empirical models must therefore be used with care when interpreting spacecraft data, especially when observations are made far from the Sun-Earth line. Further analysis reveals a bias to higher shock speeds when measured by virtual spacecraft. This is attributed to the fact that the spacecraft only observes the shock when it is in motion. This must be accounted for when studying bow shock motion and variability with spacecraft data.

  3. Synchronous parallel system for emulation and discrete event simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1992-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to state variables of the simulation object attributable to the event object, and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring the events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  4. Synchronous Parallel System for Emulation and Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    2001-01-01

    A synchronous parallel system for emulation and discrete event simulation having parallel nodes responds to received messages at each node by generating event objects having individual time stamps, stores only the changes to the state variables of the simulation object attributable to the event object and produces corresponding messages. The system refrains from transmitting the messages and changing the state variables while it determines whether the changes are superseded, and then stores the unchanged state variables in the event object for later restoral to the simulation object if called for. This determination preferably includes sensing the time stamp of each new event object and determining which new event object has the earliest time stamp as the local event horizon, determining the earliest local event horizon of the nodes as the global event horizon, and ignoring events whose time stamps are less than the global event horizon. Host processing between the system and external terminals enables such a terminal to query, monitor, command or participate with a simulation object during the simulation process.

  5. Searching for the right scale in catchment hydrology: the effect of soil spatial variability in simulated states and fluxes

    NASA Astrophysics Data System (ADS)

    Baroni, Gabriele; Zink, Matthias; Kumar, Rohini; Samaniego, Luis; Attinger, Sabine

    2017-04-01

    The advances in computer science and the availability of new detailed data-sets have led to a growing number of distributed hydrological models applied to finer and finer grid resolutions for larger and larger catchment areas. It was argued, however, that this trend does not necessarily guarantee better understanding of the hydrological processes or it is even not necessary for specific modelling applications. In the present study, this topic is further discussed in relation to the soil spatial heterogeneity and its effect on simulated hydrological state and fluxes. To this end, three methods are developed and used for the characterization of the soil heterogeneity at different spatial scales. The methods are applied at the soil map of the upper Neckar catchment (Germany), as example. The different soil realizations are assessed regarding their impact on simulated state and fluxes using the distributed hydrological model mHM. The results are analysed by aggregating the model outputs at different spatial scales based on the Representative Elementary Scale concept (RES) proposed by Refsgaard et al. (2016). The analysis is further extended in the present study by aggregating the model output also at different temporal scales. The results show that small scale soil variabilities are not relevant when the integrated hydrological responses are considered e.g., simulated streamflow or average soil moisture over sub-catchments. On the contrary, these small scale soil variabilities strongly affect locally simulated states and fluxes i.e., soil moisture and evapotranspiration simulated at the grid resolution. A clear trade-off is also detected by aggregating the model output by spatial and temporal scales. Despite the scale at which the soil variabilities are (or are not) relevant is not universal, the RES concept provides a simple and effective framework to quantify the predictive capability of distributed models and to identify the need for further model improvements e.g., finer resolution input. For this reason, the integration in this analysis of all the relevant input factors (e.g., precipitation, vegetation, geology) could provide a strong support for the definition of the right scale for each specific model application. In this context, however, the main challenge for a proper model assessment will be the correct characterization of the spatio- temporal variability of each input factor. Refsgaard, J.C., Højberg, A.L., He, X., Hansen, A.L., Rasmussen, S.H., Stisen, S., 2016. Where are the limits of model predictive capabilities?: Representative Elementary Scale - RES. Hydrol. Process. doi:10.1002/hyp.11029

  6. Multivariate bias adjustment of high-dimensional climate simulations: the Rank Resampling for Distributions and Dependences (R2D2) bias correction

    NASA Astrophysics Data System (ADS)

    Vrac, Mathieu

    2018-06-01

    Climate simulations often suffer from statistical biases with respect to observations or reanalyses. It is therefore common to correct (or adjust) those simulations before using them as inputs into impact models. However, most bias correction (BC) methods are univariate and so do not account for the statistical dependences linking the different locations and/or physical variables of interest. In addition, they are often deterministic, and stochasticity is frequently needed to investigate climate uncertainty and to add constrained randomness to climate simulations that do not possess a realistic variability. This study presents a multivariate method of rank resampling for distributions and dependences (R2D2) bias correction allowing one to adjust not only the univariate distributions but also their inter-variable and inter-site dependence structures. Moreover, the proposed R2D2 method provides some stochasticity since it can generate as many multivariate corrected outputs as the number of statistical dimensions (i.e., number of grid cell × number of climate variables) of the simulations to be corrected. It is based on an assumption of stability in time of the dependence structure - making it possible to deal with a high number of statistical dimensions - that lets the climate model drive the temporal properties and their changes in time. R2D2 is applied on temperature and precipitation reanalysis time series with respect to high-resolution reference data over the southeast of France (1506 grid cell). Bivariate, 1506-dimensional and 3012-dimensional versions of R2D2 are tested over a historical period and compared to a univariate BC. How the different BC methods behave in a climate change context is also illustrated with an application to regional climate simulations over the 2071-2100 period. The results indicate that the 1d-BC basically reproduces the climate model multivariate properties, 2d-R2D2 is only satisfying in the inter-variable context, 1506d-R2D2 strongly improves inter-site properties and 3012d-R2D2 is able to account for both. Applications of the proposed R2D2 method to various climate datasets are relevant for many impact studies. The perspectives of improvements are numerous, such as introducing stochasticity in the dependence itself, questioning its stability assumption, and accounting for temporal properties adjustment while including more physics in the adjustment procedures.

  7. Variability in the Use of Simulation for Procedural Training in Radiology Residency: Opportunities for Improvement.

    PubMed

    Matalon, Shanna A; Chikarmane, Sona A; Yeh, Eren D; Smith, Stacy E; Mayo-Smith, William W; Giess, Catherine S

    2018-03-19

    Increased attention to quality and safety has led to a re-evaluation of the classic apprenticeship model for procedural training. Many have proposed simulation as a supplementary teaching tool. The purpose of this study was to assess radiology resident exposure to procedural training and procedural simulation. An IRB-exempt online survey was distributed to current radiology residents in the United States by e-mail. Survey results were summarized using frequency and percentages. Chi-square tests were used for statistical analysis where appropriate. A total of 353 current residents completed the survey. 37% (n = 129/353) of respondents had never used procedure simulation. Of the residents who had used simulation, most did not do so until after having already performed procedures on patients (59%, n = 132/223). The presence of a dedicated simulation center was reported by over half of residents (56%, n = 196/353) and was associated with prior simulation experience (P = 0.007). Residents who had not had procedural simulation were somewhat likely or highly likely (3 and 4 on a 4-point Likert-scale) to participate if it were available (81%, n = 104/129). Simulation training was associated with higher comfort levels in performing procedures (P < 0.001). Although procedural simulation training is associated with higher comfort levels when performing procedures, there is variable use in radiology resident training and its use is not currently optimized. Given the increased emphasis on patient safety, these results suggest the need to increase procedural simulation use during residency, including an earlier introduction to simulation before patient exposure. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Inter-model Diversity of ENSO simulation and its relation to basic states

    NASA Astrophysics Data System (ADS)

    Kug, J. S.; Ham, Y. G.

    2016-12-01

    In this study, a new methodology is developed to improve the climate simulation of state-of-the-art coupledglobal climate models (GCMs), by a postprocessing based on the intermodel diversity. Based on the closeconnection between the interannual variability and climatological states, the distinctive relation between theintermodel diversity of the interannual variability and that of the basic state is found. Based on this relation,the simulated interannual variabilities can be improved, by correcting their climatological bias. To test thismethodology, the dominant intermodel difference in precipitation responses during El Niño-SouthernOscillation (ENSO) is investigated, and its relationship with climatological state. It is found that the dominantintermodel diversity of the ENSO precipitation in phase 5 of the Coupled Model Intercomparison Project(CMIP5) is associated with the zonal shift of the positive precipitation center during El Niño. This dominantintermodel difference is significantly correlated with the basic states. The models with wetter (dryer) climatologythan the climatology of the multimodel ensemble (MME) over the central Pacific tend to shift positiveENSO precipitation anomalies to the east (west). Based on the model's systematic errors in atmosphericENSO response and bias, the models with better climatological state tend to simulate more realistic atmosphericENSO responses.Therefore, the statistical method to correct the ENSO response mostly improves the ENSO response. Afterthe statistical correction, simulating quality of theMMEENSO precipitation is distinctively improved. Theseresults provide a possibility that the present methodology can be also applied to improving climate projectionand seasonal climate prediction.

  9. Evaluation of terrestrial carbon cycle models with atmospheric CO2 measurements: Results from transient simulations considering increasing CO2, climate, and land-use effects

    USGS Publications Warehouse

    Dargaville, R.J.; Heimann, Martin; McGuire, A.D.; Prentice, I.C.; Kicklighter, D.W.; Joos, F.; Clein, Joy S.; Esser, G.; Foley, J.; Kaplan, J.; Meier, R.A.; Melillo, J.M.; Moore, B.; Ramankutty, N.; Reichenau, T.; Schloss, A.; Sitch, S.; Tian, H.; Williams, L.J.; Wittenberg, U.

    2002-01-01

    An atmospheric transport model and observations of atmospheric CO2 are used to evaluate the performance of four Terrestrial Carbon Models (TCMs) in simulating the seasonal dynamics and interannual variability of atmospheric CO2 between 1980 and 1991. The TCMs were forced with time varying atmospheric CO2 concentrations, climate, and land use to simulate the net exchange of carbon between the terrestrial biosphere and the atmosphere. The monthly surface CO2 fluxes from the TCMs were used to drive the Model of Atmospheric Transport and Chemistry and the simulated seasonal cycles and concentration anomalies are compared with observations from several stations in the CMDL network. The TCMs underestimate the amplitude of the seasonal cycle and tend to simulate too early an uptake of CO2 during the spring by approximately one to two months. The model fluxes show an increase in amplitude as a result of land-use change, but that pattern is not so evident in the simulated atmospheric amplitudes, and the different models suggest different causes for the amplitude increase (i.e., CO2 fertilization, climate variability or land use change). The comparison of the modeled concentration anomalies with the observed anomalies indicates that either the TCMs underestimate interannual variability in the exchange of CO2 between the terrestrial biosphere and the atmosphere, or that either the variability in the ocean fluxes or the atmospheric transport may be key factors in the atmospheric interannual variability.

  10. Factoring vs linear modeling in rate estimation: a simulation study of relative accuracy.

    PubMed

    Maldonado, G; Greenland, S

    1998-07-01

    A common strategy for modeling dose-response in epidemiology is to transform ordered exposures and covariates into sets of dichotomous indicator variables (that is, to factor the variables). Factoring tends to increase estimation variance, but it also tends to decrease bias and thus may increase or decrease total accuracy. We conducted a simulation study to examine the impact of factoring on the accuracy of rate estimation. Factored and unfactored Poisson regression models were fit to follow-up study datasets that were randomly generated from 37,500 population model forms that ranged from subadditive to supramultiplicative. In the situations we examined, factoring sometimes substantially improved accuracy relative to fitting the corresponding unfactored model, sometimes substantially decreased accuracy, and sometimes made little difference. The difference in accuracy between factored and unfactored models depended in a complicated fashion on the difference between the true and fitted model forms, the strength of exposure and covariate effects in the population, and the study size. It may be difficult in practice to predict when factoring is increasing or decreasing accuracy. We recommend, therefore, that the strategy of factoring variables be supplemented with other strategies for modeling dose-response.

  11. A simulation-based study on different control strategies for variable speed pump in distributed ground source heat pump systems

    DOE PAGES

    Liu, Xiaobing; Zheng, O'Neill; Niu, Fuxin

    2016-01-01

    Most commercial ground source heat pump systems (GSHP) in the United States are in a distributed configuration. These systems circulate water or an anti-freeze solution through multiple heat pump units via a central pumping system, which usually uses variable speed pump(s). Variable speed pumps have potential to significantly reduce pumping energy use; however, the energy savings in reality could be far away from its potential due to improper pumping system design and controls. In this paper, a simplified hydronic pumping system was simulated with the dynamic Modelica models to evaluate three different pumping control strategies. This includes two conventional controlmore » strategies, which are to maintain a constant differential pressure across either the supply and return mains, or at the most hydraulically remote heat pump; and an innovative control strategy, which adjusts system flow rate based on the demand of each heat pump. The simulation results indicate that a significant overflow occurs at part load conditions when the variable speed pump is controlled to main a constant differential pressure across the supply and return mains of the piping system. On the other hand, an underflow occurs at part load conditions when the variable speed pump is controlled to maintain a constant differential pressure across the furthest heat pump. The flow-demand-based control can provide needed flow rate to each heat pump at any given time, and with less pumping energy use than the two conventional controls. Finally, a typical distributed GSHP system was studied to evaluate the energy saving potential of applying the flow-demand-based pumping control strategy. This case study shows that the annual pumping energy consumption can be reduced by 62% using the flow-demand-based control compared with that using the conventional pressure-based control to maintain a constant differential pressure a cross the supply and return mains.« less

  12. Evaluation of graphic cardiovascular display in a high-fidelity simulator.

    PubMed

    Agutter, James; Drews, Frank; Syroid, Noah; Westneskow, Dwayne; Albert, Rob; Strayer, David; Bermudez, Julio; Weinger, Matthew B

    2003-11-01

    "Human error" in anesthesia can be attributed to misleading information from patient monitors or to the physician's failure to recognize a pattern. A graphic representation of monitored data may provide better support for detection, diagnosis, and treatment. We designed a graphic display to show hemodynamic variables. Twenty anesthesiologists were asked to assume care of a simulated patient. Half the participants used the graphic cardiovascular display; the other half used a Datex As/3 monitor. One scenario was a total hip replacement with a transfusion reaction to mismatched blood. The second scenario was a radical prostatectomy with 1.5 L of blood loss and myocardial ischemia. Subjects who used the graphic display detected myocardial ischemia 2 min sooner than those who did not use the display. Treatment was initiated sooner (2.5 versus 4.9 min). There were no significant differences between groups in the hip replacement scenario. Systolic blood pressure deviated less from baseline, central venous pressure was closer to its baseline, and arterial oxygen saturation was higher at the end of the case when the graphic display was used. The study lends some support for the hypothesis that providing clinical information graphically in a display designed with emergent features and functional relationships can improve clinicians' ability to detect, diagnose, manage, and treat critical cardiovascular events in a simulated environment. A graphic representation of monitored data may provide better support for detection, diagnosis, and treatment. A user-centered design process led to a novel object-oriented graphic display of hemodynamic variables containing emergent features and functional relationships. In a simulated environment, this display appeared to support clinicians' ability to diagnose, manage, and treat a critical cardiovascular event in a simulated environment. We designed a graphic display to show hemodynamic variables. The study provides some support for the hypothesis that providing clinical information graphically in a display designed with emergent features and functional relationships can improve clinicians' ability to detect, diagnosis, mange, and treat critical cardiovascular events in a simulated environment.

  13. An assessment of the ability of Bartlett-Lewis type of rainfall models to reproduce drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-12-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as a test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis model types studied fail to preserve extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  14. A copula-based assessment of Bartlett-Lewis type of rainfall models for preserving drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-06-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis type of models studied fail in preserving extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  15. Application of all relevant feature selection for failure analysis of parameter-induced simulation crashes in climate models

    NASA Astrophysics Data System (ADS)

    Paja, W.; Wrzesień, M.; Niemiec, R.; Rudnicki, W. R.

    2015-07-01

    The climate models are extremely complex pieces of software. They reflect best knowledge on physical components of the climate, nevertheless, they contain several parameters, which are too weakly constrained by observations, and can potentially lead to a crash of simulation. Recently a study by Lucas et al. (2013) has shown that machine learning methods can be used for predicting which combinations of parameters can lead to crash of simulation, and hence which processes described by these parameters need refined analyses. In the current study we reanalyse the dataset used in this research using different methodology. We confirm the main conclusion of the original study concerning suitability of machine learning for prediction of crashes. We show, that only three of the eight parameters indicated in the original study as relevant for prediction of the crash are indeed strongly relevant, three other are relevant but redundant, and two are not relevant at all. We also show that the variance due to split of data between training and validation sets has large influence both on accuracy of predictions and relative importance of variables, hence only cross-validated approach can deliver robust prediction of performance and relevance of variables.

  16. Multivariate normal maximum likelihood with both ordinal and continuous variables, and data missing at random.

    PubMed

    Pritikin, Joshua N; Brick, Timothy R; Neale, Michael C

    2018-04-01

    A novel method for the maximum likelihood estimation of structural equation models (SEM) with both ordinal and continuous indicators is introduced using a flexible multivariate probit model for the ordinal indicators. A full information approach ensures unbiased estimates for data missing at random. Exceeding the capability of prior methods, up to 13 ordinal variables can be included before integration time increases beyond 1 s per row. The method relies on the axiom of conditional probability to split apart the distribution of continuous and ordinal variables. Due to the symmetry of the axiom, two similar methods are available. A simulation study provides evidence that the two similar approaches offer equal accuracy. A further simulation is used to develop a heuristic to automatically select the most computationally efficient approach. Joint ordinal continuous SEM is implemented in OpenMx, free and open-source software.

  17. Simulation of Medical Imaging Systems: Emission and Transmission Tomography

    NASA Astrophysics Data System (ADS)

    Harrison, Robert L.

    Simulation is an important tool in medical imaging research. In patient scans the true underlying anatomy and physiology is unknown. We have no way of knowing in a given scan how various factors are confounding the data: statistical noise; biological variability; patient motion; scattered radiation, dead time, and other data contaminants. Simulation allows us to isolate a single factor of interest, for instance when researchers perform multiple simulations of the same imaging situation to determine the effect of statistical noise or biological variability. Simulations are also increasingly used as a design optimization tool for tomographic scanners. This article gives an overview of the mechanics of emission and transmission tomography simulation, reviews some of the publicly available simulation tools, and discusses trade-offs between the accuracy and efficiency of simulations.

  18. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  19. Narrowing the agronomic yield gap with improved nitrogen use efficiency: a modeling approach.

    PubMed

    Ahrens, T D; Lobell, D B; Ortiz-Monasterio, J I; Li, Y; Matson, P A

    2010-01-01

    Improving nitrogen use efficiency (NUE) in the major cereals is critical for more sustainable nitrogen use in high-input agriculture, but our understanding of the potential for NUE improvement is limited by a paucity of reliable on-farm measurements. Limited on-farm data suggest that agronomic NUE (AE(N)) is lower and more variable than data from trials conducted at research stations, on which much of our understanding of AE(N) has been built. The purpose of this study was to determine the magnitude and causes of variability in AE(N) across an agricultural region, which we refer to as the achievement distribution of AE(N). The distribution of simulated AE(N) in 80 farmers' fields in an irrigated wheat system in the Yaqui Valley, Mexico, was compared with trials at a local research center (International Wheat and Maize Improvement Center; CIMMYT). An agroecosystem simulation model WNMM was used to understand factors controlling yield, AE(N), gaseous N emissions, and nitrate leaching in the region. Simulated AE(N) in the Yaqui Valley was highly variable, and mean on-farm AE(N) was 44% lower than trials with similar fertilization rates at CIMMYT. Variability in residual N supply was the most important factor determining simulated AE(N). Better split applications of N fertilizer led to almost a doubling of AE(N), increased profit, and reduced N pollution, and even larger improvements were possible with technologies that allow for direct measurement of soil N supply and plant N demand, such as site-specific nitrogen management.

  20. Simulations of the Boreal Winter Upper Mesosphere and Lower Thermosphere With Meteorological Specifications in SD-WACCM-X

    NASA Astrophysics Data System (ADS)

    Sassi, Fabrizio; Siskind, David E.; Tate, Jennifer L.; Liu, Han-Li; Randall, Cora E.

    2018-04-01

    We investigate the benefit of high-altitude nudging in simulations of the structure and short-term variability of the upper mesosphere and lower thermosphere (UMLT) dynamical meteorology during boreal winter, specifically around the time of the January 2009 sudden stratospheric warming. We compare simulations using the Specified Dynamics, Whole Atmosphere Community Climate Model, extended version, nudged using atmospheric specifications generated by the Navy Operational Global Atmospheric Prediction System, Advanced Level Physics High Altitude. Two sets of simulations are carried out: one uses nudging over a vertical domain from 0 to 90 km; the other uses nudging over a vertical domain from 0 to 50 km. The dynamical behavior is diagnosed from ensemble mean and standard deviation of winds, temperature, and zonal accelerations due to resolved and parameterized waves. We show that the dynamical behavior of the UMLT is quite different in the two experiments, with prominent differences in the structure and variability of constituent transport. We compare the results of our numerical experiments to observations of carbon monoxide by the Atmospheric Chemistry Experiment-Fourier Transform Spectrometer to show that the high-altitude nudging is capable of reproducing with high fidelity the observed variability, and traveling planetary waves are a crucial component of the dynamics. The results of this study indicate that to capture the key physical processes that affect short-term variability (defined as the atmospheric behavior within about 10 days of a stratospheric warming) in the UMLT, specification of the atmospheric state in the stratosphere alone is not sufficient, and upper atmospheric specifications are needed.

  1. Effects of mass on aircraft sidearm controller characteristics

    NASA Technical Reports Server (NTRS)

    Wagner, Charles A.

    1994-01-01

    When designing a flight simulator, providing a set of low mass variable-characteristic pilot controls can be very difficult. Thus, a strong incentive exists to identify the highest possible mass that will not degrade the validity of a simulation. The NASA Dryden Flight Research Center has conducted a brief flight program to determine the maximum acceptable mass (system inertia) of an aircraft sidearm controller as a function of force gradient. This information is useful for control system design in aircraft as well as development of suitable flight simulator controls. A modified Learjet with a variable-characteristic sidearm controller was used to obtain data. A boundary was defined between mass considered acceptable and mass considered unacceptable to the pilot. This boundary is defined as a function of force gradient over a range of natural frequencies. This investigation is limited to a study of mass-frequency characteristics only. Results of this investigation are presented in this paper.

  2. Simulation of gaseous diffusion in partially saturated porous media under variable gravity with lattice Boltzmann methods

    NASA Technical Reports Server (NTRS)

    Chau, Jessica Furrer; Or, Dani; Sukop, Michael C.; Steinberg, S. L. (Principal Investigator)

    2005-01-01

    Liquid distributions in unsaturated porous media under different gravitational accelerations and corresponding macroscopic gaseous diffusion coefficients were investigated to enhance understanding of plant growth conditions in microgravity. We used a single-component, multiphase lattice Boltzmann code to simulate liquid configurations in two-dimensional porous media at varying water contents for different gravity conditions and measured gas diffusion through the media using a multicomponent lattice Boltzmann code. The relative diffusion coefficients (D rel) for simulations with and without gravity as functions of air-filled porosity were in good agreement with measured data and established models. We found significant differences in liquid configuration in porous media, leading to reductions in D rel of up to 25% under zero gravity. The study highlights potential applications of the lattice Boltzmann method for rapid and cost-effective evaluation of alternative plant growth media designs under variable gravity.

  3. Attribution of Observed Streamflow Changes in Key British Columbia Drainage Basins

    NASA Astrophysics Data System (ADS)

    Najafi, Mohammad Reza; Zwiers, Francis W.; Gillett, Nathan P.

    2017-11-01

    We study the observed decline in summer streamflow in four key river basins in British Columbia (BC), Canada, using a formal detection and attribution (D&A) analysis procedure. Reconstructed and simulated streamflow is generated using the semidistributed variable infiltration capacity hydrologic model, which is driven by 1/16° gridded observations and downscaled climate model data from the Coupled Model Intercomparison Project phase 5 (CMIP5), respectively. The internal variability of the regional hydrologic components using 5100 years of streamflow was simulated using CMIP5 preindustrial control runs. Results show that the observed changes in summer streamflow are inconsistent with simulations representing the responses to natural forcing factors alone, while the response to anthropogenic and natural forcing factors combined is detected in these changes. A two-signal D&A analysis indicates that the effects of anthropogenic (ANT) forcing factors are discernable from natural forcing in BC, albeit with large uncertainties.

  4. Case studies in configuration control for redundant robots

    NASA Technical Reports Server (NTRS)

    Seraji, H.; Lee, T.; Colbaugh, R.; Glass, K.

    1989-01-01

    A simple approach to configuration control of redundant robots is presented. The redundancy is utilized to control the robot configuration directly in task space, where the task will be performed. A number of task-related kinematic functions are defined and combined with the end-effector coordinates to form a set of configuration variables. An adaptive control scheme is then utilized to ensure that the configuration variables track the desired reference trajectories as closely as possible. Simulation results are presented to illustrate the control scheme. The scheme has also been implemented for direct online control of a PUMA industrial robot, and experimental results are presented. The simulation and experimental results validate the configuration control scheme for performing various realistic tasks.

  5. Interannual Variability of Ammonia Concentrations over the United States: Sources and Implications for Inorganic Particulate Matter

    NASA Astrophysics Data System (ADS)

    Schiferl, L. D.; Heald, C. L.; Van Damme, M.; Pierre-Francois, C.; Clerbaux, C.

    2015-12-01

    Modern agricultural practices have greatly increased the emission of ammonia (NH3) to the atmosphere. Recent controls to reduce the emissions of sulfur and nitrogen oxides (SOX and NOX) have increased the importance of understanding the role ammonia plays in the formation of surface fine inorganic particulate matter (PM2.5) in the United States. In this study, we identify the interannual variability in ammonia concentration, explore the sources of this variability and determine their contribution to the variability in surface PM2.5 concentration. Over the summers of 2008-2012, measurements from the Ammonia Monitoring Network (AMoN) and the Infrared Atmospheric Sounding Interferometer (IASI) satellite instrument show considerable variability in both surface and column ammonia concentrations (+/- 29% and 28% of the mean), respectively. This observed variability is larger than that simulated by the GEOS-Chem chemical transport model, where meteorology dominates the variability in ammonia and PM2.5 concentrations compared to the changes caused by SOX and NOX reductions. Our initial simulation does not include year-to-year changes in ammonia agricultural emissions. We use county-wide information on fertilizer sales and livestock populations, as well as meteorological variations to account for the interannual variability in agricultural activity and ammonia volatilization. These sources of ammonia emission variability are important for replicating observed variations in ammonia and PM2.5, highlighting how accurate ammonia emissions characterization is central to PM air quality prediction.

  6. A Study of Effects of MultiCollinearity in the Multivariable Analysis

    PubMed Central

    Yoo, Wonsuk; Mayberry, Robert; Bae, Sejong; Singh, Karan; (Peter) He, Qinghua; Lillard, James W.

    2015-01-01

    A multivariable analysis is the most popular approach when investigating associations between risk factors and disease. However, efficiency of multivariable analysis highly depends on correlation structure among predictive variables. When the covariates in the model are not independent one another, collinearity/multicollinearity problems arise in the analysis, which leads to biased estimation. This work aims to perform a simulation study with various scenarios of different collinearity structures to investigate the effects of collinearity under various correlation structures amongst predictive and explanatory variables and to compare these results with existing guidelines to decide harmful collinearity. Three correlation scenarios among predictor variables are considered: (1) bivariate collinear structure as the most simple collinearity case, (2) multivariate collinear structure where an explanatory variable is correlated with two other covariates, (3) a more realistic scenario when an independent variable can be expressed by various functions including the other variables. PMID:25664257

  7. A Study of Effects of MultiCollinearity in the Multivariable Analysis.

    PubMed

    Yoo, Wonsuk; Mayberry, Robert; Bae, Sejong; Singh, Karan; Peter He, Qinghua; Lillard, James W

    2014-10-01

    A multivariable analysis is the most popular approach when investigating associations between risk factors and disease. However, efficiency of multivariable analysis highly depends on correlation structure among predictive variables. When the covariates in the model are not independent one another, collinearity/multicollinearity problems arise in the analysis, which leads to biased estimation. This work aims to perform a simulation study with various scenarios of different collinearity structures to investigate the effects of collinearity under various correlation structures amongst predictive and explanatory variables and to compare these results with existing guidelines to decide harmful collinearity. Three correlation scenarios among predictor variables are considered: (1) bivariate collinear structure as the most simple collinearity case, (2) multivariate collinear structure where an explanatory variable is correlated with two other covariates, (3) a more realistic scenario when an independent variable can be expressed by various functions including the other variables.

  8. European temperature records of the past five centuries based on documentary information compared to climate simulations

    NASA Astrophysics Data System (ADS)

    Zorita, E.

    2009-09-01

    Two European temperature records for the past half-millennium, January-to-April air temperature for Stockholm (Sweden) and seasonal temperature for a Central European region, both derived from the analysis of documentary sources combined with long instrumental records, are compared with the output of forced (solar, volcanic, greenhouse gases) climate simulations with the model ECHO-G. The analysis is complemented with the long (early)-instrumental record of Central England Temperature (CET). Both approaches to study past climates (simulations and reconstructions) are burdened with uncertainties. The main objective of this comparative analysis is to identify robust features and weaknesses that may help to improve models and reconstruction methods. The results indicate a general agreement between simulations and the reconstructed Stockholm and CET records regarding the long-term temperature trend over the recent centuries, suggesting a reasonable choice of the amplitude of the solar forcing in the simulations and sensitivity of the model to the external forcing. However, the Stockholm reconstruction and the CET record also show a long and clear multi-decadal warm episode peaking around 1730, which is absent in the simulations. The uncertainties associated with the reconstruction method or with the simulated internal climate variability cannot easily explain this difference. Regarding the interannual variability, the Stockholm series displays in some periods higher amplitudes than the simulations but these differences are within the statistical uncertainty and further decrease if output from a regional model driven by the global model is used. The long-term trends in the simulations and reconstructions of the Central European temperature agree less well. The reconstructed temperature displays, for all seasons, a smaller difference between the present climate and past centuries than the simulations. Possible reasons for these differences may be related to a limitation of the traditional technique for converting documentary evidence to temperature values to capture long-term climate changes, because the documents often reflect temperatures relative to the contemporary authors' own perception of what constituted 'normal' conditions. By contrast, the simulated and reconstructed inter-annual variability is in rather good agreement.

  9. A planar comparison of actuators for vibration control of flexible structures

    NASA Technical Reports Server (NTRS)

    Clark, William W.; Robertshaw, Harry H.; Warrington, Thomas J.

    1989-01-01

    The methods and results of an analytical study comparing the effectiveness of four actuators in damping the vibrations of a planar clamped-free beam are presented. The actuators studied are two inertia-type actuators, the proof mass and reaction wheel, and two variable geometry trusses, the planar truss and the planar truss proof mass (a combination variable geometry truss/inertia-type actuator). Actuator parameters used in the models were chosen based on the results of a parametric study. A full-state, LQR optimal feedback control law was used for control in each system. Numerical simulations of each beam/actuator system were performed in response to initial condition inputs. These simulations provided information such as time response of the closed-loop system and damping provided to the beam. This information can be used to determine the 'best' actuator for a given purpose.

  10. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  11. New technique for simulation of microgravity and variable gravity conditions

    NASA Astrophysics Data System (ADS)

    de la Rosa, R.; Alonso, A.; Abasolo, D. E.; Hornero, R.; Abasolo, D. E.

    2005-08-01

    This paper suggests a microgravity or variable gravity conditions simulator based on a Neuromuscular Control System (NCS), working as a man-machine interface. The subject under training lies on an active platform that counteracts his weight. And a Virtual Reality (VR) system displays a simulated environment, where the subject can interact a number of settings: extravehicular activity (EVA), walking on the Moon or training the limb response faced with variable acceleration scenes. Results related to real-time voluntary control have been achieved with neuromuscular interfaces at the Bioengineering Group in the University of Valladolid. It has been employed a custom real-time system to train arm movements. This paper outlines a more complex design that can complement other training facilities, like the buoyancy pool, in the task of microgravity simulation.

  12. Behavioral Studies in Communication, A Selected Bibliography, 1973.

    ERIC Educational Resources Information Center

    Steinfatt, Thomas M.

    The studies in communication behavior listed in this selected bibliography were published during 1973, unless otherwise specified. Entries are listed under 12 categories: cross cultural communication; diffusion; games, simulations, and conflict; general communication variables; group and organizational communication; interpersonal communication;…

  13. Adaptations of a physical-based hydrological model for alpine catchments. Application to the upper Durance catchment.

    NASA Astrophysics Data System (ADS)

    Lafaysse, Matthieu; Hingray, Benoit

    2010-05-01

    The impact of global change on water resources is expected to be especially pronounced in mountainous areas. Future hydrological scenarios required for impact studies are classically simulated with hydrological models from future meteorological scenarios based on GCMs outputs. Future hydrological regimes of French rivers were estimated following this methodology by Boé et al. (2009) with the physical-based hydrological model SAFRAN-ISBA-MODCOU (SIM), developed by Météo-France. Scenarios obtained for the Alps seem however not very reliable due to the poor performance achieved by the model for the present climate over this region. This work presents possible improvements of SIM for a more relevant simulation of alpine catchments hydrological behavior. Results obtained for the upper Durance catchment (3580 km2) are given for illustration. This catchment is located in Southern French Alps. Its outlet is the Serre-Ponçon lake, a large dam operated for hydropower production, with a key role for water supply in southeastern France. With altitudes ranging from 700 to 4100 meters, the catchment presents highly seasonal flows: minimum and maximum discharges are observed in winter and spring respectively due to snow accumulation and melt, low flows are sustained by glacier melt in late summer (39 km2 are covered by glaciers), major floods can be observed in fall due to large liquid precipitation amounts. Two main limitations of SIM were identified for this catchment. First the 8km-side grid discretization gives a bad representation of the spatial variability of hydrological processes induced by elevation and orientation. Then, low flows are not well represented because the model doesn't include deep storage in aquifers nor ice melt from glaciers. We modified SIM accordingly. For the first point, we applied a discretization based on topography : we divided the catchment in 9 sub-catchments and further 300 meters elevation bands. The vertical variability of meteorological inputs and vegetation cover could be thus better accounted for. Then, each elevation band is divided in 7 exposure classes, in order to represent the influence on snow cover of the solar radiation spatial variability . This discretisation results in 539 Hydrological Units where hydrological processes are assumed to be homogeneous. For the second point, we first included the possibility for glacier melt in previous discretization. We next added a conceptual non-linear underground reservoir in order to simulate water retention by aquifers. These adaptations lead to a clear improvement of simulations for all the hydrometric stations. Daily simulated discharges fit well with measurements (Nash score = 0.8). The model has a good ability to simulate interannual variability and it is robust under a long simulation period (1959-2006). This encourages us to use it in a modified climate context. We studied the effect of each model improvement with a set of sensitivity tests. Accounting for elevation bands allows simulating more persistent snow cover at high altitudes, contributing later to river flows. Adding underground storage leads to delay the snowmelt runoff transfer in river. The exposure influence is not so sensitive for discharges simulation, but it gives a more accurate description of the spatial variability of snow cover. Although glaciered areas are very small compared to total basin area, a better simulation of summer low flows is obtained including a glacier melt module. Despite previous improvements, winter low flows are still slightly underestimated. As suggested by a simple sensitivity analysis, this could be partly due to the fact that the model doesn't correctly simulate basal snowmelt by ground heat flow.

  14. When Can Categorical Variables Be Treated as Continuous? A Comparison of Robust Continuous and Categorical SEM Estimation Methods under Suboptimal Conditions

    ERIC Educational Resources Information Center

    Rhemtulla, Mijke; Brosseau-Liard, Patricia E.; Savalei, Victoria

    2012-01-01

    A simulation study compared the performance of robust normal theory maximum likelihood (ML) and robust categorical least squares (cat-LS) methodology for estimating confirmatory factor analysis models with ordinal variables. Data were generated from 2 models with 2-7 categories, 4 sample sizes, 2 latent distributions, and 5 patterns of category…

  15. Eddy energy sources and mesoscale eddies in the Sea of Okhotsk

    NASA Astrophysics Data System (ADS)

    Stepanov, Dmitry V.; Diansky, Nikolay A.; Fomin, Vladimir V.

    2018-05-01

    Based on eddy-permitting ocean circulation model outputs, the mesoscale variability is studied in the Sea of Okhotsk. We confirmed that the simulated circulation reproduces the main features of the general circulation in the Sea of Okhotsk. In particular, it reproduced a complex structure of the East-Sakhalin current and the pronounced seasonal variability of this current. We established that the maximum of mean kinetic energy was associated with the East-Sakhalin Current. In order to uncover causes and mechanisms of the mesoscale variability, we studied the budget of eddy kinetic energy (EKE) in the Sea of Okhotsk. Spatial distribution of the EKE showed that intensive mesoscale variability occurs along the western boundary of the Sea of Okhotsk, where the East-Sakhalin Current extends. We revealed a pronounced seasonal variability of EKE with its maximum intensity in winter and its minimum intensity in summer. Analysis of EKE sources and rates of energy conversion revealed a leading role of time-varying (turbulent) wind stress in the generation of mesoscale variability along the western boundary of the Sea of Okhotsk in winter and spring. We established that a contribution of baroclinic instability predominates over that of barotropic instability in the generation of mesoscale variability along the western boundary of the Sea of Okhotsk. To demonstrate the mechanism of baroclinic instability, the simulated circulation was considered along the western boundary of the Sea of Okhotsk from January to April 2005. In April, the mesoscale anticyclonic eddies are observed along the western boundary of the Sea of Okhotsk. The role of the sea ice cover in the intensification of the mesoscale variability in the Sea of Okhotsk was discussed.

  16. Reduction of variable-truncation artifacts from beam occlusion during in situ x-ray tomography

    NASA Astrophysics Data System (ADS)

    Borg, Leise; Jørgensen, Jakob S.; Frikel, Jürgen; Sporring, Jon

    2017-12-01

    Many in situ x-ray tomography studies require experimental rigs which may partially occlude the beam and cause parts of the projection data to be missing. In a study of fluid flow in porous chalk using a percolation cell with four metal bars drastic streak artifacts arise in the filtered backprojection (FBP) reconstruction at certain orientations. Projections with non-trivial variable truncation caused by the metal bars are the source of these variable-truncation artifacts. To understand the artifacts a mathematical model of variable-truncation data as a function of metal bar radius and distance to sample is derived and verified numerically and with experimental data. The model accurately describes the arising variable-truncation artifacts across simulated variations of the experimental setup. Three variable-truncation artifact-reduction methods are proposed, all aimed at addressing sinogram discontinuities that are shown to be the source of the streaks. The ‘reduction to limited angle’ (RLA) method simply keeps only non-truncated projections; the ‘detector-directed smoothing’ (DDS) method smooths the discontinuities; while the ‘reflexive boundary condition’ (RBC) method enforces a zero derivative at the discontinuities. Experimental results using both simulated and real data show that the proposed methods effectively reduce variable-truncation artifacts. The RBC method is found to provide the best artifact reduction and preservation of image features using both visual and quantitative assessment. The analysis and artifact-reduction methods are designed in context of FBP reconstruction motivated by computational efficiency practical for large, real synchrotron data. While a specific variable-truncation case is considered, the proposed methods can be applied to general data cut-offs arising in different in situ x-ray tomography experiments.

  17. Further influence of the eastern boundary on the seasonal variability of the Atlantic Meridional Overturning Circulation at 26N

    NASA Astrophysics Data System (ADS)

    Baehr, Johanna; Schmidt, Christian

    2016-04-01

    The seasonal cycle of the Atlantic Meridional Overturning Circulation (AMOC) at 26.5 N has been shown to arise predominantly from sub-surface density variations at the Eastern boundary. Here, we suggest that these sub-surface density variations have their origin in the seasonal variability of the Canary Current system, in particular the Poleward Undercurrent (PUC). We use a high-resolution ocean model (STORM) for which we show that the seasonal variability resembles observations for both sub-surface density variability and meridional transports. In particular, the STORM model simulation density variations at the eastern boundary show seasonal variations reaching down to well over 1000m, a pattern that most model simulations systematically underestimate. We find that positive wind stress curl anomalies in late summer and already within one degree off the eastern boundary result -through water column stretching- in strong transport anomlies in PUC in fall, coherent down to 1000m depth. Simultaneously with a westward propagation of these transport anomalies, we find in winter a weak PUC between 200 m and 500m, and southward transports between 600m and 1300m. This variability is in agreement with the observationally-based suggestion of a seasonal reversal of the meridional transports at intermediate depths. Our findings extend earlier studies which suggested that the seasonal variability at of the meridional transports across 26N is created by changes in the basin-wide thermocline through wind-driven upwelling at the eastern boundary analyzing wind stress curl anomalies 2 degrees off the eastern boundary. Our results suggest that the investigation of AMOC variability and particular its seasonal cycle modulations require the analysis of boundary wind stress curl and the upper ocean transports within 1 degree off the eastern boundary. These findings also implicate that without high-resolution coverage of the eastern boundary, coarser model simulation might not fully represent the AMOC's seasonal variability.

  18. The contributions of local and remote atmospheric moisture fluxes to East Asian precipitation and its variability

    NASA Astrophysics Data System (ADS)

    Guo, Liang; Klingaman, Nicholas P.; Demory, Marie-Estelle; Vidale, Pier Luigi; Turner, Andrew G.; Stephan, Claudia C.

    2018-01-01

    We investigate the contribution of the local and remote atmospheric moisture fluxes to East Asia (EA) precipitation and its interannual variability during 1979-2012. We use and expand the Brubaker et al. (J Clim 6:1077-1089,1993) method, which connects the area-mean precipitation to area-mean evaporation and the horizontal moisture flux into the region. Due to its large landmass and hydrological heterogeneity, EA is divided into five sub-regions: Southeast (SE), Tibetan Plateau (TP), Central East (CE), Northwest (NW) and Northeast (NE). For each region, we first separate the contributions to precipitation of local evaporation from those of the horizontal moisture flux by calculating the precipitation recycling ratio: the fraction of precipitation over a region that originates as evaporation from the same region. Then, we separate the horizontal moisture flux across the region's boundaries by direction. We estimate the contributions of the horizontal moisture fluxes from each direction, as well as the local evaporation, to the mean precipitation and its interannual variability. We find that the major contributors to the mean precipitation are not necessarily those that contribute most to the precipitation interannual variability. Over SE, the moisture flux via the southern boundary dominates the mean precipitation and its interannual variability. Over TP, in winter and spring, the moisture flux via the western boundary dominates the mean precipitation; however, variations in local evaporation dominate the precipitation interannual variability. The western moisture flux is the dominant contributor to the mean precipitation over CE, NW and NE. However, the southern or northern moisture flux or the local evaporation dominates the precipitation interannual variability over these regions, depending on the season. Potential mechanisms associated with interannual variability in the moisture flux are identified for each region. The methods and results presented in this study can be readily applied to model simulations, to identify simulation biases in precipitation that relate to the simulated moisture supplies and transport.

  19. Simulation of Anomalous Regional Climate Events with a Variable Resolution Stretched Grid GCM

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.

    1999-01-01

    The stretched-grid approach provides an efficient down-scaling and consistent interactions between global and regional scales due to using one variable-resolution model for integrations. It is a workable alternative to the widely used nested-grid approach introduced over a decade ago as a pioneering step in regional climate modeling. A variable-resolution General Circulation Model (GCM) employing a stretched grid, with enhanced resolution over the US as the area of interest, is used for simulating two anomalous regional climate events, the US summer drought of 1988 and flood of 1993. The special mode of integration using a stretched-grid GCM and data assimilation system is developed that allows for imitating the nested-grid framework. The mode is useful for inter-comparison purposes and for underlining the differences between these two approaches. The 1988 and 1993 integrations are performed for the two month period starting from mid May. Regional resolutions used in most of the experiments is 60 km. The major goal and the result of the study is obtaining the efficient down-scaling over the area of interest. The monthly mean prognostic regional fields for the stretched-grid integrations are remarkably close to those of the verifying analyses. Simulated precipitation patterns are successfully verified against gauge precipitation observations. The impact of finer 40 km regional resolution is investigated for the 1993 integration and an example of recovering subregional precipitation is presented. The obtained results show that the global variable-resolution stretched-grid approach is a viable candidate for regional and subregional climate studies and applications.

  20. Modeling of carbonate reservoir variable secondary pore space based on CT images

    NASA Astrophysics Data System (ADS)

    Nie, X.; Nie, S.; Zhang, J.; Zhang, C.; Zhang, Z.

    2017-12-01

    Digital core technology has brought convenience to us, and X-ray CT scanning is one of the most common way to obtain 3D digital cores. However, it can only provide the original information of the only samples being scanned, and we can't modify the porosity of the scanned cores. For numerical rock physical simulations, a series of cores with variable porosities are needed to determine the relationship between the physical properties and porosity. In carbonate rocks, the secondary pore space including dissolution pores, caves and natural fractures is the key reservoir space, which makes the study of carbonate secondary porosity very important. To achieve the variation of porosities in one rock sample, based on CT scanned digital cores, according to the physical and chemical properties of carbonate rocks, several mathematical methods are chosen to simulate the variation of secondary pore space. We use the erosion and dilation operations of mathematical morphology method to simulate the pore space changes of dissolution pores and caves. We also use the Fractional Brownian Motion model to generate natural fractures with different widths and angles in digital cores to simulate fractured carbonate rocks. The morphological opening-and-closing operations in mathematical morphology method are used to simulate distribution of fluid in the pore space. The established 3D digital core models with different secondary porosities and water saturation status can be used in the study of the physical property numerical simulations of carbonate reservoir rocks.

Top