Sample records for assimilation sequential monte

  1. DUAL STATE-PARAMETER UPDATING SCHEME ON A CONCEPTUAL HYDROLOGIC MODEL USING SEQUENTIAL MONTE CARLO FILTERS

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin

    Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.

  2. How to improve an un-alterable model forecast? A sequential data assimilation based error updating approach

    NASA Astrophysics Data System (ADS)

    Gragne, A. S.; Sharma, A.; Mehrotra, R.; Alfredsen, K. T.

    2012-12-01

    Accuracy of reservoir inflow forecasts is instrumental for maximizing value of water resources and influences operation of hydropower reservoirs significantly. Improving hourly reservoir inflow forecasts over a 24 hours lead-time is considered with the day-ahead (Elspot) market of the Nordic exchange market in perspectives. The procedure presented comprises of an error model added on top of an un-alterable constant parameter conceptual model, and a sequential data assimilation routine. The structure of the error model was investigated using freely available software for detecting mathematical relationships in a given dataset (EUREQA) and adopted to contain minimum complexity for computational reasons. As new streamflow data become available the extra information manifested in the discrepancies between measurements and conceptual model outputs are extracted and assimilated into the forecasting system recursively using Sequential Monte Carlo technique. Besides improving forecast skills significantly, the probabilistic inflow forecasts provided by the present approach entrains suitable information for reducing uncertainty in decision making processes related to hydropower systems operation. The potential of the current procedure for improving accuracy of inflow forecasts at lead-times unto 24 hours and its reliability in different seasons of the year will be illustrated and discussed thoroughly.

  3. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  4. Bayesian Treatment of Uncertainty in Environmental Modeling: Optimization, Sampling and Data Assimilation Using the DREAM Software Package

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2012-12-01

    In the past decade much progress has been made in the treatment of uncertainty in earth systems modeling. Whereas initial approaches has focused mostly on quantification of parameter and predictive uncertainty, recent methods attempt to disentangle the effects of parameter, forcing (input) data, model structural and calibration data errors. In this talk I will highlight some of our recent work involving theory, concepts and applications of Bayesian parameter and/or state estimation. In particular, new methods for sequential Monte Carlo (SMC) and Markov Chain Monte Carlo (MCMC) simulation will be presented with emphasis on massively parallel distributed computing and quantification of model structural errors. The theoretical and numerical developments will be illustrated using model-data synthesis problems in hydrology, hydrogeology and geophysics.

  5. A Particle Smoother with Sequential Importance Resampling for soil hydraulic parameter estimation: A lysimeter experiment

    NASA Astrophysics Data System (ADS)

    Montzka, Carsten; Hendricks Franssen, Harrie-Jan; Moradkhani, Hamid; Pütz, Thomas; Han, Xujun; Vereecken, Harry

    2013-04-01

    An adequate description of soil hydraulic properties is essential for a good performance of hydrological forecasts. So far, several studies showed that data assimilation could reduce the parameter uncertainty by considering soil moisture observations. However, these observations and also the model forcings were recorded with a specific measurement error. It seems a logical step to base state updating and parameter estimation on observations made at multiple time steps, in order to reduce the influence of outliers at single time steps given measurement errors and unknown model forcings. Such outliers could result in erroneous state estimation as well as inadequate parameters. This has been one of the reasons to use a smoothing technique as implemented for Bayesian data assimilation methods such as the Ensemble Kalman Filter (i.e. Ensemble Kalman Smoother). Recently, an ensemble-based smoother has been developed for state update with a SIR particle filter. However, this method has not been used for dual state-parameter estimation. In this contribution we present a Particle Smoother with sequentially smoothing of particle weights for state and parameter resampling within a time window as opposed to the single time step data assimilation used in filtering techniques. This can be seen as an intermediate variant between a parameter estimation technique using global optimization with estimation of single parameter sets valid for the whole period, and sequential Monte Carlo techniques with estimation of parameter sets evolving from one time step to another. The aims are i) to improve the forecast of evaporation and groundwater recharge by estimating hydraulic parameters, and ii) to reduce the impact of single erroneous model inputs/observations by a smoothing method. In order to validate the performance of the proposed method in a real world application, the experiment is conducted in a lysimeter environment.

  6. Joint Data Assimilation and Parameter Calibration in on-line groundwater modelling using Sequential Monte Carlo techniques

    NASA Astrophysics Data System (ADS)

    Ramgraber, M.; Schirmer, M.

    2017-12-01

    As computational power grows and wireless sensor networks find their way into common practice, it becomes increasingly feasible to pursue on-line numerical groundwater modelling. The reconciliation of model predictions with sensor measurements often necessitates the application of Sequential Monte Carlo (SMC) techniques, most prominently represented by the Ensemble Kalman Filter. In the pursuit of on-line predictions it seems advantageous to transcend the scope of pure data assimilation and incorporate on-line parameter calibration as well. Unfortunately, the interplay between shifting model parameters and transient states is non-trivial. Several recent publications (e.g. Chopin et al., 2013, Kantas et al., 2015) in the field of statistics discuss potential algorithms addressing this issue. However, most of these are computationally intractable for on-line application. In this study, we investigate to what extent compromises between mathematical rigour and computational restrictions can be made within the framework of on-line numerical modelling of groundwater. Preliminary studies are conducted in a synthetic setting, with the goal of transferring the conclusions drawn into application in a real-world setting. To this end, a wireless sensor network has been established in the valley aquifer around Fehraltorf, characterized by a highly dynamic groundwater system and located about 20 km to the East of Zürich, Switzerland. By providing continuous probabilistic estimates of the state and parameter distribution, a steady base for branched-off predictive scenario modelling could be established, providing water authorities with advanced tools for assessing the impact of groundwater management practices. Chopin, N., Jacob, P.E. and Papaspiliopoulos, O. (2013): SMC2: an efficient algorithm for sequential analysis of state space models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75 (3), p. 397-426. Kantas, N., Doucet, A., Singh, S.S., Maciejowski, J., and Chopin, N. (2015): On Particle Methods for Parameter Estimation in State-Space Models. Statistical Science, 30 (3), p. 328.-351.

  7. Sequential biases on subjective judgments: Evidence from face attractiveness and ringtone agreeableness judgment.

    PubMed

    Huang, Jianrui; He, Xianyou; Ma, Xiaojin; Ren, Yian; Zhao, Tingting; Zeng, Xin; Li, Han; Chen, Yiheng

    2018-01-01

    When people make decisions about sequentially presented items in psychophysical experiments, their decisions are always biased by their preceding decisions and the preceding items, either by assimilation (shift towards the decision or item) or contrast (shift away from the decision or item). Such sequential biases also occur in naturalistic and real-world judgments such as facial attractiveness judgments. In this article, we aimed to cast light on the causes of these sequential biases. We first found significant assimilative and contrastive effects in a visual face attractiveness judgment task and an auditory ringtone agreeableness judgment task, indicating that sequential effects are not limited to the visual modality. We then found that the provision of trial-by-trial feedback of the preceding stimulus value eliminated the contrastive effect, but only weakened the assimilative effect. When participants orally reported their judgments rather than indicated them via a keyboard button press, we found a significant diminished assimilative effect, suggesting that motor response repetition strengthened the assimilation bias. Finally, we found that when visual and auditory stimuli were alternated, there was no longer a contrastive effect from the immediately previous trial, but there was an assimilative effect both from the previous trial (cross-modal) and the 2-back trial (same stimulus modality). These findings suggested that the contrastive effect results from perceptual processing, while the assimilative effect results from anchoring of the previous judgment and is strengthened by response repetition and numerical priming.

  8. Multivariate Error Covariance Estimates by Monte-Carlo Simulation for Assimilation Studies in the Pacific Ocean

    NASA Technical Reports Server (NTRS)

    Borovikov, Anna; Rienecker, Michele M.; Keppenne, Christian; Johnson, Gregory C.

    2004-01-01

    One of the most difficult aspects of ocean state estimation is the prescription of the model forecast error covariances. The paucity of ocean observations limits our ability to estimate the covariance structures from model-observation differences. In most practical applications, simple covariances are usually prescribed. Rarely are cross-covariances between different model variables used. Here a comparison is made between a univariate Optimal Interpolation (UOI) scheme and a multivariate OI algorithm (MvOI) in the assimilation of ocean temperature. In the UOI case only temperature is updated using a Gaussian covariance function and in the MvOI salinity, zonal and meridional velocities as well as temperature, are updated using an empirically estimated multivariate covariance matrix. Earlier studies have shown that a univariate OI has a detrimental effect on the salinity and velocity fields of the model. Apparently, in a sequential framework it is important to analyze temperature and salinity together. For the MvOI an estimation of the model error statistics is made by Monte-Carlo techniques from an ensemble of model integrations. An important advantage of using an ensemble of ocean states is that it provides a natural way to estimate cross-covariances between the fields of different physical variables constituting the model state vector, at the same time incorporating the model's dynamical and thermodynamical constraints as well as the effects of physical boundaries. Only temperature observations from the Tropical Atmosphere-Ocean array have been assimilated in this study. In order to investigate the efficacy of the multivariate scheme two data assimilation experiments are validated with a large independent set of recently published subsurface observations of salinity, zonal velocity and temperature. For reference, a third control run with no data assimilation is used to check how the data assimilation affects systematic model errors. While the performance of the UOI and MvOI is similar with respect to the temperature field, the salinity and velocity fields are greatly improved when multivariate correction is used, as evident from the analyses of the rms differences of these fields and independent observations. The MvOI assimilation is found to improve upon the control run in generating the water masses with properties close to the observed, while the UOI failed to maintain the temperature and salinity structure.

  9. Development of a copula-based particle filter (CopPF) approach for hydrologic data assimilation under consideration of parameter interdependence

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, G. H.; Baetz, B. W.; Li, Y. P.; Huang, K.

    2017-06-01

    In this study, a copula-based particle filter (CopPF) approach was developed for sequential hydrological data assimilation by considering parameter correlation structures. In CopPF, multivariate copulas are proposed to reflect parameter interdependence before the resampling procedure with new particles then being sampled from the obtained copulas. Such a process can overcome both particle degeneration and sample impoverishment. The applicability of CopPF is illustrated with three case studies using a two-parameter simplified model and two conceptual hydrologic models. The results for the simplified model indicate that model parameters are highly correlated in the data assimilation process, suggesting a demand for full description of their dependence structure. Synthetic experiments on hydrologic data assimilation indicate that CopPF can rejuvenate particle evolution in large spaces and thus achieve good performances with low sample size scenarios. The applicability of CopPF is further illustrated through two real-case studies. It is shown that, compared with traditional particle filter (PF) and particle Markov chain Monte Carlo (PMCMC) approaches, the proposed method can provide more accurate results for both deterministic and probabilistic prediction with a sample size of 100. Furthermore, the sample size would not significantly influence the performance of CopPF. Also, the copula resampling approach dominates parameter evolution in CopPF, with more than 50% of particles sampled by copulas in most sample size scenarios.

  10. Olympic Medals as Fruits of Comparison? Assimilation and Contrast in Sequential Performance Judgments

    ERIC Educational Resources Information Center

    Damisch, Lysann; Mussweiler, Thomas; Plessner, Henning

    2006-01-01

    The authors investigated the evaluative consequences of sequential performance judgments. Recent social comparison research has suggested that performance judgments may be influenced by judgments about a preceding performance. Specifically, performance judgments may be assimilated to judgments of the preceding performance if judges focus on…

  11. Sequential effects in judgements of attractiveness: the influences of face race and sex.

    PubMed

    Kramer, Robin S S; Jones, Alex L; Sharma, Dinkar

    2013-01-01

    In perceptual decision-making, a person's response on a given trial is influenced by their response on the immediately preceding trial. This sequential effect was initially demonstrated in psychophysical tasks, but has now been found in more complex, real-world judgements. The similarity of the current and previous stimuli determines the nature of the effect, with more similar items producing assimilation in judgements, while less similarity can cause a contrast effect. Previous research found assimilation in ratings of facial attractiveness, and here, we investigated whether this effect is influenced by the social categories of the faces presented. Over three experiments, participants rated the attractiveness of own- (White) and other-race (Chinese) faces of both sexes that appeared successively. Through blocking trials by race (Experiment 1), sex (Experiment 2), or both dimensions (Experiment 3), we could examine how sequential judgements were altered by the salience of different social categories in face sequences. For sequences that varied in sex alone, own-race faces showed significantly less opposite-sex assimilation (male and female faces perceived as dissimilar), while other-race faces showed equal assimilation for opposite- and same-sex sequences (male and female faces were not differentiated). For sequences that varied in race alone, categorisation by race resulted in no opposite-race assimilation for either sex of face (White and Chinese faces perceived as dissimilar). For sequences that varied in both race and sex, same-category assimilation was significantly greater than opposite-category. Our results suggest that the race of a face represents a superordinate category relative to sex. These findings demonstrate the importance of social categories when considering sequential judgements of faces, and also highlight a novel approach for investigating how multiple social dimensions interact during decision-making.

  12. Sequential Effects in Judgements of Attractiveness: The Influences of Face Race and Sex

    PubMed Central

    Kramer, Robin S. S.; Jones, Alex L.; Sharma, Dinkar

    2013-01-01

    In perceptual decision-making, a person’s response on a given trial is influenced by their response on the immediately preceding trial. This sequential effect was initially demonstrated in psychophysical tasks, but has now been found in more complex, real-world judgements. The similarity of the current and previous stimuli determines the nature of the effect, with more similar items producing assimilation in judgements, while less similarity can cause a contrast effect. Previous research found assimilation in ratings of facial attractiveness, and here, we investigated whether this effect is influenced by the social categories of the faces presented. Over three experiments, participants rated the attractiveness of own- (White) and other-race (Chinese) faces of both sexes that appeared successively. Through blocking trials by race (Experiment 1), sex (Experiment 2), or both dimensions (Experiment 3), we could examine how sequential judgements were altered by the salience of different social categories in face sequences. For sequences that varied in sex alone, own-race faces showed significantly less opposite-sex assimilation (male and female faces perceived as dissimilar), while other-race faces showed equal assimilation for opposite- and same-sex sequences (male and female faces were not differentiated). For sequences that varied in race alone, categorisation by race resulted in no opposite-race assimilation for either sex of face (White and Chinese faces perceived as dissimilar). For sequences that varied in both race and sex, same-category assimilation was significantly greater than opposite-category. Our results suggest that the race of a face represents a superordinate category relative to sex. These findings demonstrate the importance of social categories when considering sequential judgements of faces, and also highlight a novel approach for investigating how multiple social dimensions interact during decision-making. PMID:24349226

  13. Sequential estimation and satellite data assimilation in meteorology and oceanography

    NASA Technical Reports Server (NTRS)

    Ghil, M.

    1986-01-01

    The role of dynamics in estimating the state of the atmosphere and ocean from incomplete and noisy data is discussed and the classical applications of four-dimensional data assimilation to large-scale atmospheric dynamics are presented. It is concluded that sequential updating of a forecast model with continuously incoming conventional and remote-sensing data is the most natural way of extracting the maximum amount of information from the imperfectly known dynamics, on the one hand, and the inaccurate and incomplete observations, on the other.

  14. Sequential estimation and satellite data assimilation in meteorology and oceanography

    NASA Technical Reports Server (NTRS)

    Ghil, M.

    1986-01-01

    The central theme of this review article is the role that dynamics plays in estimating the state of the atmosphere and of the ocean from incomplete and noisy data. Objective analysis and inverse methods represent an attempt at relying mostly on the data and minimizing the role of dynamics in the estimation. Four-dimensional data assimilation tries to balance properly the roles of dynamical and observational information. Sequential estimation is presented as the proper framework for understanding this balance, and the Kalman filter as the ideal, optimal procedure for data assimilation. The optimal filter computes forecast error covariances of a given atmospheric or oceanic model exactly, and hence data assimilation should be closely connected with predictability studies. This connection is described, and consequences drawn for currently active areas of the atmospheric and oceanic sciences, namely, mesoscale meteorology, medium and long-range forecasting, and upper-ocean dynamics.

  15. Monte Carlo Bayesian Inference on a Statistical Model of Sub-gridcolumn Moisture Variability Using High-resolution Cloud Observations . Part II; Sensitivity Tests and Results

    NASA Technical Reports Server (NTRS)

    da Silva, Arlindo M.; Norris, Peter M.

    2013-01-01

    Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.

  16. Investigating the role of background and observation error correlations in improving a model forecast of forest carbon balance using four dimensional variational data assimilation.

    NASA Astrophysics Data System (ADS)

    Pinnington, Ewan; Casella, Eric; Dance, Sarah; Lawless, Amos; Morison, James; Nichols, Nancy; Wilkinson, Matthew; Quaife, Tristan

    2016-04-01

    Forest ecosystems play an important role in sequestering human emitted carbon-dioxide from the atmosphere and therefore greatly reduce the effect of anthropogenic induced climate change. For that reason understanding their response to climate change is of great importance. Efforts to implement variational data assimilation routines with functional ecology models and land surface models have been limited, with sequential and Markov chain Monte Carlo data assimilation methods being prevalent. When data assimilation has been used with models of carbon balance, background "prior" errors and observation errors have largely been treated as independent and uncorrelated. Correlations between background errors have long been known to be a key aspect of data assimilation in numerical weather prediction. More recently, it has been shown that accounting for correlated observation errors in the assimilation algorithm can considerably improve data assimilation results and forecasts. In this paper we implement a 4D-Var scheme with a simple model of forest carbon balance, for joint parameter and state estimation and assimilate daily observations of Net Ecosystem CO2 Exchange (NEE) taken at the Alice Holt forest CO2 flux site in Hampshire, UK. We then investigate the effect of specifying correlations between parameter and state variables in background error statistics and the effect of specifying correlations in time between observation error statistics. The idea of including these correlations in time is new and has not been previously explored in carbon balance model data assimilation. In data assimilation, background and observation error statistics are often described by the background error covariance matrix and the observation error covariance matrix. We outline novel methods for creating correlated versions of these matrices, using a set of previously postulated dynamical constraints to include correlations in the background error statistics and a Gaussian correlation function to include time correlations in the observation error statistics. The methods used in this paper will allow the inclusion of time correlations between many different observation types in the assimilation algorithm, meaning that previously neglected information can be accounted for. In our experiments we compared the results using our new correlated background and observation error covariance matrices and those using diagonal covariance matrices. We found that using the new correlated matrices reduced the root mean square error in the 14 year forecast of daily NEE by 44 % decreasing from 4.22 g C m-2 day-1 to 2.38 g C m-2 day-1.

  17. Development of a software framework for data assimilation and its applications for streamflow forecasting in Japan

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.

    2012-04-01

    Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.

  18. Data assimilation using a GPU accelerated path integral Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Quinn, John C.; Abarbanel, Henry D. I.

    2011-09-01

    The answers to data assimilation questions can be expressed as path integrals over all possible state and parameter histories. We show how these path integrals can be evaluated numerically using a Markov Chain Monte Carlo method designed to run in parallel on a graphics processing unit (GPU). We demonstrate the application of the method to an example with a transmembrane voltage time series of a simulated neuron as an input, and using a Hodgkin-Huxley neuron model. By taking advantage of GPU computing, we gain a parallel speedup factor of up to about 300, compared to an equivalent serial computation on a CPU, with performance increasing as the length of the observation time used for data assimilation increases.

  19. Implicit Particle Filter for Power System State Estimation with Large Scale Renewable Power Integration.

    NASA Astrophysics Data System (ADS)

    Uzunoglu, B.; Hussaini, Y.

    2017-12-01

    Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.

  20. Propagating probability distributions of stand variables using sequential Monte Carlo methods

    Treesearch

    Jeffrey H. Gove

    2009-01-01

    A general probabilistic approach to stand yield estimation is developed based on sequential Monte Carlo filters, also known as particle filters. The essential steps in the development of the sampling importance resampling (SIR) particle filter are presented. The SIR filter is then applied to simulated and observed data showing how the 'predictor - corrector'...

  1. Estimation of parameters and basic reproduction ratio for Japanese encephalitis transmission in the Philippines using sequential Monte Carlo filter

    USDA-ARS?s Scientific Manuscript database

    We developed a sequential Monte Carlo filter to estimate the states and the parameters in a stochastic model of Japanese Encephalitis (JE) spread in the Philippines. This method is particularly important for its adaptability to the availability of new incidence data. This method can also capture the...

  2. Subsurface characterization with localized ensemble Kalman filter employing adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Delijani, Ebrahim Biniaz; Pishvaie, Mahmoud Reza; Boozarjomehry, Ramin Bozorgmehry

    2014-07-01

    Ensemble Kalman filter, EnKF, as a Monte Carlo sequential data assimilation method has emerged promisingly for subsurface media characterization during past decade. Due to high computational cost of large ensemble size, EnKF is limited to small ensemble set in practice. This results in appearance of spurious correlation in covariance structure leading to incorrect or probable divergence of updated realizations. In this paper, a universal/adaptive thresholding method is presented to remove and/or mitigate spurious correlation problem in the forecast covariance matrix. This method is, then, extended to regularize Kalman gain directly. Four different thresholding functions have been considered to threshold forecast covariance and gain matrices. These include hard, soft, lasso and Smoothly Clipped Absolute Deviation (SCAD) functions. Three benchmarks are used to evaluate the performances of these methods. These benchmarks include a small 1D linear model and two 2D water flooding (in petroleum reservoirs) cases whose levels of heterogeneity/nonlinearity are different. It should be noted that beside the adaptive thresholding, the standard distance dependant localization and bootstrap Kalman gain are also implemented for comparison purposes. We assessed each setup with different ensemble sets to investigate the sensitivity of each method on ensemble size. The results indicate that thresholding of forecast covariance yields more reliable performance than Kalman gain. Among thresholding function, SCAD is more robust for both covariance and gain estimation. Our analyses emphasize that not all assimilation cycles do require thresholding and it should be performed wisely during the early assimilation cycles. The proposed scheme of adaptive thresholding outperforms other methods for subsurface characterization of underlying benchmarks.

  3. Real-time projections of cholera outbreaks through data assimilation and rainfall forecasting

    NASA Astrophysics Data System (ADS)

    Pasetto, Damiano; Finger, Flavio; Rinaldo, Andrea; Bertuzzo, Enrico

    2017-10-01

    Although treatment for cholera is well-known and cheap, outbreaks in epidemic regions still exact high death tolls mostly due to the unpreparedness of health care infrastructures to face unforeseen emergencies. In this context, mathematical models for the prediction of the evolution of an ongoing outbreak are of paramount importance. Here, we test a real-time forecasting framework that readily integrates new information as soon as available and periodically issues an updated forecast. The spread of cholera is modeled by a spatially-explicit scheme that accounts for the dynamics of susceptible, infected and recovered individuals hosted in different local communities connected through hydrologic and human mobility networks. The framework presents two major innovations for cholera modeling: the use of a data assimilation technique, specifically an ensemble Kalman filter, to update both state variables and parameters based on the observations, and the use of rainfall forecasts to force the model. The exercise of simulating the state of the system and the predictive capabilities of the novel tools, set at the initial phase of the 2010 Haitian cholera outbreak using only information that was available at that time, serves as a benchmark. Our results suggest that the assimilation procedure with the sequential update of the parameters outperforms calibration schemes based on Markov chain Monte Carlo. Moreover, in a forecasting mode the model usefully predicts the spatial incidence of cholera at least one month ahead. The performance decreases for longer time horizons yet allowing sufficient time to plan for deployment of medical supplies and staff, and to evaluate alternative strategies of emergency management.

  4. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 2: Sensitivity tests and results

    PubMed Central

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational–Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state. PMID:29618848

  5. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 2: Sensitivity Tests and Results

    NASA Technical Reports Server (NTRS)

    Norris, Peter M.; da Silva, Arlindo M.

    2016-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state.

  6. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 2: Sensitivity tests and results.

    PubMed

    Norris, Peter M; da Silva, Arlindo M

    2016-07-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state.

  7. Non-Saccharomyces Yeasts Nitrogen Source Preferences: Impact on Sequential Fermentation and Wine Volatile Compounds Profile

    PubMed Central

    Gobert, Antoine; Tourdot-Maréchal, Raphaëlle; Morge, Christophe; Sparrow, Céline; Liu, Youzhong; Quintanilla-Casas, Beatriz; Vichi, Stefania; Alexandre, Hervé

    2017-01-01

    Nitrogen sources in the must are important for yeast metabolism, growth, and performance, and wine volatile compounds profile. Yeast assimilable nitrogen (YAN) deficiencies in grape must are one of the main causes of stuck and sluggish fermentation. The nitrogen requirement of Saccharomyces cerevisiae metabolism has been described in detail. However, the YAN preferences of non-Saccharomyces yeasts remain unknown despite their increasingly widespread use in winemaking. Furthermore, the impact of nitrogen consumption by non-Saccharomyces yeasts on YAN availability, alcoholic performance and volatile compounds production by S. cerevisiae in sequential fermentation has been little studied. With a view to improving the use of non-Saccharomyces yeasts in winemaking, we studied the use of amino acids and ammonium by three strains of non-Saccharomyces yeasts (Starmerella bacillaris, Metschnikowia pulcherrima, and Pichia membranifaciens) in grape juice. We first determined which nitrogen sources were preferentially used by these yeasts in pure cultures at 28 and 20°C (because few data are available). We then carried out sequential fermentations at 20°C with S. cerevisiae, to assess the impact of the non-Saccharomyces yeasts on the availability of assimilable nitrogen for S. cerevisiae. Finally, 22 volatile compounds were quantified in sequential fermentation and their levels compared with those in pure cultures of S. cerevisiae. We report here, for the first time, that non-Saccharomyces yeasts have specific amino-acid consumption profiles. Histidine, methionine, threonine, and tyrosine were not consumed by S. bacillaris, aspartic acid was assimilated very slowly by M. pulcherrima, and glutamine was not assimilated by P. membranifaciens. By contrast, cysteine appeared to be a preferred nitrogen source for all non-Saccharomyces yeasts. In sequential fermentation, these specific profiles of amino-acid consumption by non-Saccharomyces yeasts may account for some of the interactions observed here, such as poorer performances of S. cerevisiae and volatile profile changes. PMID:29163451

  8. Non-Saccharomyces Yeasts Nitrogen Source Preferences: Impact on Sequential Fermentation and Wine Volatile Compounds Profile.

    PubMed

    Gobert, Antoine; Tourdot-Maréchal, Raphaëlle; Morge, Christophe; Sparrow, Céline; Liu, Youzhong; Quintanilla-Casas, Beatriz; Vichi, Stefania; Alexandre, Hervé

    2017-01-01

    Nitrogen sources in the must are important for yeast metabolism, growth, and performance, and wine volatile compounds profile. Yeast assimilable nitrogen (YAN) deficiencies in grape must are one of the main causes of stuck and sluggish fermentation. The nitrogen requirement of Saccharomyces cerevisiae metabolism has been described in detail. However, the YAN preferences of non- Saccharomyces yeasts remain unknown despite their increasingly widespread use in winemaking. Furthermore, the impact of nitrogen consumption by non- Saccharomyces yeasts on YAN availability, alcoholic performance and volatile compounds production by S. cerevisiae in sequential fermentation has been little studied. With a view to improving the use of non- Saccharomyces yeasts in winemaking, we studied the use of amino acids and ammonium by three strains of non- Saccharomyces yeasts ( Starmerella bacillaris, Metschnikowia pulcherrima , and Pichia membranifaciens ) in grape juice. We first determined which nitrogen sources were preferentially used by these yeasts in pure cultures at 28 and 20°C (because few data are available). We then carried out sequential fermentations at 20°C with S. cerevisiae , to assess the impact of the non- Saccharomyces yeasts on the availability of assimilable nitrogen for S. cerevisiae . Finally, 22 volatile compounds were quantified in sequential fermentation and their levels compared with those in pure cultures of S. cerevisiae . We report here, for the first time, that non- Saccharomyces yeasts have specific amino-acid consumption profiles. Histidine, methionine, threonine, and tyrosine were not consumed by S. bacillaris , aspartic acid was assimilated very slowly by M. pulcherrima , and glutamine was not assimilated by P. membranifaciens . By contrast, cysteine appeared to be a preferred nitrogen source for all non- Saccharomyces yeasts. In sequential fermentation, these specific profiles of amino-acid consumption by non- Saccharomyces yeasts may account for some of the interactions observed here, such as poorer performances of S. cerevisiae and volatile profile changes.

  9. Hydrologic and geochemical data assimilation at the Hanford 300 Area

    NASA Astrophysics Data System (ADS)

    Chen, X.; Hammond, G. E.; Murray, C. J.; Zachara, J. M.

    2012-12-01

    In modeling the uranium migration within the Integrated Field Research Challenge (IFRC) site at the Hanford 300 Area, uncertainties arise from both hydrologic and geochemical sources. The hydrologic uncertainty includes the transient flow boundary conditions induced by dynamic variations in Columbia River stage and the underlying heterogeneous hydraulic conductivity field, while the geochemical uncertainty is a result of limited knowledge of the geochemical reaction processes and parameters, as well as heterogeneity in uranium source terms. In this work, multiple types of data, including the results from constant-injection tests, borehole flowmeter profiling, and conservative tracer tests, are sequentially assimilated across scales within a Bayesian framework to reduce the hydrologic uncertainty. The hydrologic data assimilation is then followed by geochemical data assimilation, where the goal is to infer the heterogeneous distribution of uranium sources using uranium breakthrough curves from a desorption test that took place at high spring water table. We demonstrate in our study that Ensemble-based data assimilation techniques (Ensemble Kalman filter and smoother) are efficient in integrating multiple types of data sequentially for uncertainty reduction. The computational demand is managed by using the multi-realization capability within the parallel PFLOTRAN simulator.

  10. On a comparison of two schemes in sequential data assimilation

    NASA Astrophysics Data System (ADS)

    Grishina, Anastasiia A.; Penenko, Alexey V.

    2017-11-01

    This paper is focused on variational data assimilation as an approach to mathematical modeling. Realization of the approach requires a sequence of connected inverse problems with different sets of observational data to be solved. Two variational data assimilation schemes, "implicit" and "explicit", are considered in the article. Their equivalence is shown and the numerical results are given on a basis of non-linear Robertson system. To avoid the "inverse problem crime" different schemes were used to produce synthetic measurement and to solve the data assimilation problem.

  11. Comparison of Sequential and Variational Data Assimilation

    NASA Astrophysics Data System (ADS)

    Alvarado Montero, Rodolfo; Schwanenberg, Dirk; Weerts, Albrecht

    2017-04-01

    Data assimilation is a valuable tool to improve model state estimates by combining measured observations with model simulations. It has recently gained significant attention due to its potential in using remote sensing products to improve operational hydrological forecasts and for reanalysis purposes. This has been supported by the application of sequential techniques such as the Ensemble Kalman Filter which require no additional features within the modeling process, i.e. it can use arbitrary black-box models. Alternatively, variational techniques rely on optimization algorithms to minimize a pre-defined objective function. This function describes the trade-off between the amount of noise introduced into the system and the mismatch between simulated and observed variables. While sequential techniques have been commonly applied to hydrological processes, variational techniques are seldom used. In our believe, this is mainly attributed to the required computation of first order sensitivities by algorithmic differentiation techniques and related model enhancements, but also to lack of comparison between both techniques. We contribute to filling this gap and present the results from the assimilation of streamflow data in two basins located in Germany and Canada. The assimilation introduces noise to precipitation and temperature to produce better initial estimates of an HBV model. The results are computed for a hindcast period and assessed using lead time performance metrics. The study concludes with a discussion of the main features of each technique and their advantages/disadvantages in hydrological applications.

  12. An integrated error estimation and lag-aware data assimilation scheme for real-time flood forecasting

    USDA-ARS?s Scientific Manuscript database

    The performance of conventional filtering methods can be degraded by ignoring the time lag between soil moisture and discharge response when discharge observations are assimilated into streamflow modelling. This has led to the ongoing development of more optimal ways to implement sequential data ass...

  13. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability using High-Resolution Cloud Observations

    NASA Astrophysics Data System (ADS)

    Norris, P. M.; da Silva, A. M., Jr.

    2016-12-01

    Norris and da Silva recently published a method to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation (CDA). The gridcolumn model includes assumed-PDF intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used are MODIS cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. The new approach not only significantly reduces mean and standard deviation biases with respect to the assimilated observables, but also improves the simulated rotational-Ramman scattering cloud optical centroid pressure against independent (non-assimilated) retrievals from the OMI instrument. One obvious difficulty for the method, and other CDA methods, is the lack of information content in passive cloud observables on cloud vertical structure, beyond cloud-top and thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard is helpful, better honoring inversion structures in the background state.

  14. Sequential assimilation of volcanic monitoring data to quantify eruption potential: Application to Kerinci volcano

    NASA Astrophysics Data System (ADS)

    Zhan, Yan; Gregg, Patricia M.; Chaussard, Estelle; Aoki, Yosuke

    2017-12-01

    Quantifying the eruption potential of a restless volcano requires the ability to model parameters such as overpressure and calculate the host rock stress state as the system evolves. A critical challenge is developing a model-data fusion framework to take advantage of observational data and provide updates of the volcanic system through time. The Ensemble Kalman Filter (EnKF) uses a Monte Carlo approach to assimilate volcanic monitoring data and update models of volcanic unrest, providing time-varying estimates of overpressure and stress. Although the EnKF has been proven effective to forecast volcanic deformation using synthetic InSAR and GPS data, until now, it has not been applied to assimilate data from an active volcanic system. In this investigation, the EnKF is used to provide a “hindcast” of the 2009 explosive eruption of Kerinci volcano, Indonesia. A two-sources analytical model is used to simulate the surface deformation of Kerinci volcano observed by InSAR time-series data and to predict the system evolution. A deep, deflating dike-like source reproduces the subsiding signal on the flanks of the volcano, and a shallow spherical McTigue source reproduces the central uplift. EnKF predicted parameters are used in finite element models to calculate the host-rock stress state prior to the 2009 eruption. Mohr-Coulomb failure models reveal that the shallow magma reservoir is trending towards tensile failure prior to 2009, which may be the catalyst for the 2009 eruption. Our results illustrate that the EnKF shows significant promise for future applications to forecasting the eruption potential of restless volcanoes and hind-cast the triggering mechanisms of observed eruptions.

  15. Monte Carlo Simulation of Sudden Death Bearing Testing

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2003-01-01

    Monte Carlo simulations combined with sudden death testing were used to compare resultant bearing lives to the calculated hearing life and the cumulative test time and calendar time relative to sequential and censored sequential testing. A total of 30 960 virtual 50-mm bore deep-groove ball bearings were evaluated in 33 different sudden death test configurations comprising 36, 72, and 144 bearings each. Variations in both life and Weibull slope were a function of the number of bearings failed independent of the test method used and not the total number of bearings tested. Variation in L10 life as a function of number of bearings failed were similar to variations in lift obtained from sequentially failed real bearings and from Monte Carlo (virtual) testing of entire populations. Reductions up to 40 percent in bearing test time and calendar time can be achieved by testing to failure or the L(sub 50) life and terminating all testing when the last of the predetermined bearing failures has occurred. Sudden death testing is not a more efficient method to reduce bearing test time or calendar time when compared to censored sequential testing.

  16. A new GNSS-enabled floating device as a means for retrieving river bathymetry by assimilation into a hydrodynamic model

    NASA Astrophysics Data System (ADS)

    Hostache, R.; Matgen, P.; Giustarini, L.

    2012-04-01

    Hydrodynamic models form an important component in flood forecasting systems. Model predictions with reduced uncertainty critically depend on the availability of detailed information about floodplain topography and riverbed bathymetry. While digital elevation models with varying spatial resolutions and accuracy levels are readily available at a global scale and can be used to infer floodplain geometry, bathymetric data is often not available and ground surveys are time and resource intensive. In this general context, our study aims at evaluating the hydrometric value of the Global Navigation Satellite System (GNSS) for bathymetry retrieval. Integrated with satellite telecommunication systems, drifting or anchored floaters equipped with navigation systems such as GPS and Galileo, enable the quasi-continuous measurement and near real-time transmission of water levels and flow velocities, virtually from any point in the world. The presented study investigates the potential of assimilating GNSS-derived water level measurements into a hydraulic model in order to estimate river bathymetry. First, an ensemble of possible bathymetries and roughness parameters was randomly generated using a Monte-Carlo sampling approach. Next, water level measurements provided by a drifting GNSS-equipped buoy were assimilated into a hydrodynamic model using as input a recorded discharge hydrograph and as geometry the generated bathymetry ensemble. Synthetic experiments were carried out with a one-dimensional hydraulic model implemented over a 19 km reach of the Alzette River. A Particle Filter was used as an assimilation algorithm for integrating observation data into the hydraulic model. The synthetic observation, simulating the data obtained from GNSS measurements, was generated using a perturbed forward run of the hydrodynamic model using the true bathymetry (ground survey). The scenario adopted in the data assimilation experiment assumed that during a flood event, a buoy was launched into the water every ten hours. This frequency was considered plausible as the time needed for the buoy to drift from the upstream to the downstream end of the study area is estimated to be less than 6 h. Consequently, a time window of 10 h would allow an operator to launch the buoy at the upstream end, recover it at the downstream end and finally drive back to the upstream end and launch it again into the river channel.This synthetic observation was then assimilated into the hydraulic model. The first results were promising as sequentially assimilating the water level values provided by the synthetic GNSS-equipped buoy allowed gradually rejecting wrong bathymetries and converging toward bathymetries that are consistent with the ground surveyed one.

  17. Sequential Bayesian Geostatistical Inversion and Evaluation of Combined Data Worth for Aquifer Characterization at the Hanford 300 Area

    NASA Astrophysics Data System (ADS)

    Murakami, H.; Chen, X.; Hahn, M. S.; Over, M. W.; Rockhold, M. L.; Vermeul, V.; Hammond, G. E.; Zachara, J. M.; Rubin, Y.

    2010-12-01

    Subsurface characterization for predicting groundwater flow and contaminant transport requires us to integrate large and diverse datasets in a consistent manner, and quantify the associated uncertainty. In this study, we sequentially assimilated multiple types of datasets for characterizing a three-dimensional heterogeneous hydraulic conductivity field at the Hanford 300 Area. The datasets included constant-rate injection tests, electromagnetic borehole flowmeter tests, lithology profile and tracer tests. We used the method of anchored distributions (MAD), which is a modular-structured Bayesian geostatistical inversion method. MAD has two major advantages over the other inversion methods. First, it can directly infer a joint distribution of parameters, which can be used as an input in stochastic simulations for prediction. In MAD, in addition to typical geostatistical structural parameters, the parameter vector includes multiple point values of the heterogeneous field, called anchors, which capture local trends and reduce uncertainty in the prediction. Second, MAD allows us to integrate the datasets sequentially in a Bayesian framework such that it updates the posterior distribution, as a new dataset is included. The sequential assimilation can decrease computational burden significantly. We applied MAD to assimilate different combinations of the datasets, and then compared the inversion results. For the injection and tracer test assimilation, we calculated temporal moments of pressure build-up and breakthrough curves, respectively, to reduce the data dimension. A massive parallel flow and transport code PFLOTRAN is used for simulating the tracer test. For comparison, we used different metrics based on the breakthrough curves not used in the inversion, such as mean arrival time, peak concentration and early arrival time. This comparison intends to yield the combined data worth, i.e. which combination of the datasets is the most effective for a certain metric, which will be useful for guiding the further characterization effort at the site and also the future characterization projects at the other sites.

  18. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method.

    PubMed

    Norris, Peter M; da Silva, Arlindo M

    2016-07-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  19. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 1: Method

    NASA Technical Reports Server (NTRS)

    Norris, Peter M.; Da Silva, Arlindo M.

    2016-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC.

  20. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 1: Method

    PubMed Central

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    A method is presented to constrain a statistical model of sub-gridcolumn moisture variability using high-resolution satellite cloud data. The method can be used for large-scale model parameter estimation or cloud data assimilation. The gridcolumn model includes assumed probability density function (PDF) intra-layer horizontal variability and a copula-based inter-layer correlation model. The observables used in the current study are Moderate Resolution Imaging Spectroradiometer (MODIS) cloud-top pressure, brightness temperature and cloud optical thickness, but the method should be extensible to direct cloudy radiance assimilation for a small number of channels. The algorithm is a form of Bayesian inference with a Markov chain Monte Carlo (MCMC) approach to characterizing the posterior distribution. This approach is especially useful in cases where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach is not gradient-based and allows jumps into regions of non-zero cloud probability. The current study uses a skewed-triangle distribution for layer moisture. The article also includes a discussion of the Metropolis and multiple-try Metropolis versions of MCMC. PMID:29618847

  1. Methods of sequential estimation for determining initial data in numerical weather prediction. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Cohn, S. E.

    1982-01-01

    Numerical weather prediction (NWP) is an initial-value problem for a system of nonlinear differential equations, in which initial values are known incompletely and inaccurately. Observational data available at the initial time must therefore be supplemented by data available prior to the initial time, a problem known as meteorological data assimilation. A further complication in NWP is that solutions of the governing equations evolve on two different time scales, a fast one and a slow one, whereas fast scale motions in the atmosphere are not reliably observed. This leads to the so called initialization problem: initial values must be constrained to result in a slowly evolving forecast. The theory of estimation of stochastic dynamic systems provides a natural approach to such problems. For linear stochastic dynamic models, the Kalman-Bucy (KB) sequential filter is the optimal data assimilation method, for linear models, the optimal combined data assimilation-initialization method is a modified version of the KB filter.

  2. Assessing sequential data assimilation techniques for integrating GRACE data into a hydrological model

    NASA Astrophysics Data System (ADS)

    Khaki, M.; Hoteit, I.; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A. I. J. M.; Schumacher, M.; Pattiaratchi, C.

    2017-09-01

    The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively, improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.

  3. Towards the operational estimation of a radiological plume using data assimilation after a radiological accidental atmospheric release

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Vira, Julius; Bocquet, Marc; Sofiev, Mikhail; Saunier, Olivier

    2011-06-01

    In the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. The accuracy of the forecast plume is highly dependent on the source term estimation. On several academic test cases, including real data, inverse modelling and data assimilation techniques were proven to help in the assessment of the source term. In this paper, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Two dispersion models have been used: Polair3D and Silam developed in two different research centres. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large multiplicative observational errors are assumed. The inverse modelling scheme accounts for strong error bias encountered with such errors. The efficiency of the data assimilation system is tested via statistical indicators. For France and Finland, the average performance of the data assimilation system is strong. However there are outlying situations where the inversion fails because of a too poor observability. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools are developed and tested to discriminate candidate release sites.

  4. The Role of Scale and Model Bias in ADAPT's Photospheric Eatimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godinez Vazquez, Humberto C.; Hickmann, Kyle Scott; Arge, Charles Nicholas

    2015-05-20

    The Air Force Assimilative Photospheric flux Transport model (ADAPT), is a magnetic flux propagation based on Worden-Harvey (WH) model. ADAPT would be used to provide a global photospheric map of the Earth. A data assimilation method based on the Ensemble Kalman Filter (EnKF), a method of Monte Carlo approximation tied with Kalman filtering, is used in calculating the ADAPT models.

  5. Ensemble Kalman filter for the reconstruction of the Earth's mantle circulation

    NASA Astrophysics Data System (ADS)

    Bocher, Marie; Fournier, Alexandre; Coltice, Nicolas

    2018-02-01

    Recent advances in mantle convection modeling led to the release of a new generation of convection codes, able to self-consistently generate plate-like tectonics at their surface. Those models physically link mantle dynamics to surface tectonics. Combined with plate tectonic reconstructions, they have the potential to produce a new generation of mantle circulation models that use data assimilation methods and where uncertainties in plate tectonic reconstructions are taken into account. We provided a proof of this concept by applying a suboptimal Kalman filter to the reconstruction of mantle circulation (Bocher et al., 2016). Here, we propose to go one step further and apply the ensemble Kalman filter (EnKF) to this problem. The EnKF is a sequential Monte Carlo method particularly adapted to solve high-dimensional data assimilation problems with nonlinear dynamics. We tested the EnKF using synthetic observations consisting of surface velocity and heat flow measurements on a 2-D-spherical annulus model and compared it with the method developed previously. The EnKF performs on average better and is more stable than the former method. Less than 300 ensemble members are sufficient to reconstruct an evolution. We use covariance adaptive inflation and localization to correct for sampling errors. We show that the EnKF results are robust over a wide range of covariance localization parameters. The reconstruction is associated with an estimation of the error, and provides valuable information on where the reconstruction is to be trusted or not.

  6. One-sided truncated sequential t-test: application to natural resource sampling

    Treesearch

    Gary W. Fowler; William G. O' Regan

    1974-01-01

    A new procedure for constructing one-sided truncated sequential t-tests and its application to natural resource sampling are described. Monte Carlo procedures were used to develop a series of one-sided truncated sequential t-tests and the associated approximations to the operating characteristic and average sample number functions. Different truncation points and...

  7. Online sequential Monte Carlo smoother for partially observed diffusion processes

    NASA Astrophysics Data System (ADS)

    Gloaguen, Pierre; Étienne, Marie-Pierre; Le Corff, Sylvain

    2018-12-01

    This paper introduces a new algorithm to approximate smoothed additive functionals of partially observed diffusion processes. This method relies on a new sequential Monte Carlo method which allows to compute such approximations online, i.e., as the observations are received, and with a computational complexity growing linearly with the number of Monte Carlo samples. The original algorithm cannot be used in the case of partially observed stochastic differential equations since the transition density of the latent data is usually unknown. We prove that it may be extended to partially observed continuous processes by replacing this unknown quantity by an unbiased estimator obtained for instance using general Poisson estimators. This estimator is proved to be consistent and its performance are illustrated using data from two models.

  8. Usefulness of Wave Data Assimilation to the WAVE WATCH III Modeling System

    NASA Astrophysics Data System (ADS)

    Choi, J. K.; Dykes, J. D.; Yaremchuk, M.; Wittmann, P.

    2017-12-01

    In-situ and remote-sensed wave data are more abundant currently than in years past, with excellent accuracy at global scales. Forecast skill of the WAVE WATCH III model is improved by assimilation of these measurements and they are also useful for model validation and calibration. It has been known that the impact of assimilation in wind-sea conditions is not large, but spectra that result in large swell with long term propagation are identified and assimilated, the improved accuracy of the initial conditions improve the long-term forecasts. The Navy's assimilation method started with the simple Optimal Interpolation (OI) method. Operationally, Fleet Numerical Meteorology and Oceanography Center uses the sequential 2DVar scheme, but a new approach has been tested based on an adjoint-free method to variational assimilation in WAVE WATCH III. We will present the status of wave data assimilation into the WAVE WATCH III numerical model and upcoming development of this new adjoint-free variational approach.

  9. A sequential data assimilation approach for the joint reconstruction of mantle convection and surface tectonics

    NASA Astrophysics Data System (ADS)

    Bocher, M.; Coltice, N.; Fournier, A.; Tackley, P. J.

    2016-01-01

    With the progress of mantle convection modelling over the last decade, it now becomes possible to solve for the dynamics of the interior flow and the surface tectonics to first order. We show here that tectonic data (like surface kinematics and seafloor age distribution) and mantle convection models with plate-like behaviour can in principle be combined to reconstruct mantle convection. We present a sequential data assimilation method, based on suboptimal schemes derived from the Kalman filter, where surface velocities and seafloor age maps are not used as boundary conditions for the flow, but as data to assimilate. Two stages (a forecast followed by an analysis) are repeated sequentially to take into account data observed at different times. Whenever observations are available, an analysis infers the most probable state of the mantle at this time, considering a prior guess (supplied by the forecast) and the new observations at hand, using the classical best linear unbiased estimate. Between two observation times, the evolution of the mantle is governed by the forward model of mantle convection. This method is applied to synthetic 2-D spherical annulus mantle cases to evaluate its efficiency. We compare the reference evolutions to the estimations obtained by data assimilation. Two parameters control the behaviour of the scheme: the time between two analyses, and the amplitude of noise in the synthetic observations. Our technique proves to be efficient in retrieving temperature field evolutions provided the time between two analyses is ≲10 Myr. If the amplitude of the a priori error on the observations is large (30 per cent), our method provides a better estimate of surface tectonics than the observations, taking advantage of the information within the physics of convection.

  10. Handling the unknown soil hydraulic parameters in data assimilation for unsaturated flow problems

    NASA Astrophysics Data System (ADS)

    Lange, Natascha; Erdal, Daniel; Neuweiler, Insa

    2017-04-01

    Model predictions of flow in the unsaturated zone require the soil hydraulic parameters. However, these parameters cannot be determined easily in applications, in particular if observations are indirect and cover only a small range of possible states. Correlation of parameters or their correlation in the range of states that are observed is a problem, as different parameter combinations may reproduce approximately the same measured water content. In field campaigns this problem can be helped by adding more measurement devices. Often, observation networks are designed to feed models for long term prediction purposes (i.e. for weather forecasting). A popular way of making predictions with such kind of observations are data assimilation methods, like the ensemble Kalman filter (Evensen, 1994). These methods can be used for parameter estimation if the unknown parameters are included in the state vector and updated along with the model states. Given the difficulties related to estimation of the soil hydraulic parameters in general, it is questionable, though, whether these methods can really be used for parameter estimation under natural conditions. Therefore, we investigate the ability of the ensemble Kalman filter to estimate the soil hydraulic parameters. We use synthetic identical twin-experiments to guarantee full knowledge of the model and the true parameters. We use the van Genuchten model to describe the soil water retention and relative permeability functions. This model is unfortunately prone to the above mentioned pseudo-correlations of parameters. Therefore, we also test the simpler Russo Gardner model, which is less affected by that problem, in our experiments. The total number of unknown parameters is varied by considering different layers of soil. Besides, we study the influence of the parameter updates on the water content predictions. We test different iterative filter approaches and compare different observation strategies for parameter identification. Considering heterogeneous soils, we discuss the representativeness of different observation types to be used for the assimilation. G. Evensen. Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. Journal of Geophysical Research: Oceans, 99(C5):10143-10162, 1994

  11. Evaluating Snow Data Assimilation Framework for Streamflow Forecasting Applications Using Hindcast Verification

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2012-12-01

    Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.

  12. Multilevel Sequential Monte Carlo Samplers for Normalizing Constants

    DOE PAGES

    Moral, Pierre Del; Jasra, Ajay; Law, Kody J. H.; ...

    2017-08-24

    This article considers the sequential Monte Carlo (SMC) approximation of ratios of normalizing constants associated to posterior distributions which in principle rely on continuum models. Therefore, the Monte Carlo estimation error and the discrete approximation error must be balanced. A multilevel strategy is utilized to substantially reduce the cost to obtain a given error level in the approximation as compared to standard estimators. Two estimators are considered and relative variance bounds are given. The theoretical results are numerically illustrated for two Bayesian inverse problems arising from elliptic partial differential equations (PDEs). The examples involve the inversion of observations of themore » solution of (i) a 1-dimensional Poisson equation to infer the diffusion coefficient, and (ii) a 2-dimensional Poisson equation to infer the external forcing.« less

  13. A Complementary Note to 'A Lag-1 Smoother Approach to System-Error Estimation': The Intrinsic Limitations of Residual Diagnostics

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo

    2015-01-01

    Recently, this author studied an approach to the estimation of system error based on combining observation residuals derived from a sequential filter and fixed lag-1 smoother. While extending the methodology to a variational formulation, experimenting with simple models and making sure consistency was found between the sequential and variational formulations, the limitations of the residual-based approach came clearly to the surface. This note uses the sequential assimilation application to simple nonlinear dynamics to highlight the issue. Only when some of the underlying error statistics are assumed known is it possible to estimate the unknown component. In general, when considerable uncertainties exist in the underlying statistics as a whole, attempts to obtain separate estimates of the various error covariances are bound to lead to misrepresentation of errors. The conclusions are particularly relevant to present-day attempts to estimate observation-error correlations from observation residual statistics. A brief illustration of the issue is also provided by comparing estimates of error correlations derived from a quasi-operational assimilation system and a corresponding Observing System Simulation Experiments framework.

  14. Particle filters, a quasi-Monte-Carlo-solution for segmentation of coronaries.

    PubMed

    Florin, Charles; Paragios, Nikos; Williams, Jim

    2005-01-01

    In this paper we propose a Particle Filter-based approach for the segmentation of coronary arteries. To this end, successive planes of the vessel are modeled as unknown states of a sequential process. Such states consist of the orientation, position, shape model and appearance (in statistical terms) of the vessel that are recovered in an incremental fashion, using a sequential Bayesian filter (Particle Filter). In order to account for bifurcations and branchings, we consider a Monte Carlo sampling rule that propagates in parallel multiple hypotheses. Promising results on the segmentation of coronary arteries demonstrate the potential of the proposed approach.

  15. Mantle-circulation models with sequential data assimilation: inferring present-day mantle structure from plate-motion histories.

    PubMed

    Bunge, Hans-Peter; Richards, M A; Baumgardner, J R

    2002-11-15

    Data assimilation is an approach to studying geodynamic models consistent simultaneously with observables and the governing equations of mantle flow. Such an approach is essential in mantle circulation models, where we seek to constrain an unknown initial condition some time in the past, and thus cannot hope to use first-principles convection calculations to infer the flow history of the mantle. One of the most important observables for mantle-flow history comes from models of Mesozoic and Cenozoic plate motion that provide constraints not only on the surface velocity of the mantle but also on the evolution of internal mantle-buoyancy forces due to subducted oceanic slabs. Here we present five mantle circulation models with an assimilated plate-motion history spanning the past 120 Myr, a time period for which reliable plate-motion reconstructions are available. All models agree well with upper- and mid-mantle heterogeneity imaged by seismic tomography. A simple standard model of whole-mantle convection, including a factor 40 viscosity increase from the upper to the lower mantle and predominantly internal heat generation, reveals downwellings related to Farallon and Tethys subduction. Adding 35% bottom heating from the core has the predictable effect of producing prominent high-temperature anomalies and a strong thermal boundary layer at the base of the mantle. Significantly delaying mantle flow through the transition zone either by modelling the dynamic effects of an endothermic phase reaction or by including a steep, factor 100, viscosity rise from the upper to the lower mantle results in substantial transition-zone heterogeneity, enhanced by the effects of trench migration implicit in the assimilated plate-motion history. An expected result is the failure to account for heterogeneity structure in the deepest mantle below 1500 km, which is influenced by Jurassic plate motions and thus cannot be modelled from sequential assimilation of plate motion histories limited in age to the Cretaceous. This result implies that sequential assimilation of past plate-motion models is ineffective in studying the temporal evolution of core-mantle-boundary heterogeneity, and that a method for extrapolating present-day information backwards in time is required. For short time periods (of the order of perhaps a few tens of Myr) such a method exists in the form of crude 'backward' convection calculations. For longer time periods (of the order of a mantle overturn), a rigorous approach to extrapolating information back in time exists in the form of iterative nonlinear optimization methods that carry assimilated information into the past through the use of an adjoint mantle convection model.

  16. Assessing an ensemble Kalman filter inference of Manning's n coefficient of an idealized tidal inlet against a polynomial chaos-based MCMC

    NASA Astrophysics Data System (ADS)

    Siripatana, Adil; Mayo, Talea; Sraj, Ihab; Knio, Omar; Dawson, Clint; Le Maitre, Olivier; Hoteit, Ibrahim

    2017-08-01

    Bayesian estimation/inversion is commonly used to quantify and reduce modeling uncertainties in coastal ocean model, especially in the framework of parameter estimation. Based on Bayes rule, the posterior probability distribution function (pdf) of the estimated quantities is obtained conditioned on available data. It can be computed either directly, using a Markov chain Monte Carlo (MCMC) approach, or by sequentially processing the data following a data assimilation approach, which is heavily exploited in large dimensional state estimation problems. The advantage of data assimilation schemes over MCMC-type methods arises from the ability to algorithmically accommodate a large number of uncertain quantities without significant increase in the computational requirements. However, only approximate estimates are generally obtained by this approach due to the restricted Gaussian prior and noise assumptions that are generally imposed in these methods. This contribution aims at evaluating the effectiveness of utilizing an ensemble Kalman-based data assimilation method for parameter estimation of a coastal ocean model against an MCMC polynomial chaos (PC)-based scheme. We focus on quantifying the uncertainties of a coastal ocean ADvanced CIRCulation (ADCIRC) model with respect to the Manning's n coefficients. Based on a realistic framework of observation system simulation experiments (OSSEs), we apply an ensemble Kalman filter and the MCMC method employing a surrogate of ADCIRC constructed by a non-intrusive PC expansion for evaluating the likelihood, and test both approaches under identical scenarios. We study the sensitivity of the estimated posteriors with respect to the parameters of the inference methods, including ensemble size, inflation factor, and PC order. A full analysis of both methods, in the context of coastal ocean model, suggests that an ensemble Kalman filter with appropriate ensemble size and well-tuned inflation provides reliable mean estimates and uncertainties of Manning's n coefficients compared to the full posterior distributions inferred by MCMC.

  17. Data Assimilation by Ensemble Kalman Filter during One-Dimensional Nonlinear Consolidation in Randomly Heterogeneous Highly Compressible Aquitards

    NASA Astrophysics Data System (ADS)

    Zapata Norberto, B.; Morales-Casique, E.; Herrera, G. S.

    2017-12-01

    Severe land subsidence due to groundwater extraction may occur in multiaquifer systems where highly compressible aquitards are present. The highly compressible nature of the aquitards leads to nonlinear consolidation where the groundwater flow parameters are stress-dependent. The case is further complicated by the heterogeneity of the hydrogeologic and geotechnical properties of the aquitards. We explore the effect of realistic vertical heterogeneity of hydrogeologic and geotechnical parameters on the consolidation of highly compressible aquitards by means of 1-D Monte Carlo numerical simulations. 2000 realizations are generated for each of the following parameters: hydraulic conductivity (K), compression index (Cc) and void ratio (e). The correlation structure, the mean and the variance for each parameter were obtained from a literature review about field studies in the lacustrine sediments of Mexico City. The results indicate that among the parameters considered, random K has the largest effect on the ensemble average behavior of the system. Random K leads to the largest variance (and therefore largest uncertainty) of total settlement, groundwater flux and time to reach steady state conditions. We further propose a data assimilation scheme by means of ensemble Kalman filter to estimate the ensemble mean distribution of K, pore-pressure and total settlement. We consider the case where pore-pressure measurements are available at given time intervals. We test our approach by generating a 1-D realization of K with exponential spatial correlation, and solving the nonlinear flow and consolidation problem. These results are taken as our "true" solution. We take pore-pressure "measurements" at different times from this "true" solution. The ensemble Kalman filter method is then employed to estimate ensemble mean distribution of K, pore-pressure and total settlement based on the sequential assimilation of these pore-pressure measurements. The ensemble-mean estimates from this procedure closely approximate those from the "true" solution. This procedure can be easily extended to other random variables such as compression index and void ratio.

  18. Improving Robustness of Hydrologic Ensemble Predictions Through Probabilistic Pre- and Post-Processing in Sequential Data Assimilation

    NASA Astrophysics Data System (ADS)

    Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.

    2018-03-01

    Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.

  19. Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time

    NASA Astrophysics Data System (ADS)

    Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.

    2017-12-01

    We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.

  20. Learning of state-space models with highly informative observations: A tempered sequential Monte Carlo solution

    NASA Astrophysics Data System (ADS)

    Svensson, Andreas; Schön, Thomas B.; Lindsten, Fredrik

    2018-05-01

    Probabilistic (or Bayesian) modeling and learning offers interesting possibilities for systematic representation of uncertainty using probability theory. However, probabilistic learning often leads to computationally challenging problems. Some problems of this type that were previously intractable can now be solved on standard personal computers thanks to recent advances in Monte Carlo methods. In particular, for learning of unknown parameters in nonlinear state-space models, methods based on the particle filter (a Monte Carlo method) have proven very useful. A notoriously challenging problem, however, still occurs when the observations in the state-space model are highly informative, i.e. when there is very little or no measurement noise present, relative to the amount of process noise. The particle filter will then struggle in estimating one of the basic components for probabilistic learning, namely the likelihood p (data | parameters). To this end we suggest an algorithm which initially assumes that there is substantial amount of artificial measurement noise present. The variance of this noise is sequentially decreased in an adaptive fashion such that we, in the end, recover the original problem or possibly a very close approximation of it. The main component in our algorithm is a sequential Monte Carlo (SMC) sampler, which gives our proposed method a clear resemblance to the SMC2 method. Another natural link is also made to the ideas underlying the approximate Bayesian computation (ABC). We illustrate it with numerical examples, and in particular show promising results for a challenging Wiener-Hammerstein benchmark problem.

  1. Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction

    NASA Technical Reports Server (NTRS)

    Li, Zhijin; Chao, Yi; Li, P. Peggy

    2012-01-01

    A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.

  2. Multilevel sequential Monte Carlo samplers

    DOE PAGES

    Beskos, Alexandros; Jasra, Ajay; Law, Kody; ...

    2016-08-24

    Here, we study the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods and leading to a discretisation bias, with the step-size level h L. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levelsmore » $${\\infty}$$ >h 0>h 1 ...>h L. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. In conclusion, it is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context.« less

  3. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

    NASA Astrophysics Data System (ADS)

    Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik

    2018-05-01

    Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.

  4. Data Assimilation in the Presence of Forecast Bias: The GEOS Moisture Analysis

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.; Todling, Ricardo

    1999-01-01

    We describe the application of the unbiased sequential analysis algorithm developed by Dee and da Silva (1998) to the GEOS DAS moisture analysis. The algorithm estimates the persistent component of model error using rawinsonde observations and adjusts the first-guess moisture field accordingly. Results of two seasonal data assimilation cycles show that moisture analysis bias is almost completely eliminated in all observed regions. The improved analyses cause a sizable reduction in the 6h-forecast bias and a marginal improvement in the error standard deviations.

  5. Effective Online Bayesian Phylogenetics via Sequential Monte Carlo with Guided Proposals

    PubMed Central

    Fourment, Mathieu; Claywell, Brian C; Dinh, Vu; McCoy, Connor; Matsen IV, Frederick A; Darling, Aaron E

    2018-01-01

    Abstract Modern infectious disease outbreak surveillance produces continuous streams of sequence data which require phylogenetic analysis as data arrives. Current software packages for Bayesian phylogenetic inference are unable to quickly incorporate new sequences as they become available, making them less useful for dynamically unfolding evolutionary stories. This limitation can be addressed by applying a class of Bayesian statistical inference algorithms called sequential Monte Carlo (SMC) to conduct online inference, wherein new data can be continuously incorporated to update the estimate of the posterior probability distribution. In this article, we describe and evaluate several different online phylogenetic sequential Monte Carlo (OPSMC) algorithms. We show that proposing new phylogenies with a density similar to the Bayesian prior suffers from poor performance, and we develop “guided” proposals that better match the proposal density to the posterior. Furthermore, we show that the simplest guided proposals can exhibit pathological behavior in some situations, leading to poor results, and that the situation can be resolved by heating the proposal density. The results demonstrate that relative to the widely used MCMC-based algorithm implemented in MrBayes, the total time required to compute a series of phylogenetic posteriors as sequences arrive can be significantly reduced by the use of OPSMC, without incurring a significant loss in accuracy. PMID:29186587

  6. Sequential Monte Carlo tracking of the marginal artery by multiple cue fusion and random forest regression.

    PubMed

    Cherry, Kevin M; Peplinski, Brandon; Kim, Lauren; Wang, Shijun; Lu, Le; Zhang, Weidong; Liu, Jianfei; Wei, Zhuoshi; Summers, Ronald M

    2015-01-01

    Given the potential importance of marginal artery localization in automated registration in computed tomography colonography (CTC), we have devised a semi-automated method of marginal vessel detection employing sequential Monte Carlo tracking (also known as particle filtering tracking) by multiple cue fusion based on intensity, vesselness, organ detection, and minimum spanning tree information for poorly enhanced vessel segments. We then employed a random forest algorithm for intelligent cue fusion and decision making which achieved high sensitivity and robustness. After applying a vessel pruning procedure to the tracking results, we achieved statistically significantly improved precision compared to a baseline Hessian detection method (2.7% versus 75.2%, p<0.001). This method also showed statistically significantly improved recall rate compared to a 2-cue baseline method using fewer vessel cues (30.7% versus 67.7%, p<0.001). These results demonstrate that marginal artery localization on CTC is feasible by combining a discriminative classifier (i.e., random forest) with a sequential Monte Carlo tracking mechanism. In so doing, we present the effective application of an anatomical probability map to vessel pruning as well as a supplementary spatial coordinate system for colonic segmentation and registration when this task has been confounded by colon lumen collapse. Published by Elsevier B.V.

  7. A new moving strategy for the sequential Monte Carlo approach in optimizing the hydrological model parameters

    NASA Astrophysics Data System (ADS)

    Zhu, Gaofeng; Li, Xin; Ma, Jinzhu; Wang, Yunquan; Liu, Shaomin; Huang, Chunlin; Zhang, Kun; Hu, Xiaoli

    2018-04-01

    Sequential Monte Carlo (SMC) samplers have become increasing popular for estimating the posterior parameter distribution with the non-linear dependency structures and multiple modes often present in hydrological models. However, the explorative capabilities and efficiency of the sampler depends strongly on the efficiency in the move step of SMC sampler. In this paper we presented a new SMC sampler entitled the Particle Evolution Metropolis Sequential Monte Carlo (PEM-SMC) algorithm, which is well suited to handle unknown static parameters of hydrologic model. The PEM-SMC sampler is inspired by the works of Liang and Wong (2001) and operates by incorporating the strengths of the genetic algorithm, differential evolution algorithm and Metropolis-Hasting algorithm into the framework of SMC. We also prove that the sampler admits the target distribution to be a stationary distribution. Two case studies including a multi-dimensional bimodal normal distribution and a conceptual rainfall-runoff hydrologic model by only considering parameter uncertainty and simultaneously considering parameter and input uncertainty show that PEM-SMC sampler is generally superior to other popular SMC algorithms in handling the high dimensional problems. The study also indicated that it may be important to account for model structural uncertainty by using multiplier different hydrological models in the SMC framework in future study.

  8. Assimilation of all-weather GMI and ATMS observations into HWRF

    NASA Astrophysics Data System (ADS)

    Moradi, I.; Evans, F.; McCarty, W.; Marks, F.; Eriksson, P.

    2017-12-01

    We propose a novel Bayesian Monte Carlo Integration (BMCI) technique to retrieve the profiles of temperature, water vapor, and cloud liquid/ice water content from microwave cloudy measurements in the presence of TCs. These retrievals then can either be directly used by meteorologists to analyze the structure of TCs or be assimilated to provide accurate initial conditions for the NWP models. The technique is applied to the data from the Advanced Technology Microwave Sounder (ATMS) onboard Suomi National Polar-orbiting Partnership (NPP) and Global Precipitation Measurement (GPM) Microwave Imager (GMI).

  9. Development of a satellite SAR image spectra and altimeter wave height data assimilation system for ERS-1

    NASA Technical Reports Server (NTRS)

    Hasselmann, Klaus; Hasselmann, Susanne; Bauer, Eva; Bruening, Claus; Lehner, Susanne; Graber, Hans; Lionello, Piero

    1988-01-01

    The applicability of ERS-1 wind and wave data for wave models was studied using the WAM third generation wave model and SEASAT altimeter, scatterometer and SAR data. A series of global wave hindcasts is made for the surface stress and surface wind fields by assimilation of scatterometer data for the full 96-day SEASAT and also for two wind field analyses for shorter periods by assimilation with the higher resolution ECMWF T63 model and by subjective analysis methods. It is found that wave models respond very sensitively to inconsistencies in wind field analyses and therefore provide a valuable data validation tool. Comparisons between SEASAT SAR image spectra and theoretical SAR spectra derived from the hindcast wave spectra by Monte Carlo simulations yield good overall agreement for 32 cases representing a wide variety of wave conditions. It is concluded that SAR wave imaging is sufficiently well understood to apply SAR image spectra with confidence for wave studies if supported by realistic wave models and theoretical computations of the strongly nonlinear mapping of the wave spectrum into the SAR image spectrum. A closed nonlinear integral expression for this spectral mapping relation is derived which avoids the inherent statistical errors of Monte Carlo computations and may prove to be more efficient numerically.

  10. Development of the Nonstationary Incremental Analysis Update Algorithm for Sequential Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Ham, Yoo-Geun; Song, Hyo-Jong; Jung, Jaehee; Lim, Gyu-Ho

    2017-04-01

    This study introduces a altered version of the incremental analysis updates (IAU), called the nonstationary IAU (NIAU) method, to enhance the assimilation accuracy of the IAU while retaining the continuity of the analysis. Analogous to the IAU, the NIAU is designed to add analysis increments at every model time step to improve the continuity in the intermittent data assimilation. Still, unlike the IAU, the NIAU method applies time-evolved forcing employing the forward operator as rectifications to the model. The solution of the NIAU is better than that of the IAU, of which analysis is performed at the start of the time window for adding the IAU forcing, in terms of the accuracy of the analysis field. It is because, in the linear systems, the NIAU solution equals that in an intermittent data assimilation method at the end of the assimilation interval. To have the filtering property in the NIAU, a forward operator to propagate the increment is reconstructed with only dominant singular vectors. An illustration of those advantages of the NIAU is given using the simple 40-variable Lorenz model.

  11. Moisture Forecast Bias Correction in GEOS DAS

    NASA Technical Reports Server (NTRS)

    Dee, D.

    1999-01-01

    Data assimilation methods rely on numerous assumptions about the errors involved in measuring and forecasting atmospheric fields. One of the more disturbing of these is that short-term model forecasts are assumed to be unbiased. In case of atmospheric moisture, for example, observational evidence shows that the systematic component of errors in forecasts and analyses is often of the same order of magnitude as the random component. we have implemented a sequential algorithm for estimating forecast moisture bias from rawinsonde data in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The algorithm is designed to remove the systematic component of analysis errors and can be easily incorporated in an existing statistical data assimilation system. We will present results of initial experiments that show a significant reduction of bias in the GEOS DAS moisture analyses.

  12. Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation

    NASA Astrophysics Data System (ADS)

    Fichtner, A.; van Dinther, Y.; Kuensch, H. R.

    2017-12-01

    Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct added value with respect to using observations or numerical models separately. Although several challenges for applications to a natural setting remain, these first results indicate the large potential of data assimilation techniques for probabilistic seismic hazard assessment and other challenges in dynamic solid earth systems.

  13. Parameter Estimation and Model Validation of Nonlinear Dynamical Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, Henry; Gill, Philip

    In the performance period of this work under a DOE contract, the co-PIs, Philip Gill and Henry Abarbanel, developed new methods for statistical data assimilation for problems of DOE interest, including geophysical and biological problems. This included numerical optimization algorithms for variational principles, new parallel processing Monte Carlo routines for performing the path integrals of statistical data assimilation. These results have been summarized in the monograph: “Predicting the Future: Completing Models of Observed Complex Systems” by Henry Abarbanel, published by Spring-Verlag in June 2013. Additional results and details have appeared in the peer reviewed literature.

  14. An ocean data assimilation system and reanalysis of the World Ocean hydrophysical fields

    NASA Astrophysics Data System (ADS)

    Zelenko, A. A.; Vil'fand, R. M.; Resnyanskii, Yu. D.; Strukov, B. S.; Tsyrulnikov, M. D.; Svirenko, P. I.

    2016-07-01

    A new version of the ocean data assimilation system (ODAS) developed at the Hydrometcentre of Russia is presented. The assimilation is performed following the sequential scheme analysis-forecast-analysis. The main components of the ODAS are procedures for operational observation data processing, a variational analysis scheme, and an ocean general circulation model used to estimate the first guess fields involved in the analysis. In situ observations of temperature and salinity in the upper 1400-m ocean layer obtained from various observational platforms are used as input data. In the new ODAS version, the horizontal resolution of the assimilating model and of the output products is increased, the previous 2D-Var analysis scheme is replaced by a more general 3D-Var scheme, and a more flexible incremental analysis updating procedure is introduced to correct the model calculations. A reanalysis of the main World Ocean hydrophysical fields over the 2005-2015 period has been performed using the updated ODAS. The reanalysis results are compared with data from independent sources.

  15. Variational Assimilation of GOME Total-Column Ozone Satellite Data in a 2D Latitude-Longitude Tracer-Transport Model.

    NASA Astrophysics Data System (ADS)

    Eskes, H. J.; Piters, A. J. M.; Levelt, P. F.; Allaart, M. A. F.; Kelder, H. M.

    1999-10-01

    A four-dimensional data-assimilation method is described to derive synoptic ozone fields from total-column ozone satellite measurements. The ozone columns are advected by a 2D tracer-transport model, using ECMWF wind fields at a single pressure level. Special attention is paid to the modeling of the forecast error covariance and quality control. The temporal and spatial dependence of the forecast error is taken into account, resulting in a global error field at any instant in time that provides a local estimate of the accuracy of the assimilated field. The authors discuss the advantages of the 4D-variational (4D-Var) approach over sequential assimilation schemes. One of the attractive features of the 4D-Var technique is its ability to incorporate measurements at later times t > t0 in the analysis at time t0, in a way consistent with the time evolution as described by the model. This significantly improves the offline analyzed ozone fields.

  16. Analysis of the hydrological response of a distributed physically-based model using post-assimilation (EnKF) diagnostics of streamflow and in situ soil moisture observations

    NASA Astrophysics Data System (ADS)

    Trudel, Mélanie; Leconte, Robert; Paniconi, Claudio

    2014-06-01

    Data assimilation techniques not only enhance model simulations and forecast, they also provide the opportunity to obtain a diagnostic of both the model and observations used in the assimilation process. In this research, an ensemble Kalman filter was used to assimilate streamflow observations at a basin outlet and at interior locations, as well as soil moisture at two different depths (15 and 45 cm). The simulation model is the distributed physically-based hydrological model CATHY (CATchment HYdrology) and the study site is the Des Anglais watershed, a 690 km2 river basin located in southern Quebec, Canada. Use of Latin hypercube sampling instead of a conventional Monte Carlo method to generate the ensemble reduced the size of the ensemble, and therefore the calculation time. Different post-assimilation diagnostics, based on innovations (observation minus background), analysis residuals (observation minus analysis), and analysis increments (analysis minus background), were used to evaluate assimilation optimality. An important issue in data assimilation is the estimation of error covariance matrices. These diagnostics were also used in a calibration exercise to determine the standard deviation of model parameters, forcing data, and observations that led to optimal assimilations. The analysis of innovations showed a lag between the model forecast and the observation during rainfall events. Assimilation of streamflow observations corrected this discrepancy. Assimilation of outlet streamflow observations improved the Nash-Sutcliffe efficiencies (NSE) between the model forecast (one day) and the observation at both outlet and interior point locations, owing to the structure of the state vector used. However, assimilation of streamflow observations systematically increased the simulated soil moisture values.

  17. Sequential data assimilation for a distributed hydrologic model considering different time scale of internal processes

    NASA Astrophysics Data System (ADS)

    Noh, S.; Tachikawa, Y.; Shiiba, M.; Kim, S.

    2011-12-01

    Applications of the sequential data assimilation methods have been increasing in hydrology to reduce uncertainty in the model prediction. In a distributed hydrologic model, there are many types of state variables and each variable interacts with each other based on different time scales. However, the framework to deal with the delayed response, which originates from different time scale of hydrologic processes, has not been thoroughly addressed in the hydrologic data assimilation. In this study, we propose the lagged filtering scheme to consider the lagged response of internal states in a distributed hydrologic model using two filtering schemes; particle filtering (PF) and ensemble Kalman filtering (EnKF). The EnKF is one of the widely used sub-optimal filters implementing an efficient computation with limited number of ensemble members, however, still based on Gaussian approximation. PF can be an alternative in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions involved. In case of PF, advanced particle regularization scheme is implemented together to preserve the diversity of the particle system. In case of EnKF, the ensemble square root filter (EnSRF) are implemented. Each filtering method is parallelized and implemented in the high performance computing system. A distributed hydrologic model, the water and energy transfer processes (WEP) model, is applied for the Katsura River catchment, Japan to demonstrate the applicability of proposed approaches. Forecasted results via PF and EnKF are compared and analyzed in terms of the prediction accuracy and the probabilistic adequacy. Discussions are focused on the prospects and limitations of each data assimilation method.

  18. Simultaneous state-parameter estimation supports the evaluation of data assimilation performance and measurement design for soil-water-atmosphere-plant system

    NASA Astrophysics Data System (ADS)

    Hu, Shun; Shi, Liangsheng; Zha, Yuanyuan; Williams, Mathew; Lin, Lin

    2017-12-01

    Improvements to agricultural water and crop managements require detailed information on crop and soil states, and their evolution. Data assimilation provides an attractive way of obtaining these information by integrating measurements with model in a sequential manner. However, data assimilation for soil-water-atmosphere-plant (SWAP) system is still lack of comprehensive exploration due to a large number of variables and parameters in the system. In this study, simultaneous state-parameter estimation using ensemble Kalman filter (EnKF) was employed to evaluate the data assimilation performance and provide advice on measurement design for SWAP system. The results demonstrated that a proper selection of state vector is critical to effective data assimilation. Especially, updating the development stage was able to avoid the negative effect of ;phenological shift;, which was caused by the contrasted phenological stage in different ensemble members. Simultaneous state-parameter estimation (SSPE) assimilation strategy outperformed updating-state-only (USO) assimilation strategy because of its ability to alleviate the inconsistency between model variables and parameters. However, the performance of SSPE assimilation strategy could deteriorate with an increasing number of uncertain parameters as a result of soil stratification and limited knowledge on crop parameters. In addition to the most easily available surface soil moisture (SSM) and leaf area index (LAI) measurements, deep soil moisture, grain yield or other auxiliary data were required to provide sufficient constraints on parameter estimation and to assure the data assimilation performance. This study provides an insight into the response of soil moisture and grain yield to data assimilation in SWAP system and is helpful for soil moisture movement and crop growth modeling and measurement design in practice.

  19. A Hybrid Approach to Data Assimilation for Reconstructing the Evolution of Mantle Dynamics

    NASA Astrophysics Data System (ADS)

    Zhou, Quan; Liu, Lijun

    2017-11-01

    Quantifying past mantle dynamic processes represents a major challenge in understanding the temporal evolution of the solid earth. Mantle convection modeling with data assimilation is one of the most powerful tools to investigate the dynamics of plate subduction and mantle convection. Although various data assimilation methods, both forward and inverse, have been created, these methods all have limitations in their capabilities to represent the real earth. Pure forward models tend to miss important mantle structures due to the incorrect initial condition and thus may lead to incorrect mantle evolution. In contrast, pure tomography-based models cannot effectively resolve the fine slab structure and would fail to predict important subduction-zone dynamic processes. Here we propose a hybrid data assimilation approach that combines the unique power of the sequential and adjoint algorithms, which can properly capture the detailed evolution of the downgoing slab and the tomographically constrained mantle structures, respectively. We apply this new method to reconstructing mantle dynamics below the western U.S. while considering large lateral viscosity variations. By comparing this result with those from several existing data assimilation methods, we demonstrate that the hybrid modeling approach recovers the realistic 4-D mantle dynamics the best.

  20. A Hybrid Forward-Adjoint Data Assimilation Method for Reconstructing the Temporal Evolution of Mantle Dynamics

    NASA Astrophysics Data System (ADS)

    Zhou, Q.; Liu, L.

    2017-12-01

    Quantifying past mantle dynamic processes represents a major challenge in understanding the temporal evolution of the solid earth. Mantle convection modeling with data assimilation is one of the most powerful tools to investigate the dynamics of plate subduction and mantle convection. Although various data assimilation methods, both forward and inverse, have been created, these methods all have limitations in their capabilities to represent the real earth. Pure forward models tend to miss important mantle structures due to the incorrect initial condition and thus may lead to incorrect mantle evolution. In contrast, pure tomography-based models cannot effectively resolve the fine slab structure and would fail to predict important subduction-zone dynamic processes. Here we propose a hybrid data assimilation method that combines the unique power of the sequential and adjoint algorithms, which can properly capture the detailed evolution of the downgoing slab and the tomographically constrained mantle structures, respectively. We apply this new method to reconstructing mantle dynamics below the western U.S. while considering large lateral viscosity variations. By comparing this result with those from several existing data assimilation methods, we demonstrate that the hybrid modeling approach recovers the realistic 4-D mantle dynamics to the best.

  1. Integrating remotely sensed land cover observations and a biogeochemical model for estimating forest ecosystem carbon dynamics

    USGS Publications Warehouse

    Liu, J.; Liu, S.; Loveland, Thomas R.; Tieszen, L.L.

    2008-01-01

    Land cover change is one of the key driving forces for ecosystem carbon (C) dynamics. We present an approach for using sequential remotely sensed land cover observations and a biogeochemical model to estimate contemporary and future ecosystem carbon trends. We applied the General Ensemble Biogeochemical Modelling System (GEMS) for the Laurentian Plains and Hills ecoregion in the northeastern United States for the period of 1975-2025. The land cover changes, especially forest stand-replacing events, were detected on 30 randomly located 10-km by 10-km sample blocks, and were assimilated by GEMS for biogeochemical simulations. In GEMS, each unique combination of major controlling variables (including land cover change history) forms a geo-referenced simulation unit. For a forest simulation unit, a Monte Carlo process is used to determine forest type, forest age, forest biomass, and soil C, based on the Forest Inventory and Analysis (FIA) data and the U.S. General Soil Map (STATSGO) data. Ensemble simulations are performed for each simulation unit to incorporate input data uncertainty. Results show that on average forests of the Laurentian Plains and Hills ecoregion have been sequestrating 4.2 Tg C (1 teragram = 1012 gram) per year, including 1.9 Tg C removed from the ecosystem as the consequences of land cover change. ?? 2008 Elsevier B.V.

  2. Regional Ocean Data Assimilation

    NASA Astrophysics Data System (ADS)

    Edwards, Christopher A.; Moore, Andrew M.; Hoteit, Ibrahim; Cornuelle, Bruce D.

    2015-01-01

    This article reviews the past 15 years of developments in regional ocean data assimilation. A variety of scientific, management, and safety-related objectives motivate marine scientists to characterize many ocean environments, including coastal regions. As in weather prediction, the accurate representation of physical, chemical, and/or biological properties in the ocean is challenging. Models and observations alone provide imperfect representations of the ocean state, but together they can offer improved estimates. Variational and sequential methods are among the most widely used in regional ocean systems, and there have been exciting recent advances in ensemble and four-dimensional variational approaches. These techniques are increasingly being tested and adapted for biogeochemical applications.

  3. Regional ocean data assimilation.

    PubMed

    Edwards, Christopher A; Moore, Andrew M; Hoteit, Ibrahim; Cornuelle, Bruce D

    2015-01-01

    This article reviews the past 15 years of developments in regional ocean data assimilation. A variety of scientific, management, and safety-related objectives motivate marine scientists to characterize many ocean environments, including coastal regions. As in weather prediction, the accurate representation of physical, chemical, and/or biological properties in the ocean is challenging. Models and observations alone provide imperfect representations of the ocean state, but together they can offer improved estimates. Variational and sequential methods are among the most widely used in regional ocean systems, and there have been exciting recent advances in ensemble and four-dimensional variational approaches. These techniques are increasingly being tested and adapted for biogeochemical applications.

  4. Structured filtering

    NASA Astrophysics Data System (ADS)

    Granade, Christopher; Wiebe, Nathan

    2017-08-01

    A major challenge facing existing sequential Monte Carlo methods for parameter estimation in physics stems from the inability of existing approaches to robustly deal with experiments that have different mechanisms that yield the results with equivalent probability. We address this problem here by proposing a form of particle filtering that clusters the particles that comprise the sequential Monte Carlo approximation to the posterior before applying a resampler. Through a new graphical approach to thinking about such models, we are able to devise an artificial-intelligence based strategy that automatically learns the shape and number of the clusters in the support of the posterior. We demonstrate the power of our approach by applying it to randomized gap estimation and a form of low circuit-depth phase estimation where existing methods from the physics literature either exhibit much worse performance or even fail completely.

  5. Multilevel sequential Monte Carlo: Mean square error bounds under verifiable conditions

    DOE PAGES

    Del Moral, Pierre; Jasra, Ajay; Law, Kody J. H.

    2017-01-09

    We consider the multilevel sequential Monte Carlo (MLSMC) method of Beskos et al. (Stoch. Proc. Appl. [to appear]). This technique is designed to approximate expectations w.r.t. probability laws associated to a discretization. For instance, in the context of inverse problems, where one discretizes the solution of a partial differential equation. The MLSMC approach is especially useful when independent, coupled sampling is not possible. Beskos et al. show that for MLSMC the computational effort to achieve a given error, can be less than independent sampling. In this article we significantly weaken the assumptions of Beskos et al., extending the proofs tomore » non-compact state-spaces. The assumptions are based upon multiplicative drift conditions as in Kontoyiannis and Meyn (Electron. J. Probab. 10 [2005]: 61–123). The assumptions are verified for an example.« less

  6. Multilevel sequential Monte Carlo: Mean square error bounds under verifiable conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Moral, Pierre; Jasra, Ajay; Law, Kody J. H.

    We consider the multilevel sequential Monte Carlo (MLSMC) method of Beskos et al. (Stoch. Proc. Appl. [to appear]). This technique is designed to approximate expectations w.r.t. probability laws associated to a discretization. For instance, in the context of inverse problems, where one discretizes the solution of a partial differential equation. The MLSMC approach is especially useful when independent, coupled sampling is not possible. Beskos et al. show that for MLSMC the computational effort to achieve a given error, can be less than independent sampling. In this article we significantly weaken the assumptions of Beskos et al., extending the proofs tomore » non-compact state-spaces. The assumptions are based upon multiplicative drift conditions as in Kontoyiannis and Meyn (Electron. J. Probab. 10 [2005]: 61–123). The assumptions are verified for an example.« less

  7. Assimilating Remote Sensing Observations of Leaf Area Index and Soil Moisture for Wheat Yield Estimates: An Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Nearing, Grey S.; Crow, Wade T.; Thorp, Kelly R.; Moran, Mary S.; Reichle, Rolf H.; Gupta, Hoshin V.

    2012-01-01

    Observing system simulation experiments were used to investigate ensemble Bayesian state updating data assimilation of observations of leaf area index (LAI) and soil moisture (theta) for the purpose of improving single-season wheat yield estimates with the Decision Support System for Agrotechnology Transfer (DSSAT) CropSim-Ceres model. Assimilation was conducted in an energy-limited environment and a water-limited environment. Modeling uncertainty was prescribed to weather inputs, soil parameters and initial conditions, and cultivar parameters and through perturbations to model state transition equations. The ensemble Kalman filter and the sequential importance resampling filter were tested for the ability to attenuate effects of these types of uncertainty on yield estimates. LAI and theta observations were synthesized according to characteristics of existing remote sensing data, and effects of observation error were tested. Results indicate that the potential for assimilation to improve end-of-season yield estimates is low. Limitations are due to a lack of root zone soil moisture information, error in LAI observations, and a lack of correlation between leaf and grain growth.

  8. Sequential Data Assimilation for Seismicity: a Proof of Concept

    NASA Astrophysics Data System (ADS)

    van Dinther, Y.; Fichtner, A.; Kuensch, H. R.

    2015-12-01

    Our physical understanding and probabilistic forecasting ability of earthquakes is significantly hampered by limited indications of the state of stress and strength on faults and their governing parameters. Using the sequential data assimilation framework developed in meteorology and oceanography (e.g., Evensen, JGR, 1994) and a seismic cycle forward model based on Navier-Stokes Partial Differential Equations (van Dinther et al., JGR, 2013), we show that such information with its uncertainties is within reach, at least for laboratory setups. We aim to provide the first, thorough proof of concept for seismicity related PDE applications via a perfect model test of seismic cycles in a simplified wedge-like subduction setup. By evaluating the performance with respect to known numerical input and output, we aim to answer wether there is any probabilistic forecast value for this laboratory-like setup, which and how many parameters can be constrained, and how much data in both space and time would be needed to do so. Thus far our implementation of an Ensemble Kalman Filter demonstrated that probabilistic estimates of both the state of stress and strength on a megathrust fault can be obtained and utilized even when assimilating surface velocity data at a single point in time and space. An ensemble-based error covariance matrix containing velocities, stresses and pressure links surface velocity observations to fault stresses and strengths well enough to update fault coupling accordingly. Depending on what synthetic data show, coseismic events can then be triggered or inhibited.

  9. Accounting for spatial correlation errors in the assimilation of GRACE into hydrological models through localization

    NASA Astrophysics Data System (ADS)

    Khaki, M.; Schumacher, M.; Forootan, E.; Kuhn, M.; Awange, J. L.; van Dijk, A. I. J. M.

    2017-10-01

    Assimilation of terrestrial water storage (TWS) information from the Gravity Recovery And Climate Experiment (GRACE) satellite mission can provide significant improvements in hydrological modelling. However, the rather coarse spatial resolution of GRACE TWS and its spatially correlated errors pose considerable challenges for achieving realistic assimilation results. Consequently, successful data assimilation depends on rigorous modelling of the full error covariance matrix of the GRACE TWS estimates, as well as realistic error behavior for hydrological model simulations. In this study, we assess the application of local analysis (LA) to maximize the contribution of GRACE TWS in hydrological data assimilation. For this, we assimilate GRACE TWS into the World-Wide Water Resources Assessment system (W3RA) over the Australian continent while applying LA and accounting for existing spatial correlations using the full error covariance matrix. GRACE TWS data is applied with different spatial resolutions including 1° to 5° grids, as well as basin averages. The ensemble-based sequential filtering technique of the Square Root Analysis (SQRA) is applied to assimilate TWS data into W3RA. For each spatial scale, the performance of the data assimilation is assessed through comparison with independent in-situ ground water and soil moisture observations. Overall, the results demonstrate that LA is able to stabilize the inversion process (within the implementation of the SQRA filter) leading to less errors for all spatial scales considered with an average RMSE improvement of 54% (e.g., 52.23 mm down to 26.80 mm) for all the cases with respect to groundwater in-situ measurements. Validating the assimilated results with groundwater observations indicates that LA leads to 13% better (in terms of RMSE) assimilation results compared to the cases with Gaussian errors assumptions. This highlights the great potential of LA and the use of the full error covariance matrix of GRACE TWS estimates for improved data assimilation results.

  10. Theoretical Advances in Sequential Data Assimilation for the Atmosphere and Oceans

    NASA Astrophysics Data System (ADS)

    Ghil, M.

    2007-05-01

    We concentrate here on two aspects of advanced Kalman--filter-related methods: (i) the stability of the forecast- assimilation cycle, and (ii) parameter estimation for the coupled ocean-atmosphere system. The nonlinear stability of a prediction-assimilation system guarantees the uniqueness of the sequentially estimated solutions in the presence of partial and inaccurate observations, distributed in space and time; this stability is shown to be a necessary condition for the convergence of the state estimates to the true evolution of the turbulent flow. The stability properties of the governing nonlinear equations and of several data assimilation systems are studied by computing the spectrum of the associated Lyapunov exponents. These ideas are applied to a simple and an intermediate model of atmospheric variability and we show that the degree of stabilization depends on the type and distribution of the observations, as well as on the data assimilation method. These results represent joint work with A. Carrassi, A. Trevisan and F. Uboldi. Much is known by now about the main physical mechanisms that give rise to and modulate the El-Nino/Southern- Oscillation (ENSO), but the values of several parameters that enter these mechanisms are an important unknown. We apply Extended Kalman Filtering (EKF) for both model state and parameter estimation in an intermediate, nonlinear, coupled ocean-atmosphere model of ENSO. Model behavior is very sensitive to two key parameters: (a) "mu", the ocean-atmosphere coupling coefficient between the sea-surface temperature (SST) and wind stress anomalies; and (b) "delta-s", the surface-layer coefficient. Previous work has shown that "delta- s" determines the period of the model's self-sustained oscillation, while "mu' measures the degree of nonlinearity. Depending on the values of these parameters, the spatio-temporal pattern of model solutions is either that of a delayed oscillator or of a westward propagating mode. Assimilation of SST data from the NCEP- NCAR Reanalysis-2 shows that the parameters can vary on fairly short time scales and switch between values that approximate the two distinct modes of ENSO behavior. Rapid adjustments of these parameters occur, in particular, during strong ENSO events. Ways to apply EKF parameter estimation efficiently to state-of-the-art coupled ocean-atmosphere GCMs will be discussed. These results arise from joint work with D. Kondrashov and C.-j. Sun.

  11. Dream Team or Odd Couple? Examining the Combined Use of Lectures and Podcasting in Higher Education

    ERIC Educational Resources Information Center

    Jiménez-Castillo, David; Sánchez-Fernández, Raquel; Marín-Carrillo, Gema M.

    2017-01-01

    This study explores the effectiveness of the sequential use of lectures and video podcasting in higher education. Drawing together several theories, this paper examines the influence of student's perceived prior knowledge gained from lectures and technology acceptance model-related variables on student's self-reported assimilation of new material…

  12. astroABC : An Approximate Bayesian Computation Sequential Monte Carlo sampler for cosmological parameter estimation

    NASA Astrophysics Data System (ADS)

    Jennings, E.; Madigan, M.

    2017-04-01

    Given the complexity of modern cosmological parameter inference where we are faced with non-Gaussian data and noise, correlated systematics and multi-probe correlated datasets,the Approximate Bayesian Computation (ABC) method is a promising alternative to traditional Markov Chain Monte Carlo approaches in the case where the Likelihood is intractable or unknown. The ABC method is called "Likelihood free" as it avoids explicit evaluation of the Likelihood by using a forward model simulation of the data which can include systematics. We introduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler for parameter estimation. A key challenge in astrophysics is the efficient use of large multi-probe datasets to constrain high dimensional, possibly correlated parameter spaces. With this in mind astroABC allows for massive parallelization using MPI, a framework that handles spawning of processes across multiple nodes. A key new feature of astroABC is the ability to create MPI groups with different communicators, one for the sampler and several others for the forward model simulation, which speeds up sampling time considerably. For smaller jobs the Python multiprocessing option is also available. Other key features of this new sampler include: a Sequential Monte Carlo sampler; a method for iteratively adapting tolerance levels; local covariance estimate using scikit-learn's KDTree; modules for specifying optimal covariance matrix for a component-wise or multivariate normal perturbation kernel and a weighted covariance metric; restart files output frequently so an interrupted sampling run can be resumed at any iteration; output and restart files are backed up at every iteration; user defined distance metric and simulation methods; a module for specifying heterogeneous parameter priors including non-standard prior PDFs; a module for specifying a constant, linear, log or exponential tolerance level; well-documented examples and sample scripts. This code is hosted online at https://github.com/EliseJ/astroABC.

  13. Exploiting neurovascular coupling: a Bayesian sequential Monte Carlo approach applied to simulated EEG fNIRS data

    NASA Astrophysics Data System (ADS)

    Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria

    2017-08-01

    Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.

  14. Ensemble Smoother implemented in parallel for groundwater problems applications

    NASA Astrophysics Data System (ADS)

    Leyva, E.; Herrera, G. S.; de la Cruz, L. M.

    2013-05-01

    Data assimilation is a process that links forecasting models and measurements using the benefits from both sources. The Ensemble Kalman Filter (EnKF) is a data-assimilation sequential-method that was designed to address two of the main problems related to the use of the Extended Kalman Filter (EKF) with nonlinear models in large state spaces, i-e the use of a closure problem and massive computational requirements associated with the storage and subsequent integration of the error covariance matrix. The EnKF has gained popularity because of its simple conceptual formulation and relative ease of implementation. It has been used successfully in various applications of meteorology and oceanography and more recently in petroleum engineering and hydrogeology. The Ensemble Smoother (ES) is a method similar to EnKF, it was proposed by Van Leeuwen and Evensen (1996). Herrera (1998) proposed a version of the ES which we call Ensemble Smoother of Herrera (ESH) to distinguish it from the former. It was introduced for space-time optimization of groundwater monitoring networks. In recent years, this method has been used for data assimilation and parameter estimation in groundwater flow and transport models. The ES method uses Monte Carlo simulation, which consists of generating repeated realizations of the random variable considered, using a flow and transport model. However, often a large number of model runs are required for the moments of the variable to converge. Therefore, depending on the complexity of problem a serial computer may require many hours of continuous use to apply the ES. For this reason, it is required to parallelize the process in order to do it in a reasonable time. In this work we present the results of a parallelization strategy to reduce the execution time for doing a high number of realizations. The software GWQMonitor by Herrera (1998), implements all the algorithms required for the ESH in Fortran 90. We develop a script in Python using mpi4py, in order to execute GWQMonitor in parallel, applying the MPI library. Our approach is to calculate the initial inputs for each realization, and run groups of these realizations in separate processors. The only modification to the GWQMonitor was the final calculation of the covariance matrix. This strategy was applied to the study of a simplified aquifer in a rectangular domain of a single layer. We show the speedup and efficiency for different number of processors.

  15. Remarks on a financial inverse problem by means of Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Cuomo, Salvatore; Di Somma, Vittorio; Sica, Federica

    2017-10-01

    Estimating the price of a barrier option is a typical inverse problem. In this paper we present a numerical and statistical framework for a market with risk-free interest rate and a risk asset, described by a Geometric Brownian Motion (GBM). After approximating the risk asset with a numerical method, we find the final option price by following an approach based on sequential Monte Carlo methods. All theoretical results are applied to the case of an option whose underlying is a real stock.

  16. On sequential data assimilation for scalar macroscopic traffic flow models

    NASA Astrophysics Data System (ADS)

    Blandin, Sébastien; Couque, Adrien; Bayen, Alexandre; Work, Daniel

    2012-09-01

    We consider the problem of sequential data assimilation for transportation networks using optimal filtering with a scalar macroscopic traffic flow model. Properties of the distribution of the uncertainty on the true state related to the specific nonlinearity and non-differentiability inherent to macroscopic traffic flow models are investigated, derived analytically and analyzed. We show that nonlinear dynamics, by creating discontinuities in the traffic state, affect the performances of classical filters and in particular that the distribution of the uncertainty on the traffic state at shock waves is a mixture distribution. The non-differentiability of traffic dynamics around stationary shock waves is also proved and the resulting optimality loss of the estimates is quantified numerically. The properties of the estimates are explicitly studied for the Godunov scheme (and thus the Cell-Transmission Model), leading to specific conclusions about their use in the context of filtering, which is a significant contribution of this article. Analytical proofs and numerical tests are introduced to support the results presented. A Java implementation of the classical filters used in this work is available on-line at http://traffic.berkeley.edu for facilitating further efforts on this topic and fostering reproducible research.

  17. Advancing Data assimilation for Baltic Monitoring and Forecasting Center: implementation and evaluation of HBP-PDAF system

    NASA Astrophysics Data System (ADS)

    Korabel, Vasily; She, Jun; Huess, Vibeke; Woge Nielsen, Jacob; Murawsky, Jens; Nerger, Lars

    2017-04-01

    The potential of an efficient data assimilation (DA) scheme to improve model forecast skill was successfully demonstrated by many operational centres around the world. The Baltic-North Sea region is one of the most heavily monitored seas. Ferryboxes, buoys, ADCP moorings, shallow water Argo floats, and research vessels are providing more and more near-real time observations. Coastal altimetry has now providing increasing amount of high resolution sea level observations, which will be significantly expanded by the launch of SWOT satellite in next years. This will turn operational DA into a valuable tool for improving forecast quality in the region. This motivated us to focus on advancing DA for the Baltic Monitoring and Forecasting Centre (BAL MFC) in order to create a common framework for operational data assimilation in the Baltic Sea. We have implemented HBM-PDAF system based on the Parallel Data Assimilation Framework (PDAF), a highly versatile and optimised parallel suit with a choice of sequential schemes originally developed at AWI, and a hydrodynamic HIROMB-BOOS Model (HBM). At initial phase, only the satellite Sea Surface Temperature (SST) Level 3 data has been assimilated. Several related aspects are discussed, including improvements of the forecast quality for both surface and subsurface fields, the estimation of ensemble-based forecast error covariance, as well as possibilities of assimilating new types of observations, such as in-situ salinity and temperature profiles, coastal altimetry, and ice concentration.

  18. Enhancing Data Assimilation by Evolutionary Particle Filter and Markov Chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Moradkhani, H.; Abbaszadeh, P.; Yan, H.

    2016-12-01

    Particle Filters (PFs) have received increasing attention by the researchers from different disciplines in hydro-geosciences as an effective method to improve model predictions in nonlinear and non-Gaussian dynamical systems. The implication of dual state and parameter estimation by means of data assimilation in hydrology and geoscience has evolved since 2005 from SIR-PF to PF-MCMC and now to the most effective and robust framework through evolutionary PF approach based on Genetic Algorithm (GA) and Markov Chain Monte Carlo (MCMC), the so-called EPF-MCMC. In this framework, the posterior distribution undergoes an evolutionary process to update an ensemble of prior states that more closely resemble realistic posterior probability distribution. The premise of this approach is that the particles move to optimal position using the GA optimization coupled with MCMC increasing the number of effective particles, hence the particle degeneracy is avoided while the particle diversity is improved. The proposed algorithm is applied on a conceptual and highly nonlinear hydrologic model and the effectiveness, robustness and reliability of the method in jointly estimating the states and parameters and also reducing the uncertainty is demonstrated for few river basins across the United States.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serin, E.; Codel, G.; Mabhouti, H.

    Purpose: In small field geometries, the electronic equilibrium can be lost, making it challenging for the dose-calculation algorithm to accurately predict the dose, especially in the presence of tissue heterogeneities. In this study, dosimetric accuracy of Monte Carlo (MC) advanced dose calculation and sequential algorithms of Multiplan treatment planning system were investigated for small radiation fields incident on homogeneous and heterogeneous geometries. Methods: Small open fields of fixed cones of Cyberknife M6 unit 100 to 500 mm2 were used for this study. The fields were incident on in house phantom containing lung, air, and bone inhomogeneities and also homogeneous phantom.more » Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of MC and sequential algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement between two algorithms and film measurement For MC algorithm, the minimum gamma analysis passing rates between measured and calculated dose distributions were 99.7% and 98.3% for homogeneous and inhomogeneous fields in the case of lung and bone respectively. For sequential algorithm, the minimum gamma analysis passing rates were 98.9% and 92.5% for for homogeneous and inhomogeneous fields respectively for used all cone sizes. In the case of the air heterogeneity, the differences were larger for both calculation algorithms. Overall, when compared to measurement, the MC had better agreement than sequential algorithm. Conclusion: The Monte Carlo calculation algorithm in the Multiplan treatment planning system is an improvement over the existing sequential algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.« less

  20. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  1. Evaluation on surface current observing network of high frequency ground wave radars in the Gulf of Thailand

    NASA Astrophysics Data System (ADS)

    Yin, Xunqiang; Shi, Junqiang; Qiao, Fangli

    2018-05-01

    Due to the high cost of ocean observation system, the scientific design of observation network becomes much important. The current network of the high frequency radar system in the Gulf of Thailand has been studied using a three-dimensional coastal ocean model. At first, the observations from current radars have been assimilated into this coastal model and the forecast results have improved due to the data assimilation. But the results also show that further optimization of the observing network is necessary. And then, a series of experiments were carried out to assess the performance of the existing high frequency ground wave radar surface current observation system. The simulated surface current data in three regions were assimilated sequentially using an efficient ensemble Kalman filter data assimilation scheme. The experimental results showed that the coastal surface current observation system plays a positive role in improving the numerical simulation of the currents. Compared with the control experiment without assimilation, the simulation precision of surface and subsurface current had been improved after assimilated the surface currents observed at current networks. However, the improvement for three observing regions was quite different and current observing network in the Gulf of Thailand is not effective and a further optimization is required. Based on these evaluations, a manual scheme has been designed by discarding the redundant and inefficient locations and adding new stations where the performance after data assimilation is still low. For comparison, an objective scheme based on the idea of data assimilation has been obtained. Results show that all the two schemes of observing network perform better than the original network and optimal scheme-based data assimilation is much superior to the manual scheme that based on the evaluation of original observing network in the Gulf of Thailand. The distributions of the optimal network of radars could be a useful guidance for future design of observing system in this region.

  2. Ozone data assimilation with GEOS-Chem: a comparison between 3-D-Var, 4-D-Var, and suboptimal Kalman filter approaches

    NASA Astrophysics Data System (ADS)

    Singh, K.; Sandu, A.; Bowman, K. W.; Parrington, M.; Jones, D. B. A.; Lee, M.

    2011-08-01

    Chemistry transport models determine the evolving chemical state of the atmosphere by solving the fundamental equations that govern physical and chemical transformations subject to initial conditions of the atmospheric state and surface boundary conditions, e.g., surface emissions. The development of data assimilation techniques synthesize model predictions with measurements in a rigorous mathematical framework that provides observational constraints on these conditions. Two families of data assimilation methods are currently widely used: variational and Kalman filter (KF). The variational approach is based on control theory and formulates data assimilation as a minimization problem of a cost functional that measures the model-observations mismatch. The Kalman filter approach is rooted in statistical estimation theory and provides the analysis covariance together with the best state estimate. Suboptimal Kalman filters employ different approximations of the covariances in order to make the computations feasible with large models. Each family of methods has both merits and drawbacks. This paper compares several data assimilation methods used for global chemical data assimilation. Specifically, we evaluate data assimilation approaches for improving estimates of the summertime global tropospheric ozone distribution in August 2006 based on ozone observations from the NASA Tropospheric Emission Spectrometer and the GEOS-Chem chemistry transport model. The resulting analyses are compared against independent ozonesonde measurements to assess the effectiveness of each assimilation method. All assimilation methods provide notable improvements over the free model simulations, which differ from the ozonesonde measurements by about 20 % (below 200 hPa). Four dimensional variational data assimilation with window lengths between five days and two weeks is the most accurate method, with mean differences between analysis profiles and ozonesonde measurements of 1-5 %. Two sequential assimilation approaches (three dimensional variational and suboptimal KF), although derived from different theoretical considerations, provide similar ozone estimates, with relative differences of 5-10 % between the analyses and ozonesonde measurements. Adjoint sensitivity analysis techniques are used to explore the role of of uncertainties in ozone precursors and their emissions on the distribution of tropospheric ozone. A novel technique is introduced that projects 3-D-Variational increments back to an equivalent initial condition, which facilitates comparison with 4-D variational techniques.

  3. A Comparison of Sequential Assimilation Schemes for Ocean Prediction with the HYbrid Coordinate Ocean Model (HYCOM): Twin Experiments with Static Forecast Error Covariances

    DTIC Science & Technology

    2011-01-01

    A.J.. Lozano. C. Tolman, H.L. Srinivasan. A.. Hankin. S„ Cornillon. P.. Weisberg, R.. Barth. A.. He. R.. Werner. C. Wilkin .. J.. 2009. U.S. GODAE...Halliwell. G.R., Wallcrart. A.J.. Metzger, E.J.. Blanton, B.O., a. CL. Rao, D.B., Hogan , P.J.. Srinivasan. A., 2006. Generalized vertical

  4. Sequential Data Assimilation for Seismicity: a Proof of Concept

    NASA Astrophysics Data System (ADS)

    van Dinther, Ylona; Fichtner, Andreas; Kuensch, Hansruedi

    2016-04-01

    Our probabilistic forecasting ability and physical understanding of earthquakes is significantly hampered by limited indications on the current and evolving state of stress and strength on faults. This information is typically thought to be beyond our resolution capabilities based on surface data. We show that the state of stress and strength are actually obtainable for settings with one dominant fault. State variables and their uncertainties are obtained using Ensemble Kalman Filtering, a sequential data assimilation technique extensively developed for weather forecasting purposes. Through the least-squares solution of Bayes theorem erroneous data is for the first time assimilated to update a Partial Differential Equation-driven seismic cycle model. This visco-elasto-plastic continuum forward model solves Navier-Stokes equations with a rate-dependent friction coefficient (van Dinther et al., JGR, 2013). To prove the concept of this weather - earthquake forecasting bridge we perform a perfect model test. Synthetic numerical data from a single analogue borehole is assimilated into 20 ensemble models over 14 cycles of analogue earthquakes. Since we know the true state of the numerical data model, a quantitative and qualitative evaluation shows that meaningful information on the stress and strength of the unobserved fault is typically already available, once data from a single, shallow borehole is assimilated over part of a seismic cycle. This is possible, since the sampled error covariance matrix contains prior information on the physics that relates velocities, stresses, and pressures at the surface to those at the fault. During the analysis step stress and strength distributions are thus reconstructed in such a way that fault coupling can be updated to either inhibit or trigger events. In the subsequent forward propagation step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next analogue earthquake. At the next constant assimilation step, the systems forecasting ability turns out to be beyond expectations; 5 analogue events are forecasted approximately accurately, 5 had indications slightly earlier, 3 were identified only during propagation, and 1 was missed. Else predominantly quite interseismic times were forecasted, but for 3 occasions where smaller events triggered prolonged probabilities until the larger event that came slightly latter. Besides temporal forecasting, we also observe some magnitude forecasting skill for 59% of the events, while the other event sizes were underestimated. This new framework thus provides potential to in the long-term assist with improving our probabilistic hazard assessment.

  5. Particle rejuvenation of Rao-Blackwellized sequential Monte Carlo smoothers for conditionally linear and Gaussian models

    NASA Astrophysics Data System (ADS)

    Nguyen, Ngoc Minh; Corff, Sylvain Le; Moulines, Éric

    2017-12-01

    This paper focuses on sequential Monte Carlo approximations of smoothing distributions in conditionally linear and Gaussian state spaces. To reduce Monte Carlo variance of smoothers, it is typical in these models to use Rao-Blackwellization: particle approximation is used to sample sequences of hidden regimes while the Gaussian states are explicitly integrated conditional on the sequence of regimes and observations, using variants of the Kalman filter/smoother. The first successful attempt to use Rao-Blackwellization for smoothing extends the Bryson-Frazier smoother for Gaussian linear state space models using the generalized two-filter formula together with Kalman filters/smoothers. More recently, a forward-backward decomposition of smoothing distributions mimicking the Rauch-Tung-Striebel smoother for the regimes combined with backward Kalman updates has been introduced. This paper investigates the benefit of introducing additional rejuvenation steps in all these algorithms to sample at each time instant new regimes conditional on the forward and backward particles. This defines particle-based approximations of the smoothing distributions whose support is not restricted to the set of particles sampled in the forward or backward filter. These procedures are applied to commodity markets which are described using a two-factor model based on the spot price and a convenience yield for crude oil data.

  6. Application and Evaluation of a Snowmelt Runoff Model in the Tamor River Basin, Eastern Himalaya Using a Markov Chain Monte Carlo (MCMC) Data Assimilation Approach

    NASA Technical Reports Server (NTRS)

    Panday, Prajjwal K.; Williams, Christopher A.; Frey, Karen E.; Brown, Molly E.

    2013-01-01

    Previous studies have drawn attention to substantial hydrological changes taking place in mountainous watersheds where hydrology is dominated by cryospheric processes. Modelling is an important tool for understanding these changes but is particularly challenging in mountainous terrain owing to scarcity of ground observations and uncertainty of model parameters across space and time. This study utilizes a Markov Chain Monte Carlo data assimilation approach to examine and evaluate the performance of a conceptual, degree-day snowmelt runoff model applied in the Tamor River basin in the eastern Nepalese Himalaya. The snowmelt runoff model is calibrated using daily streamflow from 2002 to 2006 with fairly high accuracy (average Nash-Sutcliffe metric approx. 0.84, annual volume bias <3%). The Markov Chain Monte Carlo approach constrains the parameters to which the model is most sensitive (e.g. lapse rate and recession coefficient) and maximizes model fit and performance. Model simulated streamflow using an interpolated precipitation data set decreases the fractional contribution from rainfall compared with simulations using observed station precipitation. The average snowmelt contribution to total runoff in the Tamor River basin for the 2002-2006 period is estimated to be 29.7+/-2.9% (which includes 4.2+/-0.9% from snowfall that promptly melts), whereas 70.3+/-2.6% is attributed to contributions from rainfall. On average, the elevation zone in the 4000-5500m range contributes the most to basin runoff, averaging 56.9+/-3.6% of all snowmelt input and 28.9+/-1.1% of all rainfall input to runoff. Model simulated streamflow using an interpolated precipitation data set decreases the fractional contribution from rainfall versus snowmelt compared with simulations using observed station precipitation. Model experiments indicate that the hydrograph itself does not constrain estimates of snowmelt versus rainfall contributions to total outflow but that this derives from the degree-day melting model. Lastly, we demonstrate that the data assimilation approach is useful for quantifying and reducing uncertainty related to model parameters and thus provides uncertainty bounds on snowmelt and rainfall contributions in such mountainous watersheds.

  7. Suppressing correlations in massively parallel simulations of lattice models

    NASA Astrophysics Data System (ADS)

    Kelling, Jeffrey; Ódor, Géza; Gemming, Sibylle

    2017-11-01

    For lattice Monte Carlo simulations parallelization is crucial to make studies of large systems and long simulation time feasible, while sequential simulations remain the gold-standard for correlation-free dynamics. Here, various domain decomposition schemes are compared, concluding with one which delivers virtually correlation-free simulations on GPUs. Extensive simulations of the octahedron model for 2 + 1 dimensional Kardar-Parisi-Zhang surface growth, which is very sensitive to correlation in the site-selection dynamics, were performed to show self-consistency of the parallel runs and agreement with the sequential algorithm. We present a GPU implementation providing a speedup of about 30 × over a parallel CPU implementation on a single socket and at least 180 × with respect to the sequential reference.

  8. Growth of vertically aligned nanowires in metal-oxide nanocomposites: kinetic Monte-Carlo modeling versus experiments.

    PubMed

    Hennes, M; Schuler, V; Weng, X; Buchwald, J; Demaille, D; Zheng, Y; Vidal, F

    2018-04-26

    We employ kinetic Monte-Carlo simulations to study the growth process of metal-oxide nanocomposites obtained via sequential pulsed laser deposition. Using Ni-SrTiO3 (Ni-STO) as a model system, we reduce the complexity of the computational problem by choosing a coarse-grained approach mapping Sr, Ti and O atoms onto a single effective STO pseudo-atom species. With this ansatz, we scrutinize the kinetics of the sequential synthesis process, governed by alternating deposition and relaxation steps, and analyze the self-organization propensity of Ni atoms into straight vertically aligned nanowires embedded in the surrounding STO matrix. We finally compare the predictions of our binary toy model with experiments and demonstrate that our computational approach captures fundamental aspects of self-assembled nanowire synthesis. Despite its simplicity, our modeling strategy successfully describes the impact of relevant parameters like the concentration or laser frequency on the final nanoarchitecture of metal-oxide thin films grown via pulsed laser deposition.

  9. Prognostics of slurry pumps based on a moving-average wear degradation index and a general sequential Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Tse, Peter W.

    2015-05-01

    Slurry pumps are commonly used in oil-sand mining for pumping mixtures of abrasive liquids and solids. These operations cause constant wear of slurry pump impellers, which results in the breakdown of the slurry pumps. This paper develops a prognostic method for estimating remaining useful life of slurry pump impellers. First, a moving-average wear degradation index is proposed to assess the performance degradation of the slurry pump impeller. Secondly, the state space model of the proposed health index is constructed. A general sequential Monte Carlo method is employed to derive the parameters of the state space model. The remaining useful life of the slurry pump impeller is estimated by extrapolating the established state space model to a specified alert threshold. Data collected from an industrial oil sand pump were used to validate the developed method. The results show that the accuracy of the developed method improves as more data become available.

  10. Does Ocean Color Data Assimilation Improve Estimates of Global Ocean Inorganic Carbon?

    NASA Technical Reports Server (NTRS)

    Gregg, Watson

    2012-01-01

    Ocean color data assimilation has been shown to dramatically improve chlorophyll abundances and distributions globally and regionally in the oceans. Chlorophyll is a proxy for phytoplankton biomass (which is explicitly defined in a model), and is related to the inorganic carbon cycle through the interactions of the organic carbon (particulate and dissolved) and through primary production where inorganic carbon is directly taken out of the system. Does ocean color data assimilation, whose effects on estimates of chlorophyll are demonstrable, trickle through the simulated ocean carbon system to produce improved estimates of inorganic carbon? Our emphasis here is dissolved inorganic carbon, pC02, and the air-sea flux. We use a sequential data assimilation method that assimilates chlorophyll directly and indirectly changes nutrient concentrations in a multi-variate approach. The results are decidedly mixed. Dissolved organic carbon estimates from the assimilation model are not meaningfully different from free-run, or unassimilated results, and comparisons with in situ data are similar. pC02 estimates are generally worse after data assimilation, with global estimates diverging 6.4% from in situ data, while free-run estimates are only 4.7% higher. Basin correlations are, however, slightly improved: r increase from 0.78 to 0.79, and slope closer to unity at 0.94 compared to 0.86. In contrast, air-sea flux of C02 is noticeably improved after data assimilation. Global differences decline from -0.635 mol/m2/y (stronger model sink from the atmosphere) to -0.202 mol/m2/y. Basin correlations are slightly improved from r=O.77 to r=0.78, with slope closer to unity (from 0.93 to 0.99). The Equatorial Atlantic appears as a slight sink in the free-run, but is correctly represented as a moderate source in the assimilation model. However, the assimilation model shows the Antarctic to be a source, rather than a modest sink and the North Indian basin is represented incorrectly as a sink rather than the source indicated by the free-run model and data estimates.

  11. A Stabilized Sparse-Matrix U-D Square-Root Implementation of a Large-State Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Boggs, D.; Ghil, M.; Keppenne, C.

    1995-01-01

    The full nonlinear Kalman filter sequential algorithm is, in theory, well-suited to the four-dimensional data assimilation problem in large-scale atmospheric and oceanic problems. However, it was later discovered that this algorithm can be very sensitive to computer roundoff, and that results may cease to be meaningful as time advances. Implementations of a modified Kalman filter are given.

  12. Discharge data assimilation in a distributed hydrologic model for flood forecasting purposes

    NASA Astrophysics Data System (ADS)

    Ercolani, G.; Castelli, F.

    2017-12-01

    Flood early warning systems benefit from accurate river flow forecasts, and data assimilation may improve their reliability. However, the actual enhancement that can be obtained in the operational practice should be investigated in detail and quantified. In this work we assess the benefits that the simultaneous assimilation of discharge observations at multiple locations can bring to flow forecasting through a distributed hydrologic model. The distributed model, MOBIDIC, is part of the operational flood forecasting chain of Tuscany Region in Central Italy. The assimilation system adopts a mixed variational-Monte Carlo approach to update efficiently initial river flow, soil moisture, and a parameter related to runoff production. The evaluation of the system is based on numerous hindcast experiments of real events. The events are characterized by significant rainfall that resulted in both high and relatively low flow in the river network. The area of study is the main basin of Tuscany Region, i.e. Arno river basin, which extends over about 8300 km2 and whose mean annual precipitation is around 800 mm. Arno's mainstream, with its nearly 240 km length, passes through major Tuscan cities, as Florence and Pisa, that are vulnerable to floods (e.g. flood of November 1966). The assimilation tests follow the usage of the model in the forecasting chain, employing the operational resolution in both space and time (500 m and 15 minutes respectively) and releasing new flow forecasts every 6 hours. The assimilation strategy is evaluated in respect to open loop simulations, i.e. runs that do not exploit discharge observations through data assimilation. We compare hydrographs in their entirety, as well as classical performance indexes, as error on peak flow and Nash-Sutcliffe efficiency. The dependence of performances on lead time and location is assessed. Results indicate that the operational forecasting chain can benefit from the developed assimilation system, although with a significant variability due to the specific characteristics of any single event, and with downstream locations more sensitive to observations than upstream sites.

  13. Exact Tests for the Rasch Model via Sequential Importance Sampling

    ERIC Educational Resources Information Center

    Chen, Yuguo; Small, Dylan

    2005-01-01

    Rasch proposed an exact conditional inference approach to testing his model but never implemented it because it involves the calculation of a complicated probability. This paper furthers Rasch's approach by (1) providing an efficient Monte Carlo methodology for accurately approximating the required probability and (2) illustrating the usefulness…

  14. Multi-Scale Hydrometeorological Modeling, Land Data Assimilation and Parameter Estimation with the Land Information System

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, Christa D.

    2011-01-01

    The Land Information System (LIS; http://lis.gsfc.nasa.gov) is a flexible land surface modeling framework that has been developed with the goal of integrating satellite-and ground-based observational data products and advanced land surface modeling techniques to produce optimal fields of land surface states and fluxes. As such, LIS represents a step towards the next generation land component of an integrated Earth system model. In recognition of LIS object-oriented software design, use and impact in the land surface and hydrometeorological modeling community, the LIS software was selected as a co-winner of NASA?s 2005 Software of the Year award.LIS facilitates the integration of observations from Earth-observing systems and predictions and forecasts from Earth System and Earth science models into the decision-making processes of partnering agency and national organizations. Due to its flexible software design, LIS can serve both as a Problem Solving Environment (PSE) for hydrologic research to enable accurate global water and energy cycle predictions, and as a Decision Support System (DSS) to generate useful information for application areas including disaster management, water resources management, agricultural management, numerical weather prediction, air quality and military mobility assessment. LIS has e volved from two earlier efforts -- North American Land Data Assimilation System (NLDAS) and Global Land Data Assimilation System (GLDAS) that focused primarily on improving numerical weather prediction skills by improving the characterization of the land surface conditions. Both of GLDAS and NLDAS now use specific configurations of the LIS software in their current implementations.In addition, LIS was recently transitioned into operations at the US Air Force Weather Agency (AFWA) to ultimately replace their Agricultural Meteorology (AGRMET) system, and is also used routinely by NOAA's National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC) for their land data assimilation systems to support weather and climate modeling. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through "plugins". LIS has been coupled to the Weather Research and Forecasting (WRF) model to support studies of land-atmosphere coupling be enabling ensembles of land surface states to be tested against multiple representations of the atmospheric boundary layer. LIS has also been demonstrated for parameter estimation, who showed that the use of sequential remotely sensed soil moisture products can be used to derive soil hydraulic and texture properties given a sufficient dynamic range in the soil moisture retrievals and accurate precipitation inputs.LIS has also recently been demonstrated for multi-model data assimilation using an Ensemble Kalman Filter for sequential assimilation of soil moisture, snow, and temperature.Ongoing work has demonstrated the value of bias correction as part of the filter, and also that of joint calibration and assimilation.Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeorological modeling, assimilation and parameter estimation will be presented as advancements towards the next generation of integrated observation and modeling systems

  15. A Probabilistic Collocation Based Iterative Kalman Filter for Landfill Data Assimilation

    NASA Astrophysics Data System (ADS)

    Qiang, Z.; Zeng, L.; Wu, L.

    2016-12-01

    Due to the strong spatial heterogeneity of landfill, uncertainty is ubiquitous in gas transport process in landfill. To accurately characterize the landfill properties, the ensemble Kalman filter (EnKF) has been employed to assimilate the measurements, e.g., the gas pressure. As a Monte Carlo (MC) based method, the EnKF usually requires a large ensemble size, which poses a high computational cost for large scale problems. In this work, we propose a probabilistic collocation based iterative Kalman filter (PCIKF) to estimate permeability in a liquid-gas coupling model. This method employs polynomial chaos expansion (PCE) to represent and propagate the uncertainties of model parameters and states, and an iterative form of Kalman filter to assimilate the current gas pressure data. To further reduce the computation cost, the functional ANOVA (analysis of variance) decomposition is conducted, and only the first order ANOVA components are remained for PCE. Illustrated with numerical case studies, this proposed method shows significant superiority in computation efficiency compared with the traditional MC based iterative EnKF. The developed method has promising potential in reliable prediction and management of landfill gas production.

  16. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  17. Sequential processing of GNSS-R delay-Doppler maps (DDM's) for ocean wind retrieval

    NASA Astrophysics Data System (ADS)

    Garrison, J. L.; Rodriguez-Alvarez, N.; Hoffman, R.; Annane, B.; Leidner, M.; Kaitie, S.

    2016-12-01

    The delay-Doppler map (DDM) is the fundamental data product from GNSS-Reflectometry (GNSS-R), generated by cross-correlating the scattered signal with a local signal model over a range of delays and Doppler frequencies. Delay and Doppler form a set of coordinates on the ocean surface and the shape of the DDM is related to the distribution of ocean slopes. Wind speed can thus be estimated by fitting a scattering model to the shape of the observed DDM or defining an observable (e.g. average power or leading edge slope) which characterizes the change in DDM shape. For spaceborne measurements, the DDM is composed of signals scattered from a glistening zone, which can extend for up to 100 km or more. Setting a reasonable resolution requirement (25 km or less) will limit the usable portion of the DDM at each observation to only a small region near the specular point. Cyclone-GNSS (CYGNSS) is a NASA mission to study developing tropical cyclones using GNSS-R. CYGNSS science requirements call for wind retrieval with an accuracy of 10 percent above 20 m/s within a 25 km resolution. This requirement can be met using an observable defined for DDM samples between +/- 0.25 chips in delay and +/- 1 kHz in Doppler, with some filtering of the observations using a minimum threshold for range corrected gain (RCG). An improved approach, to be reviewed in this presentation, sequentially processes multiple DDM's, to combine observations generated from different "looks" at the same points on the surface. Applying this sequential process to synthetic data indicates a significant improvement in wind retrieval accuracy over a 10 km grid covering a region around the specular point. The attached figure illustrates this improvement, using simulated CYGNSS DDM's generated using the wind fields from hurricanes Earl and Danielle (left). The middle plots show wind retrievals using only an observable defined within the 25 km resolution cell. The plots on the right side show the retrievals from sequential processing of multiple DDM's. Recently, the assimilation of GNSS-R retrievals into weather forecast models has been studied. The authors have begun to investigate the direct assimilation of other data products, such as the DDM itself, or the results of sequential processing.

  18. Data assimilation of citizen collected information for real-time flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2017-12-01

    Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE < 0.4 m in average. For the second more realistic situation, the error becomes larger (RMSE 0.5 m) and the impact of the optimal interpolation becomes comparatively less effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for assimilating citizen collected information for real-time flood hazard mapping in the future.

  19. Sequential Markov chain Monte Carlo filter with simultaneous model selection for electrocardiogram signal modeling.

    PubMed

    Edla, Shwetha; Kovvali, Narayan; Papandreou-Suppappola, Antonia

    2012-01-01

    Constructing statistical models of electrocardiogram (ECG) signals, whose parameters can be used for automated disease classification, is of great importance in precluding manual annotation and providing prompt diagnosis of cardiac diseases. ECG signals consist of several segments with different morphologies (namely the P wave, QRS complex and the T wave) in a single heart beat, which can vary across individuals and diseases. Also, existing statistical ECG models exhibit a reliance upon obtaining a priori information from the ECG data by using preprocessing algorithms to initialize the filter parameters, or to define the user-specified model parameters. In this paper, we propose an ECG modeling technique using the sequential Markov chain Monte Carlo (SMCMC) filter that can perform simultaneous model selection, by adaptively choosing from different representations depending upon the nature of the data. Our results demonstrate the ability of the algorithm to track various types of ECG morphologies, including intermittently occurring ECG beats. In addition, we use the estimated model parameters as the feature set to classify between ECG signals with normal sinus rhythm and four different types of arrhythmia.

  20. Sequential Monte Carlo filter for state estimation of LiFePO4 batteries based on an online updated model

    NASA Astrophysics Data System (ADS)

    Li, Jiahao; Klee Barillas, Joaquin; Guenther, Clemens; Danzer, Michael A.

    2014-02-01

    Battery state monitoring is one of the key techniques in battery management systems e.g. in electric vehicles. An accurate estimation can help to improve the system performance and to prolong the battery remaining useful life. Main challenges for the state estimation for LiFePO4 batteries are the flat characteristic of open-circuit-voltage over battery state of charge (SOC) and the existence of hysteresis phenomena. Classical estimation approaches like Kalman filtering show limitations to handle nonlinear and non-Gaussian error distribution problems. In addition, uncertainties in the battery model parameters must be taken into account to describe the battery degradation. In this paper, a novel model-based method combining a Sequential Monte Carlo filter with adaptive control to determine the cell SOC and its electric impedance is presented. The applicability of this dual estimator is verified using measurement data acquired from a commercial LiFePO4 cell. Due to a better handling of the hysteresis problem, results show the benefits of the proposed method against the estimation with an Extended Kalman filter.

  1. Forward and inverse uncertainty quantification using multilevel Monte Carlo algorithms for an elliptic non-local equation

    DOE PAGES

    Jasra, Ajay; Law, Kody J. H.; Zhou, Yan

    2016-01-01

    Our paper considers uncertainty quantification for an elliptic nonlocal equation. In particular, it is assumed that the parameters which define the kernel in the nonlocal operator are uncertain and a priori distributed according to a probability measure. It is shown that the induced probability measure on some quantities of interest arising from functionals of the solution to the equation with random inputs is well-defined,s as is the posterior distribution on parameters given observations. As the elliptic nonlocal equation cannot be solved approximate posteriors are constructed. The multilevel Monte Carlo (MLMC) and multilevel sequential Monte Carlo (MLSMC) sampling algorithms are usedmore » for a priori and a posteriori estimation, respectively, of quantities of interest. Furthermore, these algorithms reduce the amount of work to estimate posterior expectations, for a given level of error, relative to Monte Carlo and i.i.d. sampling from the posterior at a given level of approximation of the solution of the elliptic nonlocal equation.« less

  2. Forward and inverse uncertainty quantification using multilevel Monte Carlo algorithms for an elliptic non-local equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasra, Ajay; Law, Kody J. H.; Zhou, Yan

    Our paper considers uncertainty quantification for an elliptic nonlocal equation. In particular, it is assumed that the parameters which define the kernel in the nonlocal operator are uncertain and a priori distributed according to a probability measure. It is shown that the induced probability measure on some quantities of interest arising from functionals of the solution to the equation with random inputs is well-defined,s as is the posterior distribution on parameters given observations. As the elliptic nonlocal equation cannot be solved approximate posteriors are constructed. The multilevel Monte Carlo (MLMC) and multilevel sequential Monte Carlo (MLSMC) sampling algorithms are usedmore » for a priori and a posteriori estimation, respectively, of quantities of interest. Furthermore, these algorithms reduce the amount of work to estimate posterior expectations, for a given level of error, relative to Monte Carlo and i.i.d. sampling from the posterior at a given level of approximation of the solution of the elliptic nonlocal equation.« less

  3. On the modeling of the 2010 Gulf of Mexico Oil Spill

    NASA Astrophysics Data System (ADS)

    Mariano, A. J.; Kourafalou, V. H.; Srinivasan, A.; Kang, H.; Halliwell, G. R.; Ryan, E. H.; Roffer, M.

    2011-09-01

    Two oil particle trajectory forecasting systems were developed and applied to the 2010 Deepwater Horizon Oil Spill in the Gulf of Mexico. Both systems use ocean current fields from high-resolution numerical ocean circulation model simulations, Lagrangian stochastic models to represent unresolved sub-grid scale variability to advect oil particles, and Monte Carlo-based schemes for representing uncertain biochemical and physical processes. The first system assumes two-dimensional particle motion at the ocean surface, the oil is in one state, and the particle removal is modeled as a Monte Carlo process parameterized by a one number removal rate. Oil particles are seeded using both initial conditions based on observations and particles released at the location of the Maconda well. The initial conditions (ICs) of oil particle location for the two-dimensional surface oil trajectory forecasts are based on a fusing of all available information including satellite-based analyses. The resulting oil map is digitized into a shape file within which a polygon filling software generates longitude and latitude with variable particle density depending on the amount of oil present in the observations for the IC. The more complex system assumes three (light, medium, heavy) states for the oil, each state has a different removal rate in the Monte Carlo process, three-dimensional particle motion, and a particle size-dependent oil mixing model. Simulations from the two-dimensional forecast system produced results that qualitatively agreed with the uncertain "truth" fields. These simulations validated the use of our Monte Carlo scheme for representing oil removal by evaporation and other weathering processes. Eulerian velocity fields for predicting particle motion from data-assimilative models produced better particle trajectory distributions than a free running model with no data assimilation. Monte Carlo simulations of the three-dimensional oil particle trajectory, whose ensembles were generated by perturbing the size of the oil particles and the fraction in a given size range that are released at depth, the two largest unknowns in this problem. 36 realizations of the model were run with only subsurface oil releases. An average of these results yields that after three months, about 25% of the oil remains in the water column and that most of the oil is below 800 m.

  4. Life detection systems.

    NASA Technical Reports Server (NTRS)

    Mitz, M. A.

    1972-01-01

    Some promising newer approaches for detecting microorganisms are discussed, giving particular attention to the integration of different methods into a single instrument. Life detection methods may be divided into biological, chemical, and cytological methods. Biological methods are based on the biological properties of assimilation, metabolism, and growth. Devices for the detection of organic materials are considered, taking into account an instrument which volatilizes, separates, and analyzes a sample sequentially. Other instrumental systems described make use of a microscope and the cytochemical staining principle.

  5. A reduced order model based on Kalman filtering for sequential data assimilation of turbulent flows

    NASA Astrophysics Data System (ADS)

    Meldi, M.; Poux, A.

    2017-10-01

    A Kalman filter based sequential estimator is presented in this work. The estimator is integrated in the structure of segregated solvers for the analysis of incompressible flows. This technique provides an augmented flow state integrating available observation in the CFD model, naturally preserving a zero-divergence condition for the velocity field. Because of the prohibitive costs associated with a complete Kalman Filter application, two model reduction strategies have been proposed and assessed. These strategies dramatically reduce the increase in computational costs of the model, which can be quantified in an augmentation of 10%- 15% with respect to the classical numerical simulation. In addition, an extended analysis of the behavior of the numerical model covariance Q has been performed. Optimized values are strongly linked to the truncation error of the discretization procedure. The estimator has been applied to the analysis of a number of test cases exhibiting increasing complexity, including turbulent flow configurations. The results show that the augmented flow successfully improves the prediction of the physical quantities investigated, even when the observation is provided in a limited region of the physical domain. In addition, the present work suggests that these Data Assimilation techniques, which are at an embryonic stage of development in CFD, may have the potential to be pushed even further using the augmented prediction as a powerful tool for the optimization of the free parameters in the numerical simulation.

  6. Ensemble Kalman Filter versus Ensemble Smoother for Data Assimilation in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Li, L.; Cao, Z.; Zhou, H.

    2017-12-01

    Groundwater modeling calls for an effective and robust integrating method to fill the gap between the model and data. The Ensemble Kalman Filter (EnKF), a real-time data assimilation method, has been increasingly applied in multiple disciplines such as petroleum engineering and hydrogeology. In this approach, the groundwater models are sequentially updated using measured data such as hydraulic head and concentration data. As an alternative to the EnKF, the Ensemble Smoother (ES) was proposed with updating models using all the data together, and therefore needs a much less computational cost. To further improve the performance of the ES, an iterative ES was proposed for continuously updating the models by assimilating measurements together. In this work, we compare the performance of the EnKF, the ES and the iterative ES using a synthetic example in groundwater modeling. The hydraulic head data modeled on the basis of the reference conductivity field are utilized to inversely estimate conductivities at un-sampled locations. Results are evaluated in terms of the characterization of conductivity and groundwater flow and solute transport predictions. It is concluded that: (1) the iterative ES could achieve a comparable result with the EnKF, but needs a less computational cost; (2) the iterative ES has the better performance than the ES through continuously updating. These findings suggest that the iterative ES should be paid much more attention for data assimilation in groundwater modeling.

  7. Kullback-Leibler information function and the sequential selection of experiments to discriminate among several linear models

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    The error variance of the process prior multivariate normal distributions of the parameters of the models are assumed to be specified, prior probabilities of the models being correct. A rule for termination of sampling is proposed. Upon termination, the model with the largest posterior probability is chosen as correct. If sampling is not terminated, posterior probabilities of the models and posterior distributions of the parameters are computed. An experiment was chosen to maximize the expected Kullback-Leibler information function. Monte Carlo simulation experiments were performed to investigate large and small sample behavior of the sequential adaptive procedure.

  8. The Ensemble Kalman Filter for Groundwater Plume Characterization: A Case Study.

    PubMed

    Ross, James L; Andersen, Peter F

    2018-04-17

    The Kalman filter is an efficient data assimilation tool to refine an estimate of a state variable using measured data and the variable's correlations in space and/or time. The ensemble Kalman filter (EnKF) (Evensen 2004, 2009) is a Kalman filter variant that employs Monte Carlo analysis to define the correlations that help to refine the updated state. While use of EnKF in hydrology is somewhat limited, it has been successfully applied in other fields of engineering (e.g., oil reservoir modeling, weather forecasting). Here, EnKF is used to refine a simulated groundwater tetrachloroethylene (TCE) plume that underlies the Tooele Army Depot-North (TEAD-N) in Utah, based on observations of TCE in the aquifer. The resulting EnKF-based assimilated plume is simulated forward in time to predict future plume migration. The correlations that underpin EnKF updating implicitly contain information about how the plume developed over time under the influence of complex site hydrology and variable source history, as they are predicated on multiple realizations of a well-calibrated numerical groundwater flow and transport model. The EnKF methodology is compared to an ordinary kriging-based assimilation method with respect to the accurate representation of plume concentrations in order to determine the relative efficacy of EnKF for water quality data assimilation. © 2018, National Ground Water Association.

  9. Small-Noise Analysis and Symmetrization of Implicit Monte Carlo Samplers

    DOE PAGES

    Goodman, Jonathan; Lin, Kevin K.; Morzfeld, Matthias

    2015-07-06

    Implicit samplers are algorithms for producing independent, weighted samples from multivariate probability distributions. These are often applied in Bayesian data assimilation algorithms. We use Laplace asymptotic expansions to analyze two implicit samplers in the small noise regime. Our analysis suggests a symmetrization of the algorithms that leads to improved implicit sampling schemes at a relatively small additional cost. Here, computational experiments confirm the theory and show that symmetrization is effective for small noise sampling problems.

  10. Aerosol Observability and Predictability: From Research to Operations for Chemical Weather Forecasting. Lagrangian Displacement Ensembles for Aerosol Data Assimilation

    NASA Technical Reports Server (NTRS)

    da Silva, Arlindo

    2010-01-01

    A challenge common to many constituent data assimilation applications is the fact that one observes a much smaller fraction of the phase space that one wishes to estimate. For example, remotely sensed estimates of the column average concentrations are available, while one is faced with the problem of estimating 3D concentrations for initializing a prognostic model. This problem is exacerbated in the case of aerosols because the observable Aerosol Optical Depth (AOD) is not only a column integrated quantity, but it also sums over a large number of species (dust, sea-salt, carbonaceous and sulfate aerosols. An aerosol transport model when driven by high-resolution, state-of-the-art analysis of meteorological fields and realistic emissions can produce skillful forecasts even when no aerosol data is assimilated. The main task of aerosol data assimilation is to address the bias arising from inaccurate emissions, and Lagrangian misplacement of plumes induced by errors in the driving meteorological fields. As long as one decouples the meteorological and aerosol assimilation as we do here, the classic baroclinic growth of error is no longer the main order of business. We will describe an aerosol data assimilation scheme in which the analysis update step is conducted in observation space, using an adaptive maximum-likelihood scheme for estimating background errors in AOD space. This scheme includes e explicit sequential bias estimation as in Dee and da Silva. Unlikely existing aerosol data assimilation schemes we do not obtain analysis increments of the 3D concentrations by scaling the background profiles. Instead we explore the Lagrangian characteristics of the problem for generating local displacement ensembles. These high-resolution state-dependent ensembles are then used to parameterize the background errors and generate 3D aerosol increments. The algorithm has computational complexity running at a resolution of 1/4 degree, globally. We will present the result of assimilating AOD retrievals from MODIS (on both Aqua and TERRA satellites) from AERONET for validation. The impact on the GEOS-5 Aerosol Forecasting will be fully documented.

  11. Multi-Scale Hydrometeorological Modeling, Land Data Assimilation and Parameter Estimation with the Land Information System

    NASA Technical Reports Server (NTRS)

    Peters-Lidard, Christa D.; Kumar, Sujay V.; Santanello, Joseph A., Jr.; Reichle, Rolf H.

    2009-01-01

    The Land Information System (LIS; http://lis.gsfc.nasa.gov; Kumar et al., 2006; Peters- Lidard et al.,2007) is a flexible land surface modeling framework that has been developed with the goal of integrating satellite- and ground-based observational data products and advanced land surface modeling techniques to produce optimal fields of land surface states and fluxes. As such, LIS represents a step towards the next generation land component of an integrated Earth system model. In recognition of LIS object-oriented software design, use and impact in the land surface and hydrometeorological modeling community, the LIS software was selected ase co-winner of NASA's 2005 Software of the Year award. LIS facilitates the integration of observations from Earth-observing systems and predictions and forecasts from Earth System and Earth science models into the decision-making processes of partnering agency and national organizations. Due to its flexible software design, LIS can serve both as a Problem Solving Environment (PSE) for hydrologic research to enable accurate global water and energy cycle predictions, and as a Decision Support System (DSS) to generate useful information for application areas including disaster management, water resources management, agricultural management, numerical weather prediction, air quality and military mobility assessment. LIS has evolved from two earlier efforts North American Land Data Assimilation System (NLDAS; Mitchell et al. 2004) and Global Land Data Assimilation System (GLDAS; Rodell al. 2004) that focused primarily on improving numerical weather prediction skills by improving the characterization of the land surface conditions. Both of GLDAS and NLDAS now use specific configurations of the LIS software in their current implementations. In addition, LIS was recently transitioned into operations at the US Air Force Weather Agency (AFWA) to ultimately replace their Agricultural Meteorology (AGRMET) system, and is also used routinely by NOAA's National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC) for their land data assimilation systems to support weather and climate modeling. LIS not only consolidates the capabilities of these two systems, but also enables a much larger variety of configurations with respect to horizontal spatial resolution, input datasets and choice of land surface model through "plugins,". As described in Kumar et al., 2007, and demonstrated in Case et al., 2008, and Santanello et al., 2009, LIS has been coupled to the Weather Research and Forecasting (WRF) model to support studies of land-atmosphere coupling the enabling ensembles of land surface states to be tested against multiple representations of the atmospheric boundary layer. LIS has also been demonstrated for parameter estimation as described in Peters-Lidard et al. (2008) and Santanello et al. (2007), who showed that the use of sequential remotely sensed soil moisture products can be used to derive soil hydraulic and texture properties given a sufficient dynamic range in the soil moisture retrievals and accurate precipitation inputs. LIS has also recently been demonstrated for multi-model data assimilation (Kumar et al., 2008) using an Ensemble Kalman Filter for sequential assimilation of soil moisture, snow, and temperature. Ongoing work has demonstrated the value of bias correction as part of the filter, and also that of joint calibration and assimilation. Examples and case studies demonstrating the capabilities and impacts of LIS for hydrometeoroogical modeling, assimilation and parameter estimation will be presented as advancements towards the next generation of integrated observation and modeling systems.

  12. Data Assimilation using Artificial Neural Networks for the global FSU atmospheric model

    NASA Astrophysics Data System (ADS)

    Cintra, Rosangela; Cocke, Steven; Campos Velho, Haroldo

    2015-04-01

    Data assimilation is the process by which measurements and model predictions are combined to obtain an accurate representation of the state of the modeled system. Uncertainty is the characteristic of the atmosphere, coupled with inevitable inadequacies in observations and computer models and increase errors in weather forecasts. Data assimilation is a technique to generate an initial condition to a weather or climate forecasts. This paper shows the results of a data assimilation technique using artificial neural networks (ANN) to obtain the initial condition to the atmospheric general circulation model (AGCM) for the Florida State University in USA. The Local Ensemble Transform Kalman filter (LETKF) is implemented with Florida State University Global Spectral Model (FSUGSM). The ANN data assimilation is made to emulate the initial condition from LETKF to run the FSUGSM. LETKF is a version of Kalman filter with Monte-Carlo ensembles of short-term forecasts to solve the data assimilation problem. The model FSUGSM is a multilevel (27 vertical levels) spectral primitive equation model with a vertical sigma coordinate. All variables are expanded horizontally in a truncated series of spherical harmonic functions (at resolution T63) and a transform technique is applied to calculate the physical processes in real space. The LETKF data assimilation experiments are based in synthetic observations data (surface pressure, absolute temperature, zonal component wind, meridional component wind and humidity). For the ANN data assimilation scheme, we use Multilayer Perceptron (MLP-DA) with supervised training algorithm where ANN receives input vectors with their corresponding response or target output from LETKF scheme. An automatic tool that finds the optimal representation to these ANNs configures the MLP-DA in this experiment. After the training process, the scheme MLP-DA is seen as a function of data assimilation where the inputs are observations and a short-range forecast to each model grid point. The ANNs were trained with data from each month of 2001, 2002, 2003, and 2004. A hind-casting experiment for data assimilation cycle using MLP-DA was performed with synthetic observations for January 2005. The numerical results demonstrate the effectiveness of the ANN technique for atmospheric data assimilation, since the analyses (initial conditions) have similar quality to LETKF analyses. The major advantage of using MLP-DA is the computational performance, which is faster than LETKF. The reduced computational cost allows the inclusion of greater number of observations and new data sources and the use of high resolution of models, which ensures the accuracy of analysis and of its weather prediction

  13. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  14. Elucidating the biosynthesis of 2-carboxyarabinitol 1-phosphate through reduced expression of chloroplastic fructose 1,6-bisphosphate phosphatase and radiotracer studies with 14CO2

    PubMed Central

    Andralojc, P. John; Keys, Alfred J.; Kossmann, Jens; Parry, Martin A. J.

    2002-01-01

    2-Carboxyarabinitol 1-phosphate limits photosynthetic CO2 assimilation at low light because it is a potent, naturally occurring inhibitor of ribulose 1,5-bisphosphate carboxylase/oxygenase. Evidence is presented that this inhibitor is derived from chloroplastic fructose 1,6-bisphosphate. First, transgenic plants containing decreased amounts of chloroplastic fructose 1,6-bisphosphate phosphatase contained increased amounts of fructose 1,6-bisphosphate and 2-carboxyarabinitol 1-phosphate and greatly increased amounts of the putative intermediates hamamelose and 2-carboxyarabinitol, which in some cases were as abundant as sucrose. Second, French bean leaves in the light were shown to incorporate 14C from 14CO2 sequentially into fructose 1,6-bisphosphate, hamamelose bisphosphate, hamamelose monophosphate, hamamelose, and 2-carboxyarabinitol. As shown previously, 14C assimilated by photosynthesis was also incorporated into 2-carboxyarabinitol 1-phosphate during subsequent darkness. PMID:11917127

  15. On the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods

    PubMed Central

    Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.

    2011-01-01

    We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276

  16. Sequential updating of multimodal hydrogeologic parameter fields using localization and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Morris, Alan P.; Mohanty, Sitakanta

    2009-07-01

    Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. However, their usefulness is largely limited by the inherent assumption of Gaussian error statistics. Hydraulic conductivity distributions in alluvial aquifers, for example, are usually non-Gaussian as a result of complex depositional and diagenetic processes. In this study, we combine an ensemble Kalman filter with grid-based localization and a Gaussian mixture model (GMM) clustering techniques for updating high-dimensional, multimodal parameter distributions via dynamic data assimilation. We introduce innovative strategies (e.g., block updating and dimension reduction) to effectively reduce the computational costs associated with these modified ensemble Kalman filter schemes. The developed data assimilation schemes are demonstrated numerically for identifying the multimodal heterogeneous hydraulic conductivity distributions in a binary facies alluvial aquifer. Our results show that localization and GMM clustering are very promising techniques for assimilating high-dimensional, multimodal parameter distributions, and they outperform the corresponding global ensemble Kalman filter analysis scheme in all scenarios considered.

  17. PARTICLE FILTERING WITH SEQUENTIAL PARAMETER LEARNING FOR NONLINEAR BOLD fMRI SIGNALS.

    PubMed

    Xia, Jing; Wang, Michelle Yongmei

    Analyzing the blood oxygenation level dependent (BOLD) effect in the functional magnetic resonance imaging (fMRI) is typically based on recent ground-breaking time series analysis techniques. This work represents a significant improvement over existing approaches to system identification using nonlinear hemodynamic models. It is important for three reasons. First, instead of using linearized approximations of the dynamics, we present a nonlinear filtering based on the sequential Monte Carlo method to capture the inherent nonlinearities in the physiological system. Second, we simultaneously estimate the hidden physiological states and the system parameters through particle filtering with sequential parameter learning to fully take advantage of the dynamic information of the BOLD signals. Third, during the unknown static parameter learning, we employ the low-dimensional sufficient statistics for efficiency and avoiding potential degeneration of the parameters. The performance of the proposed method is validated using both the simulated data and real BOLD fMRI data.

  18. A probabilistic drought forecasting framework: A combined dynamical and statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh

    In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less

  19. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  20. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  1. Practical implementation of a particle filter data assimilation approach to estimate initial hydrologic conditions and initialize medium-range streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Clark, Elizabeth; Wood, Andy; Nijssen, Bart; Mendoza, Pablo; Newman, Andy; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    In an automated forecast system, hydrologic data assimilation (DA) performs the valuable function of correcting raw simulated watershed model states to better represent external observations, including measurements of streamflow, snow, soil moisture, and the like. Yet the incorporation of automated DA into operational forecasting systems has been a long-standing challenge due to the complexities of the hydrologic system, which include numerous lags between state and output variations. To help demonstrate that such methods can succeed in operational automated implementations, we present results from the real-time application of an ensemble particle filter (PF) for short-range (7 day lead) ensemble flow forecasts in western US river basins. We use the System for Hydromet Applications, Research and Prediction (SHARP), developed by the National Center for Atmospheric Research (NCAR) in collaboration with the University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. SHARP is a fully automated platform for short-term to seasonal hydrologic forecasting applications, incorporating uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions through ensemble methods. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 temperature and precipitation time series through conceptual and physically-oriented models. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. The PF selects and/or weights and resamples the IHCs that are most consistent with external streamflow observations, and uses the particles to initialize a streamflow forecast ensemble driven by ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS). We apply this method in real-time for several basins in the western US that are important for water resources management, and perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts skill. This presentation describes findings, including a comparison of sequential and non-sequential particle weighting methods.

  2. A Localized Ensemble Kalman Smoother

    NASA Technical Reports Server (NTRS)

    Butala, Mark D.

    2012-01-01

    Numerous geophysical inverse problems prove difficult because the available measurements are indirectly related to the underlying unknown dynamic state and the physics governing the system may involve imperfect models or unobserved parameters. Data assimilation addresses these difficulties by combining the measurements and physical knowledge. The main challenge in such problems usually involves their high dimensionality and the standard statistical methods prove computationally intractable. This paper develops and addresses the theoretical convergence of a new high-dimensional Monte-Carlo approach called the localized ensemble Kalman smoother.

  3. An Ensemble-Based Smoother with Retrospectively Updated Weights for Highly Nonlinear Systems

    NASA Technical Reports Server (NTRS)

    Chin, T. M.; Turmon, M. J.; Jewell, J. B.; Ghil, M.

    2006-01-01

    Monte Carlo computational methods have been introduced into data assimilation for nonlinear systems in order to alleviate the computational burden of updating and propagating the full probability distribution. By propagating an ensemble of representative states, algorithms like the ensemble Kalman filter (EnKF) and the resampled particle filter (RPF) rely on the existing modeling infrastructure to approximate the distribution based on the evolution of this ensemble. This work presents an ensemble-based smoother that is applicable to the Monte Carlo filtering schemes like EnKF and RPF. At the minor cost of retrospectively updating a set of weights for ensemble members, this smoother has demonstrated superior capabilities in state tracking for two highly nonlinear problems: the double-well potential and trivariate Lorenz systems. The algorithm does not require retrospective adaptation of the ensemble members themselves, and it is thus suited to a streaming operational mode. The accuracy of the proposed backward-update scheme in estimating non-Gaussian distributions is evaluated by comparison to the more accurate estimates provided by a Markov chain Monte Carlo algorithm.

  4. Reduction of the uncertainties in the water level-discharge relation of a 1D hydraulic model in the context of operational flood forecasting

    NASA Astrophysics Data System (ADS)

    Habert, J.; Ricci, S.; Le Pape, E.; Thual, O.; Piacentini, A.; Goutal, N.; Jonville, G.; Rochoux, M.

    2016-01-01

    This paper presents a data-driven hydrodynamic simulator based on the 1-D hydraulic solver dedicated to flood forecasting with lead time of an hour up to 24 h. The goal of the study is to reduce uncertainties in the hydraulic model and thus provide more reliable simulations and forecasts in real time for operational use by the national hydrometeorological flood forecasting center in France. Previous studies have shown that sequential assimilation of water level or discharge data allows to adjust the inflows to the hydraulic network resulting in a significant improvement of the discharge while leaving the water level state imperfect. Two strategies are proposed here to improve the water level-discharge relation in the model. At first, a modeling strategy consists in improving the description of the river bed geometry using topographic and bathymetric measurements. Secondly, an inverse modeling strategy proposes to locally correct friction coefficients in the river bed and the flood plain through the assimilation of in situ water level measurements. This approach is based on an Extended Kalman filter algorithm that sequentially assimilates data to infer the upstream and lateral inflows at first and then the friction coefficients. It provides a time varying correction of the hydrological boundary conditions and hydraulic parameters. The merits of both strategies are demonstrated on the Marne catchment in France for eight validation flood events and the January 2004 flood event is used as an illustrative example throughout the paper. The Nash-Sutcliffe criterion for water level is improved from 0.135 to 0.832 for a 12-h forecast lead time with the data assimilation strategy. These developments have been implemented at the SAMA SPC (local flood forecasting service in the Haute-Marne French department) and used for operational forecast since 2013. They were shown to provide an efficient tool for evaluating flood risk and to improve the flood early warning system. Complementary with the deterministic forecast of the hydraulic state, the estimation of an uncertainty range is given relying on off-line and on-line diagnosis. The possibilities to further extend the control vector while limiting the computational cost and equifinality problem are finally discussed.

  5. Sensitivity analysis of a data assimilation technique for hindcasting and forecasting hydrodynamics of a complex coastal water body

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Hartnett, Michael

    2017-02-01

    Accurate forecasting of coastal surface currents is of great economic importance due to marine activities such as marine renewable energy and fish farms in coastal regions in recent twenty years. Advanced oceanographic observation systems such as satellites and radars can provide many parameters of interest, such as surface currents and waves, with fine spatial resolution in near real time. To enhance modelling capability, data assimilation (DA) techniques which combine the available measurements with the hydrodynamic models have been used since the 1990s in oceanography. Assimilating measurements into hydrodynamic models makes the original model background states follow the observation trajectory, then uses it to provide more accurate forecasting information. Galway Bay is an open, wind dominated water body on which two coastal radars are deployed. An efficient and easy to implement sequential DA algorithm named Optimal Interpolation (OI) was used to blend radar surface current data into a three-dimensional Environmental Fluid Dynamics Code (EFDC) model. Two empirical parameters, horizontal correlation length and DA cycle length (CL), are inherent within OI. No guidance has previously been published regarding selection of appropriate values of these parameters or how sensitive OI DA is to variations in their values. Detailed sensitivity analysis has been performed on both of these parameters and results presented. Appropriate value of DA CL was examined and determined on producing the minimum Root-Mean-Square-Error (RMSE) between radar data and model background states. Analysis was performed to evaluate assimilation index (AI) of using an OI DA algorithm in the model. AI of the half-day forecasting mean vectors' directions was over 50% in the best assimilation model. The ability of using OI to improve model forecasts was also assessed and is reported upon.

  6. Predicting geomagnetic reversals via data assimilation: a feasibility study

    NASA Astrophysics Data System (ADS)

    Morzfeld, Matthias; Fournier, Alexandre; Hulot, Gauthier

    2014-05-01

    The system of three ordinary differential equations (ODE) presented by Gissinger in [1] was shown to exhibit chaotic reversals whose statistics compared well with those from the paleomagnetic record. We explore the geophysical relevance of this low-dimensional model via data assimilation, i.e. we update the solution of the ODE with information from data of the dipole variable. The data set we use is 'SINT' (Valet et al. [2]), and it provides the signed virtual axial dipole moment over the past 2 millions years. We can obtain an accurate reconstruction of these dipole data using implicit sampling (a fully nonlinear Monte Carlo sampling strategy) and assimilating 5 kyr of data per sweep. We confirm our calibration of the model using the PADM2M dipole data set of Ziegler et al. [3]. The Monte Carlo sampling strategy provides us with quantitative information about the uncertainty of our estimates, and -in principal- we can use this information for making (robust) predictions under uncertainty. We perform synthetic data experiments to explore the predictive capability of the ODE model updated by data assimilation. For each experiment, we produce 2 Myr of synthetic data (with error levels similar to the ones found in the SINT data), calibrate the model to this record, and then check if this calibrated model can reliably predict a reversal within the next 5 kyr. By performing a large number of such experiments, we can estimate the statistics that describe how reliably our calibrated model can predict a reversal of the geomagnetic field. It is found that the 1 kyr-ahead predictions of reversals produced by the model appear to be accurate and reliable. These encouraging results prompted us to also test predictions of the five reversals of the SINT (and PADM2M) data set, using a similarly calibrated model. Results will be presented and discussed. References Gissinger, C., 2012, A new deterministic model for chaotic reversals, European Physical Journal B, 85:137 Valet, J.P., Maynadier,L and Guyodo, Y., 2005, Geomagnetic field strength and reversal rate over the past 2 Million years, Nature, 435, 802-805. Ziegler, L.B., Constable, C.G., Johnson, C.L. and Tauxe, L., 2011, PADM2M: a penalized maximum likelihood moidel of the 0-2 Ma paleomagnetic axial dipole moment, Geophysical Journal International, 184, 1069-1089.

  7. Sequential Geoacoustic Filtering and Geoacoustic Inversion

    DTIC Science & Technology

    2015-09-30

    and online algorithms. We show here that CS obtains higher resolution than MVDR, even in scenarios, which favor classical high-resolution methods...windows actually performs better than conventional beamforming and MVDR/ MUSIC (see Figs. 1-2). Compressive geoacoustic inversion Geoacoustic...histograms based on 100 Monte Carlo simulations, and c)(CS, exhaustive-search, CBF, MVDR, and MUSIC performance versus SNR. The true source positions

  8. Multilevel Mixture Kalman Filter

    NASA Astrophysics Data System (ADS)

    Guo, Dong; Wang, Xiaodong; Chen, Rong

    2004-12-01

    The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS) and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.

  9. A Multigrid NLS-4DVar Data Assimilation Scheme with Advanced Research WRF (ARW)

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Tian, X.

    2017-12-01

    The motions of the atmosphere have multiscale properties in space and/or time, and the background error covariance matrix (Β) should thus contain error information at different correlation scales. To obtain an optimal analysis, the multigrid three-dimensional variational data assimilation scheme is used widely when sequentially correcting errors from large to small scales. However, introduction of the multigrid technique into four-dimensional variational data assimilation is not easy, due to its strong dependence on the adjoint model, which has extremely high computational costs in data coding, maintenance, and updating. In this study, the multigrid technique was introduced into the nonlinear least-squares four-dimensional variational assimilation (NLS-4DVar) method, which is an advanced four-dimensional ensemble-variational method that can be applied without invoking the adjoint models. The multigrid NLS-4DVar (MG-NLS-4DVar) scheme uses the number of grid points to control the scale, with doubling of this number when moving from a coarse to a finer grid. Furthermore, the MG-NLS-4DVar scheme not only retains the advantages of NLS-4DVar, but also sufficiently corrects multiscale errors to achieve a highly accurate analysis. The effectiveness and efficiency of the proposed MG-NLS-4DVar scheme were evaluated by several groups of observing system simulation experiments using the Advanced Research Weather Research and Forecasting Model. MG-NLS-4DVar outperformed NLS-4DVar, with a lower computational cost.

  10. The COsmic-ray Soil Moisture Interaction Code (COSMIC) for use in data assimilation

    NASA Astrophysics Data System (ADS)

    Shuttleworth, J.; Rosolem, R.; Zreda, M.; Franz, T.

    2013-08-01

    Soil moisture status in land surface models (LSMs) can be updated by assimilating cosmic-ray neutron intensity measured in air above the surface. This requires a fast and accurate model to calculate the neutron intensity from the profiles of soil moisture modeled by the LSM. The existing Monte Carlo N-Particle eXtended (MCNPX) model is sufficiently accurate but too slow to be practical in the context of data assimilation. Consequently an alternative and efficient model is needed which can be calibrated accurately to reproduce the calculations made by MCNPX and used to substitute for MCNPX during data assimilation. This paper describes the construction and calibration of such a model, COsmic-ray Soil Moisture Interaction Code (COSMIC), which is simple, physically based and analytic, and which, because it runs at least 50 000 times faster than MCNPX, is appropriate in data assimilation applications. The model includes simple descriptions of (a) degradation of the incoming high-energy neutron flux with soil depth, (b) creation of fast neutrons at each depth in the soil, and (c) scattering of the resulting fast neutrons before they reach the soil surface, all of which processes may have parameterized dependency on the chemistry and moisture content of the soil. The site-to-site variability in the parameters used in COSMIC is explored for 42 sample sites in the COsmic-ray Soil Moisture Observing System (COSMOS), and the comparative performance of COSMIC relative to MCNPX when applied to represent interactions between cosmic-ray neutrons and moist soil is explored. At an example site in Arizona, fast-neutron counts calculated by COSMIC from the average soil moisture profile given by an independent network of point measurements in the COSMOS probe footprint are similar to the fast-neutron intensity measured by the COSMOS probe. It was demonstrated that, when used within a data assimilation framework to assimilate COSMOS probe counts into the Noah land surface model at the Santa Rita Experimental Range field site, the calibrated COSMIC model provided an effective mechanism for translating model-calculated soil moisture profiles into aboveground fast-neutron count when applied with two radically different approaches used to remove the bias between data and model.

  11. Towards robust quantification and reduction of uncertainty in hydrologic predictions: Integration of particle Markov chain Monte Carlo and factorial polynomial chaos expansion

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.

    2017-05-01

    The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.

  12. Comparison of Learning Styles of Pharmacy Students and Faculty Members

    PubMed Central

    Crawford, Stephanie Y.; Alhreish, Suhail K.

    2012-01-01

    Objectives. To compare dominant learning styles of pharmacy students and faculty members and between faculty members in different tracks. Methods. Gregorc Style Delineator (GSD) and Zubin’s Pharmacists’ Inventory of Learning Styles (PILS) were administered to students and faculty members at an urban, Midwestern college of pharmacy. Results. Based on responses from 299 students (classes of 2008, 2009, and 2010) and 59 faculty members, GSD styles were concrete sequential (48%), abstract sequential (18%), abstract random (13%), concrete random (13%), and multimodal (8%). With PILS, dominant styles were assimilator (47%) and converger (30%). There were no significant differences between faculty members and student learning styles nor across pharmacy student class years (p>0.05). Learning styles differed between men and women across both instruments (p<0.01), and between faculty members in tenure and clinical tracks for the GSD styles (p=0.01). Conclusion. Learning styles differed among respondents based on gender and faculty track. PMID:23275657

  13. Parallelization and implementation of approximate root isolation for nonlinear system by Monte Carlo

    NASA Astrophysics Data System (ADS)

    Khosravi, Ebrahim

    1998-12-01

    This dissertation solves a fundamental problem of isolating the real roots of nonlinear systems of equations by Monte-Carlo that were published by Bush Jones. This algorithm requires only function values and can be applied readily to complicated systems of transcendental functions. The implementation of this sequential algorithm provides scientists with the means to utilize function analysis in mathematics or other fields of science. The algorithm, however, is so computationally intensive that the system is limited to a very small set of variables, and this will make it unfeasible for large systems of equations. Also a computational technique was needed for investigating a metrology of preventing the algorithm structure from converging to the same root along different paths of computation. The research provides techniques for improving the efficiency and correctness of the algorithm. The sequential algorithm for this technique was corrected and a parallel algorithm is presented. This parallel method has been formally analyzed and is compared with other known methods of root isolation. The effectiveness, efficiency, enhanced overall performance of the parallel processing of the program in comparison to sequential processing is discussed. The message passing model was used for this parallel processing, and it is presented and implemented on Intel/860 MIMD architecture. The parallel processing proposed in this research has been implemented in an ongoing high energy physics experiment: this algorithm has been used to track neutrinoes in a super K detector. This experiment is located in Japan, and data can be processed on-line or off-line locally or remotely.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jennings, E.; Madigan, M.

    Given the complexity of modern cosmological parameter inference where we arefaced with non-Gaussian data and noise, correlated systematics and multi-probecorrelated data sets, the Approximate Bayesian Computation (ABC) method is apromising alternative to traditional Markov Chain Monte Carlo approaches in thecase where the Likelihood is intractable or unknown. The ABC method is called"Likelihood free" as it avoids explicit evaluation of the Likelihood by using aforward model simulation of the data which can include systematics. Weintroduce astroABC, an open source ABC Sequential Monte Carlo (SMC) sampler forparameter estimation. A key challenge in astrophysics is the efficient use oflarge multi-probe datasets to constrainmore » high dimensional, possibly correlatedparameter spaces. With this in mind astroABC allows for massive parallelizationusing MPI, a framework that handles spawning of jobs across multiple nodes. Akey new feature of astroABC is the ability to create MPI groups with differentcommunicators, one for the sampler and several others for the forward modelsimulation, which speeds up sampling time considerably. For smaller jobs thePython multiprocessing option is also available. Other key features include: aSequential Monte Carlo sampler, a method for iteratively adapting tolerancelevels, local covariance estimate using scikit-learn's KDTree, modules forspecifying optimal covariance matrix for a component-wise or multivariatenormal perturbation kernel, output and restart files are backed up everyiteration, user defined metric and simulation methods, a module for specifyingheterogeneous parameter priors including non-standard prior PDFs, a module forspecifying a constant, linear, log or exponential tolerance level,well-documented examples and sample scripts. This code is hosted online athttps://github.com/EliseJ/astroABC« less

  15. Snow water equivalent monitoring retrieved by assimilating passive microwave observations in a coupled snowpack evolution and microwave emission models over North-Eastern Canada

    NASA Astrophysics Data System (ADS)

    Royer, A.; Larue, F.; De Sève, D.; Roy, A.; Vionnet, V.; Picard, G.; Cosme, E.

    2017-12-01

    Over northern snow-dominated basins, the snow water equivalent (SWE) is of primary interest for spring streamflow forecasting. SWE retrievals from satellite data are still not well resolved, in particular from microwave (MW) measurements, the only type of data sensible to snow mass. Also, the use of snowpack models is challenging due to the large uncertainties in meteorological input forcings. This project aims to improve SWE prediction by assimilation of satellite brightness temperature (TB), without any ground-based observations. The proposed approach is the coupling of a detailed multilayer snowpack model (Crocus) with a MW snow emission model (DMRT-ML). The assimilation scheme is a Sequential Importance Resampling Particle filter, through ensembles of perturbed meteorological forcings according to their respective uncertainties. Crocus simulations driven by operational meteorological forecasts from the Canadian Global Environmental Multiscale model at 10 km spatial resolution were compared to continuous daily SWE measurements over Québec, North-Eastern Canada (56° - 45°N). The results show a mean bias of the maximum SWE overestimated by 16% with variations up to +32%. This observed large variability could lead to dramatic consequences on spring flood forecasts. Results of Crocus-DMRT-ML coupling compared to surface-based TB measurements (at 11, 19 and 37 GHz) show that the Crocus snowpack microstructure described by sticky hard spheres within DMRT has to be scaled by a snow stickiness of 0.18, significantly reducing the overall RMSE of simulated TBs. The ability of assimilation of daily TBs to correct the simulated SWE is first presented through twin experiments with synthetic data, and then with AMSR-2 satellite time series of TBs along the winter taking into account atmospheric and forest canopy interferences (absorption and emission). The differences between TBs at 19-37 GHz and at 11-19 GHz, in vertical polarization, were assimilated. This assimilation test with synthetic data gives a SWE RMSE reduced by a factor of 2 after assimilation. Assimilation of AMSR-2 TBs shows improvement in SWE retrievals compared to continuous in-situ SWE measurements. The accuracy is discussed as a function of boreal forest density and LAI (MODIS-based data), having significant effects.

  16. Sequential assimilation of satellite-derived vegetation and soil moisture products using SURFEX_v8.0: LDAS-Monde assessment over the Euro-Mediterranean area

    NASA Astrophysics Data System (ADS)

    Albergel, Clément; Munier, Simon; Leroux, Delphine Jennifer; Dewaele, Hélène; Fairbairn, David; Lavinia Barbu, Alina; Gelati, Emiliano; Dorigo, Wouter; Faroux, Stéphanie; Meurey, Catherine; Le Moigne, Patrick; Decharme, Bertrand; Mahfouf, Jean-Francois; Calvet, Jean-Christophe

    2017-10-01

    In this study, a global land data assimilation system (LDAS-Monde) is applied over Europe and the Mediterranean basin to increase monitoring accuracy for land surface variables. LDAS-Monde is able to ingest information from satellite-derived surface soil moisture (SSM) and leaf area index (LAI) observations to constrain the interactions between soil-biosphere-atmosphere (ISBA, Interactions between Soil, Biosphere and Atmosphere) land surface model (LSM) coupled with the CNRM (Centre National de Recherches Météorologiques) version of the Total Runoff Integrating Pathways (ISBA-CTRIP) continental hydrological system. It makes use of the CO2-responsive version of ISBA which models leaf-scale physiological processes and plant growth. Transfer of water and heat in the soil rely on a multilayer diffusion scheme. SSM and LAI observations are assimilated using a simplified extended Kalman filter (SEKF), which uses finite differences from perturbed simulations to generate flow dependence between the observations and the model control variables. The latter include LAI and seven layers of soil (from 1 to 100 cm depth). A sensitivity test of the Jacobians over 2000-2012 exhibits effects related to both depth and season. It also suggests that observations of both LAI and SSM have an impact on the different control variables. From the assimilation of SSM, the LDAS is more effective in modifying soil moisture (SM) from the top layers of soil, as model sensitivity to SSM decreases with depth and has almost no impact from 60 cm downwards. From the assimilation of LAI, a strong impact on LAI itself is found. The LAI assimilation impact is more pronounced in SM layers that contain the highest fraction of roots (from 10 to 60 cm). The assimilation is more efficient in summer and autumn than in winter and spring. Results shows that the LDAS works well constraining the model to the observations and that stronger corrections are applied to LAI than to SM. A comprehensive evaluation of the assimilation impact is conducted using (i) agricultural statistics over France, (ii) river discharge observations, (iii) satellite-derived estimates of land evapotranspiration from the Global Land Evaporation Amsterdam Model (GLEAM) project and (iv) spatially gridded observation-based estimates of upscaled gross primary production and evapotranspiration from the FLUXNET network. Comparisons with those four datasets highlight neutral to highly positive improvement.

  17. Benchmarking a Soil Moisture Data Assimilation System for Agricultural Drought Monitoring

    NASA Technical Reports Server (NTRS)

    Hun, Eunjin; Crow, Wade T.; Holmes, Thomas; Bolten, John

    2014-01-01

    Despite considerable interest in the application of land surface data assimilation systems (LDAS) for agricultural drought applications, relatively little is known about the large-scale performance of such systems and, thus, the optimal methodological approach for implementing them. To address this need, this paper evaluates an LDAS for agricultural drought monitoring by benchmarking individual components of the system (i.e., a satellite soil moisture retrieval algorithm, a soil water balance model and a sequential data assimilation filter) against a series of linear models which perform the same function (i.e., have the same basic inputoutput structure) as the full system component. Benchmarking is based on the calculation of the lagged rank cross-correlation between the normalized difference vegetation index (NDVI) and soil moisture estimates acquired for various components of the system. Lagged soil moistureNDVI correlations obtained using individual LDAS components versus their linear analogs reveal the degree to which non-linearities andor complexities contained within each component actually contribute to the performance of the LDAS system as a whole. Here, a particular system based on surface soil moisture retrievals from the Land Parameter Retrieval Model (LPRM), a two-layer Palmer soil water balance model and an Ensemble Kalman filter (EnKF) is benchmarked. Results suggest significant room for improvement in each component of the system.

  18. A cross-sectional study of learning styles among continuing medical education participants.

    PubMed

    Collins, C Scott; Nanda, Sanjeev; Palmer, Brian A; Mohabbat, Arya B; Schleck, Cathy D; Mandrekar, Jayawant N; Mahapatra, Saswati; Beckman, Thomas J; Wittich, Christopher M

    2018-04-27

    Experiential learning has been suggested as a framework for planning continuing medical education (CME). We aimed to (1) determine participants' learning styles at traditional CME courses and (2) explore associations between learning styles and participant characteristics. Cross-sectional study of all participants (n = 393) at two Mayo Clinic CME courses who completed the Kolb Learning Style Inventory and provided demographic data. A total of 393 participants returned 241 surveys (response rate, 61.3%). Among the 143 participants (36.4%) who supplied complete demographic and Kolb data, Kolb learning styles included diverging (45; 31.5%), assimilating (56; 39.2%), converging (8; 5.6%), and accommodating (34; 23.8%). Associations existed between learning style and gender (p = 0.02). For most men, learning styles were diverging (23 of 63; 36.5%) and assimilating (30 of 63; 47.6%); for most women, diverging (22 of 80; 27.5%), assimilating (26 of 80; 32.5%), and accommodating (26 of 80; 32.5%). Internal medicine and psychiatry CME participants had diverse learning styles. Female participants had more variation in their learning styles than men. Teaching techniques must vary to appeal to all learners. The experiential learning theory sequentially moves a learner from Why? to What? to How? to If? to accommodate learning styles.

  19. Data assimilation for unsaturated flow models with restart adaptive probabilistic collocation based Kalman filter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Li, Weixuan; Zeng, Lingzao

    2016-06-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a relatively large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the polynomial chaos to approximate the original system. In this way, the sampling error can be reduced. However, PCKF suffers from the so-called "curse of dimensionality". When the system nonlinearity is strong and number of parameters is large, PCKF could be even more computationally expensive than EnKF. Motivated by most recent developments in uncertainty quantification, we proposemore » a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problems. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected. The "restart" technology is used to eliminate the inconsistency between model parameters and states. The performance of RAPCKF is tested with numerical cases of unsaturated flow models. It is shown that RAPCKF is more efficient than EnKF with the same computational cost. Compared with the traditional PCKF, the RAPCKF is more applicable in strongly nonlinear and high dimensional problems.« less

  20. Introduction of Parallel GPGPU Acceleration Algorithms for the Solution of Radiative Transfer

    NASA Technical Reports Server (NTRS)

    Godoy, William F.; Liu, Xu

    2011-01-01

    General-purpose computing on graphics processing units (GPGPU) is a recent technique that allows the parallel graphics processing unit (GPU) to accelerate calculations performed sequentially by the central processing unit (CPU). To introduce GPGPU to radiative transfer, the Gauss-Seidel solution of the well-known expressions for 1-D and 3-D homogeneous, isotropic media is selected as a test case. Different algorithms are introduced to balance memory and GPU-CPU communication, critical aspects of GPGPU. Results show that speed-ups of one to two orders of magnitude are obtained when compared to sequential solutions. The underlying value of GPGPU is its potential extension in radiative solvers (e.g., Monte Carlo, discrete ordinates) at a minimal learning curve.

  1. Performance evaluation of an asynchronous multisensor track fusion filter

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.; Gray, John E.; McCabe, D. H.

    2003-08-01

    Recently the authors developed a new filter that uses data generated by asynchronous sensors to produce a state estimate that is optimal in the minimum mean square sense. The solution accounts for communications delay between sensors platform and fusion center. It also deals with out of sequence data as well as latent data by processing the information in a batch-like manner. This paper compares, using simulated targets and Monte Carlo simulations, the performance of the filter to the optimal sequential processing approach. It was found that the new asynchronous Multisensor track fusion filter (AMSTFF) performance is identical to that of the extended sequential Kalman filter (SEKF), while the new filter updates its track at a much lower rate than the SEKF.

  2. A GPU-based large-scale Monte Carlo simulation method for systems with long-range interactions

    NASA Astrophysics Data System (ADS)

    Liang, Yihao; Xing, Xiangjun; Li, Yaohang

    2017-06-01

    In this work we present an efficient implementation of Canonical Monte Carlo simulation for Coulomb many body systems on graphics processing units (GPU). Our method takes advantage of the GPU Single Instruction, Multiple Data (SIMD) architectures, and adopts the sequential updating scheme of Metropolis algorithm. It makes no approximation in the computation of energy, and reaches a remarkable 440-fold speedup, compared with the serial implementation on CPU. We further use this method to simulate primitive model electrolytes, and measure very precisely all ion-ion pair correlation functions at high concentrations. From these data, we extract the renormalized Debye length, renormalized valences of constituent ions, and renormalized dielectric constants. These results demonstrate unequivocally physics beyond the classical Poisson-Boltzmann theory.

  3. Balanced Atmospheric Data Assimilation

    NASA Astrophysics Data System (ADS)

    Hastermann, Gottfried; Reinhardt, Maria; Klein, Rupert; Reich, Sebastian

    2017-04-01

    The atmosphere's multi-scale structure poses several major challenges in numerical weather prediction. One of these arises in the context of data assimilation. The large-scale dynamics of the atmosphere are balanced in the sense that acoustic or rapid internal wave oscillations generally come with negligibly small amplitudes. If triggered artificially, however, through inappropriate initialization or by data assimilation, such oscillations can have a detrimental effect on forecast quality as they interact with the moist aerothermodynamics of the atmosphere. In the setting of sequential Bayesian data assimilation, we therefore investigate two different strategies to reduce these artificial oscillations induced by the analysis step. On the one hand, we develop a new modification for a local ensemble transform Kalman filter, which penalizes imbalances via a minimization problem. On the other hand, we modify the first steps of the subsequent forecast to push the ensemble members back to the slow evolution. We therefore propose the use of certain asymptotically consistent integrators that can blend between the balanced and the unbalanced evolution model seamlessly. In our work, we furthermore present numerical results and performance of the proposed methods for two nonlinear ordinary differential equation models, where we can identify the different scales clearly. The first one is a Lorenz 96 model coupled with a wave equation. In this case the balance relation is linear and the imbalances are caused only by the localization of the filter. The second one is the elastic double pendulum where the balance relation itself is already highly nonlinear. In both cases the methods perform very well and could significantly reduce the imbalances and therefore increase the forecast quality of the slow variables.

  4. Impact of state updating and multi-parametric ensemble for streamflow hindcasting in European river basins

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Rakovec, O.; Kumar, R.; Samaniego, L. E.

    2015-12-01

    Accurate and reliable streamflow prediction is essential to mitigate social and economic damage coming from water-related disasters such as flood and drought. Sequential data assimilation (DA) may facilitate improved streamflow prediction using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. However, if parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by model ensemble may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we evaluate impacts of streamflow data assimilation over European river basins. Especially, a multi-parametric ensemble approach is tested to consider the effects of parametric uncertainty in DA. Because augmentation of parameters is not required within an assimilation window, the approach could be more stable with limited ensemble members and have potential for operational uses. To consider the response times and non-Gaussian characteristics of internal hydrologic processes, lagged particle filtering is utilized. The presentation will be focused on gains and limitations of streamflow data assimilation and multi-parametric ensemble method over large-scale basins.

  5. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  6. Adaptive Sequential Monte Carlo for Multiple Changepoint Analysis

    DOE PAGES

    Heard, Nicholas A.; Turcotte, Melissa J. M.

    2016-05-21

    Process monitoring and control requires detection of structural changes in a data stream in real time. This paper introduces an efficient sequential Monte Carlo algorithm designed for learning unknown changepoints in continuous time. The method is intuitively simple: new changepoints for the latest window of data are proposed by conditioning only on data observed since the most recent estimated changepoint, as these observations carry most of the information about the current state of the process. The proposed method shows improved performance over the current state of the art. Another advantage of the proposed algorithm is that it can be mademore » adaptive, varying the number of particles according to the apparent local complexity of the target changepoint probability distribution. This saves valuable computing time when changes in the changepoint distribution are negligible, and enables re-balancing of the importance weights of existing particles when a significant change in the target distribution is encountered. The plain and adaptive versions of the method are illustrated using the canonical continuous time changepoint problem of inferring the intensity of an inhomogeneous Poisson process, although the method is generally applicable to any changepoint problem. Performance is demonstrated using both conjugate and non-conjugate Bayesian models for the intensity. Lastly, appendices to the article are available online, illustrating the method on other models and applications.« less

  7. A Sequential Monte Carlo Approach for Streamflow Forecasting

    NASA Astrophysics Data System (ADS)

    Hsu, K.; Sorooshian, S.

    2008-12-01

    As alternatives to traditional physically-based models, Artificial Neural Network (ANN) models offer some advantages with respect to the flexibility of not requiring the precise quantitative mechanism of the process and the ability to train themselves from the data directly. In this study, an ANN model was used to generate one-day-ahead streamflow forecasts from the precipitation input over a catchment. Meanwhile, the ANN model parameters were trained using a Sequential Monte Carlo (SMC) approach, namely Regularized Particle Filter (RPF). The SMC approaches are known for their capabilities in tracking the states and parameters of a nonlinear dynamic process based on the Baye's rule and the proposed effective sampling and resampling strategies. In this study, five years of daily rainfall and streamflow measurement were used for model training. Variable sample sizes of RPF, from 200 to 2000, were tested. The results show that, after 1000 RPF samples, the simulation statistics, in terms of correlation coefficient, root mean square error, and bias, were stabilized. It is also shown that the forecasted daily flows fit the observations very well, with the correlation coefficient of higher than 0.95. The results of RPF simulations were also compared with those from the popular back-propagation ANN training approach. The pros and cons of using SMC approach and the traditional back-propagation approach will be discussed.

  8. Impact of Diurnal Variations of Precursors on the Prediction of Ozone

    NASA Astrophysics Data System (ADS)

    Hamer, P. D.; Bowman, K. W.; Henze, D. K.; Singh, K.

    2009-12-01

    Using a photochemical box model and its adjoint, constructed using the Kinetic Pre-Processor, we investigate the impacts of changing observational capacity, observation frequency and quality upon the ability to both understand and predict the nature of peak ozone events within a variety of polluted environments. The model consists of a chemical mechanism based on the Master Chemical Mechanism utilising 171 chemical species and 524 chemical reactions interacting with emissions, dry deposition and mixing schemes. The model was run under a variety of conditions designed to simulate a range of summertime polluted environments spanning a range of NOx and volatile organic compound regimes (VOCs). Using the forward model we were able to generate simulated atmospheric conditions representative of a particular polluted environment, which could in turn be used to generate a set of pseudo observations of key photochemical constituents. The model was then run under somewhat less polluted conditions to generate a background and then perturbed back towards the polluted trajectory using sequential data assimilation and the pseudo observations. Using a combination of the adjoint sensitivity analysis and the sequential data assimilation described here we assess the optimal time of observation and the diversity of observed chemical species required to provide acceptable forecast estimates of ozone concentrations. As the photochemical regime changes depending on NOx and VOC concentrations different observing strategies become favourable. The impact of using remote sensing based observations of the free tropospheric photochemical state are investigated to demonstrate the advantage of gaining knowledge of atmospheric trace gases away from the immediate photochemical environment.

  9. Initializing carbon cycle predictions from the Community Land Model by assimilating global biomass observations

    NASA Astrophysics Data System (ADS)

    Fox, A. M.; Hoar, T. J.; Smith, W. K.; Moore, D. J.

    2017-12-01

    The locations and longevity of terrestrial carbon sinks remain uncertain, however it is clear that in order to predict long-term climate changes the role of the biosphere in surface energy and carbon balance must be understood and incorporated into earth system models (ESMs). Aboveground biomass, the amount of carbon stored in vegetation, is a key component of the terrestrial carbon cycle, representing the balance of uptake through gross primary productivity (GPP), losses from respiration, senescence and mortality over hundreds of years. The best predictions of current and future land-atmosphere fluxes are likely from the integration of process-based knowledge contained in models and information from observations of changes in carbon stocks using data assimilation (DA). By exploiting long times series, it is possible to accurately detect variability and change in carbon cycle dynamics through monitoring ecosystem states, for example biomass derived from vegetation optical depth (VOD), and use this information to initialize models before making predictions. To make maximum use of information about the current state of global ecosystems when using models we have developed a system that combines the Community Land Model (CLM) with the Data Assimilation Research Testbed (DART), a community tool for ensemble DA. This DA system is highly innovative in its complexity, completeness and capabilities. Here we described a series of activities, using both Observation System Simulation Experiments (OSSEs) and real observations, that have allowed us to quantify the potential impact of assimilating VOD data into CLM-DART on future land-atmosphere fluxes. VOD data are particularly suitable to use in this activity due to their long temporal coverage and appropriate scale when combined with CLM, but their absolute values rely on many assumptions. Therefore, we have had to assess the implications of the VOD retrieval algorithms, with an emphasis on detecting uncertainty due to assumptions and inputs in the algorithms that are incompatible with those encoded within CLM. It is probable that VOD describes changes in biomass more accurately than absolute values, so in additional to sequential assimilation of observations, we have tested alternative filter algorithms, and assimilating VOD anomalies.

  10. Analysis and Assessment of Operation Risk for Hybrid AC/DC Power System based on the Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Hu, Xiaojing; Li, Qiang; Zhang, Hao; Guo, Ziming; Zhao, Kun; Li, Xinpeng

    2018-06-01

    Based on the Monte Carlo method, an improved risk assessment method for hybrid AC/DC power system with VSC station considering the operation status of generators, converter stations, AC lines and DC lines is proposed. According to the sequential AC/DC power flow algorithm, node voltage and line active power are solved, and then the operation risk indices of node voltage over-limit and line active power over-limit are calculated. Finally, an improved two-area IEEE RTS-96 system is taken as a case to analyze and assessment its operation risk. The results show that the proposed model and method can intuitively and directly reflect the weak nodes and weak lines of the system, which can provide some reference for the dispatching department.

  11. Population Annealing Monte Carlo for Frustrated Systems

    NASA Astrophysics Data System (ADS)

    Amey, Christopher; Machta, Jonathan

    Population annealing is a sequential Monte Carlo algorithm that efficiently simulates equilibrium systems with rough free energy landscapes such as spin glasses and glassy fluids. A large population of configurations is initially thermalized at high temperature and then cooled to low temperature according to an annealing schedule. The population is kept in thermal equilibrium at every annealing step via resampling configurations according to their Boltzmann weights. Population annealing is comparable to parallel tempering in terms of efficiency, but has several distinct and useful features. In this talk I will give an introduction to population annealing and present recent progress in understanding its equilibration properties and optimizing it for spin glasses. Results from large-scale population annealing simulations for the Ising spin glass in 3D and 4D will be presented. NSF Grant DMR-1507506.

  12. Improved Assimilation of Streamflow and Satellite Soil Moisture with the Evolutionary Particle Filter and Geostatistical Modeling

    NASA Astrophysics Data System (ADS)

    Yan, Hongxiang; Moradkhani, Hamid; Abbaszadeh, Peyman

    2017-04-01

    Assimilation of satellite soil moisture and streamflow data into hydrologic models using has received increasing attention over the past few years. Currently, these observations are increasingly used to improve the model streamflow and soil moisture predictions. However, the performance of this land data assimilation (DA) system still suffers from two limitations: 1) satellite data scarcity and quality; and 2) particle weight degeneration. In order to overcome these two limitations, we propose two possible solutions in this study. First, the general Gaussian geostatistical approach is proposed to overcome the limitation in the space/time resolution of satellite soil moisture products thus improving their accuracy at uncovered/biased grid cells. Secondly, an evolutionary PF approach based on Genetic Algorithm (GA) and Markov Chain Monte Carlo (MCMC), the so-called EPF-MCMC, is developed to further reduce weight degeneration and improve the robustness of the land DA system. This study provides a detailed analysis of the joint and separate assimilation of streamflow and satellite soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed EPF-MCMC and the general Gaussian geostatistical approach. Performance is assessed over several basins in the USA selected from Model Parameter Estimation Experiment (MOPEX) and located in different climate regions. The results indicate that: 1) the general Gaussian approach can predict the soil moisture at uncovered grid cells within the expected satellite data quality threshold; 2) assimilation of satellite soil moisture inferred from the general Gaussian model can significantly improve the soil moisture predictions; and 3) in terms of both deterministic and probabilistic measures, the EPF-MCMC can achieve better streamflow predictions. These results recommend that the geostatistical model is a helpful tool to aid the remote sensing technique and the EPF-MCMC is a reliable and effective DA approach in hydrologic applications.

  13. Improving Computational Efficiency of Prediction in Model-based Prognostics Using the Unscented Transform

    DTIC Science & Technology

    2010-10-01

    bodies becomes greater as surface as- perities wear down (Hutchings, 1992). We characterize friction damage by a change in the friction coefficient...points are such a set, and satisfy an additional constraint in which the skew ( third moment) is minimized, which reduces the average error for a...On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing, 10, 197–208. Hutchings, I. M. (1992). Tribology : friction

  14. Respiratory Nitrate Ammonification by Shewanella oneidensis MR-1▿

    PubMed Central

    Cruz-García, Claribel; Murray, Alison E.; Klappenbach, Joel A.; Stewart, Valley; Tiedje, James M.

    2007-01-01

    Anaerobic cultures of Shewanella oneidensis MR-1 grown with nitrate as the sole electron acceptor exhibited sequential reduction of nitrate to nitrite and then to ammonium. Little dinitrogen and nitrous oxide were detected, and no growth occurred on nitrous oxide. A mutant with the napA gene encoding periplasmic nitrate reductase deleted could not respire or assimilate nitrate and did not express nitrate reductase activity, confirming that the NapA enzyme is the sole nitrate reductase. Hence, S. oneidensis MR-1 conducts respiratory nitrate ammonification, also termed dissimilatory nitrate reduction to ammonium, but not respiratory denitrification. PMID:17098906

  15. Lead Isotope Compositions of Acid Residues from Olivine-Phyric Shergottite Tissint: Implications for Heterogeneous Shergottite Source Reservoirs

    NASA Technical Reports Server (NTRS)

    Moriwaki, R.; Usui, T.; Yokoyama, T.; Simon, J. I.; Jones, J. H.

    2015-01-01

    Geochemical studies of shergottites suggest that their parental magmas reflect mixtures between at least two distinct geochemical source reservoirs, producing correlations between radiogenic isotope compositions and trace element abundances. These correlations have been interpreted as indicating the presence of a reduced, incompatible element- depleted reservoir and an oxidized, incompatible- element-enriched reservoir. The former is clearly a depleted mantle source, but there is ongoing debate regarding the origin of the enriched reservoir. Two contrasting models have been proposed regarding the location and mixing process of the two geochemical source reservoirs: (1) assimilation of oxidized crust by mantle derived, reduced magmas, or (2) mixing of two distinct mantle reservoirs during melting. The former requires the ancient Martian crust to be the enriched source (crustal assimilation), whereas the latter requires isolation of a long-lived enriched mantle domain that probably originated from residual melts formed during solidification of a magma ocean (heterogeneous mantle model). This study conducts Pb isotope and trace element concentration analyses of sequential acid-leaching fractions (leachates and the final residues) from the geochemically depleted olivine-phyric shergottite Tissint. The results suggest that the Tissint magma is not isotopically uniform and sampled at least two geochemical source reservoirs, implying that either crustal assimilation or magma mixing would have played a role in the Tissint petrogenesis.

  16. Impact of Data Assimilation on Cost-Accuracy Tradeoff in Multi-Fidelity Models at the Example of an Infiltration Problem

    NASA Astrophysics Data System (ADS)

    Sinsbeck, Michael; Tartakovsky, Daniel

    2015-04-01

    Infiltration into top soil can be described by alternative models with different degrees of fidelity: Richards equation and the Green-Ampt model. These models typically contain uncertain parameters and forcings, rendering predictions of the state variables uncertain as well. Within the probabilistic framework, solutions of these models are given in terms of their probability density functions (PDFs) that, in the presence of data, can be treated as prior distributions. The assimilation of soil moisture data into model predictions, e.g., via a Bayesian updating of solution PDFs, poses a question of model selection: Given a significant difference in computational cost, is a lower-fidelity model preferable to its higher-fidelity counter-part? We investigate this question in the context of heterogeneous porous media, whose hydraulic properties are uncertain. While low-fidelity (reduced-complexity) models introduce a model error, their moderate computational cost makes it possible to generate more realizations, which reduces the (e.g., Monte Carlo) sampling or stochastic error. The ratio between these two errors determines the model with the smallest total error. We found assimilation of measurements of a quantity of interest (the soil moisture content, in our example) to decrease the model error, increasing the probability that the predictive accuracy of a reduced-complexity model does not fall below that of its higher-fidelity counterpart.

  17. Skill (or lack thereof) of data-model fusion techniques to provide an early warning signal for an approaching tipping point.

    PubMed

    Singh, Riddhi; Quinn, Julianne D; Reed, Patrick M; Keller, Klaus

    2018-01-01

    Many coupled human-natural systems have the potential to exhibit a highly nonlinear threshold response to external forcings resulting in fast transitions to undesirable states (such as eutrophication in a lake). Often, there are considerable uncertainties that make identifying the threshold challenging. Thus, rapid learning is critical for guiding management actions to avoid abrupt transitions. Here, we adopt the shallow lake problem as a test case to compare the performance of four common data assimilation schemes to predict an approaching transition. In order to demonstrate the complex interactions between management strategies and the ability of the data assimilation schemes to predict eutrophication, we also analyze our results across two different management strategies governing phosphorus emissions into the shallow lake. The compared data assimilation schemes are: ensemble Kalman filtering (EnKF), particle filtering (PF), pre-calibration (PC), and Markov Chain Monte Carlo (MCMC) estimation. While differing in their core assumptions, each data assimilation scheme is based on Bayes' theorem and updates prior beliefs about a system based on new information. For large computational investments, EnKF, PF and MCMC show similar skill in capturing the observed phosphorus in the lake (measured as expected root mean squared prediction error). EnKF, followed by PF, displays the highest learning rates at low computational cost, thus providing a more reliable signal of an impending transition. MCMC approaches the true probability of eutrophication only after a strong signal of an impending transition emerges from the observations. Overall, we find that learning rates are greatest near regions of abrupt transitions, posing a challenge to early learning and preemptive management of systems with such abrupt transitions.

  18. Skill (or lack thereof) of data-model fusion techniques to provide an early warning signal for an approaching tipping point

    PubMed Central

    Quinn, Julianne D.; Reed, Patrick M.; Keller, Klaus

    2018-01-01

    Many coupled human-natural systems have the potential to exhibit a highly nonlinear threshold response to external forcings resulting in fast transitions to undesirable states (such as eutrophication in a lake). Often, there are considerable uncertainties that make identifying the threshold challenging. Thus, rapid learning is critical for guiding management actions to avoid abrupt transitions. Here, we adopt the shallow lake problem as a test case to compare the performance of four common data assimilation schemes to predict an approaching transition. In order to demonstrate the complex interactions between management strategies and the ability of the data assimilation schemes to predict eutrophication, we also analyze our results across two different management strategies governing phosphorus emissions into the shallow lake. The compared data assimilation schemes are: ensemble Kalman filtering (EnKF), particle filtering (PF), pre-calibration (PC), and Markov Chain Monte Carlo (MCMC) estimation. While differing in their core assumptions, each data assimilation scheme is based on Bayes’ theorem and updates prior beliefs about a system based on new information. For large computational investments, EnKF, PF and MCMC show similar skill in capturing the observed phosphorus in the lake (measured as expected root mean squared prediction error). EnKF, followed by PF, displays the highest learning rates at low computational cost, thus providing a more reliable signal of an impending transition. MCMC approaches the true probability of eutrophication only after a strong signal of an impending transition emerges from the observations. Overall, we find that learning rates are greatest near regions of abrupt transitions, posing a challenge to early learning and preemptive management of systems with such abrupt transitions. PMID:29389938

  19. Geochemical evolution of Jurassic diorites from the Bristol Lake region, California, USA, and the role of assimilation

    USGS Publications Warehouse

    Young, E.D.; Wooden, J.L.; Shieh, Y.-N.; Farber, D.

    1992-01-01

    Late Jurassic dioritic plutons from the Bristol Lake region of the eastern Mojave Desert share several geochemical attributes with high-alumina basalts, continental hawaiite basalts, and high-K are andesites including: high K2O concentrations; high Al2O3 (16-19 weight %); elevated Zr/TiO2; LREE (light-rare-earth-element) enrichment (La/YbCN=6.3-13.3); and high Nb. Pearce element ratio analysis supported by petrographic relations demonstrates that P, Hf, and Zr were conserved during differentiation. Abundances of conserved elements suggest that dioritic plutons from neighboring ranges were derived from similar parental melts. In the most voluminous suite, correlated variations in elemental concentrations and (87Sr/86Sr)i indicate differentiation by fractional crystallization of hornblende and plagioclase combined with assimilation of a component characterized by abundant radiogenic Sr. Levenberg-Marquardt and Monte Carlo techniques were used to obtain optimal solutions to non-linear inverse models for fractional crystallization-assimilation processes. Results show that the assimilated material was chemically analogous to lower crustal mafic granulites and that the mass ratio of contaminant to parental magma was on the order of 0.1. Lack of enrichment in 18O with differentiation is consistent with the model results. Elemental concentrations and O, Sr, and Nd isotopic data point to a hydrous REE-enriched subcontinental lithospheric source similar to that which produced some Cenozoic continental hawaiites from the southern Cordillera. Isotopic compositions of associated granitoids suggest that partial melting of this subcontinental lithosphere may have been an important process in the development of the Late Jurassic plutonic arc of the eastern Mojave Desert. ?? 1992 Springer-Verlag.

  20. Simulation of the Transport and Dispersion of Perfluorocarbon Tracers Released in Texas Using multiple Assimilated Meteorological Wind Fields

    NASA Astrophysics Data System (ADS)

    Schichtel, B.; Barna, M.; Gebhart, K.; Green, M.

    2002-12-01

    The Big Bend Regional Aerosol and Visibility Observational Study (BRAVO) was designed to determine the causes of visibility impairment at Big Bend National Park, located in southwestern Texas. As part of BRAVO, an intensive field study was conducted during July-October 1999. Among the features of this study was the release of unique perfluorocarbon tracers from four sites within Texas, representative of industrial/urban locations. These tracers were monitored at 21 sites, throughout Texas. Other measurements collected during the field study included upper-level winds using radar profilers, and speciated fine-particulate mass concentrations. MM5 was used to simulate the regional meteorology during BRAVO, and was run in non-hydrostatic mode using a continental-scale 36km domain with nested 12km and 4km domains. MM5 employed observational nudging by incorporating the available measured wind data from the National Weather Service and data from the radar wind profilers. Meteorological data from the National Weather Service's Eta Data Assimilation System (EDAS), archived at 80km grid spacing, were also available. Several models are being used to evaluate airmass transport to Big Bend, including CMAQ, REMSAD, HYSPLIT and the CAPITA Monte Carlo Model. This combination of tracer data, meteorological data and deployment of four models provides a unique opportunity to assess the ability of the model/wind field combinations to properly simulate the regional scale atmospheric transport and dispersion of trace gases over distances of 100 to 800km. This paper will present the tracer simulations from REMSAD using the 36 and 12 km MM5 wind fields, and results from HYSPLIT and the Monte Carlo model driven by the 36km MM5 and 80km EDAS wind fields. Preliminary results from HYSPLIT and the Monte Carlo model driven by the EDAS wind fields shows that these models are able to account for the primary features of tracer concentrations patterns in the Big Bend area. However, at times the simulated concentration peaks proceeded or followed the actual measured concentrations by about at day and the duration of the simulated tracer impacts were shorter than those measured in the Big Bend area.

  1. A comparison of Monte Carlo-based Bayesian parameter estimation methods for stochastic models of genetic networks

    PubMed Central

    Zaikin, Alexey; Míguez, Joaquín

    2017-01-01

    We compare three state-of-the-art Bayesian inference methods for the estimation of the unknown parameters in a stochastic model of a genetic network. In particular, we introduce a stochastic version of the paradigmatic synthetic multicellular clock model proposed by Ullner et al., 2007. By introducing dynamical noise in the model and assuming that the partial observations of the system are contaminated by additive noise, we enable a principled mechanism to represent experimental uncertainties in the synthesis of the multicellular system and pave the way for the design of probabilistic methods for the estimation of any unknowns in the model. Within this setup, we tackle the Bayesian estimation of a subset of the model parameters. Specifically, we compare three Monte Carlo based numerical methods for the approximation of the posterior probability density function of the unknown parameters given a set of partial and noisy observations of the system. The schemes we assess are the particle Metropolis-Hastings (PMH) algorithm, the nonlinear population Monte Carlo (NPMC) method and the approximate Bayesian computation sequential Monte Carlo (ABC-SMC) scheme. We present an extensive numerical simulation study, which shows that while the three techniques can effectively solve the problem there are significant differences both in estimation accuracy and computational efficiency. PMID:28797087

  2. Sequential Monte Carlo for Maximum Weight Subgraphs with Application to Solving Image Jigsaw Puzzles.

    PubMed

    Adluru, Nagesh; Yang, Xingwei; Latecki, Longin Jan

    2015-05-01

    We consider a problem of finding maximum weight subgraphs (MWS) that satisfy hard constraints in a weighted graph. The constraints specify the graph nodes that must belong to the solution as well as mutual exclusions of graph nodes, i.e., pairs of nodes that cannot belong to the same solution. Our main contribution is a novel inference approach for solving this problem in a sequential monte carlo (SMC) sampling framework. Usually in an SMC framework there is a natural ordering of the states of the samples. The order typically depends on observations about the states or on the annealing setup used. In many applications (e.g., image jigsaw puzzle problems), all observations (e.g., puzzle pieces) are given at once and it is hard to define a natural ordering. Therefore, we relax the assumption of having ordered observations about states and propose a novel SMC algorithm for obtaining maximum a posteriori estimate of a high-dimensional posterior distribution. This is achieved by exploring different orders of states and selecting the most informative permutations in each step of the sampling. Our experimental results demonstrate that the proposed inference framework significantly outperforms loopy belief propagation in solving the image jigsaw puzzle problem. In particular, our inference quadruples the accuracy of the puzzle assembly compared to that of loopy belief propagation.

  3. Sequential Monte Carlo for Maximum Weight Subgraphs with Application to Solving Image Jigsaw Puzzles

    PubMed Central

    Adluru, Nagesh; Yang, Xingwei; Latecki, Longin Jan

    2015-01-01

    We consider a problem of finding maximum weight subgraphs (MWS) that satisfy hard constraints in a weighted graph. The constraints specify the graph nodes that must belong to the solution as well as mutual exclusions of graph nodes, i.e., pairs of nodes that cannot belong to the same solution. Our main contribution is a novel inference approach for solving this problem in a sequential monte carlo (SMC) sampling framework. Usually in an SMC framework there is a natural ordering of the states of the samples. The order typically depends on observations about the states or on the annealing setup used. In many applications (e.g., image jigsaw puzzle problems), all observations (e.g., puzzle pieces) are given at once and it is hard to define a natural ordering. Therefore, we relax the assumption of having ordered observations about states and propose a novel SMC algorithm for obtaining maximum a posteriori estimate of a high-dimensional posterior distribution. This is achieved by exploring different orders of states and selecting the most informative permutations in each step of the sampling. Our experimental results demonstrate that the proposed inference framework significantly outperforms loopy belief propagation in solving the image jigsaw puzzle problem. In particular, our inference quadruples the accuracy of the puzzle assembly compared to that of loopy belief propagation. PMID:26052182

  4. Bulk rock composition and geochemistry of olivine-hosted melt inclusions in the Grey Porri Tuff and selected lavas of the Monte dei Porri volcano, Salina, Aeolian Islands, southern Italy

    USGS Publications Warehouse

    Doherty, Angela L.; Bodnar, Robert J.; De Vivo, Benedetto; Bohrson, Wendy A.; Belkin, Harvey E.; Messina, Antonia; Tracy, Robert J.

    2012-01-01

    The Aeolian Islands are an arcuate chain of submarine seamounts and volcanic islands, lying just north of Sicily in southern Italy. The second largest of the islands, Salina, exhibits a wide range of compositional variation in its erupted products, from basaltic lavas to rhyolitic pumice. The Monte dei Porri eruptions occurred between 60 ka and 30 ka, following a period of approximately 60,000 years of repose. The bulk rock composition of the Monte dei Porri products range from basaltic-andesite scoria to andesitic pumice in the Grey Porri Tuff (GPT), with the Monte dei Porri lavas having basaltic-andesite compositions. The typical mineral assemblage of the GPT is calcic plagioclase, clinopyroxene (augite), olivine (Fo72−84) and orthopyroxene (enstatite) ± amphibole and Ti-Fe oxides. The lava units show a similar mineral assemblage, but contain lower Fo olivines (Fo57−78). The lava units also contain numerous glomerocrysts, including an unusual variety that contains quartz, K-feldspar and mica. Melt inclusions (MI) are ubiquitous in all mineral phases from all units of the Monte dei Porri eruptions; however, only data from olivine-hosted MI in the GPT are reported here. Compositions of MI in the GPT are typically basaltic (average SiO2 of 49.8 wt %) in the pumices and basaltic-andesite (average SiO2 of 55.6 wt %) in the scoriae and show a bimodal distribution in most compositional discrimination plots. The compositions of most of the MI in the scoriae overlap with bulk rock compositions of the lavas. Petrological and geochemical evidence suggest that mixing of one or more magmas and/or crustal assimilation played a role in the evolution of the Monte dei Porri magmatic system, especially the GPT. Analyses of the more evolved mineral phases are required to better constrain the evolution of the magma.

  5. Data Assimilation and Propagation of Uncertainty in Multiscale Cardiovascular Simulation

    NASA Astrophysics Data System (ADS)

    Schiavazzi, Daniele; Marsden, Alison

    2015-11-01

    Cardiovascular modeling is the application of computational tools to predict hemodynamics. State-of-the-art techniques couple a 3D incompressible Navier-Stokes solver with a boundary circulation model and can predict local and peripheral hemodynamics, analyze the post-operative performance of surgical designs and complement clinical data collection minimizing invasive and risky measurement practices. The ability of these tools to make useful predictions is directly related to their accuracy in representing measured physiologies. Tuning of model parameters is therefore a topic of paramount importance and should include clinical data uncertainty, revealing how this uncertainty will affect the predictions. We propose a fully Bayesian, multi-level approach to data assimilation of uncertain clinical data in multiscale circulation models. To reduce the computational cost, we use a stable, condensed approximation of the 3D model build by linear sparse regression of the pressure/flow rate relationship at the outlets. Finally, we consider the problem of non-invasively propagating the uncertainty in model parameters to the resulting hemodynamics and compare Monte Carlo simulation with Stochastic Collocation approaches based on Polynomial or Multi-resolution Chaos expansions.

  6. Intra-individual diagnostic image quality and organ-specific-radiation dose comparison between spiral cCT with iterative image reconstruction and z-axis automated tube current modulation and sequential cCT.

    PubMed

    Wenz, Holger; Maros, Máté E; Meyer, Mathias; Gawlitza, Joshua; Förster, Alex; Haubenreisser, Holger; Kurth, Stefan; Schoenberg, Stefan O; Groden, Christoph; Henzler, Thomas

    2016-01-01

    To prospectively evaluate image quality and organ-specific-radiation dose of spiral cranial CT (cCT) combined with automated tube current modulation (ATCM) and iterative image reconstruction (IR) in comparison to sequential tilted cCT reconstructed with filtered back projection (FBP) without ATCM. 31 patients with a previous performed tilted non-contrast enhanced sequential cCT aquisition on a 4-slice CT system with only FBP reconstruction and no ATCM were prospectively enrolled in this study for a clinical indicated cCT scan. All spiral cCT examinations were performed on a 3rd generation dual-source CT system using ATCM in z-axis direction. Images were reconstructed using both, FBP and IR (level 1-5). A Monte-Carlo-simulation-based analysis was used to compare organ-specific-radiation dose. Subjective image quality for various anatomic structures was evaluated using a 4-point Likert-scale and objective image quality was evaluated by comparing signal-to-noise ratios (SNR). Spiral cCT led to a significantly lower (p < 0.05) organ-specific-radiation dose in all targets including eye lense. Subjective image quality of spiral cCT datasets with an IR reconstruction level 5 was rated significantly higher compared to the sequential cCT acquisitions (p < 0.0001). Consecutive mean SNR was significantly higher in all spiral datasets (FBP, IR 1-5) when compared to sequential cCT with a mean SNR improvement of 44.77% (p < 0.0001). Spiral cCT combined with ATCM and IR allows for significant-radiation dose reduction including a reduce eye lens organ-dose when compared to a tilted sequential cCT while improving subjective and objective image quality.

  7. Sequential Data Assimilation for Seismicity: a Proof of Concept

    NASA Astrophysics Data System (ADS)

    van Dinther, Ylona; Kuensch, Hans Rudolf; Fichtner, Andreas

    2017-04-01

    Integrating geological and geophysical observations, laboratory results and physics-based numerical modeling is crucial to improve our understanding of the occurrence of large subduction earthquakes. How to do this integration is less obvious, especially in light of the scarcity and uncertainty of natural and laboratory data and the difficulty of modeling the physics governing earthquakes. One way to efficiently combine information from these sources in order to estimate states and/or parameters is data assimilation, a mathematically sound framework extensively developed for weather forecasting purposes. We demonstrate the potential of using data assimilation by applying an Ensemble Kalman Filter to recover the current and forecast the future state of stress and strength on the megathrust based on data from a single borehole. Data and its errors are for the first time assimilated to - using the least-squares solution of Bayes theorem - update a Partial Differential Equation-driven seismic cycle model. This visco-elasto-plastic continuum forward model solves Navier-Stokes equations with a rate-dependent friction coefficient. To prove this concept we perform a perfect model test in an analogue subduction zone setting. Synthetic numerical data from a single analogue borehole are assimilated into 150 ensemble models. Since we know the true state of the numerical data model, a quantitative and qualitative evaluation shows that meaningful information on the stress and strength is available, even when only data from a single borehole is assimilated over only a part of a seismic cycle. This is possible, since the sampled error covariance matrix contains prior information on the physics that relates velocities, stresses, and pressures at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed in such a way that fault coupling can be updated to either inhibit or trigger events. In the subsequent forward propagation step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next analogue earthquake. At the next assimilation step(s), the systems forecasting ability turns out to be distinctly better than using a periodic model to forecast this simple, quasi-periodic sequence. Combining our knowledge of physical laws with observations thus seems to be a useful tool that could be used to improve probabilistic seismic hazard assessment and increase our physical understanding of the spatiotemporal occurrence of earthquakes, subduction zones, and other Solid Earth systems.

  8. Carbon cycling of European croplands: A framework for the assimilation of optical and microwave Earth observation data

    NASA Astrophysics Data System (ADS)

    Revill, Andrew; Sus, Oliver; Williams, Mathew

    2013-04-01

    Croplands are traditionally managed to maximise the production of food, feed, fibre and bioenergy. Advancements in agricultural technologies, together with land-use change, have approximately doubled World grain harvests over the past 50 years. Cropland ecosystems also play a significant role in the global carbon (C) cycle and, through changes to C storage in response to management activities, they can provide opportunities for climate change mitigation. However, quantifying and understanding the cropland C cycle is complex, due to variable environmental drivers, varied management practices and often highly heterogeneous landscapes. Efforts to upscale processes using simulation models must resolve these challenges. Here we show how data assimilation (DA) approaches can link C cycle modelling to Earth observation (EO) and reduce uncertainty in upscaling. We evaluate a framework for the assimilation of leaf area index (LAI) time series, empirically derived from EO optical and radar sensors, for state-updating a model of crop development and C fluxes. Sensors are selected with fine spatial resolutions (20-50 m) to resolve variability across field sizes typically used in European agriculture. Sequential DA is used to improve the canopy development simulation, which is validated by comparing time-series LAI and net ecosystem exchange (NEE) predictions to independent ground measurements and eddy covariance observations at multiple European cereal crop sites. Significant empirical relationships were established between the LAI ground measurements and the optical reflectance and radar backscatter, which allowed for single LAI calibrations being valid for all the cropland sites for each sensor. The DA of all EO LAI estimates results indicated clear adjustments in LAI and an enhanced representation of daily CO2 exchanges, particularly around the time of peak C uptake. Compared to the simulation without DA, the assimilation of all EO LAI estimates improved the predicted at-harvest cumulative NEE for all cropland sites by an average of 69%. The use of radar sensors, being relatively unaffected by cloud cover and sensitive to the structural properties of the crop, significantly improves the analyses when compared to the combined, and individual, use of the optical LAI estimates. When assimilating the radar derived LAI only, the estimated at-harvest cumulative NEE was improved by 79% when compared to the simulation without DA. Future developments would include the spatial upscaling of the existing model framework and the assimilation of additional state variables, such as soil moisture.

  9. A parallel computational model for GATE simulations.

    PubMed

    Rannou, F R; Vega-Acevedo, N; El Bitar, Z

    2013-12-01

    GATE/Geant4 Monte Carlo simulations are computationally demanding applications, requiring thousands of processor hours to produce realistic results. The classical strategy of distributing the simulation of individual events does not apply efficiently for Positron Emission Tomography (PET) experiments, because it requires a centralized coincidence processing and large communication overheads. We propose a parallel computational model for GATE that handles event generation and coincidence processing in a simple and efficient way by decentralizing event generation and processing but maintaining a centralized event and time coordinator. The model is implemented with the inclusion of a new set of factory classes that can run the same executable in sequential or parallel mode. A Mann-Whitney test shows that the output produced by this parallel model in terms of number of tallies is equivalent (but not equal) to its sequential counterpart. Computational performance evaluation shows that the software is scalable and well balanced. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Testing particle filters on convective scale dynamics

    NASA Astrophysics Data System (ADS)

    Haslehner, Mylene; Craig, George. C.; Janjic, Tijana

    2014-05-01

    Particle filters have been developed in recent years to deal with highly nonlinear dynamics and non Gaussian error statistics that also characterize data assimilation on convective scales. In this work we explore the use of the efficient particle filter (P.v. Leeuwen, 2011) for convective scale data assimilation application. The method is tested in idealized setting, on two stochastic models. The models were designed to reproduce some of the properties of convection, for example the rapid development and decay of convective clouds. The first model is a simple one-dimensional, discrete state birth-death model of clouds (Craig and Würsch, 2012). For this model, the efficient particle filter that includes nudging the variables shows significant improvement compared to Ensemble Kalman Filter and Sequential Importance Resampling (SIR) particle filter. The success of the combination of nudging and resampling, measured as RMS error with respect to the 'true state', is proportional to the nudging intensity. Significantly, even a very weak nudging intensity brings notable improvement over SIR. The second model is a modified version of a stochastic shallow water model (Würsch and Craig 2013), which contains more realistic dynamical characteristics of convective scale phenomena. Using the efficient particle filter and different combination of observations of the three field variables (wind, water 'height' and rain) allows the particle filter to be evaluated in comparison to a regime where only nudging is used. Sensitivity to the properties of the model error covariance is also considered. Finally, criteria are identified under which the efficient particle filter outperforms nudging alone. References: Craig, G. C. and M. Würsch, 2012: The impact of localization and observation averaging for convective-scale data assimilation in a simple stochastic model. Q. J. R. Meteorol. Soc.,139, 515-523. Van Leeuwen, P. J., 2011: Efficient non-linear data assimilation in geophysical fluid dynamics. - Computers and Fluids, doi:10,1016/j.compfluid.2010.11.011, 1096 2011. Würsch, M. and G. C. Craig, 2013: A simple dynamical model of cumulus convection for data assimilation research, submitted to Met. Zeitschrift.

  11. Assessing the benefit of snow data assimilation for runoff modeling in Alpine catchments

    NASA Astrophysics Data System (ADS)

    Griessinger, Nena; Seibert, Jan; Magnusson, Jan; Jonas, Tobias

    2016-09-01

    In Alpine catchments, snowmelt is often a major contribution to runoff. Therefore, modeling snow processes is important when concerned with flood or drought forecasting, reservoir operation and inland waterway management. In this study, we address the question of how sensitive hydrological models are to the representation of snow cover dynamics and whether the performance of a hydrological model can be enhanced by integrating data from a dedicated external snow monitoring system. As a framework for our tests we have used the hydrological model HBV (Hydrologiska Byråns Vattenbalansavdelning) in the version HBV-light, which has been applied in many hydrological studies and is also in use for operational purposes. While HBV originally follows a temperature-index approach with time-invariant calibrated degree-day factors to represent snowmelt, in this study the HBV model was modified to use snowmelt time series from an external and spatially distributed snow model as model input. The external snow model integrates three-dimensional sequential assimilation of snow monitoring data with a snowmelt model, which is also based on the temperature-index approach but uses a time-variant degree-day factor. The following three variations of this external snow model were applied: (a) the full model with assimilation of observational snow data from a dense monitoring network, (b) the same snow model but with data assimilation switched off and (c) a downgraded version of the same snow model representing snowmelt with a time-invariant degree-day factor. Model runs were conducted for 20 catchments at different elevations within Switzerland for 15 years. Our results show that at low and mid-elevations the performance of the runoff simulations did not vary considerably with the snow model version chosen. At higher elevations, however, best performance in terms of simulated runoff was obtained when using the snowmelt time series from the snow model, which utilized data assimilation. This was especially true for snow-rich years. These findings suggest that with increasing elevation and the correspondingly increased contribution of snowmelt to runoff, the accurate estimation of snow water equivalent (SWE) and snowmelt rates has gained importance.

  12. Simultaneous perceptual and response biases on sequential face attractiveness judgments

    PubMed Central

    Pegors, Teresa K.; Mattar, Marcelo G.; Bryan, Peter B.; Epstein, Russell A.

    2015-01-01

    Face attractiveness is a social characteristic that we often use to make first-pass judgments about the people around us. However, these judgments are highly influenced by our surrounding social world, and researchers still understand little about the mechanisms underlying these influences. In a series of three experiments, we used a novel sequential rating paradigm that enabled us to measure biases on attractiveness judgments from the previous face and the previous rating. Our results revealed two simultaneous and opposing influences on face attractiveness judgments that arise from our past experience of faces: a response bias in which attractiveness ratings shift towards a previously given rating, and a stimulus bias in which attractiveness ratings shift away from the mean attractiveness of the previous face. Furthermore, we provide evidence that the contrastive stimulus bias (but not the assimilative response bias) is strengthened by increasing the duration of the previous stimulus, suggesting an underlying perceptual mechanism. These results demonstrate that judgments of face attractiveness are influenced by information from our evaluative and perceptual history and that these influences have measurable behavioral effects over the course of just a few seconds. PMID:25867223

  13. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  14. Application of Multi-Hypothesis Sequential Monte Carlo for Breakup Analysis

    NASA Astrophysics Data System (ADS)

    Faber, W. R.; Zaidi, W.; Hussein, I. I.; Roscoe, C. W. T.; Wilkins, M. P.; Schumacher, P. W., Jr.

    As more objects are launched into space, the potential for breakup events and space object collisions is ever increasing. These events create large clouds of debris that are extremely hazardous to space operations. Providing timely, accurate, and statistically meaningful Space Situational Awareness (SSA) data is crucial in order to protect assets and operations in space. The space object tracking problem, in general, is nonlinear in both state dynamics and observations, making it ill-suited to linear filtering techniques such as the Kalman filter. Additionally, given the multi-object, multi-scenario nature of the problem, space situational awareness requires multi-hypothesis tracking and management that is combinatorially challenging in nature. In practice, it is often seen that assumptions of underlying linearity and/or Gaussianity are used to provide tractable solutions to the multiple space object tracking problem. However, these assumptions are, at times, detrimental to tracking data and provide statistically inconsistent solutions. This paper details a tractable solution to the multiple space object tracking problem applicable to space object breakup events. Within this solution, simplifying assumptions of the underlying probability density function are relaxed and heuristic methods for hypothesis management are avoided. This is done by implementing Sequential Monte Carlo (SMC) methods for both nonlinear filtering as well as hypothesis management. This goal of this paper is to detail the solution and use it as a platform to discuss computational limitations that hinder proper analysis of large breakup events.

  15. Multilevel Sequential2 Monte Carlo for Bayesian inverse problems

    NASA Astrophysics Data System (ADS)

    Latz, Jonas; Papaioannou, Iason; Ullmann, Elisabeth

    2018-09-01

    The identification of parameters in mathematical models using noisy observations is a common task in uncertainty quantification. We employ the framework of Bayesian inversion: we combine monitoring and observational data with prior information to estimate the posterior distribution of a parameter. Specifically, we are interested in the distribution of a diffusion coefficient of an elliptic PDE. In this setting, the sample space is high-dimensional, and each sample of the PDE solution is expensive. To address these issues we propose and analyse a novel Sequential Monte Carlo (SMC) sampler for the approximation of the posterior distribution. Classical, single-level SMC constructs a sequence of measures, starting with the prior distribution, and finishing with the posterior distribution. The intermediate measures arise from a tempering of the likelihood, or, equivalently, a rescaling of the noise. The resolution of the PDE discretisation is fixed. In contrast, our estimator employs a hierarchy of PDE discretisations to decrease the computational cost. We construct a sequence of intermediate measures by decreasing the temperature or by increasing the discretisation level at the same time. This idea builds on and generalises the multi-resolution sampler proposed in P.S. Koutsourelakis (2009) [33] where a bridging scheme is used to transfer samples from coarse to fine discretisation levels. Importantly, our choice between tempering and bridging is fully adaptive. We present numerical experiments in 2D space, comparing our estimator to single-level SMC and the multi-resolution sampler.

  16. Hydrologic Remote Sensing and Land Surface Data Assimilation.

    PubMed

    Moradkhani, Hamid

    2008-05-06

    Accurate, reliable and skillful forecasting of key environmental variables such as soil moisture and snow are of paramount importance due to their strong influence on many water resources applications including flood control, agricultural production and effective water resources management which collectively control the behavior of the climate system. Soil moisture is a key state variable in land surface-atmosphere interactions affecting surface energy fluxes, runoff and the radiation balance. Snow processes also have a large influence on land-atmosphere energy exchanges due to snow high albedo, low thermal conductivity and considerable spatial and temporal variability resulting in the dramatic change on surface and ground temperature. Measurement of these two variables is possible through variety of methods using ground-based and remote sensing procedures. Remote sensing, however, holds great promise for soil moisture and snow measurements which have considerable spatial and temporal variability. Merging these measurements with hydrologic model outputs in a systematic and effective way results in an improvement of land surface model prediction. Data Assimilation provides a mechanism to combine these two sources of estimation. Much success has been attained in recent years in using data from passive microwave sensors and assimilating them into the models. This paper provides an overview of the remote sensing measurement techniques for soil moisture and snow data and describes the advances in data assimilation techniques through the ensemble filtering, mainly Ensemble Kalman filter (EnKF) and Particle filter (PF), for improving the model prediction and reducing the uncertainties involved in prediction process. It is believed that PF provides a complete representation of the probability distribution of state variables of interests (according to sequential Bayes law) and could be a strong alternative to EnKF which is subject to some limitations including the linear updating rule and assumption of jointly normal distribution of errors in state variables and observation.

  17. Catchment Tomography - Joint Estimation of Surface Roughness and Hydraulic Conductivity with the EnKF

    NASA Astrophysics Data System (ADS)

    Baatz, D.; Kurtz, W.; Hendricks Franssen, H. J.; Vereecken, H.; Kollet, S. J.

    2017-12-01

    Parameter estimation for physically based, distributed hydrological models becomes increasingly challenging with increasing model complexity. The number of parameters is usually large and the number of observations relatively small, which results in large uncertainties. A moving transmitter - receiver concept to estimate spatially distributed hydrological parameters is presented by catchment tomography. In this concept, precipitation, highly variable in time and space, serves as a moving transmitter. As response to precipitation, runoff and stream discharge are generated along different paths and time scales, depending on surface and subsurface flow properties. Stream water levels are thus an integrated signal of upstream parameters, measured by stream gauges which serve as the receivers. These stream water level observations are assimilated into a distributed hydrological model, which is forced with high resolution, radar based precipitation estimates. Applying a joint state-parameter update with the Ensemble Kalman Filter, the spatially distributed Manning's roughness coefficient and saturated hydraulic conductivity are estimated jointly. The sequential data assimilation continuously integrates new information into the parameter estimation problem, especially during precipitation events. Every precipitation event constrains the possible parameter space. In the approach, forward simulations are performed with ParFlow, a variable saturated subsurface and overland flow model. ParFlow is coupled to the Parallel Data Assimilation Framework for the data assimilation and the joint state-parameter update. In synthetic, 3-dimensional experiments including surface and subsurface flow, hydraulic conductivity and the Manning's coefficient are efficiently estimated with the catchment tomography approach. A joint update of the Manning's coefficient and hydraulic conductivity tends to improve the parameter estimation compared to a single parameter update, especially in cases of biased initial parameter ensembles. The computational experiments additionally show to which degree of spatial heterogeneity and to which degree of uncertainty of subsurface flow parameters the Manning's coefficient and hydraulic conductivity can be estimated efficiently.

  18. Performance Evaluation of EnKF-based Hydrogeological Site Characterization using Color Coherent Vectors

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.

    2017-12-01

    Complexity of hydrogeological systems arises from the multi-scale heterogeneity and insufficient measurements of their underlying parameters such as hydraulic conductivity and porosity. An inadequate characterization of hydrogeological properties can significantly decrease the trustworthiness of numerical models that predict groundwater flow and solute transport. Therefore, a variety of data assimilation methods have been proposed in order to estimate hydrogeological parameters from spatially scarce data by incorporating the governing physical models. In this work, we propose a novel framework for evaluating the performance of these estimation methods. We focus on the Ensemble Kalman Filter (EnKF) approach that is a widely used data assimilation technique. It reconciles multiple sources of measurements to sequentially estimate model parameters such as the hydraulic conductivity. Several methods have been used in the literature to quantify the accuracy of the estimations obtained by EnKF, including Rank Histograms, RMSE and Ensemble Spread. However, these commonly used methods do not regard the spatial information and variability of geological formations. This can cause hydraulic conductivity fields with very different spatial structures to have similar histograms or RMSE. We propose a vision-based approach that can quantify the accuracy of estimations by considering the spatial structure embedded in the estimated fields. Our new approach consists of adapting a new metric, Color Coherent Vectors (CCV), to evaluate the accuracy of estimated fields achieved by EnKF. CCV is a histogram-based technique for comparing images that incorporate spatial information. We represent estimated fields as digital three-channel images and use CCV to compare and quantify the accuracy of estimations. The sensitivity of CCV to spatial information makes it a suitable metric for assessing the performance of spatial data assimilation techniques. Under various factors of data assimilation methods such as number, layout, and type of measurements, we compare the performance of CCV with other metrics such as RMSE. By simulating hydrogeological processes using estimated and true fields, we observe that CCV outperforms other existing evaluation metrics.

  19. How Hydroclimate Influences the Effectiveness of Particle Filter Data Assimilation of Streamflow in Initializing Short- to Medium-range Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Clark, E.; Wood, A.; Nijssen, B.; Clark, M. P.

    2017-12-01

    Short- to medium-range (1- to 7-day) streamflow forecasts are important for flood control operations and in issuing potentially life-save flood warnings. In the U.S., the National Weather Service River Forecast Centers (RFCs) issue such forecasts in real time, depending heavily on a manual data assimilation (DA) approach. Forecasters adjust model inputs, states, parameters and outputs based on experience and consideration of a range of supporting real-time information. Achieving high-quality forecasts from new automated, centralized forecast systems will depend critically on the adequacy of automated DA approaches to make analogous corrections to the forecasting system. Such approaches would further enable systematic evaluation of real-time flood forecasting methods and strategies. Toward this goal, we have implemented a real-time Sequential Importance Resampling particle filter (SIR-PF) approach to assimilate observed streamflow into simulated initial hydrologic conditions (states) for initializing ensemble flood forecasts. Assimilating streamflow alone in SIR-PF improves simulated streamflow and soil moisture during the model spin up period prior to a forecast, with consequent benefits for forecasts. Nevertheless, it only consistently limits error in simulated snow water equivalent during the snowmelt season and in basins where precipitation falls primarily as snow. We examine how the simulated initial conditions with and without SIR-PF propagate into 1- to 7-day ensemble streamflow forecasts. Forecasts are evaluated in terms of reliability and skill over a 10-year period from 2005-2015. The focus of this analysis is on how interactions between hydroclimate and SIR-PF performance impact forecast skill. To this end, we examine forecasts for 5 hydroclimatically diverse basins in the western U.S. Some of these basins receive most of their precipitation as snow, others as rain. Some freeze throughout the mid-winter while others experience significant mid-winter melt events. We describe the methodology and present seasonal and inter-basin variations in DA-enhanced forecast skill.

  20. Bayesian data assimilation provides rapid decision support for vector-borne diseases

    PubMed Central

    Jewell, Chris P.; Brown, Richard G.

    2015-01-01

    Predicting the spread of vector-borne diseases in response to incursions requires knowledge of both host and vector demographics in advance of an outbreak. Although host population data are typically available, for novel disease introductions there is a high chance of the pathogen using a vector for which data are unavailable. This presents a barrier to estimating the parameters of dynamical models representing host–vector–pathogen interaction, and hence limits their ability to provide quantitative risk forecasts. The Theileria orientalis (Ikeda) outbreak in New Zealand cattle demonstrates this problem: even though the vector has received extensive laboratory study, a high degree of uncertainty persists over its national demographic distribution. Addressing this, we develop a Bayesian data assimilation approach whereby indirect observations of vector activity inform a seasonal spatio-temporal risk surface within a stochastic epidemic model. We provide quantitative predictions for the future spread of the epidemic, quantifying uncertainty in the model parameters, case infection times and the disease status of undetected infections. Importantly, we demonstrate how our model learns sequentially as the epidemic unfolds and provide evidence for changing epidemic dynamics through time. Our approach therefore provides a significant advance in rapid decision support for novel vector-borne disease outbreaks. PMID:26136225

  1. Application of advanced data assimilation techniques to the study of cloud and precipitation feedbacks in the tropical climate system

    NASA Astrophysics Data System (ADS)

    Posselt, Derek J.

    The research documented in this study centers around two topics: evaluation of the response of precipitating cloud systems to changes in the tropical climate system, and assimilation of cloud and precipitation information from remote-sensing platforms. The motivation for this work proceeds from the following outstanding problems: (1) Use of models to study the response of clouds to perturbations in the climate system is hampered by uncertainties in cloud microphysical parameterizations. (2) Though there is an ever-growing set of available observations, cloud and precipitation assimilation remains a difficult problem, particularly in the tropics. (3) Though it is widely acknowledged that cloud and precipitation processes play a key role in regulating the Earth's response to surface warming, the response of the tropical hydrologic cycle to climate perturbations remains largely unknown. The above issues are addressed in the following manner. First, Markov chain Monte Carlo (MCMC) methods are used to quantify the sensitivity of the NASA Goddard Cumulus Ensemble (GCE) cloud resolving model (CRM) to changes in its cloud odcrnpbymiC8l parameters. TRMM retrievals of precipitation rate, cloud properties, and radiative fluxes and heating rates over the South China Sea are then assimilated into the GCE model to constrain cloud microphysical parameters to values characteristic of convection in the tropics, and the resulting observation-constrained model is used to assess the response of the tropical hydrologic cycle to surface warming. The major findings of this study are the following: (1) MCMC provides an effective tool with which to evaluate both model parameterizations and the assumption of Gaussian statistics used in optimal estimation procedures. (2) Statistics of the tropical radiation budget and hydrologic cycle can be used to effectively constrain CRM cloud microphysical parameters. (3) For 2D CRM simulations run with and without shear, the precipitation efficiency of cloud systems increases with increasing sea surface temperature, while the high cloud fraction and outgoing shortwave radiation decrease.

  2. Integrating remotely sensed surface water extent into continental scale hydrology

    NASA Astrophysics Data System (ADS)

    Revilla-Romero, Beatriz; Wanders, Niko; Burek, Peter; Salamon, Peter; de Roo, Ad

    2016-12-01

    In hydrological forecasting, data assimilation techniques are employed to improve estimates of initial conditions to update incorrect model states with observational data. However, the limited availability of continuous and up-to-date ground streamflow data is one of the main constraints for large-scale flood forecasting models. This is the first study that assess the impact of assimilating daily remotely sensed surface water extent at a 0.1° × 0.1° spatial resolution derived from the Global Flood Detection System (GFDS) into a global rainfall-runoff including large ungauged areas at the continental spatial scale in Africa and South America. Surface water extent is observed using a range of passive microwave remote sensors. The methodology uses the brightness temperature as water bodies have a lower emissivity. In a time series, the satellite signal is expected to vary with changes in water surface, and anomalies can be correlated with flood events. The Ensemble Kalman Filter (EnKF) is a Monte-Carlo implementation of data assimilation and used here by applying random sampling perturbations to the precipitation inputs to account for uncertainty obtaining ensemble streamflow simulations from the LISFLOOD model. Results of the updated streamflow simulation are compared to baseline simulations, without assimilation of the satellite-derived surface water extent. Validation is done in over 100 in situ river gauges using daily streamflow observations in the African and South American continent over a one year period. Some of the more commonly used metrics in hydrology were calculated: KGE', NSE, PBIAS%, R2, RMSE, and VE. Results show that, for example, NSE score improved on 61 out of 101 stations obtaining significant improvements in both the timing and volume of the flow peaks. Whereas the validation at gauges located in lowland jungle obtained poorest performance mainly due to the closed forest influence on the satellite signal retrieval. The conclusion is that remotely sensed surface water extent holds potential for improving rainfall-runoff streamflow simulations, potentially leading to a better forecast of the peak flow.

  3. Adaptive probabilistic collocation based Kalman filter for unsaturated flow problem

    NASA Astrophysics Data System (ADS)

    Man, J.; Li, W.; Zeng, L.; Wu, L.

    2015-12-01

    The ensemble Kalman filter (EnKF) has gained popularity in hydrological data assimilation problems. As a Monte Carlo based method, a relatively large ensemble size is usually required to guarantee the accuracy. As an alternative approach, the probabilistic collocation based Kalman filter (PCKF) employs the Polynomial Chaos to approximate the original system. In this way, the sampling error can be reduced. However, PCKF suffers from the so called "cure of dimensionality". When the system nonlinearity is strong and number of parameters is large, PCKF is even more computationally expensive than EnKF. Motivated by recent developments in uncertainty quantification, we propose a restart adaptive probabilistic collocation based Kalman filter (RAPCKF) for data assimilation in unsaturated flow problem. During the implementation of RAPCKF, the important parameters are identified and active PCE basis functions are adaptively selected. The "restart" technology is used to alleviate the inconsistency between model parameters and states. The performance of RAPCKF is tested by unsaturated flow numerical cases. It is shown that RAPCKF is more efficient than EnKF with the same computational cost. Compared with the traditional PCKF, the RAPCKF is more applicable in strongly nonlinear and high dimensional problems.

  4. Bayesian approach to inverse statistical mechanics.

    PubMed

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  5. Bayesian approach to inverse statistical mechanics

    NASA Astrophysics Data System (ADS)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  6. Error in telemetry studies: Effects of animal movement on triangulation

    USGS Publications Warehouse

    Schmutz, Joel A.; White, Gary C.

    1990-01-01

    We used Monte Carlo simulations to investigate the effects of animal movement on error of estimated animal locations derived from radio-telemetry triangulation of sequentially obtained bearings. Simulated movements of 0-534 m resulted in up to 10-fold increases in average location error but <10% decreases in location precision when observer-to-animal distances were <1,000 m. Location error and precision were minimally affected by censorship of poor locations with Chi-square goodness-of-fit tests. Location error caused by animal movement can only be eliminated by taking simultaneous bearings.

  7. A study of Bangladesh's sub-surface water storages using satellite products and data assimilation scheme.

    PubMed

    Khaki, M; Forootan, E; Kuhn, M; Awange, J; Papa, F; Shum, C K

    2018-06-01

    Climate change can significantly influence terrestrial water changes around the world particularly in places that have been proven to be more vulnerable such as Bangladesh. In the past few decades, climate impacts, together with those of excessive human water use have changed the country's water availability structure. In this study, we use multi-mission remotely sensed measurements along with a hydrological model to separately analyze groundwater and soil moisture variations for the period 2003-2013, and their interactions with rainfall in Bangladesh. To improve the model's estimates of water storages, terrestrial water storage (TWS) data obtained from the Gravity Recovery And Climate Experiment (GRACE) satellite mission are assimilated into the World-Wide Water Resources Assessment (W3RA) model using the ensemble-based sequential technique of the Square Root Analysis (SQRA) filter. We investigate the capability of the data assimilation approach to use a non-regional hydrological model for a regional case study. Based on these estimates, we investigate relationships between the model derived sub-surface water storage changes and remotely sensed precipitations, as well as altimetry-derived river level variations in Bangladesh by applying the empirical mode decomposition (EMD) method. A larger correlation is found between river level heights and rainfalls (78% on average) in comparison to groundwater storage variations and rainfalls (57% on average). The results indicate a significant decline in groundwater storage (∼32% reduction) for Bangladesh between 2003 and 2013, which is equivalent to an average rate of 8.73 ± 2.45mm/year. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Smoothing-based compressed state Kalman filter for joint state-parameter estimation: Applications in reservoir characterization and CO2 storage monitoring

    NASA Astrophysics Data System (ADS)

    Li, Y. J.; Kokkinaki, Amalia; Darve, Eric F.; Kitanidis, Peter K.

    2017-08-01

    The operation of most engineered hydrogeological systems relies on simulating physical processes using numerical models with uncertain parameters and initial conditions. Predictions by such uncertain models can be greatly improved by Kalman-filter techniques that sequentially assimilate monitoring data. Each assimilation constitutes a nonlinear optimization, which is solved by linearizing an objective function about the model prediction and applying a linear correction to this prediction. However, if model parameters and initial conditions are uncertain, the optimization problem becomes strongly nonlinear and a linear correction may yield unphysical results. In this paper, we investigate the utility of one-step ahead smoothing, a variant of the traditional filtering process, to eliminate nonphysical results and reduce estimation artifacts caused by nonlinearities. We present the smoothing-based compressed state Kalman filter (sCSKF), an algorithm that combines one step ahead smoothing, in which current observations are used to correct the state and parameters one step back in time, with a nonensemble covariance compression scheme, that reduces the computational cost by efficiently exploring the high-dimensional state and parameter space. Numerical experiments show that when model parameters are uncertain and the states exhibit hyperbolic behavior with sharp fronts, as in CO2 storage applications, one-step ahead smoothing reduces overshooting errors and, by design, gives physically consistent state and parameter estimates. We compared sCSKF with commonly used data assimilation methods and showed that for the same computational cost, combining one step ahead smoothing and nonensemble compression is advantageous for real-time characterization and monitoring of large-scale hydrogeological systems with sharp moving fronts.

  9. Preliminary Report on U-Th-Pb Isotope Systematics of the Olivine-Phyric Shergottite Tissint

    NASA Technical Reports Server (NTRS)

    Moriwaki, R.; Usui, T.; Yokoyama, T.; Simon, J. I.; Jones, J. H.

    2014-01-01

    Geochemical studies of shergottites suggest that their parental magmas reflect mixtures between at least two distinct geochemical source reservoirs, producing correlations between radiogenic isotope compositions, and trace element abundances.. These correlations have been interpreted as indicating the presence of a reduced, incompatible-element- depleted reservoir and an oxidized, incompatible-element-rich reservoir. The former is clearly a depleted mantle source, but there has been a long debate regarding the origin of the enriched reservoir. Two contrasting models have been proposed regarding the location and mixing process of the two geochemical source reservoirs: (1) assimilation of oxidized crust by mantle derived, reduced magmas, or (2) mixing of two distinct mantle reservoirs during melting. The former clearly requires the ancient martian crust to be the enriched source (crustal assimilation), whereas the latter requires a long-lived enriched mantle domain that probably originated from residual melts formed during solidification of a magma ocean (heterogeneous mantle model). This study conducts Pb isotope and U-Th-Pb concentration analyses of the olivine-phyric shergottite Tissint because U-Th-Pb isotope systematics have been intensively used as a powerful radiogenic tracer to characterize old crust/sediment components in mantle- derived, terrestrial oceanic island basalts. The U-Th-Pb analyses are applied to sequential acid leaching fractions obtained from Tissint whole-rock powder in order to search for Pb isotopic source components in Tissint magma. Here we report preliminary results of the U-Th-Pb analyses of acid leachates and a residue, and propose the possibility that Tissint would have experienced minor assimilation of old martian crust.

  10. Influences of NOM composition and bacteriological characteristics on biological stability in a full-scale drinking water treatment plant.

    PubMed

    Park, Ji Won; Kim, Hyun-Chul; Meyer, Anne S; Kim, Sungpyo; Maeng, Sung Kyu

    2016-10-01

    The influences of natural organic matter (NOM) and bacteriological characteristics on the biological stability of water were investigated in a full-scale drinking water treatment plant. We found that prechlorination decreased the hydrophobicity of the organic matter and significantly increased the high-molecular-weight (MW) dissolved organic matter, such as biopolymers and humic substances. High-MW organic matter and structurally complex compounds are known to be relatively slowly biodegradable; however, because of the prechlorination step, the indigenous bacteria could readily utilise these fractions as assimilable organic carbon. Sequential coagulation and sedimentation resulted in the substantial removal of biopolymer (74%), humic substance (33%), bacterial cells (79%), and assimilable organic carbon (67%). Rapid sand and granular activated carbon filtration induced an increase in the low-nucleic-acid content bacteria; however, these bacteria were biologically less active in relation to enzymatic activity and ATP. The granular activated carbon step was essential to securing biological stability (the ability to prevent bacterial growth) by removing the residual assimilable organic carbon that had formed during the ozone treatment. The growth potential of Escherichia coli and indigenous bacteria were found to differ in respect to NOM characteristics. In comparison with E. coli, the indigenous bacteria utilised a broader range of NOM as a carbon source. Principal component analysis demonstrated that the measured biological stability of water could differ, depending on the NOM characteristics, as well as on the bacterial inoculum selected for the analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A growth analysis of waterlogging damage in mung bean (Phaseolus aureus)

    NASA Technical Reports Server (NTRS)

    Musgrave, M. E.; Vanhoy, M. A.

    1989-01-01

    Mung beans (Phaseolus aureus Roxb.) were grown for 2 weeks in gravel-vermiculite soilless mix in a growth chamber and subjected to a 1-week waterlogging period followed by a 1-week recovery period. Sequential harvests were made to determine the time course of effects of waterlogging and subsequent recovery on growth parameters by techniques of growth analysis. Root dry matter was the first to be affected, along with an increase in leaf dry matter and specific leaf weight. After a 1-week waterlogging period, specific leaf weight had more than doubled in the stressed plants. Leaf area declined in relation to the control plants as did the ratio of root dry matter to shoot dry matter. During the recovery period there was an increase in the dry matter allocation to the roots relative to the shoot. Specific leaf weight fell to control levels although the rate of leaf area elaboration did not increase during this time, suggesting a redistribution of stored assimilates from the leaves. Net assimilation rate increased during the waterlogging period, probably due to a restriction in root metabolism and reduced translocation out of the leaf rather than to an increase in photosynthesis. Net assimilation rate of waterlogged plants was severely reduced compared with control plants during the recovery period. Both relative growth rate and leaf area duration declined during the waterlogging period and declined further subsequent to the waterlogging treatment. The results illustrate the interrelationships between root and shoot carbon budgets in mung bean during response to the stress of waterlogging.

  12. Ascertainment correction for Markov chain Monte Carlo segregation and linkage analysis of a quantitative trait.

    PubMed

    Ma, Jianzhong; Amos, Christopher I; Warwick Daw, E

    2007-09-01

    Although extended pedigrees are often sampled through probands with extreme levels of a quantitative trait, Markov chain Monte Carlo (MCMC) methods for segregation and linkage analysis have not been able to perform ascertainment corrections. Further, the extent to which ascertainment of pedigrees leads to biases in the estimation of segregation and linkage parameters has not been previously studied for MCMC procedures. In this paper, we studied these issues with a Bayesian MCMC approach for joint segregation and linkage analysis, as implemented in the package Loki. We first simulated pedigrees ascertained through individuals with extreme values of a quantitative trait in spirit of the sequential sampling theory of Cannings and Thompson [Cannings and Thompson [1977] Clin. Genet. 12:208-212]. Using our simulated data, we detected no bias in estimates of the trait locus location. However, in addition to allele frequencies, when the ascertainment threshold was higher than or close to the true value of the highest genotypic mean, bias was also found in the estimation of this parameter. When there were multiple trait loci, this bias destroyed the additivity of the effects of the trait loci, and caused biases in the estimation all genotypic means when a purely additive model was used for analyzing the data. To account for pedigree ascertainment with sequential sampling, we developed a Bayesian ascertainment approach and implemented Metropolis-Hastings updates in the MCMC samplers used in Loki. Ascertainment correction greatly reduced biases in parameter estimates. Our method is designed for multiple, but a fixed number of trait loci. Copyright (c) 2007 Wiley-Liss, Inc.

  13. Characterization of As-polluted soils by laboratory X-ray-based techniques coupled with sequential extractions and electron microscopy: the case of Crocette gold mine in the Monte Rosa mining district (Italy).

    PubMed

    Allegretta, Ignazio; Porfido, Carlo; Martin, Maria; Barberis, Elisabetta; Terzano, Roberto; Spagnuolo, Matteo

    2018-06-24

    Arsenic concentration and distribution were studied by combining laboratory X-ray-based techniques (wavelength dispersive X-ray fluorescence (WDXRF), micro X-ray fluorescence (μXRF), and X-ray powder diffraction (XRPD)), field emission scanning electron microscopy equipped with microanalysis (FE-SEM-EDX), and sequential extraction procedure (SEP) coupled to total reflection X-ray fluorescence (TXRF) analysis. This approach was applied to three contaminated soils and one mine tailing collected near the gold extraction plant at the Crocette gold mine (Macugnaga, VB) in the Monte Rosa mining district (Piedmont, Italy). Arsenic (As) concentration, measured with WDXRF, ranged from 145 to 40,200 mg/kg. XRPD analysis evidenced the presence of jarosite and the absence of any As-bearing mineral, suggesting a high weathering grade and strong oxidative conditions. However, small domains of Fe arsenate were identified by combining μXRF with FE-SEM-EDX. SEP results revealed that As was mainly associated to amorphous Fe oxides/hydroxides or hydroxysulfates (50-80%) and the combination of XRPD and FE-SEM-EDX suggested that this phase could be attributed to schwertmannite. On the basis of the reported results, As is scarcely mobile, even if a consistent As fraction (1-3 g As/kg of soil) is still potentially mobilizable. In general, the proposed combination of laboratory X-ray techniques could be successfully employed to unravel environmental issues related to metal(loid) pollution in soil and sediments.

  14. Feature aided Monte Carlo probabilistic data association filter for ballistic missile tracking

    NASA Astrophysics Data System (ADS)

    Ozdemir, Onur; Niu, Ruixin; Varshney, Pramod K.; Drozd, Andrew L.; Loe, Richard

    2011-05-01

    The problem of ballistic missile tracking in the presence of clutter is investigated. Probabilistic data association filter (PDAF) is utilized as the basic filtering algorithm. We propose to use sequential Monte Carlo methods, i.e., particle filters, aided with amplitude information (AI) in order to improve the tracking performance of a single target in clutter when severe nonlinearities exist in the system. We call this approach "Monte Carlo probabilistic data association filter with amplitude information (MCPDAF-AI)." Furthermore, we formulate a realistic problem in the sense that we use simulated radar cross section (RCS) data for a missile warhead and a cylinder chaff using Lucernhammer1, a state of the art electromagnetic signature prediction software, to model target and clutter amplitude returns as additional amplitude features which help to improve data association and tracking performance. A performance comparison is carried out between the extended Kalman filter (EKF) and the particle filter under various scenarios using single and multiple sensors. The results show that, when only one sensor is used, the MCPDAF performs significantly better than the EKF in terms of tracking accuracy under severe nonlinear conditions for ballistic missile tracking applications. However, when the number of sensors is increased, even under severe nonlinear conditions, the EKF performs as well as the MCPDAF.

  15. Radiative transport produced by oblique illumination of turbid media with collimated beams

    NASA Astrophysics Data System (ADS)

    Gardner, Adam R.; Kim, Arnold D.; Venugopalan, Vasan

    2013-06-01

    We examine the general problem of light transport initiated by oblique illumination of a turbid medium with a collimated beam. This situation has direct relevance to the analysis of cloudy atmospheres, terrestrial surfaces, soft condensed matter, and biological tissues. We introduce a solution approach to the equation of radiative transfer that governs this problem, and develop a comprehensive spherical harmonics expansion method utilizing Fourier decomposition (SHEFN). The SHEFN approach enables the solution of problems lacking azimuthal symmetry and provides both the spatial and directional dependence of the radiance. We also introduce the method of sequential-order smoothing that enables the calculation of accurate solutions from the results of two sequential low-order approximations. We apply the SHEFN approach to determine the spatial and angular dependence of both internal and boundary radiances from strongly and weakly scattering turbid media. These solutions are validated using more costly Monte Carlo simulations and reveal important insights regarding the evolution of the radiant field generated by oblique collimated beams spanning ballistic and diffusely scattering regimes.

  16. Free energy computations by minimization of Kullback-Leibler divergence: An efficient adaptive biasing potential method for sparse representations

    NASA Astrophysics Data System (ADS)

    Bilionis, I.; Koutsourelakis, P. S.

    2012-05-01

    The present paper proposes an adaptive biasing potential technique for the computation of free energy landscapes. It is motivated by statistical learning arguments and unifies the tasks of biasing the molecular dynamics to escape free energy wells and estimating the free energy function, under the same objective of minimizing the Kullback-Leibler divergence between appropriately selected densities. It offers rigorous convergence diagnostics even though history dependent, non-Markovian dynamics are employed. It makes use of a greedy optimization scheme in order to obtain sparse representations of the free energy function which can be particularly useful in multidimensional cases. It employs embarrassingly parallelizable sampling schemes that are based on adaptive Sequential Monte Carlo and can be readily coupled with legacy molecular dynamics simulators. The sequential nature of the learning and sampling scheme enables the efficient calculation of free energy functions parametrized by the temperature. The characteristics and capabilities of the proposed method are demonstrated in three numerical examples.

  17. Footprints of electron correlation in strong-field double ionization of Kr close to the sequential-ionization regime

    NASA Astrophysics Data System (ADS)

    Li, Xiaokai; Wang, Chuncheng; Yuan, Zongqiang; Ye, Difa; Ma, Pan; Hu, Wenhui; Luo, Sizuo; Fu, Libin; Ding, Dajun

    2017-09-01

    By combining kinematically complete measurements and a semiclassical Monte Carlo simulation we study the correlated-electron dynamics in the strong-field double ionization of Kr. Interestingly, we find that, as we step into the sequential-ionization regime, there are still signatures of correlation in the two-electron joint momentum spectrum and, more intriguingly, the scaling law of the high-energy tail is completely different from early predictions on the low-Z atom (He). These experimental observations are well reproduced by our generalized semiclassical model adapting a Green-Sellin-Zachor potential. It is revealed that the competition between the screening effect of inner-shell electrons and the Coulomb focusing of nuclei leads to a non-inverse-square central force, which twists the returned electron trajectory at the vicinity of the parent core and thus significantly increases the probability of hard recollisions between two electrons. Our results might have promising applications ranging from accurately retrieving atomic structures to simulating celestial phenomena in the laboratory.

  18. Advanced treatment of residual nitrogen from biologically treated coke effluent by a microalga-mediated process using volatile fatty acids (VFAs) under stepwise mixotrophic conditions.

    PubMed

    Ryu, Byung-Gon; Kim, Woong; Heo, Sung-Woon; Kim, Donghyun; Choi, Gang-Guk; Yang, Ji-Won

    2015-09-01

    This work describes the development of a microalga-mediated process for simultaneous removal of residual ammonium nitrogen (NH4(+)-N) and production of lipids from biologically treated coke effluent. Four species of green algae were tested using a sequential mixotrophic process. In the first phase-CO2-supplied mixotrophic condition-all microalgae assimilated NH4(+)-N with no evident inhibition. In second phase-volatile fatty acids (VFAs)-supplied mixotrophic condition-removal rates of NH4(+)-N and biomass significantly increased. Among the microalgae used, Arctic Chlorella sp. ArM0029B had the highest rate of NH4(+)-N removal (0.97 mg/L/h) and fatty acid production (24.9 mg/L/d) which were 3.6- and 2.1-fold higher than those observed under the CO2-supplied mixotrophic condition. Redundancy analysis (RDA) indicated that acetate and butyrate were decisive factors for increasing NH4(+)-N removal and fatty acid production. These results demonstrate that microalgae can be used in a sequential process for treatment of residual nitrogen after initial treatment of activated sludge. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Determination of yeast assimilable nitrogen content in wine fermentations by sequential injection analysis with spectrophotometric detetection.

    PubMed

    Muik, Barbara; Edelmann, Andrea; Lendl, Bernhard; Ayora-Cañada, María José

    2002-09-01

    An automated method for measuring the primary amino acid concentration in wine fermentations by sequential injection analysis with spectrophotometric detection was developed. Isoindole-derivatives from the primary amino acid were formed by reaction with o-phthaldialdehyde and N-acetyl- L-cysteine and measured at 334 nm with respect to a baseline point at 700 nm to compensate the observed Schlieren effect. As the reaction kinetic was strongly matrix dependent the analytical readout at the final reaction equilibrium has been evaluated. Therefore four parallel reaction coils were included in the flow system to be capable of processing four samples simultaneously. Using isoleucine as the representative primary amino acid in wine fermentations a linear calibration curve from 2 to 10 mM isoleucine, corresponding to 28 to 140 mg nitrogen/L (N/L) was obtained. The coefficient of variation of the method was 1.5% at a throughput of 12 samples per hour. The developed method was successfully used to monitor two wine fermentations during alcoholic fermentation. The results were in agreement with an external reference method based on high performance liquid chromatography. A mean-t-test showed no significant differences between the two methods at a confidence level of 95%.

  20. Simultaneous perceptual and response biases on sequential face attractiveness judgments.

    PubMed

    Pegors, Teresa K; Mattar, Marcelo G; Bryan, Peter B; Epstein, Russell A

    2015-06-01

    Face attractiveness is a social characteristic that we often use to make first-pass judgments about the people around us. However, these judgments are highly influenced by our surrounding social world, and researchers still understand little about the mechanisms underlying these influences. In a series of 3 experiments, we use a novel sequential rating paradigm that enables us to measure biases in attractiveness judgments from the previous face and the previous rating. Our results reveal 2 simultaneous and opposing influences on face attractiveness judgments that arise from past experience of faces: a response bias in which attractiveness ratings shift toward a previously given rating and a stimulus bias in which attractiveness ratings shift away from the mean attractiveness of the previous face. Further, we provide evidence that the contrastive stimulus bias (but not the assimilative response bias) is strengthened by increasing the duration of the previous stimulus, suggesting an underlying perceptual mechanism. These results demonstrate that judgments of face attractiveness are influenced by information from our evaluative and perceptual history and that these influences have measurable behavioral effects over the course of just a few seconds. (c) 2015 APA, all rights reserved).

  1. Adaptive measurement selection for progressive damage estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Wenfan; Kovvali, Narayan; Papandreou-Suppappola, Antonia; Chattopadhyay, Aditi; Peralta, Pedro

    2011-04-01

    Noise and interference in sensor measurements degrade the quality of data and have a negative impact on the performance of structural damage diagnosis systems. In this paper, a novel adaptive measurement screening approach is presented to automatically select the most informative measurements and use them intelligently for structural damage estimation. The method is implemented efficiently in a sequential Monte Carlo (SMC) setting using particle filtering. The noise suppression and improved damage estimation capability of the proposed method is demonstrated by an application to the problem of estimating progressive fatigue damage in an aluminum compact-tension (CT) sample using noisy PZT sensor measurements.

  2. Inverse Regional Modeling with Adjoint-Free Technique

    NASA Astrophysics Data System (ADS)

    Yaremchuk, M.; Martin, P.; Panteleev, G.; Beattie, C.

    2016-02-01

    The ongoing parallelization trend in computer technologies facilitates the use ensemble methods in geophysical data assimilation. Of particular interest are ensemble techniques which do not require the development of tangent linear numerical models and their adjoints for optimization. These ``adjoint-free'' methods minimize the cost function within the sequence of subspaces spanned by a carefully chosen sets perturbations of the control variables. In this presentation, an adjoint-free variational technique (a4dVar) is demonstrated in an application estimating initial conditions of two numerical models: the Navy Coastal Ocean Model (NCOM), and the surface wave model (WAM). With the NCOM, performance of both adjoint and adjoint-free 4dVar data assimilation techniques is compared in application to the hydrographic surveys and velocity observations collected in the Adriatic Sea in 2006. Numerical experiments have shown that a4dVar is capable of providing forecast skill similar to that of conventional 4dVar at comparable computational expense while being less susceptible to excitation of ageostrophic modes that are not supported by observations. Adjoint-free technique constrained by the WAM model is tested in a series of data assimilation experiments with synthetic observations in the southern Chukchi Sea. The types of considered observations are directional spectra estimated from point measurements by stationary buoys, significant wave height (SWH) observations by coastal high-frequency radars and along-track SWH observations by satellite altimeters. The a4dVar forecast skill is shown to be 30-40% better than the skill of the sequential assimilaiton method based on optimal interpolation which is currently used in operations. Prospects of further development of the a4dVar methods in regional applications are discussed.

  3. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  4. Geomagnetic inverse problem and data assimilation: a progress report

    NASA Astrophysics Data System (ADS)

    Aubert, Julien; Fournier, Alexandre

    2013-04-01

    In this presentation I will present two studies recently undertaken by our group in an effort to bring the benefits of data assimilation to the study of Earth's magnetic field and the dynamics of its liquid iron core, where the geodynamo operates. In a first part I will focus on the geomagnetic inverse problem, which attempts to recover the fluid flow in the core from the temporal variation of the magnetic field (known as the secular variation). Geomagnetic data can be downward continued from the surface of the Earth down to the core-mantle boundary, but not further below, since the core is an electrical conductor. Historically, solutions to the geomagnetic inverse problem in such a sparsely observed system were thus found only for flow immediately below the core mantle boundary. We have recently shown that combining a numerical model of the geodynamo together with magnetic observations, through the use of Kalman filtering, now allows to present solutions for flow throughout the core. In a second part, I will present synthetic tests of sequential geomagnetic data assimilation aiming at evaluating the range at which the future of the geodynamo can be predicted, and our corresponding prospects to refine the current geomagnetic predictions. Fournier, Aubert, Thébault: Inference on core surface flow from observations and 3-D dynamo modelling, Geophys. J. Int. 186, 118-136, 2011, doi: 10.1111/j.1365-246X.2011.05037.x Aubert, Fournier: Inferring internal properties of Earth's core dynamics and their evolution from surface observations and a numerical geodynamo model, Nonlinear Proc. Geoph. 18, 657-674, 2011, doi:10.5194/npg-18-657-2011 Aubert: Flow throughout the Earth's core inverted from geomagnetic observations and numerical dynamo models, Geophys. J. Int., 2012, doi: 10.1093/gji/ggs051

  5. Three-Dimensional Bayesian Geostatistical Aquifer Characterization at the Hanford 300 Area using Tracer Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xingyuan; Murakami, Haruko; Hahn, Melanie S.

    2012-06-01

    Tracer testing under natural or forced gradient flow holds the potential to provide useful information for characterizing subsurface properties, through monitoring, modeling and interpretation of the tracer plume migration in an aquifer. Non-reactive tracer experiments were conducted at the Hanford 300 Area, along with constant-rate injection tests and electromagnetic borehole flowmeter (EBF) profiling. A Bayesian data assimilation technique, the method of anchored distributions (MAD) [Rubin et al., 2010], was applied to assimilate the experimental tracer test data with the other types of data and to infer the three-dimensional heterogeneous structure of the hydraulic conductivity in the saturated zone of themore » Hanford formation. In this study, the Bayesian prior information on the underlying random hydraulic conductivity field was obtained from previous field characterization efforts using the constant-rate injection tests and the EBF data. The posterior distribution of the conductivity field was obtained by further conditioning the field on the temporal moments of tracer breakthrough curves at various observation wells. MAD was implemented with the massively-parallel three-dimensional flow and transport code PFLOTRAN to cope with the highly transient flow boundary conditions at the site and to meet the computational demands of MAD. A synthetic study proved that the proposed method could effectively invert tracer test data to capture the essential spatial heterogeneity of the three-dimensional hydraulic conductivity field. Application of MAD to actual field data shows that the hydrogeological model, when conditioned on the tracer test data, can reproduce the tracer transport behavior better than the field characterized without the tracer test data. This study successfully demonstrates that MAD can sequentially assimilate multi-scale multi-type field data through a consistent Bayesian framework.« less

  6. Bayesian data assimilation provides rapid decision support for vector-borne diseases.

    PubMed

    Jewell, Chris P; Brown, Richard G

    2015-07-06

    Predicting the spread of vector-borne diseases in response to incursions requires knowledge of both host and vector demographics in advance of an outbreak. Although host population data are typically available, for novel disease introductions there is a high chance of the pathogen using a vector for which data are unavailable. This presents a barrier to estimating the parameters of dynamical models representing host-vector-pathogen interaction, and hence limits their ability to provide quantitative risk forecasts. The Theileria orientalis (Ikeda) outbreak in New Zealand cattle demonstrates this problem: even though the vector has received extensive laboratory study, a high degree of uncertainty persists over its national demographic distribution. Addressing this, we develop a Bayesian data assimilation approach whereby indirect observations of vector activity inform a seasonal spatio-temporal risk surface within a stochastic epidemic model. We provide quantitative predictions for the future spread of the epidemic, quantifying uncertainty in the model parameters, case infection times and the disease status of undetected infections. Importantly, we demonstrate how our model learns sequentially as the epidemic unfolds and provide evidence for changing epidemic dynamics through time. Our approach therefore provides a significant advance in rapid decision support for novel vector-borne disease outbreaks. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  7. Kinetic mechanism of the dimeric ATP sulfurylase from plants

    PubMed Central

    Ravilious, Geoffrey E.; Herrmann, Jonathan; Goo Lee, Soon; Westfall, Corey S.; Jez, Joseph M.

    2013-01-01

    In plants, sulfur must be obtained from the environment and assimilated into usable forms for metabolism. ATP sulfurylase catalyses the thermodynamically unfavourable formation of a mixed phosphosulfate anhydride in APS (adenosine 5′-phosphosulfate) from ATP and sulfate as the first committed step of sulfur assimilation in plants. In contrast to the multi-functional, allosterically regulated ATP sulfurylases from bacteria, fungi and mammals, the plant enzyme functions as a mono-functional, non-allosteric homodimer. Owing to these differences, here we examine the kinetic mechanism of soybean ATP sulfurylase [GmATPS1 (Glycine max (soybean) ATP sulfurylase isoform 1)]. For the forward reaction (APS synthesis), initial velocity methods indicate a single-displacement mechanism. Dead-end inhibition studies with chlorate showed competitive inhibition versus sulfate and non-competitive inhibition versus APS. Initial velocity studies of the reverse reaction (ATP synthesis) demonstrate a sequential mechanism with global fitting analysis suggesting an ordered binding of substrates. ITC (isothermal titration calorimetry) showed tight binding of APS to GmATPS1. In contrast, binding of PPi (pyrophosphate) to GmATPS1 was not detected, although titration of the E•APS complex with PPi in the absence of magnesium displayed ternary complex formation. These results suggest a kinetic mechanism in which ATP and APS are the first substrates bound in the forward and reverse reactions, respectively. PMID:23789618

  8. On the use of Bayesian Monte-Carlo in evaluation of nuclear data

    NASA Astrophysics Data System (ADS)

    De Saint Jean, Cyrille; Archier, Pascal; Privas, Edwin; Noguere, Gilles

    2017-09-01

    As model parameters, necessary ingredients of theoretical models, are not always predicted by theory, a formal mathematical framework associated to the evaluation work is needed to obtain the best set of parameters (resonance parameters, optical models, fission barrier, average width, multigroup cross sections) with Bayesian statistical inference by comparing theory to experiment. The formal rule related to this methodology is to estimate the posterior density probability function of a set of parameters by solving an equation of the following type: pdf(posterior) ˜ pdf(prior) × a likelihood function. A fitting procedure can be seen as an estimation of the posterior density probability of a set of parameters (referred as x→?) knowing a prior information on these parameters and a likelihood which gives the probability density function of observing a data set knowing x→?. To solve this problem, two major paths could be taken: add approximations and hypothesis and obtain an equation to be solved numerically (minimum of a cost function or Generalized least Square method, referred as GLS) or use Monte-Carlo sampling of all prior distributions and estimate the final posterior distribution. Monte Carlo methods are natural solution for Bayesian inference problems. They avoid approximations (existing in traditional adjustment procedure based on chi-square minimization) and propose alternative in the choice of probability density distribution for priors and likelihoods. This paper will propose the use of what we are calling Bayesian Monte Carlo (referred as BMC in the rest of the manuscript) in the whole energy range from thermal, resonance and continuum range for all nuclear reaction models at these energies. Algorithms will be presented based on Monte-Carlo sampling and Markov chain. The objectives of BMC are to propose a reference calculation for validating the GLS calculations and approximations, to test probability density distributions effects and to provide the framework of finding global minimum if several local minimums exist. Application to resolved resonance, unresolved resonance and continuum evaluation as well as multigroup cross section data assimilation will be presented.

  9. Volcano Deformation and Eruption Forecasting using Data Assimilation: Building the Strategy

    NASA Astrophysics Data System (ADS)

    Bato, M. G.; Pinel, V.; Yan, Y.

    2016-12-01

    In monitoring active volcanoes, the magma overpressure is one of the key parameters used in forecasting volcanic eruptions. This can be inferred from the ground displacements measured on the Earth's surface by applying inversion techniques. However, during the inversion, we lose the temporal characteristic along with huge amount of information about the behaviour of the volcano. Our work focuses on developing a strategy in order to better forecast the magma overpressure using data assimilation. Data assimilation is a sequential time-forward process that best combines models and observations, sometimes a priori information based on error statistics, to predict the state of a dynamical system. It has gained popularity in various fields of geoscience (e.g. ocean-weather forecasting, geomagnetism and natural resources exploration), but remains a new and emerging technique in the field of volcanology. With the increasing amount of geodetic data (i.e. InSAR and GPS) recorded on volcanoes nowadays, and the wide-range availability of dynamical models that can provide better understanding about the volcano plumbing system; developing a forecasting framework that can efficiently combine them is crucial. Here, we particularly built our strategy on the basis of the Ensemble Kalman Filter (EnKF) [1]. We predict the temporal behaviours of the magma overpressures and surface deformations by adopting the two-magma chamber model proposed by Reverso et. al., 2014 [2] and by using synthetic GPS and/or InSAR data. Several tests are performed in order to answer the following: 1) know the efficiency of EnKF in forecasting volcanic unrests, 2) constrain unknown parameters of the model, 3) properly use GPS and/or InSAR during assimilation and 4) compare EnKF with classic inversion while using the same dynamical model. Results show that EnKF works well with the synthetic cases and there is a great potential in utilising the method for real-time monitoring of volcanic unrests. [1] Evensen, G., The Ensemble Kalman Filter: theoretical formulation and practical implementation. Ocean Dyn.,53, 343-367, 2003 [2] T. Reverso, J. Vandemeulebrouck, F. Jouanne, V. Pinel, T. Villemin, E. Sturkell, A two-magma chamber as a source of deformation at Grimsvötn volcano, Iceland, JGR, 2014

  10. Geochemical influences on assimilation of sediment-bound metals in clams and mussels

    USGS Publications Warehouse

    Griscom, S.B.; Fisher, N.S.; Luoma, S.N.

    2000-01-01

    A series of experiments was performed to evaluate the extent to which Cd, Co, Ag, Se, Cr, and Zn bound to sediments with different geochemical properties could be assimilated by the mussel Mytilus edulis and the clam Macoma balthica. Oxidized and reduced radiolabeled sediments were fed to suspension-feeding animals, the depuration patterns of the individuals were followed by ??-spectrometry, and the assimilation efficiencies (AEs) of ingested metals were determined. AEs from geochemically diverse sediments typically varied less than 2-fold and ranged from 1% for Cr to 42% for Zn. Metals were assimilated from anoxic sediment by both animals; Ag, Cd, and Co AEs in M. balthica were 9-16%, 2-fold lower than from oxic sediment, but in M. edulis AEs were about two times greater from anoxic sediment for all metals but Ag. For oxic sediment, Cd and Co AEs in M. edulis decreased 3-4-fold with increased sediment exposure time to the metals with smaller but significant effects also noted for Zn and Se but not Ag. A less pronounced decrease in AE for M. balthica was evident only after 6 months exposure time. Sequential extractions of the oxidized sediments showed a transfer of metals into more resistant sediment components over time, but the rate did not correlate with a decrease in metal AEs. Comparing the two bivalves, TOC concentrations had an inconsistent effect on metal AEs. AEs of metals from bacteria-coated glass beads were slightly higher than from humic acid-coated beads, which were comparable with whole-sediment AEs. There was correspondence of AE with desorption of Ag, Cd, Co, and Se (but not Zn) from sediments into pH 5 seawater, measured to simulate the gut pH of these bivalves. The results imply that metals associated with sulfides and anoxic sediments are bioavailable, that the bioavailability of metals from sediments decreases over exposure time, that organic carbon content generally has a small effect on AEs, and that AEs of sediment-bound metals differ among species.

  11. Multiple point statistical simulation using uncertain (soft) conditional data

    NASA Astrophysics Data System (ADS)

    Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou

    2018-05-01

    Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.

  12. Implicit particle filtering for equations with partial noise and application to geomagnetic data assimilation

    NASA Astrophysics Data System (ADS)

    Morzfeld, M.; Atkins, E.; Chorin, A. J.

    2011-12-01

    The task in data assimilation is to identify the state of a system from an uncertain model supplemented by a stream of incomplete and noisy data. The model is typically given in form of a discretization of an Ito stochastic differential equation (SDE), x(n+1) = R(x(n))+ G W(n), where x is an m-dimensional vector and n=0,1,2,.... The m-dimensional vector function R and the m x m matrix G depend on the SDE as well as on the discretization scheme, and W is an m-dimensional vector whose elements are independent standard normal variates. The data are y(n) = h(x(n))+QV(n) where h is a k-dimensional vector function, Q is a k x k matrix and V is a vector whose components are independent standard normal variates. One can use statistics of the conditional probability density (pdf) of the state given the observations, p(n+1)=p(x(n+1)|y(1), ... , y(n+1)), to identify the state x(n+1). Particle filters approximate p(n+1) by sequential Monte Carlo and rely on the recursive formulation of the target pdf, p(n+1)∝p(x(n+1)|x(n)) p(y(n+1)|x(n+1)). The pdf p(x(n+1)|x(n)) can be read off of the model equations to be a Gaussian with mean R(x(n)) and covariance matrix Σ = GG^T, where the T denotes a transposed; the pdf p(y(n+1)|x(n+1)) is a Gaussian with mean h(x(n+1)) and covariance QQ^T. In a sampling-importance-resampling (SIR) filter one samples new values for the particles from a prior pdf and then one weighs these samples with weights determined by the observations, to yield an approximation to p(n+1). Such weighting schemes often yield small weights for many of the particles. Implicit particle filtering overcomes this problem by using the observations to generate the particles, thus focusing attention on regions of large probability. A suitable algebraic equation that depends on the model and the observations is constructed for each particle, and its solution yields high probability samples of p(n+1). In the current formulation of the implicit particle filter, the state covariance matrix Σ is assumed to be non-singular. In the present work we consider the case where the covariance Σ is singular. This happens in particular when the noise is spatially smooth and can be represented by a small number of Fourier coefficients, as is often the case in geophysical applications. We derive an implicit filter for this problem and show that it is very efficient, because the filter operates in a space whose dimension is the rank of Σ, rather than the full model dimension. We compare the implicit filter to SIR, to the Ensemble Kalman Filter and to variational methods, and also study how information from data is propagated from observed to unobserved variables. We illustrate the theory on two coupled nonlinear PDE's in one space dimension that have been used as a test-bed for geomagnetic data assimilation. We observe that the implicit filter gives good results with few (2-10) particles, while SIR requires thousands of particles for similar accuracy. We also find lower limits to the accuracy of the filter's reconstruction as a function of data availability.

  13. Integrating remotely sensed surface water extent into continental scale hydrology.

    PubMed

    Revilla-Romero, Beatriz; Wanders, Niko; Burek, Peter; Salamon, Peter; de Roo, Ad

    2016-12-01

    In hydrological forecasting, data assimilation techniques are employed to improve estimates of initial conditions to update incorrect model states with observational data. However, the limited availability of continuous and up-to-date ground streamflow data is one of the main constraints for large-scale flood forecasting models. This is the first study that assess the impact of assimilating daily remotely sensed surface water extent at a 0.1° × 0.1° spatial resolution derived from the Global Flood Detection System (GFDS) into a global rainfall-runoff including large ungauged areas at the continental spatial scale in Africa and South America. Surface water extent is observed using a range of passive microwave remote sensors. The methodology uses the brightness temperature as water bodies have a lower emissivity. In a time series, the satellite signal is expected to vary with changes in water surface, and anomalies can be correlated with flood events. The Ensemble Kalman Filter (EnKF) is a Monte-Carlo implementation of data assimilation and used here by applying random sampling perturbations to the precipitation inputs to account for uncertainty obtaining ensemble streamflow simulations from the LISFLOOD model. Results of the updated streamflow simulation are compared to baseline simulations, without assimilation of the satellite-derived surface water extent. Validation is done in over 100 in situ river gauges using daily streamflow observations in the African and South American continent over a one year period. Some of the more commonly used metrics in hydrology were calculated: KGE', NSE, PBIAS%, R 2 , RMSE, and VE. Results show that, for example, NSE score improved on 61 out of 101 stations obtaining significant improvements in both the timing and volume of the flow peaks. Whereas the validation at gauges located in lowland jungle obtained poorest performance mainly due to the closed forest influence on the satellite signal retrieval. The conclusion is that remotely sensed surface water extent holds potential for improving rainfall-runoff streamflow simulations, potentially leading to a better forecast of the peak flow.

  14. Nonstandard convergence to jamming in random sequential adsorption: The case of patterned one-dimensional substrates

    NASA Astrophysics Data System (ADS)

    Verma, Arjun; Privman, Vladimir

    2018-02-01

    We study approach to the large-time jammed state of the deposited particles in the model of random sequential adsorption. The convergence laws are usually derived from the argument of Pomeau which includes the assumption of the dominance, at large enough times, of small landing regions into each of which only a single particle can be deposited without overlapping earlier deposited particles and which, after a certain time are no longer created by depositions in larger gaps. The second assumption has been that the size distribution of gaps open for particle-center landing in this large-time small-gaps regime is finite in the limit of zero gap size. We report numerical Monte Carlo studies of a recently introduced model of random sequential adsorption on patterned one-dimensional substrates that suggest that the second assumption must be generalized. We argue that a region exists in the parameter space of the studied model in which the gap-size distribution in the Pomeau large-time regime actually linearly vanishes at zero gap sizes. In another region, the distribution develops a threshold property, i.e., there are no small gaps below a certain gap size. We discuss the implications of these findings for new asymptotic power-law and exponential-modified-by-a-power-law convergences to jamming in irreversible one-dimensional deposition.

  15. Sequential Inverse Problems Bayesian Principles and the Logistic Map Example

    NASA Astrophysics Data System (ADS)

    Duan, Lian; Farmer, Chris L.; Moroz, Irene M.

    2010-09-01

    Bayesian statistics provides a general framework for solving inverse problems, but is not without interpretation and implementation problems. This paper discusses difficulties arising from the fact that forward models are always in error to some extent. Using a simple example based on the one-dimensional logistic map, we argue that, when implementation problems are minimal, the Bayesian framework is quite adequate. In this paper the Bayesian Filter is shown to be able to recover excellent state estimates in the perfect model scenario (PMS) and to distinguish the PMS from the imperfect model scenario (IMS). Through a quantitative comparison of the way in which the observations are assimilated in both the PMS and the IMS scenarios, we suggest that one can, sometimes, measure the degree of imperfection.

  16. The PMHT: solutions for some of its problems

    NASA Astrophysics Data System (ADS)

    Wieneke, Monika; Koch, Wolfgang

    2007-09-01

    Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.

  17. Spatio-Temporal Mining of PolSAR Satellite Image Time Series

    NASA Astrophysics Data System (ADS)

    Julea, A.; Meger, N.; Trouve, E.; Bolon, Ph.; Rigotti, C.; Fallourd, R.; Nicolas, J.-M.; Vasile, G.; Gay, M.; Harant, O.; Ferro-Famil, L.

    2010-12-01

    This paper presents an original data mining approach for describing Satellite Image Time Series (SITS) spatially and temporally. It relies on pixel-based evolution and sub-evolution extraction. These evolutions, namely the frequent grouped sequential patterns, are required to cover a minimum surface and to affect pixels that are sufficiently connected. These spatial constraints are actively used to face large data volumes and to select evolutions making sense for end-users. In this paper, a specific application to fully polarimetric SAR image time series is presented. Preliminary experiments performed on a RADARSAT-2 SITS covering the Chamonix Mont-Blanc test-site are used to illustrate the proposed approach.

  18. Derivative-Free Estimation of the Score Vector and Observed Information Matrix with Application to State-Space Models

    DTIC Science & Technology

    2015-07-14

    2008). Sequential Monte Carlo smoothing with applica- tion to parameter estimation in non-linear state space models. Bernoulli , 14, 155-179. [22] Parikh...1BcΣ(θ?,δ)(Θ) ] = o ( τk ) for all k ∈ N. (45) The other integral is over the ball BΣ(θ?, δ), i.e. close to θ?; hence we perform a Taylor expansion of...1] R3 (θ, θ?) = ∑ |α|=4 ∂αϕ (θ? + cθ (θ − θ?)) (θ − θ?)α α! . 26 We now use the symmetry of the normal distribution N ( θ?, τ2Σ ) on the ball BΣ(θ

  19. Incremental Bayesian Category Learning From Natural Language.

    PubMed

    Frermann, Lea; Lapata, Mirella

    2016-08-01

    Models of category learning have been extensively studied in cognitive science and primarily tested on perceptual abstractions or artificial stimuli. In this paper, we focus on categories acquired from natural language stimuli, that is, words (e.g., chair is a member of the furniture category). We present a Bayesian model that, unlike previous work, learns both categories and their features in a single process. We model category induction as two interrelated subproblems: (a) the acquisition of features that discriminate among categories, and (b) the grouping of concepts into categories based on those features. Our model learns categories incrementally using particle filters, a sequential Monte Carlo method commonly used for approximate probabilistic inference that sequentially integrates newly observed data and can be viewed as a plausible mechanism for human learning. Experimental results show that our incremental learner obtains meaningful categories which yield a closer fit to behavioral data compared to related models while at the same time acquiring features which characterize the learned categories. (An earlier version of this work was published in Frermann and Lapata .). Copyright © 2015 Cognitive Science Society, Inc.

  20. Assimilation of TOPEX/POSEIDON altimeter data into a circulation model of the North Atlantic

    NASA Astrophysics Data System (ADS)

    Blayo, E.; Verron, J.; Molines, J. M.

    1994-12-01

    Assimilation experiments were conducted using the first 12 months of TOPEX/POSEIDON (T/P) altimeter measurements in a multilayered quasi-geostrophic model of the North Atlantic between 20°N and 60°N. These experiments demonstrate the feasibility of using T/P data to control a basin-scale circulation model by means of an assimilation procedure. Moreover, they allow us to recreate the four-dimensional behavior of the North Atlantic Ocean during the year October 1992-September 1993 and to improve our knowledge and understanding of such circulation patterns. For this study we used a four-layer quasigeostrophic model of high horizontal resolution (1/6° in latitude and longitude). The assimilation procedure used is an along-track, sequential, nudging technique. The evolution of the model general circulation is described and analyzed from a deterministic and statistical point of view, with special emphasis on the Gulf Stream area. The gross features of the North Atlantic circulation in terms of mean transport and circulation are reproduced, such as the path, penetration and recirculation of the Gulf Stream, and its meandering throughout the eastern basin. The North Atlantic Drift is, however, noticeably underestimated. A northern meander of the north wall of the Gulf Stream above the New England Seamount Chain is present for most of the year, while, just downstream, the southern part of the jet is subject to a 100-km southeastward deflection. The Azores current is shown to remain stable and to shift southward with time from the beginning of December 1992 to the end of April 1993, the amplitude of the shift being about 2°. The computation of the mean latitude of the Gulf Stream as a function of time shows an abrupt shift from a northern position to a southern position in January, and a reverse shift, from a southern position to a northern position, in July. Finally, some issues are addressed concerning the comparison of assimilation experiments using T/P data and Geosat data. The first results show that the T/P simulations are more energetic than the Geosat simulations, especially east of the Mid-Atlantic Ridge, for every wavelength from 50 km to 500 km. This property is also verified in the deep ocean. The predicted abyssal circulation is indeed more energetic in the T/P case, which is more in accordance with what we know of the real ocean. Moreover, the good T/P altimeter coverage near the coasts greatly improves the model eddy kinetic energy levels in these areas, especially east of 25°W.

  1. Vertical drying of a suspension of sticks: Monte Carlo simulation for continuous two-dimensional problem

    NASA Astrophysics Data System (ADS)

    Lebovka, Nikolai I.; Tarasevich, Yuri Yu.; Vygornitskii, Nikolai V.

    2018-02-01

    The vertical drying of a two-dimensional colloidal film containing zero-thickness sticks (lines) was studied by means of kinetic Monte Carlo (MC) simulations. The continuous two-dimensional problem for both the positions and orientations was considered. The initial state before drying was produced using a model of random sequential adsorption with isotropic orientations of the sticks. During the evaporation, an upper interface falls with a linear velocity in the vertical direction, and the sticks undergo translational and rotational Brownian motions. The MC simulations were run at different initial number concentrations (the numbers of sticks per unit area), pi, and solvent evaporation rates, u . For completely dried films, the spatial distributions of the sticks, the order parameters, and the electrical conductivities of the films in both the horizontal, x , and vertical, y , directions were examined. Significant evaporation-driven self-assembly and stratification of the sticks in the vertical direction was observed. The extent of stratification increased with increasing values of u . The anisotropy of the electrical conductivity of the film can be finely regulated by changes in the values of pi and u .

  2. Solvent effects on the absorption spectrum and first hyperpolarizability of keto-enol tautomeric forms of anil derivatives: A Monte Carlo/quantum mechanics study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adriano Junior, L.; Fonseca, T. L.; Castro, M. A.

    2016-06-21

    Theoretical results for the absorption spectrum and electric properties of the enol and keto tautomeric forms of anil derivatives in the gas-phase and in solution are presented. The electronic properties in chloroform, acetonitrile, methanol, and water were determined by carrying out sequential Monte Carlo simulations and quantum mechanics calculations based on the time dependent density functional theory and on the second-order Møller–Plesset perturbation theory method. The results illustrate the role played by electrostatic interactions in the electronic properties of anil derivatives in a liquid environment. There is a significant increase of the dipole moment in solution (20%-100%) relative to themore » gas-phase value. Solvent effects are mild for the absorption spectrum and linear polarizability but they can be particularly important for first hyperpolarizability. A large first hyperpolarizability contrast between the enol and keto forms is observed when absorption spectra present intense lowest-energy absorption bands. Dynamic results for the first hyperpolarizability are in qualitative agreement with the available experimental results.« less

  3. A hybrid parallel framework for the cellular Potts model simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approachmore » achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).« less

  4. The role of short-term memory impairment in nonword repetition, real word repetition, and nonword decoding: A case study.

    PubMed

    Peter, Beate

    2018-01-01

    In a companion study, adults with dyslexia and adults with a probable history of childhood apraxia of speech showed evidence of difficulty with processing sequential information during nonword repetition, multisyllabic real word repetition and nonword decoding. Results suggested that some errors arose in visual encoding during nonword reading, all levels of processing but especially short-term memory storage/retrieval during nonword repetition, and motor planning and programming during complex real word repetition. To further investigate the role of short-term memory, a participant with short-term memory impairment (MI) was recruited. MI was confirmed with poor performance during a sentence repetition and three nonword repetition tasks, all of which have a high short-term memory load, whereas typical performance was observed during tests of reading, spelling, and static verbal knowledge, all with low short-term memory loads. Experimental results show error-free performance during multisyllabic real word repetition but high counts of sequence errors, especially migrations and assimilations, during nonword repetition, supporting short-term memory as a locus of sequential processing deficit during nonword repetition. Results are also consistent with the hypothesis that during complex real word repetition, short-term memory is bypassed as the word is recognized and retrieved from long-term memory prior to producing the word.

  5. Central Metabolic Responses to Ozone and Herbivory Affect Photosynthesis and Stomatal Closure1[OPEN

    PubMed Central

    Khaling, Eliezer; Lassueur, Steve

    2016-01-01

    Plants have evolved adaptive mechanisms that allow them to tolerate a continuous range of abiotic and biotic stressors. Tropospheric ozone (O3), a global anthropogenic pollutant, directly affects living organisms and ecosystems, including plant-herbivore interactions. In this study, we investigate the stress responses of Brassica nigra (wild black mustard) exposed consecutively to O3 and the specialist herbivore Pieris brassicae. Transcriptomics and metabolomics data were evaluated using multivariate, correlation, and network analyses for the O3 and herbivory responses. O3 stress symptoms resembled those of senescence and phosphate starvation, while a sequential shift from O3 to herbivory induced characteristic plant defense responses, including a decrease in central metabolism, induction of the jasmonic acid/ethylene pathways, and emission of volatiles. Omics network and pathway analyses predicted a link between glycerol and central energy metabolism that influences the osmotic stress response and stomatal closure. Further physiological measurements confirmed that while O3 stress inhibited photosynthesis and carbon assimilation, sequential herbivory counteracted the initial responses induced by O3, resulting in a phenotype similar to that observed after herbivory alone. This study clarifies the consequences of multiple stress interactions on a plant metabolic system and also illustrates how omics data can be integrated to generate new hypotheses in ecology and plant physiology. PMID:27758847

  6. Enhancing hydrologic data assimilation by evolutionary Particle Filter and Markov Chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, Peyman; Moradkhani, Hamid; Yan, Hongxiang

    2018-01-01

    Particle Filters (PFs) have received increasing attention by researchers from different disciplines including the hydro-geosciences, as an effective tool to improve model predictions in nonlinear and non-Gaussian dynamical systems. The implication of dual state and parameter estimation using the PFs in hydrology has evolved since 2005 from the PF-SIR (sampling importance resampling) to PF-MCMC (Markov Chain Monte Carlo), and now to the most effective and robust framework through evolutionary PF approach based on Genetic Algorithm (GA) and MCMC, the so-called EPFM. In this framework, the prior distribution undergoes an evolutionary process based on the designed mutation and crossover operators of GA. The merit of this approach is that the particles move to an appropriate position by using the GA optimization and then the number of effective particles is increased by means of MCMC, whereby the particle degeneracy is avoided and the particle diversity is improved. In this study, the usefulness and effectiveness of the proposed EPFM is investigated by applying the technique on a conceptual and highly nonlinear hydrologic model over four river basins located in different climate and geographical regions of the United States. Both synthetic and real case studies demonstrate that the EPFM improves both the state and parameter estimation more effectively and reliably as compared with the PF-MCMC.

  7. Phase segregation and spontaneous symmetry breaking in a bidirectional two-channel non-conserving model with narrow entrances

    NASA Astrophysics Data System (ADS)

    Sharma, Natasha; Gupta, A. K.

    2017-04-01

    Motivated by connections between the inputs and outputs of several transport mechanisms and multi-species functionalities, we studied an open system of a two-species totally asymmetric simple exclusion process with narrow entrances, which assimilate the synergy of the particles with the surrounding environment through Langmuir kinetics (LK). We analyzed the model within the framework of mean-field theory, and examined complex phenomena such as boundary-induced phase transitions and spontaneous symmetry breaking for variant conditions of attachment and detachment rates. Based on the theoretical investigations we obtained the phase boundaries for various symmetric and asymmetric phases. Our finding displays a prolific behavior, highlighting the significant effect of LK rates on symmetry breaking. It is found that for lower orders of LK rates, the number of symmetrical and asymmetrical phases increases notably, while for their higher orders symmetry breaking disappears, revealing that the presence of bulk non-conserving processes can resume/break the uniformity between two species. The critical value of LK rates beyond which the asymmetrical phases disappears is identified. The theoretical findings are explored by extensive Monte Carlo simulations. The effect of the system size and symmetry breaking incident on the Monte Carlo simulation results has also been examined based on particle density histograms.

  8. Improving the Algae Bloom Prediction through the Assimilation of the Remotely Sensed Chlorophyll-A Data in a Generic Ecological Model in the North Sea

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada

    2010-05-01

    Harmful algae can cause damage to co-existing organisms, tourism and farmers. Accurate predictions of algal future composition and abundance as well as when and where algal blooms may occur could help early warning and mitigating. The Generic Ecological Model, GEM, [Blauw et al 2008] is an instrument that can be applied to any water system (fresh, transitional or coastal) to calculate the primary production, chlorophyll-a concentration and phytoplankton species composition. It consists of physical, chemical and ecological model components which are coupled together to build one generic and flexible modeling tool. For the North Sea, the model has been analyzed to assess sensitivity of the simulated chlorophyll-a concentration to a subset of ecologically significant set of factors. The research led to the definition of the most significant set of parameters to the algae blooming process in the North Sea [Salacinska et al 2009]. In order to improve the prediction of the model, the set of parameters and the chlorophyll-a concentration can be further estimated through the use of data assimilation. In this research, the Ensemble Kalman Filter (EnKF) data assimilation technique is used to assimilate the chlorophyll-a data of the North Sea, retrieved from MEdium Resolution Imaging Sensor (MERIS) spectrometer data [Peters et al 2005], in the GEM model. The chlorophyll-a data includes concentrations and error information that enable their use in data assimilation. For the same purpose, the uncertainty of the ecological generic model, GEM has been quantified by means of Monte Carlo approach. Through a study covering the year of 2003, the research demonstrates that both data and model are sufficiently robust for a successful assimilation. The results show that through the assimilation of the satellite data, a better description of the algae bloom has been achieved and an improvement of the capability of the model to predict the algae bloom for the North Sea has been confirmed. Blauw A.N., Los F.J., Bokhorst M., Erftemeijer P.L.A., (2009), GEM: a Generic Ecological Model for estuaries and coastal waters. Journal of Hydrobiologia, Volume 618, Number 1, 175-198. Peters, S.W.M., Eleveld, M. Pasterkamp, R., Woerd, H. van der, Devolder, M., Jans, S., Park, Y., Ruddick, K., Block, T., Brockmann, C., Doerffer, R., Krasemann, H., Röttgers, R., Schönfeld, W., Jørgensen, P.V., Tilstone, G., Martinez-Vicente, V., Moore, G., Sørensen, K., Høkedal, J., Johnsen, T.M., Lømsland, E.R., Aas, E. (2005). Atlas of Chlorophyll-a concentration for the North Sea based on MERIS imagery of 2003. IVM report, Vrije Universiteit Amsterdam, 117 pp. ISBN 90-5192-026-1. Salacinska K., El Serafy G.Y., Blauw A., Los F.J., (2009) Sensitivity analysis of the two dimensional application of the Generic Ecological Model (GEM) to algal bloom prediction in the North Sea, Journal of Ecological Modeling, volume 221, 7, pp 178-190, DOI: 10.1016/j.ecolmodel.2009.10.001

  9. The Mediterranean Sea 1985-2007 re-analysis: validation results

    NASA Astrophysics Data System (ADS)

    Adani, Mario; Dobricic, Srdjan; Pinardi, Nadia

    2010-05-01

    Re-analyses are different from analyses because they are consistent for the whole period since the oceanic state estimates are produced without changes in the modelling assumptions and they are usually done with systems which are more advance then the available systems at the time of the observations collection. A fundamental part of a re-analysis system is the data assimilation scheme which minimizes the cost function penalizing the time-space misfits between the data and the numerical solutions, with the constraint of the model equations and their parameters. In this work we will compare ocean circulation estimates provided by pure simulation, a system in which the assimilation scheme is based on a sequential algorithm: Optimal Interpolation (OI) and a three-dimensional variational scheme (3dvar). The OGCM used in this work is based on OPA 8.1 code (Madec et al. 1998), which has been implemented in the Mediterranean Sea by Tonani et al.(2008). The model has 1/16th horizontal resolution and 71 unevenly spaced vertical levels. The present model formulation uses a realistic water flux with river runoffs which improves the realism of the simulation. One re-analysis is produced with the Reduced Order Optimal Interpolation (ROOI) (De Mey and Benkiran, 2002) and the other with OceanVar (Dobricic and Pinardi, 2008). The observational data sets assimilated for both reanalysis are: • the historical data archive of MedATLAS (Maillard et al., 2003) which contains vertical in situ profiles of temperature and salinity from bottles, XBT, MBT and CTD sensors • temperature and salinity profiles collected in the framework of MFSPP and MFSTEP projects • CLS along track satellite sea level anomaly data from ERS1, ERS2, Envisat, Topex/Poseidon, Jason1 satellites (Pujol and Larnicol,2005) Reanalyzed daily mean fields of Sea Surface Temperature (SST) from Medspiration (Marullo et al., 2007) and the Delayed-Time operational product of CNR-ISAC have been used to relax the model SST. The Mean Dynamic Topography of (Dobricic, 2005) has been used for both experiments. The model is forced with a combined dataset of ECMWF analysis when available and ERA-15. The precipitations are monthly mean climatology of the NCEP re-analysis (Kistler et.al 2001), the river runoff data are monthly mean climatology from the Global Runoff Data Centre (GRDC) and from Raicic (1996) for the minor Adriatic Sea rivers. The assimilation schemes help in reducing the spin up time of the model by acting as a forcing inside the water column. Both re-analyses show significantly better results then the simulation reducing both bias and root mean square error even though the structure of the error remains almost the same of the simulation: the largest error for tracers is confined in the thermocline especially in summer, highlighting a problem in the mixing parameterization; the majors error for SLA is confined in the most dynamically active areas. Satellite altimetry observations result in a fundamental dataset to constrain model solution and since its homogeneity in the sampling they permits a consistent assessment of the model behaviour along the years which it is not possible from in-situ observations whose sampling is extremely inhomogeneous both in time and space. This study describes the development of modelling and data assimilation tools for the production of re-analysis for the entire Mediterranean Sea. In order to carry out a re-analysis two major steps were undertaken in this work. In the first, the general circulation model was upgraded to have the correct air-sea water fluxes. In the second, two assimilation schemes, one new and the other consolidated, were compared to show their impact on the quality of the re-analysis. The general circulation model used in this study is shown to be capable of reproducing quite accurately the ocean dynamics of the Mediterranean Sea. The results have shown that the model solution is in agreement with data and observations, even though some parameterizations of the model should be improved (i.e. heat flux and mixing processes). The new implementation of a realistic water flux, proposed in this study, has improved the model solution so that re-analysis is possible. The study of the re-analysis produced shows that both products are sufficiently accurate for appropriate climate studies. Both assimilation schemes show good capabilities in correcting the solutions provided by the dynamical model. Moreover it has been shown the ability of both systems in retaining this information and projecting it in the future. Eventually, even for very complex non linear systems, with millions of prognostic variables, the equality between the Sequential Kalman Filter Approach and the Variational one as been demonstrated.

  10. Concurrent Monte Carlo transport and fluence optimization with fluence adjusting scalable transport Monte Carlo

    PubMed Central

    Svatos, M.; Zankowski, C.; Bednarz, B.

    2016-01-01

    Purpose: The future of radiation therapy will require advanced inverse planning solutions to support single-arc, multiple-arc, and “4π” delivery modes, which present unique challenges in finding an optimal treatment plan over a vast search space, while still preserving dosimetric accuracy. The successful clinical implementation of such methods would benefit from Monte Carlo (MC) based dose calculation methods, which can offer improvements in dosimetric accuracy when compared to deterministic methods. The standard method for MC based treatment planning optimization leverages the accuracy of the MC dose calculation and efficiency of well-developed optimization methods, by precalculating the fluence to dose relationship within a patient with MC methods and subsequently optimizing the fluence weights. However, the sequential nature of this implementation is computationally time consuming and memory intensive. Methods to reduce the overhead of the MC precalculation have been explored in the past, demonstrating promising reductions of computational time overhead, but with limited impact on the memory overhead due to the sequential nature of the dose calculation and fluence optimization. The authors propose an entirely new form of “concurrent” Monte Carlo treat plan optimization: a platform which optimizes the fluence during the dose calculation, reduces wasted computation time being spent on beamlets that weakly contribute to the final dose distribution, and requires only a low memory footprint to function. In this initial investigation, the authors explore the key theoretical and practical considerations of optimizing fluence in such a manner. Methods: The authors present a novel derivation and implementation of a gradient descent algorithm that allows for optimization during MC particle transport, based on highly stochastic information generated through particle transport of very few histories. A gradient rescaling and renormalization algorithm, and the concept of momentum from stochastic gradient descent were used to address obstacles unique to performing gradient descent fluence optimization during MC particle transport. The authors have applied their method to two simple geometrical phantoms, and one clinical patient geometry to examine the capability of this platform to generate conformal plans as well as assess its computational scaling and efficiency, respectively. Results: The authors obtain a reduction of at least 50% in total histories transported in their investigation compared to a theoretical unweighted beamlet calculation and subsequent fluence optimization method, and observe a roughly fixed optimization time overhead consisting of ∼10% of the total computation time in all cases. Finally, the authors demonstrate a negligible increase in memory overhead of ∼7–8 MB to allow for optimization of a clinical patient geometry surrounded by 36 beams using their platform. Conclusions: This study demonstrates a fluence optimization approach, which could significantly improve the development of next generation radiation therapy solutions while incurring minimal additional computational overhead. PMID:27277051

  11. An inverse method to estimate emission rates based on nonlinear least-squares-based ensemble four-dimensional variational data assimilation with local air concentration measurements.

    PubMed

    Geng, Xiaobing; Xie, Zhenghui; Zhang, Lijun; Xu, Mei; Jia, Binghao

    2018-03-01

    An inverse source estimation method is proposed to reconstruct emission rates using local air concentration sampling data. It involves the nonlinear least squares-based ensemble four-dimensional variational data assimilation (NLS-4DVar) algorithm and a transfer coefficient matrix (TCM) created using FLEXPART, a Lagrangian atmospheric dispersion model. The method was tested by twin experiments and experiments with actual Cs-137 concentrations measured around the Fukushima Daiichi Nuclear Power Plant (FDNPP). Emission rates can be reconstructed sequentially with the progression of a nuclear accident, which is important in the response to a nuclear emergency. With pseudo observations generated continuously, most of the emission rates were estimated accurately, except under conditions when the wind blew off land toward the sea and at extremely slow wind speeds near the FDNPP. Because of the long duration of accidents and variability in meteorological fields, monitoring networks composed of land stations only in a local area are unable to provide enough information to support an emergency response. The errors in the estimation compared to the real observations from the FDNPP nuclear accident stemmed from a shortage of observations, lack of data control, and an inadequate atmospheric dispersion model without improvement and appropriate meteorological data. The proposed method should be developed further to meet the requirements of a nuclear emergency response. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The Mercator Océan operational and reanalysis systems: overview of recent improvements and scientific key issues

    NASA Astrophysics Data System (ADS)

    Hernandez, F.; Benkiran, M.; Bourdalle-Badie, R.; Bricaud, C.; Cailleau, S.; Chanut, J.; Desportes, C.; Dombrowsky, E.; drevillon, M.; Drillet, Y.; Elmoussaoui, A.; Ferry, N.; Garric, G.; Greiner, E.; Le Galloudec, O.; Lellouche, J.; Levier, B.; Parent, L.; Perruche, C.; Reffray, G.; Regnier, C.; Rémy, E.; Testut, C.; Tranchant, B.

    2011-12-01

    In the framework of the European project GMES/MyOcean, Mercator Océan has designed a hierarchy of ocean analysis, forecasting and reanalysis systems based on numerical models of the ocean/sea-ice, data assimilation methods and biogeochemistry model. Operational weekly analysis provide initial conditions for daily predictions. All ocean model configurations are based on NEMO. The 1/4° global and 1/12° Atlantic and Mediterranean configurations are improved with (i) the use of high frequency (3h) atmospheric forcings including the diurnal cycle, (ii) the use of the CORE bulk formulation, (iii) the use of a new TKE vertical mixing scheme, (iv) the use of the LIM2-EVP ice model. Leading to a better representation of the diurnal cycle, the stratification in upper layers, the sea-ice interannual extensions, or mesoscale features and WBC in the 1/12°. The 1/36° IBI regional configuration, adds non linear free surface, atmospheric pressure and tidal forcing, barotropic/baroclinic time splitting, and specific boundary conditions for operational nesting with the global system. At regional scale, a number of improvements (bathymetry and bottom friction) make it suitable for coastal modelling, validated state-of-art coastal ocean models. Data assimilation is based on reduced order Kalman filter using 3D multivariate modal decomposition of the forecast error. It assimilates jointly satellite altimetry, SST and in situ observations (temperature and salinity profiles, including ARGO data). Among difficulties, the assimilation has to be operated in real time, with limited and less accurate set of observations. Recent improvements in the global systems include (v) the insertion of the zonal/meridional velocity components into the control vector, (vi) the use of the IAU procedure, (vii) the insertion of new observational operators, (viii) the use of a new MDT, (ix) the introduction of pseudo-observations, (x) the use of a bias correction method based on a variational approach to estimate large scale biases. These improvements limit noise introduced by sequential assimilation, like in in the vertical dynamics. Water masses, on the shelves or near major run-off, like the Amazon discharge, are preserved. A biogeochemistry prediction system has been added, based on PISCES model. it uses spatial degradation of the real time physic provides by the 1/4° global system. A reanalysis covering the "altimetric era" (1992-2009) has been carried out in collaboration with the Drakkar community. GLORYS reanalyses describe the evolution of the ocean and sea-ice states; It is based on Mercator operational 1/4° global system, but with 75 levels vertical grid, ERA-Interim atmospheric forcing fields (corrected with satellite-based fluxes) and the assimilation of delayed time reprocessed and quality controlled observations. First studies show the reanalysis usefulness for interannual assessment of the mesoscale dynamics, but also the thermohaline circulations changes like MOC.

  13. A computationally efficient Bayesian sequential simulation approach for the assimilation of vast and diverse hydrogeophysical datasets

    NASA Astrophysics Data System (ADS)

    Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus

    2016-04-01

    Bayesian sequential simulation (BSS) is a powerful geostatistical technique, which notably has shown significant potential for the assimilation of datasets that are diverse with regard to the spatial resolution and their relationship. However, these types of applications of BSS require a large number of realizations to adequately explore the solution space and to assess the corresponding uncertainties. Moreover, such simulations generally need to be performed on very fine grids in order to adequately exploit the technique's potential for characterizing heterogeneous environments. Correspondingly, the computational cost of BSS algorithms in their classical form is very high, which so far has limited an effective application of this method to large models and/or vast datasets. In this context, it is also important to note that the inherent assumption regarding the independence of the considered datasets is generally regarded as being too strong in the context of sequential simulation. To alleviate these problems, we have revisited the classical implementation of BSS and incorporated two key features to increase the computational efficiency. The first feature is a combined quadrant spiral - superblock search, which targets run-time savings on large grids and adds flexibility with regard to the selection of neighboring points using equal directional sampling and treating hard data and previously simulated points separately. The second feature is a constant path of simulation, which enhances the efficiency for multiple realizations. We have also modified the aggregation operator to be more flexible with regard to the assumption of independence of the considered datasets. This is achieved through log-linear pooling, which essentially allows for attributing weights to the various data components. Finally, a multi-grid simulating path was created to enforce large-scale variance and to allow for adapting parameters, such as, for example, the log-linear weights or the type of simulation path at various scales. The newly implemented search method for kriging reduces the computational cost from an exponential dependence with regard to the grid size in the original algorithm to a linear relationship, as each neighboring search becomes independent from the grid size. For the considered examples, our results show a sevenfold reduction in run time for each additional realization when a constant simulation path is used. The traditional criticism that constant path techniques introduce a bias to the simulations was explored and our findings do indeed reveal a minor reduction in the diversity of the simulations. This bias can, however, be largely eliminated by changing the path type at different scales through the use of the multi-grid approach. Finally, we show that adapting the aggregation weight at each scale considered in our multi-grid approach allows for reproducing both the variogram and histogram, and the spatial trend of the underlying data.

  14. Exploring first-order phase transitions with population annealing

    NASA Astrophysics Data System (ADS)

    Barash, Lev Yu.; Weigel, Martin; Shchur, Lev N.; Janke, Wolfhard

    2017-03-01

    Population annealing is a hybrid of sequential and Markov chain Monte Carlo methods geared towards the efficient parallel simulation of systems with complex free-energy landscapes. Systems with first-order phase transitions are among the problems in computational physics that are difficult to tackle with standard methods such as local-update simulations in the canonical ensemble, for example with the Metropolis algorithm. It is hence interesting to see whether such transitions can be more easily studied using population annealing. We report here our preliminary observations from population annealing runs for the two-dimensional Potts model with q > 4, where it undergoes a first-order transition.

  15. Assimilating AmeriFlux Site Data into the Community Land Model with Carbon-Nitrogen Coupling via the Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Pettijohn, J. C.; Law, B. E.; Williams, M. D.; Stoeckli, R.; Thornton, P. E.; Hudiburg, T. M.; Thomas, C. K.; Martin, J.; Hill, T. C.

    2009-12-01

    The assimilation of terrestrial carbon, water and nutrient cycle measurements into land surface models of these processes is fundamental to improving our ability to predict how these ecosystems may respond to climate change. A combination of measurements and models, each with their own systematic biases, must be considered when constraining the nonlinear behavior of these coupled dynamics. As such, we use the sequential Ensemble Kalman Filter (EnKF) to assimilate eddy covariance (EC) and other site-level AmeriFlux measurements into the NCAR Community Land Model with Carbon-Nitrogen coupling (CLM-CN v3.5), run in single-column mode at a 30-minute time step, to improve estimates of relatively unconstrained model state variables and parameters. Specifically, we focus on a semi-arid ponderosa pine site (US-ME2) in the Pacific Northwest to identify the mechanisms by which this ecosystem responds to severe late summer drought. Our EnKF analysis includes water, carbon, energy and nitrogen state variables (e.g., 10 volumetric soil moisture levels (0-3.43 m), ponderosa pine and shrub evapotranspiration and net ecosystem exchange of carbon dioxide stocks and flux components, snow depth, etc.) and associated parameters (e.g., PFT-level rooting distribution parameters, maximum subsurface runoff coefficient, soil hydraulic conductivity decay factor, snow aging parameters, maximum canopy conductance, C:N ratios, etc.). The effectiveness of the EnKF in constraining state variables and associated parameters is sensitive to their relative frequencies, in that C-N state variables and parameters with long time constants require similarly long time series in the analysis. We apply the EnKF kernel perturbation routine to disrupt preliminary convergence of covariances, which has been found in recent studies to be a problem more characteristic of low frequency vegetation state variables and parameters than high frequency ones more heavily coupled with highly varying climate (e.g., shallow soil moisture, snow depth). Preliminary results demonstrate that the assimilation of EC and other available AmeriFlux site physical, chemical and biological data significantly helps quantify and reduce CLM-CN model uncertainties and helps to constrain ‘hidden’ states and parameters that are essential in the coupled water, carbon, energy and nutrient dynamics of these sites. Such site-level calibration of CLM-CN is an initial step in identifying model deficiencies and in forecasts of future ecosystem responses to climate change.

  16. Group sequential monitoring based on the weighted log-rank test statistic with the Fleming-Harrington class of weights in cancer vaccine studies.

    PubMed

    Hasegawa, Takahiro

    2016-09-01

    In recent years, immunological science has evolved, and cancer vaccines are now approved and available for treating existing cancers. Because cancer vaccines require time to elicit an immune response, a delayed treatment effect is expected and is actually observed in drug approval studies. Accordingly, we propose the evaluation of survival endpoints by weighted log-rank tests with the Fleming-Harrington class of weights. We consider group sequential monitoring, which allows early efficacy stopping, and determine a semiparametric information fraction for the Fleming-Harrington family of weights, which is necessary for the error spending function. Moreover, we give a flexible survival model in cancer vaccine studies that considers not only the delayed treatment effect but also the long-term survivors. In a Monte Carlo simulation study, we illustrate that when the primary analysis is a weighted log-rank test emphasizing the late differences, the proposed information fraction can be a useful alternative to the surrogate information fraction, which is proportional to the number of events. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Efficient Mean Field Variational Algorithm for Data Assimilation (Invited)

    NASA Astrophysics Data System (ADS)

    Vrettas, M. D.; Cornford, D.; Opper, M.

    2013-12-01

    Data assimilation algorithms combine available observations of physical systems with the assumed model dynamics in a systematic manner, to produce better estimates of initial conditions for prediction. Broadly they can be categorized in three main approaches: (a) sequential algorithms, (b) sampling methods and (c) variational algorithms which transform the density estimation problem to an optimization problem. However, given finite computational resources, only a handful of ensemble Kalman filters and 4DVar algorithms have been applied operationally to very high dimensional geophysical applications, such as weather forecasting. In this paper we present a recent extension to our variational Bayesian algorithm which seeks the ';optimal' posterior distribution over the continuous time states, within a family of non-stationary Gaussian processes. Our initial work on variational Bayesian approaches to data assimilation, unlike the well-known 4DVar method which seeks only the most probable solution, computes the best time varying Gaussian process approximation to the posterior smoothing distribution for dynamical systems that can be represented by stochastic differential equations. This approach was based on minimising the Kullback-Leibler divergence, over paths, between the true posterior and our Gaussian process approximation. Whilst the observations were informative enough to keep the posterior smoothing density close to Gaussian the algorithm proved very effective on low dimensional systems (e.g. O(10)D). However for higher dimensional systems, the high computational demands make the algorithm prohibitively expensive. To overcome the difficulties presented in the original framework and make our approach more efficient in higher dimensional systems we have been developing a new mean field version of the algorithm which treats the state variables at any given time as being independent in the posterior approximation, while still accounting for their relationships in the mean solution arising from the original system dynamics. Here we present this new mean field approach, illustrating its performance on a range of benchmark data assimilation problems whose dimensionality varies from O(10) to O(10^3)D. We emphasise that the variational Bayesian approach we adopt, unlike other variational approaches, provides a natural bound on the marginal likelihood of the observations given the model parameters which also allows for inference of (hyper-) parameters such as observational errors, parameters in the dynamical model and model error representation. We also stress that since our approach is intrinsically parallel it can be implemented very efficiently to address very long data assimilation time windows. Moreover, like most traditional variational approaches our Bayesian variational method has the benefit of being posed as an optimisation problem therefore its complexity can be tuned to the available computational resources. We finish with a sketch of possible future directions.

  18. Conditions for successful data assimilation

    NASA Astrophysics Data System (ADS)

    Morzfeld, M.; Chorin, A. J.

    2013-12-01

    Many applications in science and engineering require that the predictions of uncertain models be updated by information from a stream of noisy data. The model and the data jointly define a conditional probability density function (pdf), which contains all the information one has about the process of interest and various numerical methods can be used to study and approximate this pdf, e.g. the Kalman filter, variational methods or particle filters. Given a model and data, each of these algorithms will produce a result. We are interested in the conditions under which this result is reasonable, i.e. consistent with the real-life situation one is modeling. In particular, we show, using idealized models, that numerical data assimilation is feasible in principle only if a suitably defined effective dimension of the problem is not excessive. This effective dimension depends on the noise in the model and the data, and in physically reasonable problems it can be moderate even when the number of variables is huge. In particular, we find that the effective dimension being moderate induces a balance condition between the noises in the model and the data; this balance condition is often satisfied in realistic applications or else the noise levels are excessive and drown the underlying signal. We also study the effects of the effective dimension on particle filters in two instances, one in which the importance function is based on the model alone, and one in which it is based on both the model and the data. We have three main conclusions: (1) the stability (i.e., non-collapse of weights) in particle filtering depends on the effective dimension of the problem. Particle filters can work well if the effective dimension is moderate even if the true dimension is large (which we expect to happen often in practice). (2) A suitable choice of importance function is essential, or else particle filtering fails even when data assimilation is feasible in principle with a sequential algorithm. (3) There is a parameter range in which the model noise and the observation noise are roughly comparable, and in which even the optimal particle filter collapses, even under ideal circumstances. We further study the role of the effective dimension in variational data assimilation and particle smoothing, for both the weak and strong constraint problem. It was found that these methods too require a moderate effective dimension or else no accurate predictions can be expected. Moreover, variational data assimilation or particle smoothing may be applicable in the parameter range where particle filtering fails, because the use of more than one consecutive data set helps reduce the variance which is responsible for the collapse of the filters.

  19. Accelerated event-by-event Monte Carlo microdosimetric calculations of electrons and protons tracks on a multi-core CPU and a CUDA-enabled GPU.

    PubMed

    Kalantzis, Georgios; Tachibana, Hidenobu

    2014-01-01

    For microdosimetric calculations event-by-event Monte Carlo (MC) methods are considered the most accurate. The main shortcoming of those methods is the extensive requirement for computational time. In this work we present an event-by-event MC code of low projectile energy electron and proton tracks for accelerated microdosimetric MC simulations on a graphic processing unit (GPU). Additionally, a hybrid implementation scheme was realized by employing OpenMP and CUDA in such a way that both GPU and multi-core CPU were utilized simultaneously. The two implementation schemes have been tested and compared with the sequential single threaded MC code on the CPU. Performance comparison was established on the speed-up for a set of benchmarking cases of electron and proton tracks. A maximum speedup of 67.2 was achieved for the GPU-based MC code, while a further improvement of the speedup up to 20% was achieved for the hybrid approach. The results indicate the capability of our CPU-GPU implementation for accelerated MC microdosimetric calculations of both electron and proton tracks without loss of accuracy. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. On uncertainty quantification in hydrogeology and hydrogeophysics

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  1. Parameter estimation in physically-based integrated hydrological models with the ensemble Kalman filter: a practical application.

    NASA Astrophysics Data System (ADS)

    Botto, Anna; Camporese, Matteo

    2017-04-01

    Hydrological models allow scientists to predict the response of water systems under varying forcing conditions. In particular, many physically-based integrated models were recently developed in order to understand the fundamental hydrological processes occurring at the catchment scale. However, the use of this class of hydrological models is still relatively limited, as their prediction skills heavily depend on reliable parameter estimation, an operation that is never trivial, being normally affected by large uncertainty and requiring huge computational effort. The objective of this work is to test the potential of data assimilation to be used as an inverse modeling procedure for the broad class of integrated hydrological models. To pursue this goal, a Bayesian data assimilation (DA) algorithm based on a Monte Carlo approach, namely the ensemble Kalman filter (EnKF), is combined with the CATchment HYdrology (CATHY) model. In this approach, input variables (atmospheric forcing, soil parameters, initial conditions) are statistically perturbed providing an ensemble of realizations aimed at taking into account the uncertainty involved in the process. Each realization is propagated forward by the CATHY hydrological model within a parallel R framework, developed to reduce the computational effort. When measurements are available, the EnKF is used to update both the system state and soil parameters. In particular, four different assimilation scenarios are applied to test the capability of the modeling framework: first only pressure head or water content are assimilated, then, the combination of both, and finally both pressure head and water content together with the subsurface outflow. To demonstrate the effectiveness of the approach in a real-world scenario, an artificial hillslope was designed and built to provide real measurements for the DA analyses. The experimental facility, located in the Department of Civil, Environmental and Architectural Engineering of the University of Padova (Italy), consists of a reinforced concrete box containing a soil prism with maximum height of 3.5 m, length of 6 m and width of 2 m. The hillslope is equipped with six pairs of tensiometers and water content reflectometers, to monitor the pressure head and soil moisture content, respectively. Moreover, two tipping bucket flow gages were used to measure the surface and subsurface discharges at the outlet. A 12-day long experiment was carried out, during which a series of four rainfall events with constant rainfall rate were generated, interspersed with phases of drainage. During the experiment, measurements were collected at a relatively high resolution of 0.5 Hz. We report here on the capability of the data assimilation framework to estimate sets of plausible parameters that are consistent with the experimental setup.

  2. Bayesian calibration of terrestrial ecosystem models: a study of advanced Markov chain Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Lu, Dan; Ricciuto, Daniel; Walker, Anthony; Safta, Cosmin; Munger, William

    2017-09-01

    Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results in a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. The result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.

  3. Bayesian calibration of terrestrial ecosystem models: A study of advanced Markov chain Monte Carlo methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Dan; Ricciuto, Daniel; Walker, Anthony

    Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this study, a Differential Evolution Adaptive Metropolis (DREAM) algorithm was used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The DREAM is a multi-chainmore » method and uses differential evolution technique for chain movement, allowing it to be efficiently applied to high-dimensional problems, and can reliably estimate heavy-tailed and multimodal distributions that are difficult for single-chain schemes using a Gaussian proposal distribution. The results were evaluated against the popular Adaptive Metropolis (AM) scheme. DREAM indicated that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identified one mode. The calibration of DREAM resulted in a better model fit and predictive performance compared to the AM. DREAM provides means for a good exploration of the posterior distributions of model parameters. Lastly, it reduces the risk of false convergence to a local optimum and potentially improves the predictive performance of the calibrated model.« less

  4. Kalman filter parameter estimation for a nonlinear diffusion model of epithelial cell migration using stochastic collocation and the Karhunen-Loeve expansion.

    PubMed

    Barber, Jared; Tanase, Roxana; Yotov, Ivan

    2016-06-01

    Several Kalman filter algorithms are presented for data assimilation and parameter estimation for a nonlinear diffusion model of epithelial cell migration. These include the ensemble Kalman filter with Monte Carlo sampling and a stochastic collocation (SC) Kalman filter with structured sampling. Further, two types of noise are considered -uncorrelated noise resulting in one stochastic dimension for each element of the spatial grid and correlated noise parameterized by the Karhunen-Loeve (KL) expansion resulting in one stochastic dimension for each KL term. The efficiency and accuracy of the four methods are investigated for two cases with synthetic data with and without noise, as well as data from a laboratory experiment. While it is observed that all algorithms perform reasonably well in matching the target solution and estimating the diffusion coefficient and the growth rate, it is illustrated that the algorithms that employ SC and KL expansion are computationally more efficient, as they require fewer ensemble members for comparable accuracy. In the case of SC methods, this is due to improved approximation in stochastic space compared to Monte Carlo sampling. In the case of KL methods, the parameterization of the noise results in a stochastic space of smaller dimension. The most efficient method is the one combining SC and KL expansion. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Bayesian calibration of terrestrial ecosystem models: A study of advanced Markov chain Monte Carlo methods

    DOE PAGES

    Lu, Dan; Ricciuto, Daniel; Walker, Anthony; ...

    2017-02-22

    Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this study, a Differential Evolution Adaptive Metropolis (DREAM) algorithm was used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The DREAM is a multi-chainmore » method and uses differential evolution technique for chain movement, allowing it to be efficiently applied to high-dimensional problems, and can reliably estimate heavy-tailed and multimodal distributions that are difficult for single-chain schemes using a Gaussian proposal distribution. The results were evaluated against the popular Adaptive Metropolis (AM) scheme. DREAM indicated that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identified one mode. The calibration of DREAM resulted in a better model fit and predictive performance compared to the AM. DREAM provides means for a good exploration of the posterior distributions of model parameters. Lastly, it reduces the risk of false convergence to a local optimum and potentially improves the predictive performance of the calibrated model.« less

  6. Impact of the initialisation on the predictability of the Southern Ocean sea ice at interannual to multi-decadal timescales

    NASA Astrophysics Data System (ADS)

    Zunz, Violette; Goosse, Hugues; Dubinkina, Svetlana

    2014-05-01

    In this study, we assess systematically the impact of different initialisation procedures on the predictability of the sea ice in the Southern Ocean. These initialisation strategies are based on three data assimilation methods: the nudging, the particle filter with sequential resampling and the nudging proposal particle filter. An Earth-system model of intermediate complexity has been used to perform hindcast simulations in a perfect model approach. The predictability of the Southern Ocean sea ice is estimated through two aspects: the spread of the hindcast ensemble, indicating the uncertainty on the ensemble, and the correlation between the ensemble mean and the pseudo-observations, used to assess the accuracy of the prediction. Our results show that, at decadal timescales, more sophisticated data assimilation methods as well as denser pseudo-observations used to initialise the hindcasts decrease the spread of the ensemble but improve only slightly the accuracy of the prediction of the sea ice in the Southern Ocean. Overall, the predictability at interannual timescales is limited, at most, to three years ahead. At multi-decadal timescales, there is a clear improvement of the correlation of the trend in sea ice extent between the hindcasts and the pseudo-observations if the initialisation takes into account the pseudo-observations. The correlation reaches values larger than 0.5 and is due to the inertia of the ocean, showing the importance of the quality of the initialisation below the sea ice.

  7. The evolution of complex type B Allende inclusion - An ion microprobe trace element study

    NASA Technical Reports Server (NTRS)

    Macpherson, Glenn J.; Crozaz, Ghislaine; Lundberg, Laura L.

    1989-01-01

    Results are presented of a detailed trace-element and isotopic analyses of the constituent phases in each of the major textural parts (mantle, core, and islands) of a Type B refractory inclusion, the USNM 5241 inclusion from Allende, first described by El Goresy et al. (1985). The REE data on 5241 were found to be largely consistent with a model in which the mantle and the core of 5241 formed sequentially out of a single melt by fractional crystallization. The numerical models of REE evolution in the 5241 melt, especially that of Eu, require that a significant mass of spinel-free island material was assimilated into the evolving melt during the last half of the solidification history of 5241. The trace element results pbtained thus strongly support the interpretation of El Goresy et al. (1985) that the spinel-free islands in the 5241 are trapped xenoliths.

  8. Microalgae-mediated simultaneous treatment of toxic thiocyanate and production of biodiesel.

    PubMed

    Ryu, Byung-Gon; Kim, Jungmin; Yoo, Gursong; Lim, Jun-Taek; Kim, Woong; Han, Jong-In; Yang, Ji-Won

    2014-04-01

    In this work, a method for simultaneously degrading the toxic pollutant, thiocyanate, and producing microalgal lipids using mixed microbial communities was developed. Aerobic activated sludge was used as the seed culture and thiocyanate was used as the sole nitrogen source. Two cultivation methods were sequentially employed: a lithoautotrophic mode and a photoautotrophic mode. Thiocyanate hydrolysis and a nitrification was found to occur under the first (lithoautotrophic) condition, while the oxidized forms of nitrogen were assimilated by the photoautotrophic consortium and lipids were produced under the second condition. The final culture exhibited good settling efficiency (∼ 70% settling over 10 min), which can benefit downstream processing. The highest CO2 fixation rate and lipid productivity were observed with 2.5% and 5% CO2, respectively. The proposed integrated algal-bacterial system appears to be a feasible and even beneficial option for thiocyanate treatment and production of microbial lipids. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Behavior of Filters and Smoothers for Strongly Nonlinear Dynamics

    NASA Technical Reports Server (NTRS)

    Zhu, Yanqui; Cohn, Stephen E.; Todling, Ricardo

    1999-01-01

    The Kalman filter is the optimal filter in the presence of known gaussian error statistics and linear dynamics. Filter extension to nonlinear dynamics is non trivial in the sense of appropriately representing high order moments of the statistics. Monte Carlo, ensemble-based, methods have been advocated as the methodology for representing high order moments without any questionable closure assumptions. Investigation along these lines has been conducted for highly idealized dynamics such as the strongly nonlinear Lorenz model as well as more realistic models of the means and atmosphere. A few relevant issues in this context are related to the necessary number of ensemble members to properly represent the error statistics and, the necessary modifications in the usual filter situations to allow for correct update of the ensemble members. The ensemble technique has also been applied to the problem of smoothing for which similar questions apply. Ensemble smoother examples, however, seem to be quite puzzling in that results state estimates are worse than for their filter analogue. In this study, we use concepts in probability theory to revisit the ensemble methodology for filtering and smoothing in data assimilation. We use the Lorenz model to test and compare the behavior of a variety of implementations of ensemble filters. We also implement ensemble smoothers that are able to perform better than their filter counterparts. A discussion of feasibility of these techniques to large data assimilation problems will be given at the time of the conference.

  10. Influence of ore processing activity on Hg, As and Sb contamination and fractionation in soils in a former mining site of Monte Amiata ore district (Italy).

    PubMed

    Protano, Giuseppe; Nannoni, Francesco

    2018-05-01

    A geochemical study was carried out at the former Abbadia San Salvatore (ASS) mining site of the Monte Amiata ore district (Italy). Hg, As and Sb total contents and fractionation using a sequential extraction procedure were determined in soil and mining waste samples. Ore processing activities provided a different contribution to Hg contamination and concentration in soil fractions, influencing its behaviour as volatility and availability. Soils of roasting zone showed the highest Hg contamination levels mainly due to the deposition of Hg released as Hg 0 by furnaces during cinnabar roasting. High Hg contents were also measured in waste from the lower part of mining dump due to the presence of cinnabar. The fractionation pattern suggested that Hg was largely as volatile species in both uncontaminated and contaminated soils and mining waste, and concentrations of these Hg species increased as contamination increased. These findings were in agreement with the fact that the ASS mining site is characterized by high Hg concentrations in the air and the presence of Hg 0 liquid droplets in soil. Volatile Hg species were also prevalent in uncontaminated soils likely because the Monte Amiata region is an area characterized by anomalous fluxes of gaseous Hg from natural and anthropogenic inputs. At the ASS mining site soils were also contaminated by Sb, while As contents were comparable with its local background in soil. In all soil and waste samples Sb and As were preferentially in residual fraction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Agricultural Decision Support Through Robust Assimilation of Satellite Derived Soil Moisture Estimates

    NASA Astrophysics Data System (ADS)

    Mishra, V.; Cruise, J.; Mecikalski, J. R.

    2012-12-01

    Soil Moisture is a key component in the hydrological process, affects surface and boundary layer energy fluxes and is the driving factor in agricultural production. Multiple in situ soil moisture measuring instruments such as Time-domain Reflectrometry (TDR), Nuclear Probes etc. are in use along with remote sensing methods like Active and Passive Microwave (PM) sensors. In situ measurements, despite being more accurate, can only be obtained at discrete points over small spatial scales. Remote sensing estimates, on the other hand, can be obtained over larger spatial domains with varying spatial and temporal resolutions. Soil moisture profiles derived from satellite based thermal infrared (TIR) imagery can overcome many of the problems associated with laborious in-situ observations over large spatial domains. An area where soil moisture observation and assimilation is receiving increasing attention is agricultural crop modeling. This study revolves around the use of the Decision Support System for Agrotechnology Transfer (DSSAT) crop model to simulate corn yields under various forcing scenarios. First, the model was run and calibrated using observed precipitation and model generated soil moisture dynamics. Next, the modeled soil moisture was updated using estimates derived from satellite based TIR imagery and the Atmospheric Land Exchange Inverse (ALEXI) model. We selected three climatically different locations to test the concept. Test Locations were selected to represent varied climatology. Bell Mina, Alabama - South Eastern United States, representing humid subtropical climate. Nabb, Indiana - Mid Western United States, representing humid continental climate. Lubbok, Texas - Southern United States, representing semiarid steppe climate. A temporal (2000-2009) correlation analysis of the soil moisture values from both DSSAT and ALEXI were performed and validated against the Land Information System (LIS) soil moisture dataset. The results clearly show strong correlation (R = 73%) between ALEXI and DSSAT at Bell Mina. At Nabb and Lubbock the correlation was 50-60%. Further, multiple experiments were conducted for each location: a) a DSSAT rain-fed 10 year sequential run forced with daymet precipitation; b) a DSSAT sequential run with no precipitation data; and c) a DSSAT run forced with ALEXI soil moisture estimates alone. The preliminary results of all the experiments are quantified through soil moisture correlations and yield comparisons. In general, the preliminary results strongly suggest that DSSAT forced with ALEXI can provide significant information especially at locations where no significant precipitation data exists.

  12. Coupled assimilation for an intermediated coupled ENSO prediction model

    NASA Astrophysics Data System (ADS)

    Zheng, Fei; Zhu, Jiang

    2010-10-01

    The value of coupled assimilation is discussed using an intermediate coupled model in which the wind stress is the only atmospheric state which is slavery to model sea surface temperature (SST). In the coupled assimilation analysis, based on the coupled wind-ocean state covariance calculated from the coupled state ensemble, the ocean state is adjusted by assimilating wind data using the ensemble Kalman filter. As revealed by a series of assimilation experiments using simulated observations, the coupled assimilation of wind observations yields better results than the assimilation of SST observations. Specifically, the coupled assimilation of wind observations can help to improve the accuracy of the surface and subsurface currents because the correlation between the wind and ocean currents is stronger than that between SST and ocean currents in the equatorial Pacific. Thus, the coupled assimilation of wind data can decrease the initial condition errors in the surface/subsurface currents that can significantly contribute to SST forecast errors. The value of the coupled assimilation of wind observations is further demonstrated by comparing the prediction skills of three 12-year (1997-2008) hindcast experiments initialized by the ocean-only assimilation scheme that assimilates SST observations, the coupled assimilation scheme that assimilates wind observations, and a nudging scheme that nudges the observed wind stress data, respectively. The prediction skills of two assimilation schemes are significantly better than those of the nudging scheme. The prediction skills of assimilating wind observations are better than assimilating SST observations. Assimilating wind observations for the 2007/2008 La Niña event triggers better predictions, while assimilating SST observations fails to provide an early warning for that event.

  13. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the exponent in the crack propagation rate (Paris equation) and the yield strength of the elements are considered in the analytical model. The structural component is assumed to consist of a prescribed number of elements. This Monte Carlo simulation methodology is used to determine the required non-periodic inspections so that the reliability of the structural component will not fall below a prescribed minimum level. A sensitivity analysis is conducted to determine the effect of three key parameters on the specification of the non-periodic inspection intervals: namely a parameter associated with the time to crack initiation, the applied nominal stress fluctuation and the minimum acceptable reliability level.

  14. A Sequential Ensemble Prediction System at Convection Permitting Scales

    NASA Astrophysics Data System (ADS)

    Milan, M.; Simmer, C.

    2012-04-01

    A Sequential Assimilation Method (SAM) following some aspects of particle filtering with resampling, also called SIR (Sequential Importance Resampling), is introduced and applied in the framework of an Ensemble Prediction System (EPS) for weather forecasting on convection permitting scales, with focus to precipitation forecast. At this scale and beyond, the atmosphere increasingly exhibits chaotic behaviour and non linear state space evolution due to convectively driven processes. One way to take full account of non linear state developments are particle filter methods, their basic idea is the representation of the model probability density function by a number of ensemble members weighted by their likelihood with the observations. In particular particle filter with resampling abandons ensemble members (particles) with low weights restoring the original number of particles adding multiple copies of the members with high weights. In our SIR-like implementation we substitute the likelihood way to define weights and introduce a metric which quantifies the "distance" between the observed atmospheric state and the states simulated by the ensemble members. We also introduce a methodology to counteract filter degeneracy, i.e. the collapse of the simulated state space. To this goal we propose a combination of resampling taking account of simulated state space clustering and nudging. By keeping cluster representatives during resampling and filtering, the method maintains the potential for non linear system state development. We assume that a particle cluster with initially low likelihood may evolve in a state space with higher likelihood in a subsequent filter time thus mimicking non linear system state developments (e.g. sudden convection initiation) and remedies timing errors for convection due to model errors and/or imperfect initial condition. We apply a simplified version of the resampling, the particles with highest weights in each cluster are duplicated; for the model evolution for each particle pair one particle evolves using the forward model; the second particle, however, is nudged to the radar and satellite observation during its evolution based on the forward model.

  15. Assessing of distribution, mobility and bioavailability of exogenous Pb in agricultural soils using isotopic labeling method coupled with BCR approach.

    PubMed

    Huang, Zhi-Yong; Xie, Hong; Cao, Ying-Lan; Cai, Chao; Zhang, Zhi

    2014-02-15

    The contamination of Pb in agricultural soils is one of the most important ecological problems, which potentially results in serious health risk on human health through food chain. Hence, the fate of exogenous Pb contaminated in agricultural soils is needed to be deeply explored. By spiking soils with the stable enriched isotopes of (206)Pb, the contamination of exogenous Pb(2+) ions in three agricultural soils sampled from the estuary areas of Jiulong River, China was simulated in the present study, and the distribution, mobility and bioavailability of exogenous Pb in the soils were investigated using the isotopic labeling method coupled with a four-stage BCR (European Community Bureau of Reference) sequential extraction procedure. Results showed that about 60-85% of exogenous Pb was found to distribute in reducible fractions, while the exogenous Pb in acid-extractable fractions was less than 1.0%. After planting, the amounts of exogenous Pb presenting in acid-extractable, reducible and oxidizable fractions in rhizospheric soils decreased by 60-66%, in which partial exogenous Pb was assimilated by plants while most of the metal might transfer downward due to daily watering and applying fertilizer. The results show that the isotopic labeling technique coupled with sequential extraction procedures enables us to explore the distribution, mobility and bioavailability of exogenous Pb contaminated in soils, which may be useful for the further soil remediation. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A hybrid variational ensemble data assimilation for the HIgh Resolution Limited Area Model (HIRLAM)

    NASA Astrophysics Data System (ADS)

    Gustafsson, N.; Bojarova, J.; Vignes, O.

    2014-02-01

    A hybrid variational ensemble data assimilation has been developed on top of the HIRLAM variational data assimilation. It provides the possibility of applying a flow-dependent background error covariance model during the data assimilation at the same time as full rank characteristics of the variational data assimilation are preserved. The hybrid formulation is based on an augmentation of the assimilation control variable with localised weights to be assigned to a set of ensemble member perturbations (deviations from the ensemble mean). The flow-dependency of the hybrid assimilation is demonstrated in single simulated observation impact studies and the improved performance of the hybrid assimilation in comparison with pure 3-dimensional variational as well as pure ensemble assimilation is also proven in real observation assimilation experiments. The performance of the hybrid assimilation is comparable to the performance of the 4-dimensional variational data assimilation. The sensitivity to various parameters of the hybrid assimilation scheme and the sensitivity to the applied ensemble generation techniques are also examined. In particular, the inclusion of ensemble perturbations with a lagged validity time has been examined with encouraging results.

  17. Astrocytic tracer dynamics estimated from [1-¹¹C]-acetate PET measurements.

    PubMed

    Arnold, Andrea; Calvetti, Daniela; Gjedde, Albert; Iversen, Peter; Somersalo, Erkki

    2015-12-01

    We address the problem of estimating the unknown parameters of a model of tracer kinetics from sequences of positron emission tomography (PET) scan data using a statistical sequential algorithm for the inference of magnitudes of dynamic parameters. The method, based on Bayesian statistical inference, is a modification of a recently proposed particle filtering and sequential Monte Carlo algorithm, where instead of preassigning the accuracy in the propagation of each particle, we fix the time step and account for the numerical errors in the innovation term. We apply the algorithm to PET images of [1-¹¹C]-acetate-derived tracer accumulation, estimating the transport rates in a three-compartment model of astrocytic uptake and metabolism of the tracer for a cohort of 18 volunteers from 3 groups, corresponding to healthy control individuals, cirrhotic liver and hepatic encephalopathy patients. The distribution of the parameters for the individuals and for the groups presented within the Bayesian framework support the hypothesis that the parameters for the hepatic encephalopathy group follow a significantly different distribution than the other two groups. The biological implications of the findings are also discussed. © The Authors 2014. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  18. Controlled pattern imputation for sensitivity analysis of longitudinal binary and ordinal outcomes with nonignorable dropout.

    PubMed

    Tang, Yongqiang

    2018-04-30

    The controlled imputation method refers to a class of pattern mixture models that have been commonly used as sensitivity analyses of longitudinal clinical trials with nonignorable dropout in recent years. These pattern mixture models assume that participants in the experimental arm after dropout have similar response profiles to the control participants or have worse outcomes than otherwise similar participants who remain on the experimental treatment. In spite of its popularity, the controlled imputation has not been formally developed for longitudinal binary and ordinal outcomes partially due to the lack of a natural multivariate distribution for such endpoints. In this paper, we propose 2 approaches for implementing the controlled imputation for binary and ordinal data based respectively on the sequential logistic regression and the multivariate probit model. Efficient Markov chain Monte Carlo algorithms are developed for missing data imputation by using the monotone data augmentation technique for the sequential logistic regression and a parameter-expanded monotone data augmentation scheme for the multivariate probit model. We assess the performance of the proposed procedures by simulation and the analysis of a schizophrenia clinical trial and compare them with the fully conditional specification, last observation carried forward, and baseline observation carried forward imputation methods. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Eruption history of the Tharsis shield volcanoes, Mars

    NASA Technical Reports Server (NTRS)

    Plescia, J. B.

    1993-01-01

    The Tharsis Montes volcanoes and Olympus Mons are giant shield volcanoes. Although estimates of their average surface age have been made using crater counts, the length of time required to build the shields has not been considered. Crater counts for the volcanoes indicate the constructs are young; average ages are Amazonian to Hesperian. In relative terms; Arsia Mons is the oldest, Pavonis Mons intermediate, and Ascreaus Mons the youngest of the Tharsis Montes shield; Olympus Mons is the youngest of the group. Depending upon the calibration, absolute ages range from 730 Ma to 3100 Ma for Arsia Mons and 25 Ma to 100 Ma for Olympus Mons. These absolute chronologies are highly model dependent, and indicate only the time surficial volcanism ceased, not the time over which the volcano was built. The problem of estimating the time necessary to build the volcanoes can be attacked in two ways. First, eruption rates from terrestrial and extraterrestrial examples can be used to calculate the required period of time to build the shields. Second, some relation of eruptive activity between the volcanoes can be assumed, such as they all began at a speficic time or they were active sequentially, and calculate the eruptive rate. Volumes of the shield volcanoes were derived from topographic/volume data.

  20. PENGEOM-A general-purpose geometry package for Monte Carlo simulation of radiation transport in material systems defined by quadric surfaces

    NASA Astrophysics Data System (ADS)

    Almansa, Julio; Salvat-Pujol, Francesc; Díaz-Londoño, Gloria; Carnicer, Artur; Lallena, Antonio M.; Salvat, Francesc

    2016-02-01

    The Fortran subroutine package PENGEOM provides a complete set of tools to handle quadric geometries in Monte Carlo simulations of radiation transport. The material structure where radiation propagates is assumed to consist of homogeneous bodies limited by quadric surfaces. The PENGEOM subroutines (a subset of the PENELOPE code) track particles through the material structure, independently of the details of the physics models adopted to describe the interactions. Although these subroutines are designed for detailed simulations of photon and electron transport, where all individual interactions are simulated sequentially, they can also be used in mixed (class II) schemes for simulating the transport of high-energy charged particles, where the effect of soft interactions is described by the random-hinge method. The definition of the geometry and the details of the tracking algorithm are tailored to optimize simulation speed. The use of fuzzy quadric surfaces minimizes the impact of round-off errors. The provided software includes a Java graphical user interface for editing and debugging the geometry definition file and for visualizing the material structure. Images of the structure are generated by using the tracking subroutines and, hence, they describe the geometry actually passed to the simulation code.

  1. Can we reliably estimate managed forest carbon dynamics using remotely sensed data?

    NASA Astrophysics Data System (ADS)

    Smallman, Thomas Luke; Exbrayat, Jean-Francois; Bloom, A. Anthony; Williams, Mathew

    2015-04-01

    Forests are an important part of the global carbon cycle, serving as both a large store of carbon and currently as a net sink of CO2. Forest biomass varies significantly in time and space, linked to climate, soils, natural disturbance and human impacts. This variation means that the global distribution of forest biomass and their dynamics are poorly quantified. Terrestrial ecosystem models (TEMs) are rarely evaluated for their predictions of forest carbon stocks and dynamics, due to a lack of knowledge on site specific factors such as disturbance dates and / or managed interventions. In this regard, managed forests present a valuable opportunity for model calibration and improvement. Spatially explicit datasets of planting dates, species and yield classification, in combination with remote sensing data and an appropriate data assimilation (DA) framework can reduce prediction uncertainty and error. We use a Baysian approach to calibrate the data assimilation linked ecosystem carbon (DALEC) model using a Metropolis Hastings-Markov Chain Monte Carlo (MH-MCMC) framework. Forest management information is incorporated into the data assimilation framework as part of ecological and dynamic constraints (EDCs). The key advantage here is that DALEC simulates a full carbon balance, not just the living biomass, and that both parameter and prediction uncertainties are estimated as part of the DA analysis. DALEC has been calibrated at two managed forests, in the USA (Pinus taeda; Duke Forest) and UK (Picea sitchensis; Griffin Forest). At each site DALEC is calibrated twice (exp1 & exp2). Both calibrations (exp1 & exp2) assimilated MODIS LAI and HWSD estimates of soil carbon stored in soil organic matter, in addition to common management information and prior knowledge included in parameter priors and the EDCs. Calibration exp1 also utilises multiple site level estimates of carbon storage in multiple pools. By comparing simulations we determine the impact of site-level observations on uncertainty and error on predictions, and which observations are key to constraining ecosystem processes. Preliminary simulations indicate that DALEC calibration exp1 accurately simulated the assimilated observations for forest and soil carbon stock estimates including, critically for forestry, standing wood stocks (R2 = 0.92, bias = -4.46 MgC ha-1, RMSE = 5.80 MgC ha-1). The results from exp1 indicate the model is able to find parameters that are both consistent with EDC and observations. In the absence of site-level stock observations (exp2) DALEC accurately estimates foliage and fine root pools, while the median estimate of above ground litter and wood stocks (R2 = 0.92, bias = -48.30 MgC ha-1, RMSE = 50.30 MgC ha-1) are over- and underestimated respectively, site-level observations are within model uncertainty. These results indicate that we can estimate managed forests dynamics using remotely sensed data, particularly as remotely sensed above ground biomass maps become available to provide constraint to correct biases in woody accumulation.

  2. Bayesian calibration of terrestrial ecosystem models: a study of advanced Markov chain Monte Carlo methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Dan; Ricciuto, Daniel M.; Walker, Anthony P.

    Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results inmore » a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. Here, the result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.« less

  3. Bayesian calibration of terrestrial ecosystem models: a study of advanced Markov chain Monte Carlo methods

    DOE PAGES

    Lu, Dan; Ricciuto, Daniel M.; Walker, Anthony P.; ...

    2017-09-27

    Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results inmore » a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. Here, the result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.« less

  4. Sequential evaporation of water molecules from protonated water clusters: measurement of the velocity distributions of the evaporated molecules and statistical analysis.

    PubMed

    Berthias, F; Feketeová, L; Abdoul-Carime, H; Calvo, F; Farizon, B; Farizon, M; Märk, T D

    2018-06-22

    Velocity distributions of neutral water molecules evaporated after collision induced dissociation of protonated water clusters H+(H2O)n≤10 were measured using the combined correlated ion and neutral fragment time-of-flight (COINTOF) and velocity map imaging (VMI) techniques. As observed previously, all measured velocity distributions exhibit two contributions, with a low velocity part identified by statistical molecular dynamics (SMD) simulations as events obeying the Maxwell-Boltzmann statistics and a high velocity contribution corresponding to non-ergodic events in which energy redistribution is incomplete. In contrast to earlier studies, where the evaporation of a single molecule was probed, the present study is concerned with events involving the evaporation of up to five water molecules. In particular, we discuss here in detail the cases of two and three evaporated molecules. Evaporation of several water molecules after CID can be interpreted in general as a sequential evaporation process. In addition to the SMD calculations, a Monte Carlo (MC) based simulation was developed allowing the reconstruction of the velocity distribution produced by the evaporation of m molecules from H+(H2O)n≤10 cluster ions using the measured velocity distributions for singly evaporated molecules as the input. The observed broadening of the low-velocity part of the distributions for the evaporation of two and three molecules as compared to the width for the evaporation of a single molecule results from the cumulative recoil velocity of the successive ion residues as well as the intrinsically broader distributions for decreasingly smaller parent clusters. Further MC simulations were carried out assuming that a certain proportion of non-ergodic events is responsible for the first evaporation in such a sequential evaporation series, thereby allowing to model the entire velocity distribution.

  5. Towards predictive data-driven simulations of wildfire spread - Part I: Reduced-cost Ensemble Kalman Filter based on a Polynomial Chaos surrogate model for parameter estimation

    NASA Astrophysics Data System (ADS)

    Rochoux, M. C.; Ricci, S.; Lucor, D.; Cuenot, B.; Trouvé, A.

    2014-05-01

    This paper is the first part in a series of two articles and presents a data-driven wildfire simulator for forecasting wildfire spread scenarios, at a reduced computational cost that is consistent with operational systems. The prototype simulator features the following components: a level-set-based fire propagation solver FIREFLY that adopts a regional-scale modeling viewpoint, treats wildfires as surface propagating fronts, and uses a description of the local rate of fire spread (ROS) as a function of environmental conditions based on Rothermel's model; a series of airborne-like observations of the fire front positions; and a data assimilation algorithm based on an ensemble Kalman filter (EnKF) for parameter estimation. This stochastic algorithm partly accounts for the non-linearities between the input parameters of the semi-empirical ROS model and the fire front position, and is sequentially applied to provide a spatially-uniform correction to wind and biomass fuel parameters as observations become available. A wildfire spread simulator combined with an ensemble-based data assimilation algorithm is therefore a promising approach to reduce uncertainties in the forecast position of the fire front and to introduce a paradigm-shift in the wildfire emergency response. In order to reduce the computational cost of the EnKF algorithm, a surrogate model based on a polynomial chaos (PC) expansion is used in place of the forward model FIREFLY in the resulting hybrid PC-EnKF algorithm. The performance of EnKF and PC-EnKF is assessed on synthetically-generated simple configurations of fire spread to provide valuable information and insight on the benefits of the PC-EnKF approach as well as on a controlled grassland fire experiment. The results indicate that the proposed PC-EnKF algorithm features similar performance to the standard EnKF algorithm, but at a much reduced computational cost. In particular, the re-analysis and forecast skills of data assimilation strongly relate to the spatial and temporal variability of the errors in the ROS model parameters.

  6. Inverse modelling for real-time estimation of radiological consequences in the early stage of an accidental radioactivity release.

    PubMed

    Pecha, Petr; Šmídl, Václav

    2016-11-01

    A stepwise sequential assimilation algorithm is proposed based on an optimisation approach for recursive parameter estimation and tracking of radioactive plume propagation in the early stage of a radiation accident. Predictions of the radiological situation in each time step of the plume propagation are driven by an existing short-term meteorological forecast and the assimilation procedure manipulates the model parameters to match the observations incoming concurrently from the terrain. Mathematically, the task is a typical ill-posed inverse problem of estimating the parameters of the release. The proposed method is designated as a stepwise re-estimation of the source term release dynamics and an improvement of several input model parameters. It results in a more precise determination of the adversely affected areas in the terrain. The nonlinear least-squares regression methodology is applied for estimation of the unknowns. The fast and adequately accurate segmented Gaussian plume model (SGPM) is used in the first stage of direct (forward) modelling. The subsequent inverse procedure infers (re-estimates) the values of important model parameters from the actual observations. Accuracy and sensitivity of the proposed method for real-time forecasting of the accident propagation is studied. First, a twin experiment generating noiseless simulated "artificial" observations is studied to verify the minimisation algorithm. Second, the impact of the measurement noise on the re-estimated source release rate is examined. In addition, the presented method can be used as a proposal for more advanced statistical techniques using, e.g., importance sampling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Preparing for the Future Nankai Trough Tsunami: A Data Assimilation and Inversion Analysis From Various Observational Systems

    NASA Astrophysics Data System (ADS)

    Mulia, Iyan E.; Inazu, Daisuke; Waseda, Takuji; Gusman, Aditya Riadi

    2017-10-01

    The future Nankai Trough tsunami is one of the imminent threats to the Japanese coastal communities that could potentially cause a catastrophic event. As a part of the countermeasure efforts for such an occurrence, this study analyzes the efficacy of combining tsunami data assimilation (DA) and waveform inversion (WI). The DA is used to continuously refine a wavefield model whereas the WI is used to estimate the tsunami source. We consider a future scenario of the Nankai Trough tsunami recorded at various observational systems, including ocean bottom pressure (OBP) gauges, global positioning system (GPS) buoys, and ship height positioning data. Since most of the OBP gauges are located inside the source region, the recorded tsunami signals exhibit significant offsets from surface measurements due to coseismic seafloor deformation effects. Such biased data are not applicable to the standard DA, but can be taken into account in the WI. On the other hand, the use of WI for the ship data may not be practical because a considerably large precomputed tsunami database is needed to cope with the spontaneous ship locations. The DA is more suitable for such an observational system as it can be executed sequentially in time and does not require precomputed scenarios. Therefore, the combined approach of DA and WI allows us to concurrently make use of all observational resources. Additionally, we introduce a bias correction scheme for the OBP data to improve the accuracy, and an adaptive thinning of observations to determine the efficient number of observations.

  8. Parameter optimisation for a better representation of drought by LSMs: inverse modelling vs. sequential data assimilation

    NASA Astrophysics Data System (ADS)

    Dewaele, Hélène; Munier, Simon; Albergel, Clément; Planque, Carole; Laanaia, Nabil; Carrer, Dominique; Calvet, Jean-Christophe

    2017-09-01

    Soil maximum available water content (MaxAWC) is a key parameter in land surface models (LSMs). However, being difficult to measure, this parameter is usually uncertain. This study assesses the feasibility of using a 15-year (1999-2013) time series of satellite-derived low-resolution observations of leaf area index (LAI) to estimate MaxAWC for rainfed croplands over France. LAI interannual variability is simulated using the CO2-responsive version of the Interactions between Soil, Biosphere and Atmosphere (ISBA) LSM for various values of MaxAWC. Optimal value is then selected by using (1) a simple inverse modelling technique, comparing simulated and observed LAI and (2) a more complex method consisting in integrating observed LAI in ISBA through a land data assimilation system (LDAS) and minimising LAI analysis increments. The evaluation of the MaxAWC estimates from both methods is done using simulated annual maximum above-ground biomass (Bag) and straw cereal grain yield (GY) values from the Agreste French agricultural statistics portal, for 45 administrative units presenting a high proportion of straw cereals. Significant correlations (p value < 0.01) between Bag and GY are found for up to 36 and 53 % of the administrative units for the inverse modelling and LDAS tuning methods, respectively. It is found that the LDAS tuning experiment gives more realistic values of MaxAWC and maximum Bag than the inverse modelling experiment. Using undisaggregated LAI observations leads to an underestimation of MaxAWC and maximum Bag in both experiments. Median annual maximum values of disaggregated LAI observations are found to correlate very well with MaxAWC.

  9. Time-series resolution of gradual nitrogen starvation and its impact on photosynthesis in the cyanobacterium Synechocystis PCC 6803.

    PubMed

    Krasikov, Vladimir; Aguirre von Wobeser, Eneas; Dekker, Henk L; Huisman, Jef; Matthijs, Hans C P

    2012-07-01

    Sequential adaptation to nitrogen deprivation and ultimately to full starvation requires coordinated adjustment of cellular functions. We investigated changes in gene expression and cell physiology of the cyanobacterium Synechocystis PCC 6803 during 96 h of nitrogen starvation. During the first 6 h, the transcriptome showed activation of nitrogen uptake and assimilation systems and of the core nitrogen and carbon assimilation regulators. However, the nitrogen-deprived cells still grew at the same rate as the control and even showed transiently increased expression of phycobilisome genes. After 12 h, cell growth decreased and chlorosis started with degradation of the nitrogen-rich phycobilisomes. During this phase, the transcriptome showed suppression of genes for phycobilisomes, for carbon fixation and for de novo protein synthesis. Interestingly, photosynthetic activity of both photosystem I (PSI) and photosystem II was retained quite well. Excess electrons were quenched by the induction of terminal oxidase and hydrogenase genes, compensating for the diminished carbon fixation and nitrate reduction activity. After 48 h, the cells ceased most activities. A marked exception was the retained PSI gene transcription, possibly this supports the viability of Synechocystis cells and enables rapid recovery after relieving from nitrogen starvation. During early recovery, many genes changed expression, supporting the resumed cellular activity. In total, our results distinguished three phases during gradual nitrogen depletion: (1) an immediate response, (2) short-term acclimation and (3) long-term survival. This shows that cyanobacteria respond to nitrogen starvation by a cascade of physiological adaptations reflected by numerous changes in the transcriptome unfolding at different timescales. Copyright © Physiologia Plantarum 2012.

  10. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  11. A framework for optimizing micro-CT in dual-modality micro-CT/XFCT small-animal imaging system

    NASA Astrophysics Data System (ADS)

    Vedantham, Srinivasan; Shrestha, Suman; Karellas, Andrew; Cho, Sang Hyun

    2017-09-01

    Dual-modality Computed Tomography (CT)/X-ray Fluorescence Computed Tomography (XFCT) can be a valuable tool for imaging and quantifying the organ and tissue distribution of small concentrations of high atomic number materials in small-animal system. In this work, the framework for optimizing the micro-CT imaging system component of the dual-modality system is described, either when the micro-CT images are concurrently acquired with XFCT and using the x-ray spectral conditions for XFCT, or when the micro-CT images are acquired sequentially and independently of XFCT. This framework utilizes the cascaded systems analysis for task-specific determination of the detectability index using numerical observer models at a given radiation dose, where the radiation dose is determined using Monte Carlo simulations.

  12. Influence of probe pressure on diffuse reflectance spectra of human skin measured in vivo

    NASA Astrophysics Data System (ADS)

    Popov, Alexey P.; Bykov, Alexander V.; Meglinski, Igor V.

    2017-11-01

    Mechanical pressure superficially applied on the human skin surface by a fiber-optic probe influences the spatial distribution of blood within the cutaneous tissues. Upon gradual load of weight on the probe, a stepwise increase in the skin reflectance spectra is observed. The decrease in the load follows the similar inverse staircase-like tendency. The observed stepwise reflectance spectra changes are due to, respectively, sequential extrusion of blood from the topical cutaneous vascular beds and their filling afterward. The obtained results are confirmed by Monte Carlo modeling. This implies that pressure-induced influence during the human skin diffuse reflectance spectra measurements in vivo should be taken into consideration, in particular, in the rapidly developing area of wearable gadgets for real-time monitoring of various human body parameters.

  13. Model-Based Localization and Tracking Using Bluetooth Low-Energy Beacons

    PubMed Central

    Cemgil, Ali Taylan

    2017-01-01

    We introduce a high precision localization and tracking method that makes use of cheap Bluetooth low-energy (BLE) beacons only. We track the position of a moving sensor by integrating highly unreliable and noisy BLE observations streaming from multiple locations. A novel aspect of our approach is the development of an observation model, specifically tailored for received signal strength indicator (RSSI) fingerprints: a combination based on the optimal transport model of Wasserstein distance. The tracking results of the entire system are compared with alternative baseline estimation methods, such as nearest neighboring fingerprints and an artificial neural network. Our results show that highly accurate estimation from noisy Bluetooth data is practically feasible with an observation model based on Wasserstein distance interpolation combined with the sequential Monte Carlo (SMC) method for tracking. PMID:29109375

  14. Model-Based Localization and Tracking Using Bluetooth Low-Energy Beacons.

    PubMed

    Daniş, F Serhan; Cemgil, Ali Taylan

    2017-10-29

    We introduce a high precision localization and tracking method that makes use of cheap Bluetooth low-energy (BLE) beacons only. We track the position of a moving sensor by integrating highly unreliable and noisy BLE observations streaming from multiple locations. A novel aspect of our approach is the development of an observation model, specifically tailored for received signal strength indicator (RSSI) fingerprints: a combination based on the optimal transport model of Wasserstein distance. The tracking results of the entire system are compared with alternative baseline estimation methods, such as nearest neighboring fingerprints and an artificial neural network. Our results show that highly accurate estimation from noisy Bluetooth data is practically feasible with an observation model based on Wasserstein distance interpolation combined with the sequential Monte Carlo (SMC) method for tracking.

  15. Analysis and optimization of population annealing

    NASA Astrophysics Data System (ADS)

    Amey, Christopher; Machta, Jonathan

    2018-03-01

    Population annealing is an easily parallelizable sequential Monte Carlo algorithm that is well suited for simulating the equilibrium properties of systems with rough free-energy landscapes. In this work we seek to understand and improve the performance of population annealing. We derive several useful relations between quantities that describe the performance of population annealing and use these relations to suggest methods to optimize the algorithm. These optimization methods were tested by performing large-scale simulations of the three-dimensional (3D) Edwards-Anderson (Ising) spin glass and measuring several observables. The optimization methods were found to substantially decrease the amount of computational work necessary as compared to previously used, unoptimized versions of population annealing. We also obtain more accurate values of several important observables for the 3D Edwards-Anderson model.

  16. Effects of oligotrophication on primary production in peri-alpine lakes

    NASA Astrophysics Data System (ADS)

    Finger, David; Wüest, Alfred; Bossard, Peter

    2013-08-01

    During the second half of the 20th century untreated sewage released from housing and industry into natural waters led to a degradation of many freshwater lakes and reservoirs worldwide. In order to mitigate eutrophication, wastewater treatment plants, including Fe-induced phosphorus precipitation, were implemented throughout the industrialized world, leading to reoligotrophication in many freshwater lakes. To understand and assess the effects of reoligotrophication on primary productivity, we analyzed 28 years of 14C assimilation rates, as well as other biotic and abiotic parameters, such as global radiation, nutrient concentrations and plankton densities in peri-alpine Lake Lucerne, Switzerland. Using a simple productivity-light relationship, we estimated continuous primary production and discussed the relation between productivity and observed limnological parameters. Furthermore, we assessed the uncertainty of our modeling approach based on monthly 14C assimilation measurements using Monte Carlo simulations. Results confirm that monthly sampling of productivity is sufficient for identifying long-term trends in productivity and that conservation management has successfully improved water quality during the past three decades via reducing nutrients and primary production in the lake. However, even though nutrient concentrations have remained constant in recent years, annual primary production varies significantly from year to year. Despite the fact that nutrient concentrations have decreased by more than an order of magnitude, primary production has decreased only slightly. These results suggest that primary production correlates well to nutrients availability but meteorological conditions lead to interannual variability regardless of the trophic status of the lake. Accordingly, in oligotrophic freshwaters meteorological forcing may reduce productivity impacting on the entire food chain of the ecosystem.

  17. The Advantages of Hybrid 4DEnVar in the Context of the Forecast Sensitivity to Initial Conditions

    NASA Astrophysics Data System (ADS)

    Song, Hyo-Jong; Shin, Seoleun; Ha, Ji-Hyun; Lim, Sujeong

    2017-11-01

    Hybrid four-dimensional ensemble variational data assimilation (hybrid 4DEnVar) is a prospective successor to three-dimensional variational data assimilation (3DVar) in operational weather prediction centers currently developing a new weather prediction model and those that do not operate adjoint models. In experiments using real observations, hybrid 4DEnVar improved Northern Hemisphere (NH; 20°N-90°N) 500 hPa geopotential height forecasts up to 5 days in a NH summer month compared to 3DVar, with statistical significance. This result is verified against ERA-Interim through a Monte Carlo test. By a regression analysis, the sensitivity of 5 day forecast is associated with the quality of the initial condition. The increased analysis skill for midtropospheric midlatitude temperature and subtropical moisture has the most apparent effect on forecast skill in the NH including a typhoon prediction case. Through attributing the analysis improvements by hybrid 4DEnVar separately to the ensemble background error covariance (BEC), its four-dimensional (4-D) extension, and climatological BEC, it is revealed that the ensemble BEC contributes to the subtropical moisture analysis, whereas the 4-D extension does to the midtropospheric midlatitude temperature. This result implies that hourly wind-mass correlation in 6 h analysis window is required to extract the potential of hybrid 4DEnVar for the midlatitude temperature analysis to the maximum. However, the temporal ensemble correlation, in hourly time scale, between moisture and another variable is invalid so that it could not work for improving the hybrid 4DEnVar analysis.

  18. The Behavior of Filters and Smoothers for Strongly Nonlinear Dynamics

    NASA Technical Reports Server (NTRS)

    Zhu, Yanqiu; Cohn, Stephen E.; Todling, Ricardo

    1999-01-01

    The Kalman filter is the optimal filter in the presence of known Gaussian error statistics and linear dynamics. Filter extension to nonlinear dynamics is non trivial in the sense of appropriately representing high order moments of the statistics. Monte Carlo, ensemble-based, methods have been advocated as the methodology for representing high order moments without any questionable closure assumptions (e.g., Miller 1994). Investigation along these lines has been conducted for highly idealized dynamics such as the strongly nonlinear Lorenz (1963) model as well as more realistic models of the oceans (Evensen and van Leeuwen 1996) and atmosphere (Houtekamer and Mitchell 1998). A few relevant issues in this context are related to the necessary number of ensemble members to properly represent the error statistics and, the necessary modifications in the usual filter equations to allow for correct update of the ensemble members (Burgers 1998). The ensemble technique has also been applied to the problem of smoothing for which similar questions apply. Ensemble smoother examples, however, seem to quite puzzling in that results of state estimate are worse than for their filter analogue (Evensen 1997). In this study, we use concepts in probability theory to revisit the ensemble methodology for filtering and smoothing in data assimilation. We use Lorenz (1963) model to test and compare the behavior of a variety implementations of ensemble filters. We also implement ensemble smoothers that are able to perform better than their filter counterparts. A discussion of feasibility of these techniques to large data assimilation problems will be given at the time of the conference.

  19. The use of spatio-temporal correlation to forecast critical transitions

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Bierkens, Marc F. P.

    2010-05-01

    Complex dynamical systems may have critical thresholds at which the system shifts abruptly from one state to another. Such critical transitions have been observed in systems ranging from the human body system to financial markets and the Earth system. Forecasting the timing of critical transitions before they are reached is of paramount importance because critical transitions are associated with a large shift in dynamical regime of the system under consideration. However, it is hard to forecast critical transitions, because the state of the system shows relatively little change before the threshold is reached. Recently, it was shown that increased spatio-temporal autocorrelation and variance can serve as alternative early warning signal for critical transitions. However, thus far these second order statistics have not been used for forecasting in a data assimilation framework. Here we show that the use of spatio-temporal autocorrelation and variance in the state of the system reduces the uncertainty in the predicted timing of critical transitions compared to classical approaches that use the value of the system state only. This is shown by assimilating observed spatio-temporal autocorrelation and variance into a dynamical system model using a Particle Filter. We adapt a well-studied distributed model of a logistically growing resource with a fixed grazing rate. The model describes the transition from an underexploited system with high resource biomass to overexploitation as grazing pressure crosses the critical threshold, which is a fold bifurcation. To represent limited prior information, we use a large variance in the prior probability distributions of model parameters and the system driver (grazing rate). First, we show that the rate of increase in spatio-temporal autocorrelation and variance prior to reaching the critical threshold is relatively consistent across the uncertainty range of the driver and parameter values used. This indicates that an increase in spatio-temporal autocorrelation and variance are consistent predictors of a critical transition, even under the condition of a poorly defined system. Second, we perform data assimilation experiments using an artificial exhaustive data set generated by one realization of the model. To mimic real-world sampling, an observational data set is created from this exhaustive data set. This is done by sampling on a regular spatio-temporal grid, supplemented by sampling locations at a short distance. Spatial and temporal autocorrelation in this observational data set is calculated for different spatial and temporal separation (lag) distances. To assign appropriate weights to observations (here, autocorrelation values and variance) in the Particle Filter, the covariance matrix of the error in these observations is required. This covariance matrix is estimated using Monte Carlo sampling, selecting a different random position of the sampling network relative to the exhaustive data set for each realization. At each update moment in the Particle Filter, observed autocorrelation values are assimilated into the model and the state of the model is updated. Using this approach, it is shown that the use of autocorrelation reduces the uncertainty in the forecasted timing of a critical transition compared to runs without data assimilation. The performance of the use of spatial autocorrelation versus temporal autocorrelation depends on the timing and number of observational data. This study is restricted to a single model only. However, it is becoming increasingly clear that spatio-temporal autocorrelation and variance can be used as early warning signals for a large number of systems. Thus, it is expected that spatio-temporal autocorrelation and variance are valuable in data assimilation frameworks in a large number of dynamical systems.

  20. Lifetime Segmented Assimilation Trajectories and Health Outcomes in Latino and Other Community Residents

    PubMed Central

    Marsiglia, Flavio F.; Kulis, Stephen; Kellison, Joshua G.

    2010-01-01

    Objectives. Under an ecodevelopmental framework, we examined lifetime segmented assimilation trajectories (diverging assimilation pathways influenced by prior life conditions) and related them to quality-of-life indicators in a diverse sample of 258 men in the Pheonix, AZ, metropolitan area. Methods. We used a growth mixture model analysis of lifetime changes in socioeconomic status, and used acculturation to identify distinct lifetime segmented assimilation trajectory groups, which we compared on life satisfaction, exercise, and dietary behaviors. We hypothesized that lifetime assimilation change toward mainstream American culture (upward assimilation) would be associated with favorable health outcomes, and downward assimilation change with unfavorable health outcomes. Results. A growth mixture model latent class analysis identified 4 distinct assimilation trajectory groups. In partial support of the study hypotheses, the extreme upward assimilation trajectory group (the most successful of the assimilation pathways) exhibited the highest life satisfaction and the lowest frequency of unhealthy food consumption. Conclusions. Upward segmented assimilation is associated in adulthood with certain positive health outcomes. This may be the first study to model upward and downward lifetime segmented assimilation trajectories, and to associate these with life satisfaction, exercise, and dietary behaviors. PMID:20167890

  1. Optimal control of a coupled partial and ordinary differential equations system for the assimilation of polarimetry Stokes vector measurements in tokamak free-boundary equilibrium reconstruction with application to ITER

    NASA Astrophysics Data System (ADS)

    Faugeras, Blaise; Blum, Jacques; Heumann, Holger; Boulbe, Cédric

    2017-08-01

    The modelization of polarimetry Faraday rotation measurements commonly used in tokamak plasma equilibrium reconstruction codes is an approximation to the Stokes model. This approximation is not valid for the foreseen ITER scenarios where high current and electron density plasma regimes are expected. In this work a method enabling the consistent resolution of the inverse equilibrium reconstruction problem in the framework of non-linear free-boundary equilibrium coupled to the Stokes model equation for polarimetry is provided. Using optimal control theory we derive the optimality system for this inverse problem. A sequential quadratic programming (SQP) method is proposed for its numerical resolution. Numerical experiments with noisy synthetic measurements in the ITER tokamak configuration for two test cases, the second of which is an H-mode plasma, show that the method is efficient and that the accuracy of the identification of the unknown profile functions is improved compared to the use of classical Faraday measurements.

  2. Human factors in equipment development for the Space Shuttle - A study of the general purpose work station

    NASA Technical Reports Server (NTRS)

    Junge, M. K.; Giacomi, M. J.

    1981-01-01

    The results of a human factors test to assay the suitability of a prototype general purpose work station (GPWS) for biosciences experiments on the fourth Spacelab mission are reported. The evaluation was performed to verify that users of the GPWS would optimally interact with the GPWS configuration and instrumentation. Six male subjects sat on stools positioned to allow assimilation of the zero-g body posture. Trials were run concerning the operator viewing angles facing the console, the console color, procedures for injecting rates with dye, a rat blood cell count, mouse dissection, squirrel monkey transfer, and plant fixation. The trials were run for several days in order to gage improvement or poor performance conditions. Better access to the work surface was found necessary, together with more distinct and better located LEDs, better access window latches, clearer sequences on control buttons, color-coded sequential buttons, and provisions made for an intercom system when operators of the GPWS work in tandem.

  3. Comparison of DNA testing strategies in monitoring human papillomavirus infection prevalence through simulation.

    PubMed

    Lin, Carol Y; Li, Ling

    2016-11-07

    HPV DNA diagnostic tests for epidemiology monitoring (research purpose) or cervical cancer screening (clinical purpose) have often been considered separately. Women with positive Linear Array (LA) polymerase chain reaction (PCR) research test results typically are neither informed nor referred for colposcopy. Recently, a sequential testing by using Hybrid Capture 2 (HC2) HPV clinical test as a triage before genotype by LA has been adopted for monitoring HPV infections. Also, HC2 has been reported as a more feasible screening approach for cervical cancer in low-resource countries. Thus, knowing the performance of testing strategies incorporating HPV clinical test (i.e., HC2-only or using HC2 as a triage before genotype by LA) compared with LA-only testing in measuring HPV prevalence will be informative for public health practice. We conducted a Monte Carlo simulation study. Data were generated using mathematical algorithms. We designated the reported HPV infection prevalence in the U.S. and Latin America as the "true" underlying type-specific HPV prevalence. Analytical sensitivity of HC2 for detecting 14 high-risk (oncogenic) types was considered to be less than LA. Estimated-to-true prevalence ratios and percentage reductions were calculated. When the "true" HPV prevalence was designated as the reported prevalence in the U.S., with LA genotyping sensitivity and specificity of (0.95, 0.95), estimated-to-true prevalence ratios of 14 high-risk types were 2.132, 1.056, 0.958 for LA-only, HC2-only, and sequential testing, respectively. Estimated-to-true prevalence ratios of two vaccine-associated high-risk types were 2.359 and 1.063 for LA-only and sequential testing, respectively. When designated type-specific prevalence of HPV16 and 18 were reduced by 50 %, using either LA-only or sequential testing, prevalence estimates were reduced by 18 %. Estimated-to-true HPV infection prevalence ratios using LA-only testing strategy are generally higher than using HC2-only or using HC2 as a triage before genotype by LA. HPV clinical testing can be incorporated to monitor HPV prevalence or vaccine effectiveness. Caution is needed when comparing apparent prevalence from different testing strategies.

  4. Atlas Assimilation Patterns in Different Types of Adult Craniocervical Junction Malformations.

    PubMed

    Ferreira, Edson Dener Zandonadi; Botelho, Ricardo Vieira

    2015-11-01

    This is a cross-sectional analysis of resonance magnetic images of 111 patients with craniocervical malformations and those of normal subjects. To test the hypothesis that atlas assimilation is associated with basilar invagination (BI) and atlas's anterior arch assimilation is associated with craniocervical instability and type I BI. Atlas assimilation is the most common malformation in the craniocervical junction. This condition has been associated with craniocervical instability and BI in isolated cases. We evaluated midline Magnetic Resonance Images (MRIs) (and/or CT scans) from patients with craniocervical junction malformation and normal subjects. The patients were separated into 3 groups: Chiari type I malformation, BI type I, and type II. The atlas assimilations were classified according to their embryological origins as follows: posterior, anterior, and both arches assimilation. We studied the craniometric values of 111 subjects, 78 with craniocervical junction malformation and 33 without malformations. Of the 78 malformations, 51 patients had Chiari type I and 27 had BI, of whom 10 presented with type I and 17 with type II BI. In the Chiari group, 41 showed no assimilation of the atlas. In the type I BI group, all patients presented with anterior arch assimilation, either in isolation or associated with assimilation of the posterior arch. 63% of the patients with type II BI presented with posterior arch assimilation, either in isolation or associated with anterior arch assimilation. In the control group, no patients had atlas assimilation. Anterior atlas assimilation leads to type I BI. Posterior atlas assimilation more frequently leads to type II BI. Separation in terms of anterior versus posterior atlas assimilation reflects a more accurate understanding of the clinical and embryological differences in craniocervical junction malformations. N/A.

  5. A variational ensemble scheme for noisy image data assimilation

    NASA Astrophysics Data System (ADS)

    Yang, Yin; Robinson, Cordelia; Heitz, Dominique; Mémin, Etienne

    2014-05-01

    Data assimilation techniques aim at recovering a system state variables trajectory denoted as X, along time from partially observed noisy measurements of the system denoted as Y. These procedures, which couple dynamics and noisy measurements of the system, fulfill indeed a twofold objective. On one hand, they provide a denoising - or reconstruction - procedure of the data through a given model framework and on the other hand, they provide estimation procedures for unknown parameters of the dynamics. A standard variational data assimilation problem can be formulated as the minimization of the following objective function with respect to the initial discrepancy, η, from the background initial guess: δ« J(η(x)) = 1∥Xb (x) - X (t ,x)∥2 + 1 tf∥H(X (t,x ))- Y (t,x)∥2dt. 2 0 0 B 2 t0 R (1) where the observation operator H links the state variable and the measurements. The cost function can be interpreted as the log likelihood function associated to the a posteriori distribution of the state given the past history of measurements and the background. In this work, we aim at studying ensemble based optimal control strategies for data assimilation. Such formulation nicely combines the ingredients of ensemble Kalman filters and variational data assimilation (4DVar). It is also formulated as the minimization of the objective function (1), but similarly to ensemble filter, it introduces in its objective function an empirical ensemble-based background-error covariance defined as: B ≡ <(Xb - )(Xb - )T>. (2) Thus, it works in an off-line smoothing mode rather than on the fly like sequential filters. Such resulting ensemble variational data assimilation technique corresponds to a relatively new family of methods [1,2,3]. It presents two main advantages: first, it does not require anymore to construct the adjoint of the dynamics tangent linear operator, which is a considerable advantage with respect to the method's implementation, and second, it enables the handling of a flow-dependent background error covariance matrix that can be consistently adjusted to the background error. These nice advantages come however at the cost of a reduced rank modeling of the solution space. The B matrix is at most of rank N - 1 (N is the size of the ensemble) which is considerably lower than the dimension of state space. This rank deficiency may introduce spurious correlation errors, which particularly impact the quality of results associated with a high resolution computing grid. The common strategy to suppress these distant correlations for ensemble Kalman techniques is through localization procedures. In this paper we present key theoretical properties associated to different choices of methods involved in this setup and compare with an incremental 4DVar method experimentally the performances of several variations of an ensemble technique of interest. The comparisons have been led on the basis of a Shallow Water model and have been carried out both with synthetic data and real observations. We particularly addressed the potential pitfalls and advantages of the different methods. The results indicate an advantage in favor of the ensemble technique both in quality and computational cost when dealing with incomplete observations. We highlight as the premise of using ensemble variational assimilation, that the initial perturbation used to build the initial ensemble has to fit the physics of the observed phenomenon . We also apply the method to a stochastic shallow-water model which incorporate an uncertainty expression if the subgrid stress tensor related to the ensemble spread. References [1] A. C. Lorenc, The potential of the ensemble kalman filter for nwp - a comparison with 4d-var, Quart. J. Roy. Meteor. Soc., Vol. 129, pp. 3183-3203, 2003. [2] C. Liu, Q. Xiao, and B. Wang, An Ensemble-Based Four-Dimensional Variational Data Assimilation Scheme. Part I: Technical Formulation and Preliminary Test, Mon. Wea. Rev., Vol. 136(9), pp. 3363-3373, 2008. [3] M. Buehner, Ensemble-derived stationary and flow-dependent background-error covariances: Evaluation in a quasi- operational NWP setting, Quart. J. Roy. Meteor. Soc., Vol. 131(607), pp. 1013-1043, April 2005.

  6. Triple collocation-based estimation of spatially correlated observation error covariance in remote sensing soil moisture data assimilation

    NASA Astrophysics Data System (ADS)

    Wu, Kai; Shu, Hong; Nie, Lei; Jiao, Zhenhang

    2018-01-01

    Spatially correlated errors are typically ignored in data assimilation, thus degenerating the observation error covariance R to a diagonal matrix. We argue that a nondiagonal R carries more observation information making assimilation results more accurate. A method, denoted TC_Cov, was proposed for soil moisture data assimilation to estimate spatially correlated observation error covariance based on triple collocation (TC). Assimilation experiments were carried out to test the performance of TC_Cov. AMSR-E soil moisture was assimilated with a diagonal R matrix computed using the TC and assimilated using a nondiagonal R matrix, as estimated by proposed TC_Cov. The ensemble Kalman filter was considered as the assimilation method. Our assimilation results were validated against climate change initiative data and ground-based soil moisture measurements using the Pearson correlation coefficient and unbiased root mean square difference metrics. These experiments confirmed that deterioration of diagonal R assimilation results occurred when model simulation is more accurate than observation data. Furthermore, nondiagonal R achieved higher correlation coefficient and lower ubRMSD values over diagonal R in experiments and demonstrated the effectiveness of TC_Cov to estimate richly structuralized R in data assimilation. In sum, compared with diagonal R, nondiagonal R may relieve the detrimental effects of assimilation when simulated model results outperform observation data.

  7. Exploring New Pathways in Precipitation Assimilation

    NASA Technical Reports Server (NTRS)

    Hou, Arthur; Zhang, Sara Q.

    2004-01-01

    Precipitation assimilation poses a special challenge in that the forward model for rain in a global forecast system is based on parameterized physics, which can have large systematic errors that must be rectified to use precipitation data effectively within a standard statistical analysis framework. We examine some key issues in precipitation assimilation and describe several exploratory studies in assimilating rainfall and latent heating information in NASA's global data assimilation systems using the forecast model as a weak constraint. We present results from two research activities. The first is the assimilation of surface rainfall data using a time-continuous variational assimilation based on a column model of the full moist physics. The second is the assimilation of convective and stratiform latent heating retrievals from microwave sensors using a variational technique with physical parameters in the moist physics schemes as a control variable. We will show the impact of assimilating these data on analyses and forecasts. Among the lessons learned are (1) that the time-continuous application of moisture/temperature tendency corrections to mitigate model deficiencies offers an effective strategy for assimilating precipitation information, and (2) that the model prognostic variables must be allowed to directly respond to an improved rain and latent heating field within an analysis cycle to reap the full benefit of assimilating precipitation information. of microwave radiances versus retrieval information in raining areas, and initial efforts in developing ensemble techniques such as Kalman filter/smoother for precipitation assimilation. Looking to the future, we discuss new research directions including the assimilation

  8. Assimilative and non-assimilative color spreading in the watercolor configuration.

    PubMed

    Kimura, Eiji; Kuroki, Mikako

    2014-01-01

    A colored line flanking a darker contour will appear to spread its color onto an area enclosed by the line (watercolor effect). The watercolor effect has been characterized as an assimilative effect, but non-assimilative color spreading has also been demonstrated in the same spatial configuration; e.g., when a black inner contour (IC) is paired with a blue outer contour (OC), yellow color spreading can be observed. To elucidate visual mechanisms underlying these different color spreading effects, this study investigated the effects of luminance ratio between the double contours on the induced color by systematically manipulating the IC and the OC luminance (Experiment 1) as well as the background luminance (Experiment 2). The results showed that the luminance conditions suitable for assimilative and non-assimilative color spreading were nearly opposite. When the Weber contrast of the IC to the background luminance (IC contrast) was smaller in size than that of the OC (OC contrast), the induced color became similar to the IC color (assimilative spreading). In contrast, when the OC contrast was smaller than or equal to the IC contrast, the induced color became yellow (non-assimilative spreading). Extending these findings, Experiment 3 showed that bilateral color spreading, i.e., assimilative spreading on one side and non-assimilative spreading on the other side, can also be observed in the watercolor configuration. These results suggest that the assimilative and the non-assimilative spreading were mediated by different visual mechanisms. The properties of the assimilative spreading are consistent with the model proposed to account for neon color spreading (Grossberg and Mingolla, 1985) and extended for the watercolor effect (Pinna and Grossberg, 2005). However, the present results suggest that additional mechanisms are needed to account for the non-assimilative color spreading.

  9. Jamming and percolation in random sequential adsorption of straight rigid rods on a two-dimensional triangular lattice

    NASA Astrophysics Data System (ADS)

    Perino, E. J.; Matoz-Fernandez, D. A.; Pasinetti, P. M.; Ramirez-Pastor, A. J.

    2017-07-01

    Monte Carlo simulations and finite-size scaling analysis have been performed to study the jamming and percolation behavior of linear k-mers (also known as rods or needles) on a two-dimensional triangular lattice of linear dimension L, considering an isotropic RSA process and periodic boundary conditions. Extensive numerical work has been done to extend previous studies to larger system sizes and longer k-mers, which enables the confirmation of a nonmonotonic size dependence of the percolation threshold and the estimation of a maximum value of k from which percolation would no longer occur. Finally, a complete analysis of critical exponents and universality has been done, showing that the percolation phase transition involved in the system is not affected, having the same universality class of the ordinary random percolation.

  10. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods.

    PubMed

    Hoak, Anthony; Medeiros, Henry; Povinelli, Richard J

    2017-03-03

    We develop an interactive likelihood (ILH) for sequential Monte Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL) and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA) and classification of events, activities and relationships for multi-object trackers (CLEAR MOT)). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  11. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods

    PubMed Central

    Hoak, Anthony; Medeiros, Henry; Povinelli, Richard J.

    2017-01-01

    We develop an interactive likelihood (ILH) for sequential Monte Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL) and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA) and classification of events, activities and relationships for multi-object trackers (CLEAR MOT)). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter. PMID:28273796

  12. Assimilation of SeaWiFS Ocean Chlorophyll Data into a Three-Dimensional Global Ocean Model

    NASA Technical Reports Server (NTRS)

    Gregg, Watson W.

    2005-01-01

    Assimilation of satellite ocean color data is a relatively new phenomenon in ocean sciences. However, with routine observations from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), launched in late 1997, and now with new data from the Moderate Resolution Imaging Spectroradometer (MODIS) Aqua, there is increasing interest in ocean color data assimilation. Here SeaWiFS chlorophyll data were assimilated with an established thre-dimentional global ocean model. The assimilation improved estimates of hlorophyll and primary production relative to a free-run (no assimilation) model. This represents the first attempt at ocean color data assimilation using NASA satellites in a global model. The results suggest the potential of assimilation of satellite ocean chlorophyll data for improving models.

  13. Development of an eddy-resolving reanalysis using the 1/12° global HYbrid Coordinate Ocean Model and the Navy Coupled Ocean Data Assimilation Scheme

    NASA Astrophysics Data System (ADS)

    Allard, Richard; Metzger, E. Joseph; Broome, Robert; Franklin, Deborah; Smedstad, Ole Martin; Wallcraft, Alan

    2013-04-01

    Multiple international agencies have performed atmospheric reanalyses using static dynamical models and assimilation schemes while ingesting all available quality controlled observational data. Some are clearly aimed at climate time scales while others focus on the more recent time period in which assimilated satellite data are used to constrain the system. Typically these are performed at horizontal and vertical resolutions that are coarser than the existing operational atmospheric prediction system. Multiple agencies have also performed ocean reanalyses using some of the atmospheric forcing products described above. However, only a few are eddy-permitting and none are capable of resolving oceanic mesoscale features (eddies and current meanders) across the entire globe. To fill this void, the Naval Research Laboratory is performing an eddy-resolving 1993-2010 ocean reanalysis using the 1/12° global HYbrid Coordinate Ocean Model (HYCOM) that employs the Navy Coupled Ocean Data Assimilation (NCODA) scheme. A 1/12° global HYCOM/NCODA prediction system has been running in real-time at the Naval Oceanographic Office (NAVOCEANO) since 22 December 2006. It has undergone operational testing and will become an operational product by early 2013. It is capable of nowcasting and forecasting the oceanic "weather" which includes the 3D ocean temperature, salinity and current structure, the surface mixed layer, and the location of mesoscale features such as eddies, meandering currents and fronts. The system has a mid-latitude resolution of ~7 km and employs 32 hybrid vertical coordinate surfaces. Compared to traditional isopycnal coordinate models, the hybrid vertical coordinate extends the geographic range of applicability toward shallow coastal seas and the unstratified parts of the world ocean. HYCOM contains a built-in thermodynamic ice model, where ice grows and melts due to heat flux and sea surface temperature (SST) changes, but it does not contain advanced rheological physics. The ice edge is constrained by satellite ice concentration. Once per day, NCODA performs a 3D ocean analysis using all available observational data and the 1-day HYCOM forecast as the first guess in a sequential incremental update cycle. Observational data include surface observations from satellites, including sea surface height (SSH) anomalies, SST, and sea ice concentrations, plus in-situ SST observations from ships and buoys as well as temperature and salinity profiles from XBTs, CTDs and Argo profiling floats. Surface information is projected downward using synthetic profiles from the Modular Ocean Data Assimilation System (MODAS) at those locations with a predefined SSH anomaly. Unlike previous reanalyses, this ocean reanalysis will be integrated at the same horizontal and vertical resolution as the operational system running at NAVOCEANO. The system is forced with atmospheric output from the National Centers for Environmental Prediction (NCEP) Climate Forecast System Reanalysis (CFSR) and the observations listed above. The reanalysis began in 1993 because of the advent of satellite altimeter data that will constrain the oceanic mesoscale. Significant effort has been put into obtaining and quality controlling all input observational data, with special emphasis on the profile data. The computational resources are obtained through the High Performance Computing Modernization Office.

  14. 4DVAR data Assimilation with the Regional Ocean Modeling System (ROMS): Impact on the Water Mass Distributions in the Yellow Sea

    NASA Astrophysics Data System (ADS)

    Lee, Joon-Ho; Kim, Taekyun; Pang, Ig-Chan; Moon, Jae-Hong

    2018-04-01

    In this study, we evaluate the performance of the recently developed incremental strong constraint 4-dimensional variational (4DVAR) data assimilation applied to the Yellow Sea (YS) using the Regional Ocean Modeling System (ROMS). Two assimilation experiments are compared: assimilating remote-sensed sea surface temperature (SST) and both the SST and in-situ profiles measured by shipboard CTD casts into a regional ocean modeling from January to December of 2011. By comparing the two assimilation experiments against a free-run without data assimilation, we investigate how the assimilation affects the hydrographic structures in the YS. Results indicate that the SST assimilation notably improves the model behavior at the surface when compared to the nonassimilative free-run. The SST assimilation also has an impact on the subsurface water structure in the eastern YS; however, the improvement is seasonally dependent, that is, the correction becomes more effective in winter than in summer. This is due to a strong stratification in summer that prevents the assimilation of SST from affecting the subsurface temperature. A significant improvement to the subsurface temperature is made when the in-situ profiles of temperature and salinity are assimilated, forming a tongue-shaped YS bottom cold water from the YS toward the southwestern seas of Jeju Island.

  15. Assimilation of GRACE Terrestrial Water Storage Data into a Land Surface Model

    NASA Technical Reports Server (NTRS)

    Reichle, Rolf H.; Zaitchik, Benjamin F.; Rodell, Matt

    2008-01-01

    The NASA Gravity Recovery and Climate Experiment (GRACE) system of satellites provides observations of large-scale, monthly terrestrial water storage (TWS) changes. In. this presentation we describe a land data assimilation system that ingests GRACE observations and show that the assimilation improves estimates of water storage and fluxes, as evaluated against independent measurements. The ensemble-based land data assimilation system uses a Kalman smoother approach along with the NASA Catchment Land Surface Model (CLSM). We assimilated GRACE-derived TWS anomalies for each of the four major sub-basins of the Mississippi into the Catchment Land Surface Model (CLSM). Compared with the open-loop (no assimilation) CLSM simulation, assimilation estimates of groundwater variability exhibited enhanced skill with respect to measured groundwater. Assimilation also significantly increased the correlation between simulated TWS and gauged river flow for all four sub-basins and for the Mississippi River basin itself. In addition, model performance was evaluated for watersheds smaller than the scale of GRACE observations, in the majority of cases, GRACE assimilation led to increased correlation between TWS estimates and gauged river flow, indicating that data assimilation has considerable potential to downscale GRACE data for hydrological applications. We will also describe how the output from the GRACE land data assimilation system is now being prepared for use in the North American Drought Monitor.

  16. Assimilation of Quality Controlled AIRS Temperature Profiles using the NCEP GFS

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Reale, Oreste; Iredell, Lena; Rosenberg, Robert

    2013-01-01

    We have previously conducted a number of data assimilation experiments using AIRS Version-5 quality controlled temperature profiles as a step toward finding an optimum balance of spatial coverage and sounding accuracy with regard to improving forecast skill. The data assimilation and forecast system we used was the Goddard Earth Observing System Model , Version-5 (GEOS-5) Data Assimilation System (DAS), which represents a combination of the NASA GEOS-5 forecast model with the National Centers for Environmental Prediction (NCEP) operational Grid Point Statistical Interpolation (GSI) global analysis scheme. All analyses and forecasts were run at a 0.5deg x 0.625deg spatial resolution. Data assimilation experiments were conducted in four different seasons, each in a different year. Three different sets of data assimilation experiments were run during each time period: Control; AIRS T(p); and AIRS Radiance. In the "Control" analysis, all the data used operationally by NCEP was assimilated, but no AIRS data was assimilated. Radiances from the Aqua AMSU-A instrument were also assimilated operationally by NCEP and are included in the "Control". The AIRS Radiance assimilation adds AIRS observed radiance observations for a select set of channels to the data set being assimilated, as done operationally by NCEP. In the AIRS T(p) assimilation, all information used in the Control was assimilated as well as Quality Controlled AIRS Version-5 temperature profiles, i.e., AIRS T(p) information was substituted for AIRS radiance information. The AIRS Version-5 temperature profiles were presented to the GSI analysis as rawinsonde profiles, assimilated down to a case-by-case appropriate pressure level p(sub best) determined using the Quality Control procedure. Version-5 also determines case-by-case, level-by-level error estimates of the temperature profiles, which were used as the uncertainty of each temperature measurement. These experiments using GEOS-5 have shown that forecasts resulting from analyses using the AIRS T(p) assimilation system were superior to those from the Radiance assimilation system, both with regard to global 7 day forecast skill and also the ability to predict storm tracks and intensity.

  17. Impact of the initialisation on the predictability of the Southern Ocean sea ice at interannual to multi-decadal timescales

    NASA Astrophysics Data System (ADS)

    Zunz, Violette; Goosse, Hugues; Dubinkina, Svetlana

    2015-04-01

    In this study, we assess systematically the impact of different initialisation procedures on the predictability of the sea ice in the Southern Ocean. These initialisation strategies are based on three data assimilation methods: the nudging, the particle filter with sequential importance resampling and the nudging proposal particle filter. An Earth system model of intermediate complexity is used to perform hindcast simulations in a perfect model approach. The predictability of the Antarctic sea ice at interannual to multi-decadal timescales is estimated through two aspects: the spread of the hindcast ensemble, indicating the uncertainty of the ensemble, and the correlation between the ensemble mean and the pseudo-observations, used to assess the accuracy of the prediction. Our results show that at decadal timescales more sophisticated data assimilation methods as well as denser pseudo-observations used to initialise the hindcasts decrease the spread of the ensemble. However, our experiments did not clearly demonstrate that one of the initialisation methods systematically provides with a more accurate prediction of the sea ice in the Southern Ocean than the others. Overall, the predictability at interannual timescales is limited to 3 years ahead at most. At multi-decadal timescales, the trends in sea ice extent computed over the time period just after the initialisation are clearly better correlated between the hindcasts and the pseudo-observations if the initialisation takes into account the pseudo-observations. The correlation reaches values larger than 0.5 in winter. This high correlation has likely its origin in the slow evolution of the ocean ensured by its strong thermal inertia, showing the importance of the quality of the initialisation below the sea ice.

  18. Gastrointestinal assimilation of Cu during digestion of a single meal in the freshwater rainbow trout (Oncorhynchus mykiss).

    PubMed

    Nadella, Sunita R; Bucking, Carol; Grosell, Martin; Wood, Chris M

    2006-08-01

    Gastrointestinal processing and assimilation of Cu in vivo was investigated by sequential chyme analysis over a 72 h period following ingestion of a single satiation meal (3% body weight) of commercial trout food (Cu content=0.42 micromol g(-1)) by adult rainbow trout. Leaded glass ballotini beads incorporated into the food and detected by X-ray radiography were employed as an inert marker in order to quantify net Cu absorption or secretion in various parts of the tract. Cu concentrations in the supernatant (fluid phase) fell from about 0.06 micromol mL(-1) (63 microM) in the stomach at 2 h to about 0.003 micromol mL(-1) (3 microM) in the posterior intestine at 72 h. Cu concentrations in the solid phase were 10 to 30-fold higher than in the fluid phase, and increased about 4-fold from the stomach at 2 h to the posterior intestine at 72 h. By reference to the inert marker, overall net Cu absorption from the ingested food by 72 h was about 50%. The mid-intestine, and posterior intestine emerged as important sites of net Cu and water absorption and a potential role for the stomach in this process was also indicated. The anterior intestine was a site of large net Cu addition to the chyme, probably due to large net additions of Cu-containing fluids in the form of bile and other secretions in this segment. The results provide valuable information about sites of Cu absorption and realistic concentrations of Cu in chyme fluid for future in vitro mechanistic studies on Cu transport in the trout gastrointestinal tract.

  19. Evaluating integrated strategies for robust treatment of high saline piggery wastewater.

    PubMed

    Kim, Hyun-Chul; Choi, Wook Jin; Chae, A Na; Park, Joonhong; Kim, Hyung Joo; Song, Kyung Guen

    2016-02-01

    In this study, we integrated physicochemical and biological strategies for the robust treatment of piggery effluent in which high levels of organic constituents, inorganic nutrients, color, and salts remained. Piggery effluent that was stabilized in an anaerobic digester was sequentially coagulated, micro-filtered, and air-stripped prior to biological treatment with mixotrophic algal species that showed tolerance to high salinity (up to 4.8% as Cl(-)). The algae treatment was conducted with continuous O2 supplementation instead of using the combination of high lighting and CO2 injection. The microalga Scenedesmus quadricauda employed as a bio-agent was capable of assimilating both nitrogen (222 mg N g cell(-1) d(-1)) and phosphorus (9.3 mg P g cell(-1) d(-1)) and utilizing dissolved organics (2053 mg COD g cell(-1) d(-1)) as a carbon source in a single treatment process under the heterotrophic growth conditions. The heterotrophic growth of S. quadricauda proceeded rapidly by directly incorporating organic substrate in the oxidative assimilation process, which coincided with the high productivity of algal biomass, accounting for 2.4 g cell L(-1) d(-1). The algae-treated wastewater was subsequently ozonated to comply with discharge permits that limit color in the effluent, which also resulted in improved biodegradability of residual organics. The integrated treatment scheme proposed in this study also achieved 89% removal of COD, 88% removal of TN, and 60% removal of TP. The advantage of using the hybrid configuration suggests that this would be a promising strategy in full-scale treatment facilities for piggery effluent. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Monte Carlo simulation of the neutron monitor yield function

    NASA Astrophysics Data System (ADS)

    Mangeard, P.-S.; Ruffolo, D.; Sáiz, A.; Madlee, S.; Nutaro, T.

    2016-08-01

    Neutron monitors (NMs) are ground-based detectors that measure variations of the Galactic cosmic ray flux at GV range rigidities. Differences in configuration, electronics, surroundings, and location induce systematic effects on the calculation of the yield functions of NMs worldwide. Different estimates of NM yield functions can differ by a factor of 2 or more. In this work, we present new Monte Carlo simulations to calculate NM yield functions and perform an absolute (not relative) comparison with the count rate of the Princess Sirindhorn Neutron Monitor (PSNM) at Doi Inthanon, Thailand, both for the entire monitor and for individual counter tubes. We model the atmosphere using profiles from the Global Data Assimilation System database and the Naval Research Laboratory Mass Spectrometer, Incoherent Scatter Radar Extended model. Using FLUKA software and the detailed geometry of PSNM, we calculated the PSNM yield functions for protons and alpha particles. An agreement better than 9% was achieved between the PSNM observations and the simulated count rate during the solar minimum of December 2009. The systematic effect from the electronic dead time was studied as a function of primary cosmic ray rigidity at the top of the atmosphere up to 1 TV. We show that the effect is not negligible and can reach 35% at high rigidity for a dead time >1 ms. We analyzed the response function of each counter tube at PSNM using its actual dead time, and we provide normalization coefficients between count rates for various tube configurations in the standard NM64 design that are valid to within ˜1% for such stations worldwide.

  1. Monte Carlo simulation of evaporation-driven self-assembly in suspensions of colloidal rods

    NASA Astrophysics Data System (ADS)

    Lebovka, Nikolai I.; Vygornitskii, Nikolai V.; Gigiberiya, Volodymyr A.; Tarasevich, Yuri Yu.

    2016-12-01

    The vertical drying of a colloidal film containing rodlike particles was studied by means of kinetic Monte Carlo (MC) simulation. The problem was approached using a two-dimensional square lattice, and the rods were represented as linear k -mers (i.e., particles occupying k adjacent sites). The initial state before drying was produced using a model of random sequential adsorption (RSA) with isotropic orientations of the k -mers (orientation of the k -mers along horizontal x and vertical y directions are equiprobable). In the RSA model, overlapping of the k -mers is forbidden. During the evaporation, an upper interface falls with a linear velocity of u in the vertical direction and the k -mers undergo translation Brownian motion. The MC simulations were run at different initial concentrations, pi, (pi∈[0 ,pj] , where pj is the jamming concentration), lengths of k -mers (k ∈[1 ,12 ] ), and solvent evaporation rates, u . For completely dried films, the spatial distributions of k -mers and their electrical conductivities in both x and y directions were examined. Significant evaporation-driven self-assembly and orientation stratification of the k -mers oriented along the x and y directions were observed. The extent of stratification increased with increasing value of k . The anisotropy of the electrical conductivity of the film can be finely regulated by changes in the values of pi, k , and u .

  2. Numerical and experimental analyses of the radiant heat flux produced by quartz heating systems

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Ash, Robert L.

    1994-01-01

    A method is developed for predicting the radiant heat flux distribution produced by tungsten filament, tubular fused-quartz envelope heating systems with reflectors. The method is an application of Monte Carlo simulation, which takes the form of a random walk or ray tracing scheme. The method is applied to four systems of increasing complexity, including a single lamp without a reflector, a single lamp with a Hat reflector, a single lamp with a parabolic reflector, and up to six lamps in a six-lamp contoured-reflector heating unit. The application of the Monte Carlo method to the simulation of the thermal radiation generated by these systems is discussed. The procedures for numerical implementation are also presented. Experiments were conducted to study these quartz heating systems and to acquire measurements of the corresponding empirical heat flux distributions for correlation with analysis. The experiments were conducted such that several complicating factors could be isolated and studied sequentially. Comparisons of the experimental results with analysis are presented and discussed. Good agreement between the experimental and simulated results was obtained in all cases. This study shows that this method can be used to analyze very complicated quartz heating systems and can account for factors such as spectral properties, specular reflection from curved surfaces, source enhancement due to reflectors and/or adjacent sources, and interaction with a participating medium in a straightforward manner.

  3. Automated crack detection in conductive smart-concrete structures using a resistor mesh model

    NASA Astrophysics Data System (ADS)

    Downey, Austin; D'Alessandro, Antonella; Ubertini, Filippo; Laflamme, Simon

    2018-03-01

    Various nondestructive evaluation techniques are currently used to automatically detect and monitor cracks in concrete infrastructure. However, these methods often lack the scalability and cost-effectiveness over large geometries. A solution is the use of self-sensing carbon-doped cementitious materials. These self-sensing materials are capable of providing a measurable change in electrical output that can be related to their damage state. Previous work by the authors showed that a resistor mesh model could be used to track damage in structural components fabricated from electrically conductive concrete, where damage was located through the identification of high resistance value resistors in a resistor mesh model. In this work, an automated damage detection strategy that works through placing high value resistors into the previously developed resistor mesh model using a sequential Monte Carlo method is introduced. Here, high value resistors are used to mimic the internal condition of damaged cementitious specimens. The proposed automated damage detection method is experimentally validated using a 500 × 500 × 50 mm3 reinforced cement paste plate doped with multi-walled carbon nanotubes exposed to 100 identical impact tests. Results demonstrate that the proposed Monte Carlo method is capable of detecting and localizing the most prominent damage in a structure, demonstrating that automated damage detection in smart-concrete structures is a promising strategy for real-time structural health monitoring of civil infrastructure.

  4. Landslide Failure Likelihoods Estimated Through Analysis of Suspended Sediment and Streamflow Time Series Data

    NASA Astrophysics Data System (ADS)

    Stark, C. P.; Rudd, S.; Lall, U.; Hovius, N.; Dadson, S.; Chen, M.-C.

    Off-Axis DOAS measurements with non-artificial scattered light, based upon the renowned DOAS technique, allow to optimize the sensitivity of the technique for the trace gas profile in question by strongly increasing the light's path through the relevant atmosphere layers. Multi-Axis-(MAX) DOAS probe several directions simultaneously or sequentially to increase the spatial resolution. Several devices (ground based, air- borne and ship-built) are operated by our group in the framework of the SCIAMACHY validation. Radiative transfer models are an essential requirement for the interpretation of these measurements and their conversion into detailed profile data. Apart from some existing Monte Carlo Models most codes use analytical algorithms to solve the radia- tive transfer equation for given atmospheric conditions. For specific circumstances, e.g. photon scattering within clouds, these approaches are not efficient enough to pro- vide sufficient accuracy. Also horizontal gradients in atmospheric parameters have to be taken into account. To meet the needs of measurement situations for all kinds of scattered light DOAS platforms, a three dimensional full spherical Monte Carlo model was devised. Here we present Air Mass Factors (AMF) to calculate vertical column densities (VCD) from measured slant column densities (SCD). Sensitivity studies on the influence of the wavelength and telescope direction used, of the altitude of profile layers, albedo, refraction and basic aerosols are shown. Also modelled intensity series are compared with radiometer data.

  5. OCO-2 Column Carbon Dioxide and Biometric Data Jointly Constrain Parameterization and Projection of a Global Land Model

    NASA Astrophysics Data System (ADS)

    Shi, Z.; Crowell, S.; Luo, Y.; Rayner, P. J.; Moore, B., III

    2015-12-01

    Uncertainty in predicted carbon-climate feedback largely stems from poor parameterization of global land models. However, calibration of global land models with observations has been extremely challenging at least for two reasons. First we lack global data products from systematical measurements of land surface processes. Second, computational demand is insurmountable for estimation of model parameter due to complexity of global land models. In this project, we will use OCO-2 retrievals of dry air mole fraction XCO2 and solar induced fluorescence (SIF) to independently constrain estimation of net ecosystem exchange (NEE) and gross primary production (GPP). The constrained NEE and GPP will be combined with data products of global standing biomass, soil organic carbon and soil respiration to improve the community land model version 4.5 (CLM4.5). Specifically, we will first develop a high fidelity emulator of CLM4.5 according to the matrix representation of the terrestrial carbon cycle. It has been shown that the emulator fully represents the original model and can be effectively used for data assimilation to constrain parameter estimation. We will focus on calibrating those key model parameters (e.g., maximum carboxylation rate, turnover time and transfer coefficients of soil carbon pools, and temperature sensitivity of respiration) for carbon cycle. The Bayesian Markov chain Monte Carlo method (MCMC) will be used to assimilate the global databases into the high fidelity emulator to constrain the model parameters, which will be incorporated back to the original CLM4.5. The calibrated CLM4.5 will be used to make scenario-based projections. In addition, we will conduct observing system simulation experiments (OSSEs) to evaluate how the sampling frequency and length could affect the model constraining and prediction.

  6. Incorporating Land-Use Mapping Uncertainty in Remote Sensing Based Calibration of Land-Use Change Models

    NASA Astrophysics Data System (ADS)

    Cockx, K.; Van de Voorde, T.; Canters, F.; Poelmans, L.; Uljee, I.; Engelen, G.; de Jong, K.; Karssenberg, D.; van der Kwast, J.

    2013-05-01

    Building urban growth models typically involves a process of historic calibration based on historic time series of land-use maps, usually obtained from satellite imagery. Both the remote sensing data analysis to infer land use and the subsequent modelling of land-use change are subject to uncertainties, which may have an impact on the accuracy of future land-use predictions. Our research aims to quantify and reduce these uncertainties by means of a particle filter data assimilation approach that incorporates uncertainty in land-use mapping and land-use model parameter assessment into the calibration process. This paper focuses on part of this work, more in particular the modelling of uncertainties associated with the impervious surface cover estimation and urban land-use classification adopted in the land-use mapping approach. Both stages are submitted to a Monte Carlo simulation to assess their relative contribution to and their combined impact on the uncertainty in the derived land-use maps. The approach was applied on the central part of the Flanders region (Belgium), using a time-series of Landsat/SPOT-HRV data covering the years 1987, 1996, 2005 and 2012. Although the most likely land-use map obtained from the simulation is very similar to the original classification, it is shown that the errors related to the impervious surface sub-pixel fraction estimation have a strong impact on the land-use map's uncertainty. Hence, incorporating uncertainty in the land-use change model calibration through particle filter data assimilation is proposed to address the uncertainty observed in the derived land-use maps and to reduce uncertainty in future land-use predictions.

  7. Advanced data assimilation in strongly nonlinear dynamical systems

    NASA Technical Reports Server (NTRS)

    Miller, Robert N.; Ghil, Michael; Gauthiez, Francois

    1994-01-01

    Advanced data assimilation methods are applied to simple but highly nonlinear problems. The dynamical systems studied here are the stochastically forced double well and the Lorenz model. In both systems, linear approximation of the dynamics about the critical points near which regime transitions occur is not always sufficient to track their occurrence or nonoccurrence. Straightforward application of the extended Kalman filter yields mixed results. The ability of the extended Kalman filter to track transitions of the double-well system from one stable critical point to the other depends on the frequency and accuracy of the observations relative to the mean-square amplitude of the stochastic forcing. The ability of the filter to track the chaotic trajectories of the Lorenz model is limited to short times, as is the ability of strong-constraint variational methods. Examples are given to illustrate the difficulties involved, and qualitative explanations for these difficulties are provided. Three generalizations of the extended Kalman filter are described. The first is based on inspection of the innovation sequence, that is, the successive differences between observations and forecasts; it works very well for the double-well problem. The second, an extension to fourth-order moments, yields excellent results for the Lorenz model but will be unwieldy when applied to models with high-dimensional state spaces. A third, more practical method--based on an empirical statistical model derived from a Monte Carlo simulation--is formulated, and shown to work very well. Weak-constraint methods can be made to perform satisfactorily in the context of these simple models, but such methods do not seem to generalize easily to practical models of the atmosphere and ocean. In particular, it is shown that the equations derived in the weak variational formulation are difficult to solve conveniently for large systems.

  8. Ensemble-Based Data Assimilation With a Martian GCM

    NASA Astrophysics Data System (ADS)

    Lawson, W.; Richardson, M. I.; McCleese, D. J.; Anderson, J. L.; Chen, Y.; Snyder, C.

    2007-12-01

    Quantitative study of Mars weather and climate will ultimately stem from analysis of its dynamic and thermodynamic fields. Of all the observations of Mars available to date, such fields are most easily derived from mapping data (radiances) of the martian atmosphere as measured by orbiting infrared spectrometers and radiometers (e.g., MGS / TES and MRO / MCS). Such data-derived products are the solutions to inverse problems, and while individual profile retrievals have been the popular data-derived products in the planetary sciences, the terrestrial meteorological community has gained much ground over the last decade by employing techniques of data assimilation (DA) to analyze radiances. Ancillary information is required to close an inverse problem (i.e., to disambiguate the family of possibilities that are consistent with the observations), and DA practitioners inevitably rely on numerical models for this information (e.g., general circulation models (GCMs)). Data assimilation elicits maximal information content from available observations, and, by way of the physics encoded in the numerical model, spreads this information spatially, temporally, and across variables, thus allowing global extrapolation of limited and non-simultaneous observations. If the model is skillful, then a given, specific model integration can be corrected by the information spreading abilities of DA, and the resulting time sequence of "analysis" states are brought into agreement with the observations. These analysis states are complete, gridded estimates of all the fields one might wish to diagnose for scientific study of the martian atmosphere. Though a numerical model has been used to obtain these estimates, their fidelity rests in their simultaneous consistency with both the observations (to within their stated uncertainties) and the physics contained in the model. In this fashion, radiance observations can, say, be used to deduce the wind field. A new class of DA approaches based on Monte Carlo approximations, "ensemble-based methods," has matured enough to be both appropriate for use in planetary problems and exploitably within the reach of planetary scientists. Capitalizing on this new class of methods, the National Center for Atmospheric Research (NCAR) has developed a framework for ensemble-based DA that is flexible and modular in its use of various forecast models and data sets. The framework is called DART, the Data Assimilation Research Testbed, and it is freely available on-line. We have begun to take advantage of this rich software infrastructure, and are on our way toward performing state of the art DA in the martian atmosphere using Caltech's martian general circulation model, PlanetWRF. We have begun by testing and validating the model within DART under idealized scenarios, and we hope to address actual, available infrared remote sensing datasets from Mars orbiters in the coming year. We shall present the details of this approach and our progress to date.

  9. Assimilation of satellite color observations in a coupled ocean GCM-ecosystem model

    NASA Technical Reports Server (NTRS)

    Sarmiento, Jorge L.

    1992-01-01

    Monthly average coastal zone color scanner (CZCS) estimates of chlorophyll concentration were assimilated into an ocean global circulation model(GCM) containing a simple model of the pelagic ecosystem. The assimilation was performed in the simplest possible manner, to allow the assessment of whether there were major problems with the ecosystem model or with the assimilation procedure. The current ecosystem model performed well in some regions, but failed in others to assimilate chlorophyll estimates without disrupting important ecosystem properties. This experiment gave insight into those properties of the ecosystem model that must be changed to allow data assimilation to be generally successful, while raising other important issues about the assimilation procedure.

  10. Health Information Exchange (HIE): A literature review, assimilation pattern and a proposed classification for a new policy approach.

    PubMed

    Esmaeilzadeh, Pouyan; Sambasivan, Murali

    2016-12-01

    Literature shows existence of barriers to Healthcare Information Exchange (HIE) assimilation process. A number of studies have considered assimilation of HIE as a whole phenomenon without regard to its multifaceted nature. Thus, the pattern of HIE assimilation in healthcare providers has not been clearly studied due to the effects of contingency factors on different assimilation phases. This study is aimed at defining HIE assimilation phases, recognizing assimilation pattern, and proposing a classification to highlight unique issues associated with HIE assimilation. A literature review of existing studies related to HIE efforts from 2005 was undertaken. Four electronic research databases (PubMed, Web of Science, CINAHL, and Academic Search Premiere) were searched for articles addressing different phases of HIE assimilation process. Two hundred and fifty-four articles were initially selected. Out of 254, 44 studies met the inclusion criteria and were reviewed. The assimilation of HIE is a complicated and a multi-staged process. Our findings indicated that HIE assimilation process consisted of four main phases: initiation, organizational adoption decision, implementation and institutionalization. The data helped us recognize the assimilation pattern of HIE in healthcare organizations. The results provide useful theoretical implications for research by defining HIE assimilation pattern. The findings of the study also have practical implications for policy makers. The findings show the importance of raising national awareness of HIE potential benefits, financial incentive programs, use of standard guidelines, implementation of certified technology, technical assistance, training programs and trust between healthcare providers. The study highlights deficiencies in the current policy using the literature and identifies the "pattern" as an indication for a new policy approach. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Data Assimilation Experiments using Quality Controlled AIRS Version 5 Temperature Soundings

    NASA Technical Reports Server (NTRS)

    Susskind, Joel

    2008-01-01

    The AIRS Science Team Version 5 retrieval algorithm has been finalized and is now operational at the Goddard DAAC in the processing (and reprocessing) of all AlRS data. Version 5 contains accurate case-by-case error estimates for most derived products, which are also used for quality control. We have conducted forecast impact experiments assimilating AlRS quality controlled temperature profiles using the NASA GEOS-5 data assimilation system, consisting of the NCEP GSI analysis coupled with the NASA FVGCM. Assimilation of quality controlled temperature profiles resulted in significantly improved forecast skill in both the Northern Hemisphere and Southern Hemisphere Extra-Tropics, compared to that obtained from analyses obtained when all data used operationally by NCEP except for AlRS data is assimilated. Experiments using different Quality Control thresholds for assimilation of AlRS temperature retrievals showed that a medium quality control threshold performed better than a tighter threshold, which provided better overall sounding accuracy; or a looser threshold, which provided better spatial coverage of accepted soundings. We are conducting more experiments to further optimize this balance of spatial coverage and sounding accuracy from the data assimilation perspective. In all cases, temperature soundings were assimilated well below cloud level in partially cloudy cases. The positive impact of assimilating AlRS derived atmospheric temperatures all but vanished when only AIRS stratospheric temperatures were assimilated. Forecast skill resulting from assimilation of AlRS radiances uncontaminated by clouds, instead of AlRS temperature soundings, was only slightly better than that resulting from assimilation of only stratospheric AlRS temperatures. This reduction in forecast skill is most likely the result of significant loss of tropospheric information when only AIRS radiances unaffected by clouds are used in the data assimilation process.

  12. Assimilating MODIS-based albedo and snow cover fraction into the Common Land Model to improve snow depth simulation with direct insertion and deterministic ensemble Kalman filter methods

    NASA Astrophysics Data System (ADS)

    Xu, Jianhui; Shu, Hong

    2014-09-01

    This study assesses the analysis performance of assimilating the Moderate Resolution Imaging Spectroradiometer (MODIS)-based albedo and snow cover fraction (SCF) separately or jointly into the physically based Common Land Model (CoLM). A direct insertion method (DI) is proposed to assimilate the black and white-sky albedos into the CoLM. The MODIS-based albedo is calculated with the MODIS bidirectional reflectance distribution function (BRDF) model parameters product (MCD43B1) and the solar zenith angle as estimated in the CoLM for each time step. Meanwhile, the MODIS SCF (MOD10A1) is assimilated into the CoLM using the deterministic ensemble Kalman filter (DEnKF) method. A new DEnKF-albedo assimilation scheme for integrating the DI and DEnKF assimilation schemes is proposed. Our assimilation results are validated against in situ snow depth observations from November 2008 to March 2009 at five sites in the Altay region of China. The experimental results show that all three data assimilation schemes can improve snow depth simulations. But overall, the DEnKF-albedo assimilation shows the best analysis performance as it significantly reduces the bias and root-mean-square error (RMSE) during the snow accumulation and ablation periods at all sites except for the Fuyun site. The SCF assimilation via DEnKF produces better results than the albedo assimilation via DI, implying that the albedo assimilation that indirectly updates the snow depth state variable is less efficient than the direct SCF assimilation. For the Fuyun site, the DEnKF-albedo scheme tends to overestimate the snow depth accumulation with the maximum bias and RMSE values because of the large positive innovation (observation minus forecast).

  13. Assimilation of remote sensing observations into a sediment transport model of China's largest freshwater lake: spatial and temporal effects.

    PubMed

    Zhang, Peng; Chen, Xiaoling; Lu, Jianzhong; Zhang, Wei

    2015-12-01

    Numerical models are important tools that are used in studies of sediment dynamics in inland and coastal waters, and these models can now benefit from the use of integrated remote sensing observations. This study explores a scheme for assimilating remotely sensed suspended sediment (from charge-coupled device (CCD) images obtained from the Huanjing (HJ) satellite) into a two-dimensional sediment transport model of Poyang Lake, the largest freshwater lake in China. Optimal interpolation is used as the assimilation method, and model predictions are obtained by combining four remote sensing images. The parameters for optimal interpolation are determined through a series of assimilation experiments evaluating the sediment predictions based on field measurements. The model with assimilation of remotely sensed sediment reduces the root-mean-square error of the predicted sediment concentrations by 39.4% relative to the model without assimilation, demonstrating the effectiveness of the assimilation scheme. The spatial effect of assimilation is explored by comparing model predictions with remotely sensed sediment, revealing that the model with assimilation generates reasonable spatial distribution patterns of suspended sediment. The temporal effect of assimilation on the model's predictive capabilities varies spatially, with an average temporal effect of approximately 10.8 days. The current velocities which dominate the rate and direction of sediment transport most likely result in spatial differences in the temporal effect of assimilation on model predictions.

  14. Comparison between assimilated and non-assimilated experiments of the MACCii global reanalysis near surface ozone

    NASA Astrophysics Data System (ADS)

    Tsikerdekis, Athanasios; Katragou, Eleni; Zanis, Prodromos; Melas, Dimitrios; Eskes, Henk; Flemming, Johannes; Huijnen, Vincent; Inness, Antje; Kapsomenakis, Ioannis; Schultz, Martin; Stein, Olaf; Zerefos, Christos

    2014-05-01

    In this work we evaluate near surface ozone concentrations of the MACCii global reanalysis using measurements from the EMEP and AIRBASE database. The eight-year long reanalysis of atmospheric composition data covering the period 2003-2010 was constructed as part of the FP7-funded Monitoring Atmospheric Composition and Climate project by assimilating satellite data into a global model and data assimilation system (Inness et al., 2013). The study mainly focuses in the differences between the assimilated and the non-assimilated experiments and aims to identify and quantify any improvements achieved by adding data assimilation to the system. Results are analyzed in eight European sub-regions and region-specific Taylor plots illustrate the evaluation and the overall predictive skill of each experiment. The diurnal and annual cycles of near surface ozone are evaluated for both experiments. Furthermore ozone exposure indices for crop growth (AOT40), human health (SOMO35) and the number of days that 8-hour ozone averages exceeded 60ppb and 90ppb have been calculated for each station based on both observed and simulated data. Results indicate mostly improvement of the assimilated experiment with respect to the high near surface ozone concentrations, the diurnal cycle and range and the bias in comparison to the non-assimilated experiment. The limitations of the comparison between assimilated and non-assimilated experiments for near surface ozone are also discussed.

  15. Ocean Data Assimilation Systems for GODAE

    DTIC Science & Technology

    2009-09-01

    we describe some of the ocean data assimilation systems that have been developed within the Global Ocean Data Assimilation Experiment (GODAE...assimilation systems in the post-GODAF. time period beyond 2008. 15. SUBJECT TERMS Global Ocean Data Assimilation Experiment, ARGO, subsurface...E. R. Franchi , 7000 Public Affairs (Unclassified/ Unlimited Only), Code 703o 4 yj ?>-* i o’ 1. Release of this paper is approved. 2. To the

  16. Variational assimilation of streamflow into operational distributed hydrologic models: effect of spatiotemporal adjustment scale

    NASA Astrophysics Data System (ADS)

    Lee, H.; Seo, D.-J.; Liu, Y.; Koren, V.; McKee, P.; Corby, R.

    2012-01-01

    State updating of distributed rainfall-runoff models via streamflow assimilation is subject to overfitting because large dimensionality of the state space of the model may render the assimilation problem seriously under-determined. To examine the issue in the context of operational hydrology, we carry out a set of real-world experiments in which streamflow data is assimilated into gridded Sacramento Soil Moisture Accounting (SAC-SMA) and kinematic-wave routing models of the US National Weather Service (NWS) Research Distributed Hydrologic Model (RDHM) with the variational data assimilation technique. Study basins include four basins in Oklahoma and five basins in Texas. To assess the sensitivity of data assimilation performance to dimensionality reduction in the control vector, we used nine different spatiotemporal adjustment scales, where state variables are adjusted in a lumped, semi-distributed, or distributed fashion and biases in precipitation and potential evaporation (PE) are adjusted hourly, 6-hourly, or kept time-invariant. For each adjustment scale, three different streamflow assimilation scenarios are explored, where streamflow observations at basin interior points, at the basin outlet, or at both interior points and the outlet are assimilated. The streamflow assimilation experiments with nine different basins show that the optimum spatiotemporal adjustment scale varies from one basin to another and may be different for streamflow analysis and prediction in all of the three streamflow assimilation scenarios. The most preferred adjustment scale for seven out of nine basins is found to be the distributed, hourly scale, despite the fact that several independent validation results at this adjustment scale indicated the occurrence of overfitting. Basins with highly correlated interior and outlet flows tend to be less sensitive to the adjustment scale and could benefit more from streamflow assimilation. In comparison to outlet flow assimilation, interior flow assimilation at any adjustment scale produces streamflow predictions with a spatial correlation structure more consistent with that of streamflow observations. We also describe diagnosing the complexity of the assimilation problem using the spatial correlation information associated with the streamflow process, and discuss the effect of timing errors in a simulated hydrograph on the performance of the data assimilation procedure.

  17. Assimilation of MLS and OMI Ozone Data

    NASA Technical Reports Server (NTRS)

    Stajner, I.; Wargan, K.; Chang, L.-P.; Hayashi, H.; Pawson, S.; Froidevaux, L.; Livesey, N.

    2005-01-01

    Ozone data from Aura Microwave Limb Sounder (MLS) and Ozone Monitoring Instrument (OMI) were assimilated into the ozone model at NASA's Global Modeling and Assimilation Office (GMAO). This assimilation produces ozone fields that are superior to those from the operational GMAO assimilation of Solar Backscatter Ultraviolet (SBUV/2) instrument data. Assimilation of Aura data improves the representation of the "ozone hole" and the agreement with independent Stratospheric Aerosol and Gas Experiment (SAGE) III and ozone sonde data. Ozone in the lower stratosphere is captured better: mean state, vertical gradients, spatial and temporal variability are all improved. Inclusion of OMI and MLS data together, or separately, in the assimilation system provides a way of checking how consistent OMI and MLS data are with each other, and with the ozone model. We found that differences between OMI total ozone column data and model forecasts decrease after MLS data are assimilated. This indicates that MLS stratospheric ozone profiles are consistent with OMI total ozone columns. The evaluation of error characteristics of OMI and MLS ozone will continue as data from newer versions of retrievals becomes available. We report on the initial step in obtaining global assimilated ozone fields that combine measurements from different Aura instruments, the ozone model at the GMAO, and their respective error characteristics. We plan to use assimilated ozone fields in estimation of tropospheric ozone. We also plan to investigate impacts of assimilated ozone fields on numerical weather prediction through their use in radiative models and in the assimilation of infrared nadir radiance data from NASA's Advanced Infrared Sounder (AIRS).

  18. Assimilation of total lightning data using the three-dimensional variational method at convection-allowing resolution

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Zhang, Yijun; Xu, Liangtao; Zheng, Dong; Yao, Wen

    2017-08-01

    A large number of observational analyses have shown that lightning data can be used to indicate areas of deep convection. It is important to assimilate observed lightning data into numerical models, so that more small-scale information can be incorporated to improve the quality of the initial condition and the subsequent forecasts. In this study, the empirical relationship between flash rate, water vapor mixing ratio, and graupel mixing ratio was used to adjust the model relative humidity, which was then assimilated by using the three-dimensional variational data assimilation system of the Weather Research and Forecasting model in cycling mode at 10-min intervals. To find the appropriate assimilation time-window length that yielded significant improvement in both the initial conditions and subsequent forecasts, four experiments with different assimilation time-window lengths were conducted for a squall line case that occurred on 10 July 2007 in North China. It was found that 60 min was the appropriate assimilation time-window length for this case, and longer assimilation window length was unnecessary since no further improvement was present. Forecasts of 1-h accumulated precipitation during the assimilation period and the subsequent 3-h accumulated precipitation were significantly improved compared with the control experiment without lightning data assimilation. The simulated reflectivity was optimal after 30 min of the forecast, it remained optimal during the following 42 min, and the positive effect from lightning data assimilation began to diminish after 72 min of the forecast. Overall, the improvement from lightning data assimilation can be maintained for about 3 h.

  19. Morphodynamic data assimilation used to understand changing coasts

    USGS Publications Warehouse

    Plant, Nathaniel G.; Long, Joseph W.

    2015-01-01

    Morphodynamic data assimilation blends observations with model predictions and comes in many forms, including linear regression, Kalman filter, brute-force parameter estimation, variational assimilation, and Bayesian analysis. Importantly, data assimilation can be used to identify sources of prediction errors that lead to improved fundamental understanding. Overall, models incorporating data assimilation yield better information to the people who must make decisions impacting safety and wellbeing in coastal regions that experience hazards due to storms, sea-level rise, and erosion. We present examples of data assimilation associated with morphologic change. We conclude that enough morphodynamic predictive capability is available now to be useful to people, and that we will increase our understanding and the level of detail of our predictions through assimilation of observations and numerical-statistical models.

  20. A short note on the assimilation of collocated and concurrent GPS and ionosonde data into the Electron Density Assimilative Model

    NASA Astrophysics Data System (ADS)

    Angling, M. J.; Jackson-Booth, N. K.

    2011-12-01

    The Electron Density Assimilative Model (EDAM) has been developed to provide real-time characterizations of the ionosphere by assimilating diverse data sets into a background model. Techniques have been developed to assimilate virtual height ionogram traces rather than relying on true height inversions. A test assimilation has been conducted using both GPS and ionosonde data as input. Postassimilation analysis shows that foF2 residuals can be degraded when only GPS data are assimilated. It has also been demonstrated that by using both data types it is possible to have low total electron content and foF2 residuals and that this is achieved by modifying the ionospheric slab thickness.

  1. Efficient Methods to Assimilate Satellite Retrievals Based on Information Content. Part 2; Suboptimal Retrieval Assimilation

    NASA Technical Reports Server (NTRS)

    Joiner, J.; Dee, D. P.

    1998-01-01

    One of the outstanding problems in data assimilation has been and continues to be how best to utilize satellite data while balancing the tradeoff between accuracy and computational cost. A number of weather prediction centers have recently achieved remarkable success in improving their forecast skill by changing the method by which satellite data are assimilated into the forecast model from the traditional approach of assimilating retrievals to the direct assimilation of radiances in a variational framework. The operational implementation of such a substantial change in methodology involves a great number of technical details, e.g., pertaining to quality control procedures, systematic error correction techniques, and tuning of the statistical parameters in the analysis algorithm. Although there are clear theoretical advantages to the direct radiance assimilation approach, it is not obvious at all to what extent the improvements that have been obtained so far can be attributed to the change in methodology, or to various technical aspects of the implementation. The issue is of interest because retrieval assimilation retains many practical and logistical advantages which may become even more significant in the near future when increasingly high-volume data sources become available. The central question we address here is: how much improvement can we expect from assimilating radiances rather than retrievals, all other things being equal? We compare the two approaches in a simplified one-dimensional theoretical framework, in which problems related to quality control and systematic error correction are conveniently absent. By assuming a perfect radiative transfer model and perfect knowledge of radiance and background error covariances, we are able to formulate a nonlinear local error analysis for each assimilation method. Direct radiance assimilation is optimal in this idealized context, while the traditional method of assimilating retrievals is suboptimal because it ignores the cross-covariances between background errors and retrieval errors. We show that interactive retrieval assimilation (where the same background used for assimilation is also used in the retrieval step) is equivalent to direct assimilation of radiances with suboptimal analysis weights. We illustrate and extend these theoretical arguments with several one-dimensional assimilation experiments, where we estimate vertical atmospheric profiles using simulated data from both the High-resolution InfraRed Sounder 2 (HIRS2) and the future Atmospheric InfraRed Sounder (AIRS).

  2. Simultaneous assimilation of ozone profiles from multiple UV-VIS satellite instruments

    NASA Astrophysics Data System (ADS)

    van Peet, Jacob C. A.; van der A, Ronald J.; Kelder, Hennie M.; Levelt, Pieternel F.

    2018-02-01

    A three-dimensional global ozone distribution has been derived from assimilation of ozone profiles that were observed by satellites. By simultaneous assimilation of ozone profiles retrieved from the nadir looking satellite instruments Global Ozone Monitoring Experiment 2 (GOME-2) and Ozone Monitoring Instrument (OMI), which measure the atmosphere at different times of the day, the quality of the derived atmospheric ozone field has been improved. The assimilation is using an extended Kalman filter in which chemical transport model TM5 has been used for the forecast. The combined assimilation of both GOME-2 and OMI improves upon the assimilation results of a single sensor. The new assimilation system has been demonstrated by processing 4 years of data from 2008 to 2011. Validation of the assimilation output by comparison with sondes shows that biases vary between -5 and +10 % between the surface and 100 hPa. The biases for the combined assimilation vary between -3 and +3 % in the region between 100 and 10 hPa where GOME-2 and OMI are most sensitive. This is a strong improvement compared to direct retrievals of ozone profiles from satellite observations.

  3. Improving Assimilated Global Climate Data Using TRMM and SSM/I Rainfall and Moisture Data

    NASA Technical Reports Server (NTRS)

    Hou, Arthur Y.; Zhang, Sara Q.; daSilva, Arlindo M.; Olson, William S.

    1999-01-01

    Current global analyses contain significant errors in primary hydrological fields such as precipitation, evaporation, and related cloud and moisture in the tropics. Work has been underway at NASA's Data Assimilation Office to explore the use of TRMM and SSM/I-derived rainfall and total precipitable water (TPW) data in global data assimilation to directly constrain these hydrological parameters. We found that assimilating these data types improves not only the precipitation and moisture estimates but also key climate parameters directly linked to convection such as the outgoing longwave radiation, clouds, and the large-scale circulation in the tropics. We will present results showing that assimilating TRMM and SSM/I 6-hour averaged rain rates and TPW estimates significantly reduces the state-dependent systematic errors in assimilated products. Specifically, rainfall assimilation improves cloud and latent heating distributions, which, in turn, improves the cloudy-sky radiation and the large-scale circulation, while TPW assimilation reduces moisture biases to improve radiation in clear-sky regions. Rainfall and TPW assimilation also improves tropical forecasts beyond 1 day.

  4. Exploring coupled 4D-Var data assimilation using an idealised atmosphere-ocean model

    NASA Astrophysics Data System (ADS)

    Smith, Polly; Fowler, Alison; Lawless, Amos; Haines, Keith

    2014-05-01

    The successful application of data assimilation techniques to operational numerical weather prediction and ocean forecasting systems has led to an increased interest in their use for the initialisation of coupled atmosphere-ocean models in prediction on seasonal to decadal timescales. Coupled data assimilation presents a significant challenge but offers a long list of potential benefits including improved use of near-surface observations, reduction of initialisation shocks in coupled forecasts, and generation of a consistent system state for the initialisation of coupled forecasts across all timescales. In this work we explore some of the fundamental questions in the design of coupled data assimilation systems within the context of an idealised one-dimensional coupled atmosphere-ocean model. The system is based on the European Centre for Medium-Range Weather Forecasts (ECMWF) Integrated Forecast System (IFS) atmosphere model and a K-Profile Parameterisation (KKP) mixed layer ocean model developed by the National Centre for Atmospheric Science (NCAS) climate group at the University of Reading. It employs a strong constraint incremental 4D-Var scheme and is designed to enable the effective exploration of various approaches to performing coupled model data assimilation whilst avoiding many of the issues associated with more complex models. Working with this simple framework enables a greater range and quantity of experiments to be performed. Here, we will describe the development of our simplified single-column coupled atmosphere-ocean 4D-Var assimilation system and present preliminary results from a series of identical twin experiments devised to investigate and compare the behaviour and sensitivities of different coupled data assimilation methodologies. This includes comparing fully and weakly coupled assimilations with uncoupled assimilation, investigating whether coupled assimilation can eliminate or lessen initialisation shock in coupled model forecasts, and exploring the effect of the assimilation window length in coupled assimilations. These experiments will facilitate a greater theoretical understanding of the coupled atmosphere-ocean data assimilation problem and thus help guide the design and implementation of different coupling strategies within operational systems. This research is funded by the European Space Agency (ESA) and the UK Natural Environment Research Council (NERC). The ESA funded component is part of the Data Assimilation Projects - Coupled Model Data Assimilation initiative whose goal is to advance data assimilation techniques in fully coupled atmosphere-ocean models (see http://www.esa-da.org/). It is being conducted in parallel to the development of prototype weakly coupled data assimilation systems at both the UK Met Office and ECMWF.

  5. Impact of Assimilation on Heavy Rainfall Simulations Using WRF Model: Sensitivity of Assimilation Results to Background Error Statistics

    NASA Astrophysics Data System (ADS)

    Rakesh, V.; Kantharao, B.

    2017-03-01

    Data assimilation is considered as one of the effective tools for improving forecast skill of mesoscale models. However, for optimum utilization and effective assimilation of observations, many factors need to be taken into account while designing data assimilation methodology. One of the critical components that determines the amount and propagation observation information into the analysis, is model background error statistics (BES). The objective of this study is to quantify how BES in data assimilation impacts on simulation of heavy rainfall events over a southern state in India, Karnataka. Simulations of 40 heavy rainfall events were carried out using Weather Research and Forecasting Model with and without data assimilation. The assimilation experiments were conducted using global and regional BES while the experiment with no assimilation was used as the baseline for assessing the impact of data assimilation. The simulated rainfall is verified against high-resolution rain-gage observations over Karnataka. Statistical evaluation using several accuracy and skill measures shows that data assimilation has improved the heavy rainfall simulation. Our results showed that the experiment using regional BES outperformed the one which used global BES. Critical thermo-dynamic variables conducive for heavy rainfall like convective available potential energy simulated using regional BES is more realistic compared to global BES. It is pointed out that these results have important practical implications in design of forecast platforms while decision-making during extreme weather events

  6. A Global Data Assimilation System for Atmospheric Aerosol

    NASA Technical Reports Server (NTRS)

    daSilva, Arlindo

    1999-01-01

    We will give an overview of an aerosol data assimilation system which combines advances in remote sensing of atmospheric aerosols, aerosol modeling and data assimilation methodology to produce high spatial and temporal resolution 3D aerosol fields. Initially, the Goddard Aerosol Assimilation System (GAAS) will assimilate TOMS, AVHRR and AERONET observations; later we will include MODIS and MISR. This data assimilation capability will allows us to integrate complementing aerosol observations from these platforms, enabling the development of an assimilated aerosol climatology as well as a global aerosol forecasting system in support of field campaigns. Furthermore, this system provides an interactive retrieval framework for each aerosol observing satellites, in particular TOMS and AVHRR. The Goddard Aerosol Assimilation System (GAAS) takes advantage of recent advances in constituent data assimilation at DAO, including flow dependent parameterizations of error covariances and the proper consideration of model bias. For its prognostic transport model, GAAS will utilize the Goddard Ozone, Chemistry, Aerosol, Radiation and Transport (GOCART) model developed at NASA/GSFC Codes 916 and 910.3. GOCART includes the Lin-Rood flux-form, semi-Langrangian transport model with parameterized aerosol chemistry and physical processes for absorbing (dust and black carbon) and non-absorbing aerosols (sulfate and organic carbon). Observations and model fields are combined using a constituent version of DAO's Physical-space Statistical Analysis System (PSAS), including its adaptive quality control system. In this talk we describe the main components of this assimilation system and present preliminary results obtained by assimilating TOMS data.

  7. Continuing Themes in Assimilation Through Education.

    ERIC Educational Resources Information Center

    Strouse, Joan

    1987-01-01

    Discusses assimilation and adaptation of immigrants in the United States. Summarizes major sociological theories on assimilation. Focuses on schools as instruments of assimilation that attempt to force Anglo-conformity upon students. The refugee student's perceptions of his or her problems and opportunities are discussed. (PS)

  8. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  9. Towards Data-Driven Simulations of Wildfire Spread using Ensemble-based Data Assimilation

    NASA Astrophysics Data System (ADS)

    Rochoux, M. C.; Bart, J.; Ricci, S. M.; Cuenot, B.; Trouvé, A.; Duchaine, F.; Morel, T.

    2012-12-01

    Real-time predictions of a propagating wildfire remain a challenging task because the problem involves both multi-physics and multi-scales. The propagation speed of wildfires, also called the rate of spread (ROS), is indeed determined by complex interactions between pyrolysis, combustion and flow dynamics, atmospheric dynamics occurring at vegetation, topographical and meteorological scales. Current operational fire spread models are mainly based on a semi-empirical parameterization of the ROS in terms of vegetation, topographical and meteorological properties. For the fire spread simulation to be predictive and compatible with operational applications, the uncertainty on the ROS model should be reduced. As recent progress made in remote sensing technology provides new ways to monitor the fire front position, a promising approach to overcome the difficulties found in wildfire spread simulations is to integrate fire modeling and fire sensing technologies using data assimilation (DA). For this purpose we have developed a prototype data-driven wildfire spread simulator in order to provide optimal estimates of poorly known model parameters [*]. The data-driven simulation capability is adapted for more realistic wildfire spread : it considers a regional-scale fire spread model that is informed by observations of the fire front location. An Ensemble Kalman Filter algorithm (EnKF) based on a parallel computing platform (OpenPALM) was implemented in order to perform a multi-parameter sequential estimation where wind magnitude and direction are in addition to vegetation properties (see attached figure). The EnKF algorithm shows its good ability to track a small-scale grassland fire experiment and ensures a good accounting for the sensitivity of the simulation outcomes to the control parameters. As a conclusion, it was shown that data assimilation is a promising approach to more accurately forecast time-varying wildfire spread conditions as new airborne-like observations of the fire front location get available. [*] Rochoux, M.C., Delmotte, B., Cuenot, B., Ricci, S., and Trouvé, A. (2012) "Regional-scale simulations of wildland fire spread informed by real-time flame front observations", Proc. Combust. Inst., 34, in press http://dx.doi.org/10.1016/j.proci.2012.06.090 EnKF-based tracking of small-scale grassland fire experiment, with estimation of wind and fuel parameters.

  10. Evaluation of an operational real-time irrigation scheduling scheme for drip irrigated citrus fields in Picassent, Spain

    NASA Astrophysics Data System (ADS)

    Li, Dazhi; Hendricks-Franssen, Harrie-Jan; Han, Xujun; Jiménez Bello, Miguel Angel; Martínez Alzamora, Fernando; Vereecken, Harry

    2017-04-01

    Irrigated agriculture accounts worldwide for 40% of food production and 70% of fresh water withdrawals. Irrigation scheduling aims to minimize water use while maintaining the agricultural production. In this study we were concerned with the real-time automatic control of irrigation, which calculates daily water allocation by combining information from soil moisture sensors and a land surface model. The combination of soil moisture measurements and predictions by the Community Land Model (CLM) using sequential data assimilation (DA) is a promising alternative to improve the estimate of soil and plant water status. The LETKF (Local Ensemble Transform Kalman Filter) was chosen to assimilate soil water content measured by FDR (Frequency Domain Reflectometry) into CLM and improve the initial (soil moisture) conditions for the next model run. In addition, predictions by the GFS (Global Forecast System) atmospheric simulation model were used as atmospheric input data for CLM to predict an ensemble of possible soil moisture evolutions for the next days. The difference between predicted and target soil water content is defined as the water deficit, and the irrigation amount was calculated by the integrated water deficit over the root zone. The corresponding irrigation time to apply the required water was introduced in SCADA (supervisory control and data acquisition system) for each citrus field. In total 6 fields were irrigated according our optimization approach including data assimilation (CLM-DA) and there were also 2 fields following the FAO (Food and Agriculture Organization) water balance method and 4 fields controlled by farmers as reference. During the real-time irrigation campaign in Valencia from July to October in 2015 and June to October in 2016, the applied irrigation amount, stem water potential and soil moisture content were recorded. The data indicated that 5% 20% less irrigation water was needed for the CLM-DA scheduled fields than for the other fields following the FAO or farmers' method. Stem water potential data indicated that the CLM-DA fields were not suffering from water stress during most of the irrigation period. Even though the CLM-DA fields received the least irrigation water, the orange production was not suppressed either. Our results show the water saving potential of the CLM-DA method compared to other traditional irrigation methods.

  11. Ensemble-Based Assimilation of Aerosol Observations in GEOS-5

    NASA Technical Reports Server (NTRS)

    Buchard, V.; Da Silva, A.

    2016-01-01

    MERRA-2 is the latest Aerosol Reanalysis produced at NASA's Global Modeling Assimilation Office (GMAO) from 1979 to present. This reanalysis is based on a version of the GEOS-5 model radiatively coupled to GOCART aerosols and includes assimilation of bias corrected Aerosol Optical Depth (AOD) from AVHRR over ocean, MODIS sensors on both Terra and Aqua satellites, MISR over bright surfaces and AERONET data. In order to assimilate lidar profiles of aerosols, we are updating the aerosol component of our assimilation system to an Ensemble Kalman Filter (EnKF) type of scheme using ensembles generated routinely by the meteorological assimilation. Following the work performed with the first NASA's aerosol reanalysis (MERRAero), we first validate the vertical structure of MERRA-2 aerosol assimilated fields using CALIOP data over regions of particular interest during 2008.

  12. One-dimensional soil temperature assimilation experiment based on unscented particle filter and Common Land Model

    NASA Astrophysics Data System (ADS)

    Fu, Xiao Lei; Jin, Bao Ming; Jiang, Xiao Lei; Chen, Cheng

    2018-06-01

    Data assimilation is an efficient way to improve the simulation/prediction accuracy in many fields of geosciences especially in meteorological and hydrological applications. This study takes unscented particle filter (UPF) as an example to test its performance at different two probability distribution, Gaussian and Uniform distributions with two different assimilation frequencies experiments (1) assimilating hourly in situ soil surface temperature, (2) assimilating the original Moderate Resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature (LST) once per day. The numerical experiment results show that the filter performs better when increasing the assimilation frequency. In addition, UPF is efficient for improving the soil variables (e.g., soil temperature) simulation/prediction accuracy, though it is not sensitive to the probability distribution for observation error in soil temperature assimilation.

  13. Comparison of different assimilation schemes in an operational assimilation system with Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre

    2016-04-01

    In this paper, four assimilation schemes, including an intermittent assimilation scheme (INT) and three incremental assimilation schemes (IAU 0, IAU 50 and IAU 100), are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The three IAU schemes differ from each other in the position of the increment update window that has the same size as the assimilation window. 0, 50 and 100 correspond to the degree of superposition of the increment update window on the current assimilation window. Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments The relevance of each assimilation scheme is evaluated through analyses on thermohaline variables and the current velocities. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semi-independent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system.

  14. Assimilation of Stratospheric Meteorological and Constituent Observations: A Review

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Pawson, Steven

    2004-01-01

    This talk reviews the assimilation of meteorological and constituent observations of the stratosphere. The first efforts to assimilate observations into stratospheric models were during the early 1980s, and a number of research studies followed during the next decade. Since the launch of the Upper Atmospheric Research Satellite (UARS) in 1991, model-assimilated data sets of the stratospheric meteorological state have been routinely available. These assimilated data sets were critical in bringing together observations from the different instruments on UARS as well as linking UARS observations to measurements from other platforms. Using trajectory-mapping techniques, meteorological assimilation analyses are, now, widely used in the analysis of constituent observations and have increased the level of quantitative study of stratospheric chemistry and transport. During the 1990s the use of winds and temperatures from assimilated data sets became standard for offline chemistry and transport modeling. variability in middle latitudes. The transport experiments, however, reveal a set of shortcomings that become obvious as systematic errors are integrated over time. Generally, the tropics are not well represented, mixing between the tropics and middle latitudes is overestimated, and the residual circulation is not accurate. These shortcomings reveal underlying fundamental challenges related to bias and noise. Current studies using model simulation and data assimilation in controlled experimentation are highlighting the issues that must be addressed if assimilated data sets are to be convincingly used to study interannual variability and decadal change. observations. The primary focus has been on stratospheric ozone, but there are efforts that investigate a suite of reactive chemical constituents. Recent progress in ozone assimilation shows the potential of assimilation to contribute to the validation of ozone observations and, ultimately, the retrieval of ozone profiles from space-based radiance measurements. Assimilated data sets provide accurate analyses of synoptic and planetary Scale At the same time, stratospheric assimilation is evolving to include constituent

  15. DasPy – Open Source Multivariate Land Data Assimilation Framework with High Performance Computing

    NASA Astrophysics Data System (ADS)

    Han, Xujun; Li, Xin; Montzka, Carsten; Kollet, Stefan; Vereecken, Harry; Hendricks Franssen, Harrie-Jan

    2015-04-01

    Data assimilation has become a popular method to integrate observations from multiple sources with land surface models to improve predictions of the water and energy cycles of the soil-vegetation-atmosphere continuum. In recent years, several land data assimilation systems have been developed in different research agencies. Because of the software availability or adaptability, these systems are not easy to apply for the purpose of multivariate land data assimilation research. Multivariate data assimilation refers to the simultaneous assimilation of observation data for multiple model state variables into a simulation model. Our main motivation was to develop an open source multivariate land data assimilation framework (DasPy) which is implemented using the Python script language mixed with C++ and Fortran language. This system has been evaluated in several soil moisture, L-band brightness temperature and land surface temperature assimilation studies. The implementation allows also parameter estimation (soil properties and/or leaf area index) on the basis of the joint state and parameter estimation approach. LETKF (Local Ensemble Transform Kalman Filter) is implemented as the main data assimilation algorithm, and uncertainties in the data assimilation can be represented by perturbed atmospheric forcings, perturbed soil and vegetation properties and model initial conditions. The CLM4.5 (Community Land Model) was integrated as the model operator. The CMEM (Community Microwave Emission Modelling Platform), COSMIC (COsmic-ray Soil Moisture Interaction Code) and the two source formulation were integrated as observation operators for assimilation of L-band passive microwave, cosmic-ray soil moisture probe and land surface temperature measurements, respectively. DasPy is parallelized using the hybrid MPI (Message Passing Interface) and OpenMP (Open Multi-Processing) techniques. All the input and output data flow is organized efficiently using the commonly used NetCDF file format. Online 1D and 2D visualization of data assimilation results is also implemented to facilitate the post simulation analysis. In summary, DasPy is a ready to use open source parallel multivariate land data assimilation framework.

  16. Improving GEOS-5 seven day forecast skill by assimilation of quality controlled AIRS temperature profiles

    NASA Astrophysics Data System (ADS)

    Susskind, J.; Rosenberg, R. I.

    2016-12-01

    The GEOS-5 Data Assimilation System (DAS) generates a global analysis every six hours by combining the previous six hour forecast for that time period with contemporaneous observations. These observations include in-situ observations as well as those taken by satellite borne instruments, such as AIRS/AMSU on EOS Aqua and CrIS/ATMS on S-NPP. Operational data assimilation methodology assimilates observed channel radiances Ri for IR sounding instruments such as AIRS and CrIS, but only for those channels i in a given scene whose radiances are thought to be unaffected by clouds. A limitation of this approach is that radiances in most tropospheric sounding channels are affected by clouds under partial cloud cover conditions, which occurs most of the time. The AIRS Science Team Version-6 retrieval algorithm generates cloud cleared radiances (CCR's) for each channel in a given scene, which represent the radiances AIRS would have observed if the scene were cloud free, and then uses them to determine quality controlled (QC'd) temperature profiles T(p) under all cloud conditions. There are potential advantages to assimilate either AIRS QC'd CCR's or QC'd T(p) instead of Ri in that the spatial coverage of observations is greater under partial cloud cover. We tested these two alternate data assimilation approaches by running three parallel data assimilation experiments over different time periods using GEOS-5. Experiment 1 assimilated all observations as done operationally, Experiment 2 assimilated QC'd values of AIRS CCRs in place of AIRS radiances, and Experiment 3 assimilated QC'd values of T(p) in place of observed radiances. Assimilation of QC'd AIRS T(p) resulted in significant improvement in seven day forecast skill compared to assimilation of CCR's or assimilation of observed radiances, especially in the Southern Hemisphere Extra-tropics.

  17. 1D-Var multilayer assimilation of X-band SAR data into a detailed snowpack model

    NASA Astrophysics Data System (ADS)

    Phan, X. V.; Ferro-Famil, L.; Gay, M.; Durand, Y.; Dumont, M.; Morin, S.; Allain, S.; D'Urso, G.; Girard, A.

    2014-10-01

    The structure and physical properties of a snowpack and their temporal evolution may be simulated using meteorological data and a snow metamorphism model. Such an approach may meet limitations related to potential divergences and accumulated errors, to a limited spatial resolution, to wind or topography-induced local modulations of the physical properties of a snow cover, etc. Exogenous data are then required in order to constrain the simulator and improve its performance over time. Synthetic-aperture radars (SARs) and, in particular, recent sensors provide reflectivity maps of snow-covered environments with high temporal and spatial resolutions. The radiometric properties of a snowpack measured at sufficiently high carrier frequencies are known to be tightly related to some of its main physical parameters, like its depth, snow grain size and density. SAR acquisitions may then be used, together with an electromagnetic backscattering model (EBM) able to simulate the reflectivity of a snowpack from a set of physical descriptors, in order to constrain a physical snowpack model. In this study, we introduce a variational data assimilation scheme coupling TerraSAR-X radiometric data into the snowpack evolution model Crocus. The physical properties of a snowpack, such as snow density and optical diameter of each layer, are simulated by Crocus, fed by the local reanalysis of meteorological data (SAFRAN) at a French Alpine location. These snowpack properties are used as inputs of an EBM based on dense media radiative transfer (DMRT) theory, which simulates the total backscattering coefficient of a dry snow medium at X and higher frequency bands. After evaluating the sensitivity of the EBM to snowpack parameters, a 1D-Var data assimilation scheme is implemented in order to minimize the discrepancies between EBM simulations and observations obtained from TerraSAR-X acquisitions by modifying the physical parameters of the Crocus-simulated snowpack. The algorithm then re-initializes Crocus with the modified snowpack physical parameters, allowing it to continue the simulation of snowpack evolution, with adjustments based on remote sensing information. This method is evaluated using multi-temporal TerraSAR-X images acquired over the specific site of the Argentière glacier (Mont-Blanc massif, French Alps) to constrain the evolution of Crocus. Results indicate that X-band SAR data can be taken into account to modify the evolution of snowpack simulated by Crocus.

  18. Impact of assimilating GOES imager clear-sky radiance with a rapid refresh assimilation system for convection-permitting forecast over Mexico

    NASA Astrophysics Data System (ADS)

    Yang, Chun; Liu, Zhiquan; Gao, Feng; Childs, Peter P.; Min, Jinzhong

    2017-05-01

    The Geostationary Operational Environmental Satellite (GOES) imager data could provide a continuous image of the evolutionary pattern of severe weather phenomena with its high spatial and temporal resolution. The capability to assimilate the GOES imager radiances has been developed within the Weather Research and Forecasting model's data assimilation system. Compared to the benchmark experiment with no GOES imager data, the impact of assimilating GOES imager radiances on the analysis and forecast of convective process over Mexico in 7-10 March 2016 was assessed through analysis/forecast cycling experiments using rapid refresh assimilation system with hybrid-3DEnVar scheme. With GOES imager radiance assimilation, better analyses were obtained in terms of the humidity, temperature, and simulated water vapor channel brightness temperature distribution. Positive forecast impacts from assimilating GOES imager radiance were seen when verified against the Tropospheric Airborne Meteorological Data Reporting observation, GOES imager observation, and Mexico station precipitation data.

  19. Improvement of aerosol optical properties modeling over Eastern Asia with MODIS AOD assimilation in a global non-hydrostatic icosahedral aerosol transport model.

    PubMed

    Dai, Tie; Schutgens, Nick A J; Goto, Daisuke; Shi, Guangyu; Nakajima, Teruyuki

    2014-12-01

    A new global aerosol assimilation system adopting a more complex icosahedral grid configuration is developed. Sensitivity tests for the assimilation system are performed utilizing satellite retrieved aerosol optical depth (AOD) from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the results over Eastern Asia are analyzed. The assimilated results are validated through independent Aerosol Robotic Network (AERONET) observations. Our results reveal that the ensemble and local patch sizes have little effect on the assimilation performance, whereas the ensemble perturbation method has the largest effect. Assimilation leads to significantly positive effect on the simulated AOD field, improving agreement with all of the 12 AERONET sites over the Eastern Asia based on both the correlation coefficient and the root mean square difference (assimilation efficiency). Meanwhile, better agreement of the Ångström Exponent (AE) field is achieved for 8 of the 12 sites due to the assimilation of AOD only. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Chromatic assimilation unaffected by perceived depth of inducing light.

    PubMed

    Shevell, Steven K; Cao, Dingcai

    2004-01-01

    Chromatic assimilation is a shift toward the color of nearby light. Several studies conclude that a neural process contributes to assimilation but the neural locus remains in question. Some studies posit a peripheral process, such as retinal receptive-field organization, while others claim the neural mechanism follows depth perception, figure/ground segregation, or perceptual grouping. The experiments here tested whether assimilation depends on a neural process that follows stereoscopic depth perception. By introducing binocular disparity, the test field judged in color was made to appear in a different depth plane than the light that induced assimilation. The chromaticity and spatial frequency of the inducing light, and the chromaticity of the test light, were varied. Chromatic assimilation was found with all inducing-light sizes and chromaticities, but the magnitude of assimilation did not depend on the perceived relative depth planes of the test and inducing fields. We found no evidence to support the view that chromatic assimilation depends on a neural process that follows binocular combination of the two eyes' signals.

  1. Joint penalized-likelihood reconstruction of time-activity curves and regions-of-interest from projection data in brain PET

    NASA Astrophysics Data System (ADS)

    Krestyannikov, E.; Tohka, J.; Ruotsalainen, U.

    2008-06-01

    This paper presents a novel statistical approach for joint estimation of regions-of-interest (ROIs) and the corresponding time-activity curves (TACs) from dynamic positron emission tomography (PET) brain projection data. It is based on optimizing the joint objective function that consists of a data log-likelihood term and two penalty terms reflecting the available a priori information about the human brain anatomy. The developed local optimization strategy iteratively updates both the ROI and TAC parameters and is guaranteed to monotonically increase the objective function. The quantitative evaluation of the algorithm is performed with numerically and Monte Carlo-simulated dynamic PET brain data of the 11C-Raclopride and 18F-FDG tracers. The results demonstrate that the method outperforms the existing sequential ROI quantification approaches in terms of accuracy, and can noticeably reduce the errors in TACs arising due to the finite spatial resolution and ROI delineation.

  2. A Looping-Based Model for Quenching Repression

    PubMed Central

    Pollak, Yaroslav; Goldberg, Sarah; Amit, Roee

    2017-01-01

    We model the regulatory role of proteins bound to looped DNA using a simulation in which dsDNA is represented as a self-avoiding chain, and proteins as spherical protrusions. We simulate long self-avoiding chains using a sequential importance sampling Monte-Carlo algorithm, and compute the probabilities for chain looping with and without a protrusion. We find that a protrusion near one of the chain’s termini reduces the probability of looping, even for chains much longer than the protrusion–chain-terminus distance. This effect increases with protrusion size, and decreases with protrusion-terminus distance. The reduced probability of looping can be explained via an eclipse-like model, which provides a novel inhibitory mechanism. We test the eclipse model on two possible transcription-factor occupancy states of the D. melanogaster eve 3/7 enhancer, and show that it provides a possible explanation for the experimentally-observed eve stripe 3 and 7 expression patterns. PMID:28085884

  3. Neglected chaos in international stock markets: Bayesian analysis of the joint return-volatility dynamical system

    NASA Astrophysics Data System (ADS)

    Tsionas, Mike G.; Michaelides, Panayotis G.

    2017-09-01

    We use a novel Bayesian inference procedure for the Lyapunov exponent in the dynamical system of returns and their unobserved volatility. In the dynamical system, computation of largest Lyapunov exponent by traditional methods is impossible as the stochastic nature has to be taken explicitly into account due to unobserved volatility. We apply the new techniques to daily stock return data for a group of six countries, namely USA, UK, Switzerland, Netherlands, Germany and France, from 2003 to 2014, by means of Sequential Monte Carlo for Bayesian inference. The evidence points to the direction that there is indeed noisy chaos both before and after the recent financial crisis. However, when a much simpler model is examined where the interaction between returns and volatility is not taken into consideration jointly, the hypothesis of chaotic dynamics does not receive much support by the data ("neglected chaos").

  4. Method and system for detecting polygon boundaries of structures in images as particle tracks through fields of corners and pixel gradients

    DOEpatents

    Paglieroni, David W [Pleasanton, CA; Manay, Siddharth [Livermore, CA

    2011-12-20

    A stochastic method and system for detecting polygon structures in images, by detecting a set of best matching corners of predetermined acuteness .alpha. of a polygon model from a set of similarity scores based on GDM features of corners, and tracking polygon boundaries as particle tracks using a sequential Monte Carlo approach. The tracking involves initializing polygon boundary tracking by selecting pairs of corners from the set of best matching corners to define a first side of a corresponding polygon boundary; tracking all intermediate sides of the polygon boundaries using a particle filter, and terminating polygon boundary tracking by determining the last side of the tracked polygon boundaries to close the polygon boundaries. The particle tracks are then blended to determine polygon matches, which may be made available, such as to a user, for ranking and inspection.

  5. Weighted Markov chains for forecasting and analysis in Incidence of infectious diseases in jiangsu Province, China☆

    PubMed Central

    Peng, Zhihang; Bao, Changjun; Zhao, Yang; Yi, Honggang; Xia, Letian; Yu, Hao; Shen, Hongbing; Chen, Feng

    2010-01-01

    This paper first applies the sequential cluster method to set up the classification standard of infectious disease incidence state based on the fact that there are many uncertainty characteristics in the incidence course. Then the paper presents a weighted Markov chain, a method which is used to predict the future incidence state. This method assumes the standardized self-coefficients as weights based on the special characteristics of infectious disease incidence being a dependent stochastic variable. It also analyzes the characteristics of infectious diseases incidence via the Markov chain Monte Carlo method to make the long-term benefit of decision optimal. Our method is successfully validated using existing incidents data of infectious diseases in Jiangsu Province. In summation, this paper proposes ways to improve the accuracy of the weighted Markov chain, specifically in the field of infection epidemiology. PMID:23554632

  6. Weighted Markov chains for forecasting and analysis in Incidence of infectious diseases in jiangsu Province, China.

    PubMed

    Peng, Zhihang; Bao, Changjun; Zhao, Yang; Yi, Honggang; Xia, Letian; Yu, Hao; Shen, Hongbing; Chen, Feng

    2010-05-01

    This paper first applies the sequential cluster method to set up the classification standard of infectious disease incidence state based on the fact that there are many uncertainty characteristics in the incidence course. Then the paper presents a weighted Markov chain, a method which is used to predict the future incidence state. This method assumes the standardized self-coefficients as weights based on the special characteristics of infectious disease incidence being a dependent stochastic variable. It also analyzes the characteristics of infectious diseases incidence via the Markov chain Monte Carlo method to make the long-term benefit of decision optimal. Our method is successfully validated using existing incidents data of infectious diseases in Jiangsu Province. In summation, this paper proposes ways to improve the accuracy of the weighted Markov chain, specifically in the field of infection epidemiology.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, J.C.; Leahy, R.M.

    A new method for source localization is described that is based on a modification of the well known multiple signal classification (MUSIC) algorithm. In classical MUSIC, the array manifold vector is projected onto an estimate of the signal subspace, but errors in the estimate can make location of multiple sources difficult. Recursively applied and projected (RAP) MUSIC uses each successively located source to form an intermediate array gain matrix, and projects both the array manifold and the signal subspace estimate into its orthogonal complement. The MUSIC projection is then performed in this reduced subspace. Using the metric of principal angles,more » the authors describe a general form of the RAP-MUSIC algorithm for the case of diversely polarized sources. Through a uniform linear array simulation, the authors demonstrate the improved Monte Carlo performance of RAP-MUSIC relative to MUSIC and two other sequential subspace methods, S and IES-MUSIC.« less

  8. Variational assimilation of VAS data into the mass model

    NASA Technical Reports Server (NTRS)

    Cram, J. M.; Kaplan, M. L.

    1984-01-01

    Experiments are reported in which VAS data at 1200, 1500, and 1800 GMT 20 July 1981 were assimilated using both the adiabatic and full physics version of the Mesoscale Atmospheric Simulation System (MASS). A nonassimilation forecast is compared with forecasts assimilating temperature gradients only and forecasts assimilating both temperature and humidity gradients. The effects of successive vs single assimilations are also examined. It is noted that the greatest improvements to the forecast resulted when the VAS data resolved the mesoscale structure of the temperature and relative humidity fields. When this structure was assimilated into MASS, the ensuing simulations more clearly defined a mesoscale structure in the developing instabilities.

  9. Application of data assimilation methods for analysis and integration of observed and modeled Arctic Sea ice motions

    NASA Astrophysics Data System (ADS)

    Meier, Walter Neil

    This thesis demonstrates the applicability of data assimilation methods to improve observed and modeled ice motion fields and to demonstrate the effects of assimilated motion on Arctic processes important to the global climate and of practical concern to human activities. Ice motions derived from 85 GHz and 37 GHz SSM/I imagery and estimated from two-dimensional dynamic-thermodynamic sea ice models are compared to buoy observations. Mean error, error standard deviation, and correlation with buoys are computed for the model domain. SSM/I motions generally have a lower bias, but higher error standard deviations and lower correlation with buoys than model motions. There are notable variations in the statistics depending on the region of the Arctic, season, and ice characteristics. Assimilation methods are investigated and blending and optimal interpolation strategies are implemented. Blending assimilation improves error statistics slightly, but the effect of the assimilation is reduced due to noise in the SSM/I motions and is thus not an effective method to improve ice motion estimates. However, optimal interpolation assimilation reduces motion errors by 25--30% over modeled motions and 40--45% over SSM/I motions. Optimal interpolation assimilation is beneficial in all regions, seasons and ice conditions, and is particularly effective in regimes where modeled and SSM/I errors are high. Assimilation alters annual average motion fields. Modeled ice products of ice thickness, ice divergence, Fram Strait ice volume export, transport across the Arctic and interannual basin averages are also influenced by assimilated motions. Assimilation improves estimates of pollutant transport and corrects synoptic-scale errors in the motion fields caused by incorrect forcings or errors in model physics. The portability of the optimal interpolation assimilation method is demonstrated by implementing the strategy in an ice thickness distribution (ITD) model. This research presents an innovative method of combining a new data set of SSM/I-derived ice motions with three different sea ice models via two data assimilation methods. The work described here is the first example of assimilating remotely-sensed data within high-resolution and detailed dynamic-thermodynamic sea ice models. The results demonstrate that assimilation is a valuable resource for determining accurate ice motion in the Arctic.

  10. An energy function for dynamics simulations of polypeptides in torsion angle space

    NASA Astrophysics Data System (ADS)

    Sartori, F.; Melchers, B.; Böttcher, H.; Knapp, E. W.

    1998-05-01

    Conventional simulation techniques to model the dynamics of proteins in atomic detail are restricted to short time scales. A simplified molecular description, in which high frequency motions with small amplitudes are ignored, can overcome this problem. In this protein model only the backbone dihedrals φ and ψ and the χi of the side chains serve as degrees of freedom. Bond angles and lengths are fixed at ideal geometry values provided by the standard molecular dynamics (MD) energy function CHARMM. In this work a Monte Carlo (MC) algorithm is used, whose elementary moves employ cooperative rotations in a small window of consecutive amide planes, leaving the polypeptide conformation outside of this window invariant. A single of these window MC moves generates local conformational changes only. But, the application of many such moves at different parts of the polypeptide backbone leads to global conformational changes. To account for the lack of flexibility in the protein model employed, the energy function used to evaluate conformational energies is split into sequentially neighbored and sequentially distant contributions. The sequentially neighbored part is represented by an effective (φ,ψ)-torsion potential. It is derived from MD simulations of a flexible model dipeptide using a conventional MD energy function. To avoid exaggeration of hydrogen bonding strengths, the electrostatic interactions involving hydrogen atoms are scaled down at short distances. With these adjustments of the energy function, the rigid polypeptide model exhibits the same equilibrium distributions as obtained by conventional MD simulation with a fully flexible molecular model. Also, the same temperature dependence of the stability and build-up of α helices of 18-alanine as found in MD simulations is observed using the adapted energy function for MC simulations. Analyses of transition frequencies demonstrate that also dynamical aspects of MD trajectories are faithfully reproduced. Finally, it is demonstrated that even for high temperature unfolded polypeptides the MC simulation is more efficient by a factor of 10 than conventional MD simulations.

  11. A Module for Assimilating Hyperspectral Infrared Retrieved Profiles into the Gridpoint Statistical Interpolation System for Unique Forecasting Applications

    NASA Technical Reports Server (NTRS)

    Berndt, Emily; Zavodsky, Bradley; Srikishen, Jayanthi; Blankenship, Clay

    2015-01-01

    Hyperspectral infrared sounder radiance data are assimilated into operational modeling systems however the process is computationally expensive and only approximately 1% of available data are assimilated due to data thinning as well as the fact that radiances are restricted to cloud-free fields of view. In contrast, the number of hyperspectral infrared profiles assimilated is much higher since the retrieved profiles can be assimilated in some partly cloudy scenes due to profile coupling other data, such as microwave or neural networks, as first guesses to the retrieval process. As the operational data assimilation community attempts to assimilate cloud-affected radiances, it is possible that the use of retrieved profiles might offer an alternative methodology that is less complex and more computationally efficient to solve this problem. The NASA Short-term Prediction Research and Transition (SPoRT) Center has assimilated hyperspectral infrared retrieved profiles into Weather Research and Forecasting Model (WRF) simulations using the Gridpoint Statistical Interpolation (GSI) System. Early research at SPoRT demonstrated improved initial conditions when assimilating Atmospheric Infrared Sounder (AIRS) thermodynamic profiles into WRF (using WRF-Var and assigning more appropriate error weighting to the profiles) to improve regional analysis and heavy precipitation forecasts. Successful early work has led to more recent research utilizing WRF and GSI for applications including the assimilation of AIRS profiles to improve WRF forecasts of atmospheric rivers and assimilation of AIRS, Cross-track Infrared and Microwave Sounding Suite (CrIMSS), and Infrared Atmospheric Sounding Interferometer (IASI) profiles to improve model representation of tropopause folds and associated non-convective wind events. Although more hyperspectral infrared retrieved profiles can be assimilated into model forecasts, one disadvantage is the retrieved profiles have traditionally been assigned the same error values as the rawinsonde observations when assimilated with GSI. Typically, satellitederived profile errors are larger and more difficult to quantify than traditional rawinsonde observations (especially in the boundary layer), so it is important to appropriately assign observation errors within GSI to eliminate potential spurious innovations and analysis increments that can sometimes arise when using retrieved profiles. The goal of this study is to describe modifications to the GSI source code to more appropriately assimilate hyperspectral infrared retrieved profiles and outline preliminary results that show the differences between a model simulation that assimilated the profiles as rawinsonde observations and one that assimilated the profiles in a module with the appropriate error values.

  12. Satellite-Scale Snow Water Equivalent Assimilation into a High-Resolution Land Surface Model

    NASA Technical Reports Server (NTRS)

    De Lannoy, Gabrielle J.M.; Reichle, Rolf H.; Houser, Paul R.; Arsenault, Kristi R.; Verhoest, Niko E.C.; Paulwels, Valentijn R.N.

    2009-01-01

    An ensemble Kalman filter (EnKF) is used in a suite of synthetic experiments to assimilate coarse-scale (25 km) snow water equivalent (SWE) observations (typical of satellite retrievals) into fine-scale (1 km) model simulations. Coarse-scale observations are assimilated directly using an observation operator for mapping between the coarse and fine scales or, alternatively, after disaggregation (re-gridding) to the fine-scale model resolution prior to data assimilation. In either case observations are assimilated either simultaneously or independently for each location. Results indicate that assimilating disaggregated fine-scale observations independently (method 1D-F1) is less efficient than assimilating a collection of neighboring disaggregated observations (method 3D-Fm). Direct assimilation of coarse-scale observations is superior to a priori disaggregation. Independent assimilation of individual coarse-scale observations (method 3D-C1) can bring the overall mean analyzed field close to the truth, but does not necessarily improve estimates of the fine-scale structure. There is a clear benefit to simultaneously assimilating multiple coarse-scale observations (method 3D-Cm) even as the entire domain is observed, indicating that underlying spatial error correlations can be exploited to improve SWE estimates. Method 3D-Cm avoids artificial transitions at the coarse observation pixel boundaries and can reduce the RMSE by 60% when compared to the open loop in this study.

  13. Assimilation of SBUV Version 8 Radiances into the GEOS Ozone DAS

    NASA Technical Reports Server (NTRS)

    Mueller, Martin D.; Stajner, Ivanka; Bhartia, Pawan K.

    2004-01-01

    In operational weather forecasting, the assimilation of brightness temperatures from satellite sounders, instead of assimilation of 1D-retrievals has become increasingly common practice over the last two decades. Compared to these systems, assimilation of trace gases is still at a relatively early stage of development, and efforts to directly assimilate radiances instead of retrieved products have just begun a few years ago, partially because it requires much more computation power due to the employment of a radiative transport forward model (FM). This paper will focus on a method to assimilate SBUV/2 radiances (albedos) into the Global Earth Observation System Ozone Data Assimilation Scheme (GEOS-03DAS). While SBUV-type instruments cannot compete with newer sensors in terms of spectral and horizontal resolution, they feature a continuous data record back to 1978, which makes them very valuable for trend studies. Assimilation can help spreading their ground coverage over the whole globe, as has been previously demonstrated with the GEOS-03DAS using SBUV Version 6 ozone profiles. Now, the DAS has been updated to use the newly released SBUV Version 8 data. We will compare pre]lmlnarv results of SBUV radiance assimilation with the assimilation of retrieved ozone profiles, discuss methods to deal with the increased computational load, and try to assess the error characteristics and future potential of the new approach.

  14. Demonstration of Both a Photosynthetic and a Nonphotosynthetic CO(2) Requirement for NH(4) Assimilation in the Green Alga Selenastrum minutum.

    PubMed

    Amory, A M; Vanlerberghe, G C; Turpin, D H

    1991-01-01

    Nitrogen-limited and nitrogen-sufficient cell cultures of Selenastrum minutum (Naeg.) Collins (Chlorophyta) were used to investigate the dependence of NH(4) (+) assimilation on exogenous CO(2). N-sufficient cells were only able to assimilate NH(4) (+) maximally in the presence of CO(2) and light. Inhibition of photosynthesis with 3-(3,4-dichlorophenyl)-1,1-dimethylurea, diuron also inhibited NH(4) (+) assimilation. These results indicate that NH(4) (+) assimilation by N-sufficient cells exhibited a strict requirement for photosynthetic CO(2) fixation. N-limited cells assimilated NH(4) (+) both in the dark and in the light in the presence of 3-(3,4-dichlorophenyl)-1,1-dimethylurea, diuron, indicating that photosynthetic CO(2) fixation was not required for NH(4) (+) assimilation. Using CO(2) removal techniques reported previously in the literature, we were unable to demonstrate CO(2)-dependent NH(4) (+) assimilation in N-limited cells. However, employing more stringent CO(2) removal techniques we were able to show a CO(2) dependence of NH(4) (+) assimilation in both the light and dark, which was independent of photosynthesis. The results indicate two independent CO(2) requirements for NH(4) (+) assimilation. The first is as a substrate for photosynthetic CO(2) fixation, whereas the second is a nonphoto-synthetic requirement, presumably as a substrate for the anaplerotic reaction catalyzed by phosphoenolpyruvate carboxylase.

  15. Demonstration of Both a Photosynthetic and a Nonphotosynthetic CO2 Requirement for NH4+ Assimilation in the Green Alga Selenastrum minutum1

    PubMed Central

    Amory, Alan M.; Vanlerberghe, Greg C.; Turpin, David H.

    1991-01-01

    Nitrogen-limited and nitrogen-sufficient cell cultures of Selenastrum minutum (Naeg.) Collins (Chlorophyta) were used to investigate the dependence of NH4+ assimilation on exogenous CO2. N-sufficient cells were only able to assimilate NH4+ maximally in the presence of CO2 and light. Inhibition of photosynthesis with 3-(3,4-dichlorophenyl)-1,1-dimethylurea, diuron also inhibited NH4+ assimilation. These results indicate that NH4+ assimilation by N-sufficient cells exhibited a strict requirement for photosynthetic CO2 fixation. N-limited cells assimilated NH4+ both in the dark and in the light in the presence of 3-(3,4-dichlorophenyl)-1,1-dimethylurea, diuron, indicating that photosynthetic CO2 fixation was not required for NH4+ assimilation. Using CO2 removal techniques reported previously in the literature, we were unable to demonstrate CO2-dependent NH4+ assimilation in N-limited cells. However, employing more stringent CO2 removal techniques we were able to show a CO2 dependence of NH4+ assimilation in both the light and dark, which was independent of photosynthesis. The results indicate two independent CO2 requirements for NH4+ assimilation. The first is as a substrate for photosynthetic CO2 fixation, whereas the second is a nonphoto-synthetic requirement, presumably as a substrate for the anaplerotic reaction catalyzed by phosphoenolpyruvate carboxylase. PMID:16667950

  16. Changes in symptom intensity and emotion valence during the process of assimilation of a problematic experience: A quantitative study of a good outcome case of cognitive-behavioral therapy.

    PubMed

    Basto, Isabel; Pinheiro, Patrícia; Stiles, William B; Rijo, Daniel; Salgado, João

    2017-07-01

    The assimilation model describes the change process in psychotherapy. In this study we analyzed the relation of assimilation with changes in symptom intensity, measured session by session, and changes in emotional valence, measured for each emotional episode, in the case of a 33-year-old woman treated for depression with cognitive-behavioral therapy. Results showed the theoretically expected negative relation between assimilation of the client's main concerns and symptom intensity, and the relation between assimilation levels and emotional valence corresponded closely to the assimilation model's theoretical feelings curve. The results show how emotions work as markers of the client's current assimilation level, which could help the therapist adjust the intervention, moment by moment, to the client's needs.

  17. A Global Carbon Assimilation System using a modified EnKF assimilation method

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Zheng, X.; Chen, Z.; Dan, B.; Chen, J. M.; Yi, X.; Wang, L.; Wu, G.

    2014-10-01

    A Global Carbon Assimilation System based on Ensemble Kalman filter (GCAS-EK) is developed for assimilating atmospheric CO2 abundance data into an ecosystem model to simultaneously estimate the surface carbon fluxes and atmospheric CO2 distribution. This assimilation approach is based on the ensemble Kalman filter (EnKF), but with several new developments, including using analysis states to iteratively estimate ensemble forecast errors, and a maximum likelihood estimation of the inflation factors of the forecast and observation errors. The proposed assimilation approach is tested in observing system simulation experiments and then used to estimate the terrestrial ecosystem carbon fluxes and atmospheric CO2 distributions from 2002 to 2008. The results showed that this assimilation approach can effectively reduce the biases and uncertainties of the carbon fluxes simulated by the ecosystem model.

  18. The Global Structure of UTLS Ozone in GEOS-5: A Multi-Year Assimilation of EOS Aura Data

    NASA Technical Reports Server (NTRS)

    Wargan, Krzysztof; Pawson, Steven; Olsen, Mark A.; Witte, Jacquelyn C.; Douglass, Anne R.; Ziemke, Jerald R.; Strahan, Susan E.; Nielsen, J. Eric

    2015-01-01

    Eight years of ozone measurements retrieved from the Ozone Monitoring Instrument (OMI) and the Microwave Limb Sounder, both on the EOS Aura satellite, have been assimilated into the Goddard Earth Observing System version 5 (GEOS-5) data assimilation system. This study thoroughly evaluates this assimilated product, highlighting its potential for science. The impact of observations on the GEOS-5 system is explored by examining the spatial distribution of the observation-minus-forecast statistics. Independent data are used for product validation. The correlation coefficient of the lower-stratospheric ozone column with ozonesondes is 0.99 and the bias is 0.5%, indicating the success of the assimilation in reproducing the ozone variability in that layer. The upper-tropospheric assimilated ozone column is about 10% lower than the ozonesonde column but the correlation is still high (0.87). The assimilation is shown to realistically capture the sharp cross-tropopause gradient in ozone mixing ratio. Occurrence of transport-driven low ozone laminae in the assimilation system is similar to that obtained from the High Resolution Dynamics Limb Sounder (HIRDLS) above the 400 K potential temperature surface but the assimilation produces fewer laminae than seen by HIRDLS below that surface. Although the assimilation produces 5 - 8 fewer occurrences per day (up to approximately 20%) during the three years of HIRDLS data, the interannual variability is captured correctly. This data-driven assimilated product is complementary to ozone fields generated from chemistry and transport models. Applications include study of the radiative forcing by ozone and tracer transport near the tropopause.

  19. Culture Training: Validation Evidence for the Culture Assimilator.

    ERIC Educational Resources Information Center

    Mitchell, Terence R.; And Others

    The culture assimilator, a programed self-instructional approach to culture training, is described and a series of laboratory experiments and field studies validating the culture assimilator are reviewed. These studies show that the culture assimilator is an effective method of decreasing some of the stress experienced when one works with people…

  20. Improving 7-Day Forecast Skill by Assimilation of Retrieved AIRS Temperature Profiles

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Rosenberg, Bob

    2016-01-01

    We conducted a new set of Data Assimilation Experiments covering the period January 1 to February 29, 2016 using the GEOS-5 DAS. Our experiments assimilate all data used operationally by GMAO (Control) with some modifications. Significant improvement in Global and Southern Hemisphere Extra-tropical 7-day forecast skill was obtained when: We assimilated AIRS Quality Controlled temperature profiles in place of observed AIRS radiances, and also did not assimilate CrISATMS radiances, nor did we assimilate radiosonde temperature profiles or aircraft temperatures. This new methodology did not improve or degrade 7-day Northern Hemispheric Extra-tropical forecast skill. We are conducting experiments aimed at further improving of Northern Hemisphere Extra-tropical forecast skill.

  1. An iterative particle filter approach for coupled hydro-geophysical inversion of a controlled infiltration experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoli, Gabriele, E-mail: manoli@dmsa.unipd.it; Nicholas School of the Environment, Duke University, Durham, NC 27708; Rossi, Matteo

    The modeling of unsaturated groundwater flow is affected by a high degree of uncertainty related to both measurement and model errors. Geophysical methods such as Electrical Resistivity Tomography (ERT) can provide useful indirect information on the hydrological processes occurring in the vadose zone. In this paper, we propose and test an iterated particle filter method to solve the coupled hydrogeophysical inverse problem. We focus on an infiltration test monitored by time-lapse ERT and modeled using Richards equation. The goal is to identify hydrological model parameters from ERT electrical potential measurements. Traditional uncoupled inversion relies on the solution of two sequentialmore » inverse problems, the first one applied to the ERT measurements, the second one to Richards equation. This approach does not ensure an accurate quantitative description of the physical state, typically violating mass balance. To avoid one of these two inversions and incorporate in the process more physical simulation constraints, we cast the problem within the framework of a SIR (Sequential Importance Resampling) data assimilation approach that uses a Richards equation solver to model the hydrological dynamics and a forward ERT simulator combined with Archie's law to serve as measurement model. ERT observations are then used to update the state of the system as well as to estimate the model parameters and their posterior distribution. The limitations of the traditional sequential Bayesian approach are investigated and an innovative iterative approach is proposed to estimate the model parameters with high accuracy. The numerical properties of the developed algorithm are verified on both homogeneous and heterogeneous synthetic test cases based on a real-world field experiment.« less

  2. Investigation of flow and transport processes at the MADE site using ensemble Kalman filter

    USGS Publications Warehouse

    Liu, Gaisheng; Chen, Y.; Zhang, Dongxiao

    2008-01-01

    In this work the ensemble Kalman filter (EnKF) is applied to investigate the flow and transport processes at the macro-dispersion experiment (MADE) site in Columbus, MS. The EnKF is a sequential data assimilation approach that adjusts the unknown model parameter values based on the observed data with time. The classic advection-dispersion (AD) and the dual-domain mass transfer (DDMT) models are employed to analyze the tritium plume during the second MADE tracer experiment. The hydraulic conductivity (K), longitudinal dispersivity in the AD model, and mass transfer rate coefficient and mobile porosity ratio in the DDMT model, are estimated in this investigation. Because of its sequential feature, the EnKF allows for the temporal scaling of transport parameters during the tritium concentration analysis. Inverse simulation results indicate that for the AD model to reproduce the extensive spatial spreading of the tritium observed in the field, the K in the downgradient area needs to be increased significantly. The estimated K in the AD model becomes an order of magnitude higher than the in situ flowmeter measurements over a large portion of media. On the other hand, the DDMT model gives an estimation of K that is much more comparable with the flowmeter values. In addition, the simulated concentrations by the DDMT model show a better agreement with the observed values. The root mean square (RMS) between the observed and simulated tritium plumes is 0.77 for the AD model and 0.45 for the DDMT model at 328 days. Unlike the AD model, which gives inconsistent K estimates at different times, the DDMT model is able to invert the K values that consistently reproduce the observed tritium concentrations through all times. ?? 2008 Elsevier Ltd. All rights reserved.

  3. Pushing the Frontier of Data-Oriented Geodynamic Modeling: from Qualitative to Quantitative to Predictive

    NASA Astrophysics Data System (ADS)

    Liu, L.; Hu, J.; Zhou, Q.

    2016-12-01

    The rapid accumulation of geophysical and geological data sets poses an increasing demand for the development of geodynamic models to better understand the evolution of the solid Earth. Consequently, the earlier qualitative physical models are no long satisfying. Recent efforts are focusing on more quantitative simulations and more efficient numerical algorithms. Among these, a particular line of research is on the implementation of data-oriented geodynamic modeling, with the purpose of building an observationally consistent and physically correct geodynamic framework. Such models could often catalyze new insights into the functioning mechanisms of the various aspects of plate tectonics, and their predictive nature could also guide future research in a deterministic fashion. Over the years, we have been working on constructing large-scale geodynamic models with both sequential and variational data assimilation techniques. These models act as a bridge between different observational records, and the superposition of the constraining power from different data sets help reveal unknown processes and mechanisms of the dynamics of the mantle and lithosphere. We simulate the post-Cretaceous subduction history in South America using a forward (sequential) approach. The model is constrained using past subduction history, seafloor age evolution, tectonic architecture of continents, and the present day geophysical observations. Our results quantify the various driving forces shaping the present South American flat slabs, which we found are all internally torn. The 3-D geometry of these torn slabs further explains the abnormal seismicity pattern and enigmatic volcanic history. An inverse (variational) model simulating the late Cenozoic western U.S. mantle dynamics with similar constraints reveals a different mechanism for the formation of Yellowstone-related volcanism from traditional understanding. Furthermore, important insights on the mantle density and viscosity structures also emerge from these models.

  4. Overview of Chinese GRAPES Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Liu, Yan

    2017-04-01

    The development of data assimilation system of Global and Regional Assimilation and Prediction System (GRAPES in short) which is Chinese new generation operational numerical weather prediction system completed in recent years is reviewed in this paper, including the design scheme and main characteristics. GRAPES adopts the variational approach with stresses at application of various remote sensing observational data. Its development path is from three dimensional to four dimensional assimilation. It may be implemented with limited area or global configurations. The three dimensional variational data assimilation systems have been operational in the national and a few of regional meteorological centers. The global four dimensional assimilation system is in pre-operational experiments, and will be upgraded. After a brief introduction to the GRAPES data assimilation system, results of a series of validations of GRAPES analyses against the observation data and analyses derived from other operational NWP center to assess its performance are presented.

  5. Benefits and Pitfalls of GRACE Terrestrial Water Storage Data Assimilation

    NASA Technical Reports Server (NTRS)

    Girotto, Manuela

    2018-01-01

    Satellite observations of terrestrial water storage (TWS) from the Gravity Recovery and Climate Experiment (GRACE) mission have a coarse resolution in time (monthly) and space (roughly 150,000 sq km at midlatitudes) and vertically integrate all water storage components over land, including soil moisture and groundwater. Nonetheless, data assimilation can be used to horizontally downscale and vertically partition GRACE-TWS observations. This presentation illustrates some of the benefits and drawbacks of assimilating TWS observations from GRACE into a land surface model over the continental United States and India. The assimilation scheme yields improved skill metrics for groundwater compared to the no-assimilation simulations. A smaller impact is seen for surface and root-zone soil moisture. Further, GRACE observes TWS depletion associated with anthropogenic groundwater extraction. Results from the assimilation emphasize the importance of representing anthropogenic processes in land surface modeling and data assimilation systems.

  6. The coupling of high-speed high resolution experimental data and LES through data assimilation techniques

    NASA Astrophysics Data System (ADS)

    Harris, S.; Labahn, J. W.; Frank, J. H.; Ihme, M.

    2017-11-01

    Data assimilation techniques can be integrated with time-resolved numerical simulations to improve predictions of transient phenomena. In this study, optimal interpolation and nudging are employed for assimilating high-speed high-resolution measurements obtained for an inert jet into high-fidelity large-eddy simulations. This experimental data set was chosen as it provides both high spacial and temporal resolution for the three-component velocity field in the shear layer of the jet. Our first objective is to investigate the impact that data assimilation has on the resulting flow field for this inert jet. This is accomplished by determining the region influenced by the data assimilation and corresponding effect on the instantaneous flow structures. The second objective is to determine optimal weightings for two data assimilation techniques. The third objective is to investigate how the frequency at which the data is assimilated affects the overall predictions. Graduate Research Assistant, Department of Mechanical Engineering.

  7. Assimilation of NUCAPS Retrieved Profiles in GSI for Unique Forecasting Applications

    NASA Technical Reports Server (NTRS)

    Berndt, Emily Beth; Zavodsky, Bradley; Srikishen, Jayanthi; Blankenship, Clay

    2015-01-01

    Hyperspectral IR profiles can be assimilated in GSI as a separate observation other than radiosondes with only changes to tables in the fix directory. Assimilation of profiles does produce changes to analysis fields and evidenced by: Innovations larger than +/-2.0 K are present and represent where individual profiles impact the final temperature analysis.The updated temperature analysis is colder behind the cold front and warmer in the warm sector. The updated moisture analysis is modified more in the low levels and tends to be drier than the original model background Analysis of model output shows: Differences relative to 13-km RAP analyses are smaller when profiles are assimilated with NUCAPS errors. CAPE is under-forecasted when assimilating NUCAPS profiles, which could be problematic for severe weather forecasting Refining the assimilation technique to incorporate an error covariance matrix and creating a separate GSI module to assimilate satellite profiles may improve results.

  8. Accelerating assimilation development for new observing systems using EFSO

    NASA Astrophysics Data System (ADS)

    Lien, Guo-Yuan; Hotta, Daisuke; Kalnay, Eugenia; Miyoshi, Takemasa; Chen, Tse-Chun

    2018-03-01

    To successfully assimilate data from a new observing system, it is necessary to develop appropriate data selection strategies, assimilating only the generally useful data. This development work is usually done by trial and error using observing system experiments (OSEs), which are very time and resource consuming. This study proposes a new, efficient methodology to accelerate the development using ensemble forecast sensitivity to observations (EFSO). First, non-cycled assimilation of the new observation data is conducted to compute EFSO diagnostics for each observation within a large sample. Second, the average EFSO conditionally sampled in terms of various factors is computed. Third, potential data selection criteria are designed based on the non-cycled EFSO statistics, and tested in cycled OSEs to verify the actual assimilation impact. The usefulness of this method is demonstrated with the assimilation of satellite precipitation data. It is shown that the EFSO-based method can efficiently suggest data selection criteria that significantly improve the assimilation results.

  9. Thermospheric Data Assimilation

    DTIC Science & Technology

    2016-05-05

    forecasting longer than 3 days. Furthermore, validation of assimilation analyses with independent CHAMP mass density observations confirms that the...approach developed in this project. 15. SUBJECT TERMS Data assimilation, Ensemble forecasting , Thermosphere-ionosphere coupled data assimilation...Neutral mass density specification and forecasting , 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 6 19a. NAME

  10. On the role of perception in shaping phonological assimilation rules.

    PubMed

    Hura, S L; Lindblom, B; Diehl, R L

    1992-01-01

    Assimilation of nasals to the place of articulation of following consonants is a common and natural process among the world's languages. Recent phonological theory attributes this naturalness to the postulated geometry of articulatory features and the notion of spreading (McCarthy, 1988). Others view assimilation as a result of perception (Ohala, 1990), or as perceptually tolerated articulatory simplification (Kohler, 1990). Kohler notes that certain consonant classes (such as nasals and stops) are more likely than other classes (such as fricatives) to undergo place assimilation to a following consonant. To explain this pattern, he proposes that assimilation tends not to occur when the members of a consonant class are relatively distinctive perceptually, such that their articulatory reduction would be particularly salient. This explanation, of course, presupposes that the stops and nasals which undergo place assimilation are less distinctive than fricatives, which tend not to assimilate. We report experimental results that confirm Kohler's perceptual assumption: In the context of a following word initial stop, fricatives were less confusable than nasals or unreleased stops. We conclude, in agreement with Ohala and Kohler, that perceptual factors are likely to shape phonological assimilation rules.

  11. Skill Assessment in Ocean Biological Data Assimilation

    NASA Technical Reports Server (NTRS)

    Gregg, Watson W.; Friedrichs, Marjorie A. M.; Robinson, Allan R.; Rose, Kenneth A.; Schlitzer, Reiner; Thompson, Keith R.; Doney, Scott C.

    2008-01-01

    There is growing recognition that rigorous skill assessment is required to understand the ability of ocean biological models to represent ocean processes and distributions. Statistical analysis of model results with observations represents the most quantitative form of skill assessment, and this principle serves as well for data assimilation models. However, skill assessment for data assimilation requires special consideration. This is because there are three sets of information in the free-run model, data, and the assimilation model, which uses Data assimilation information from both the flee-run model and the data. Intercom parison of results among the three sets of information is important and useful for assessment, but is not conclusive since the three information sets are intertwined. An independent data set is necessary for an objective determination. Other useful measures of ocean biological data assimilation assessment include responses of unassimilated variables to the data assimilation, performance outside the prescribed region/time of interest, forecasting, and trend analysis. Examples of each approach from the literature are provided. A comprehensive list of ocean biological data assimilation and their applications of skill assessment, in both ecosystem/biogeochemical and fisheries efforts, is summarized.

  12. Assimilation for skin SST in the NASA GEOS atmospheric data assimilation system.

    PubMed

    Akella, Santha; Todling, Ricardo; Suarez, Max

    2017-01-01

    The present article describes the sea surface temperature (SST) developments implemented in the Goddard Earth Observing System, Version 5 (GEOS-5) Atmospheric Data Assimilation System (ADAS). These are enhancements that contribute to the development of an atmosphere-ocean coupled data assimilation system using GEOS. In the current quasi-operational GEOS-ADAS, the SST is a boundary condition prescribed based on the OSTIA product, therefore SST and skin SST (Ts) are identical. This work modifies the GEOS-ADAS Ts by modeling and assimilating near sea surface sensitive satellite infrared (IR) observations. The atmosphere-ocean interface layer of the GEOS atmospheric general circulation model (AGCM) is updated to include near surface diurnal warming and cool-skin effects. The GEOS analysis system is also updated to directly assimilate SST-relevant Advanced Very High Resolution Radiometer (AVHRR) radiance observations. Data assimilation experiments designed to evaluate the Ts modification in GEOS-ADAS show improvements in the assimilation of radiance observations that extends beyond the thermal IR bands of AVHRR. In particular, many channels of hyperspectral sensors, such as those of the Atmospheric Infrared Sounder (AIRS), and Infrared Atmospheric Sounding Interferometer (IASI) are also better assimilated. We also obtained improved fit to withheld, in-situ buoy measurement of near-surface SST. Evaluation of forecast skill scores show marginal to neutral benefit from the modified Ts.

  13. Satellite radiance data assimilation for binary tropical cyclone cases over the western North Pacific

    NASA Astrophysics Data System (ADS)

    Choi, Yonghan; Cha, Dong-Hyun; Lee, Myong-In; Kim, Joowan; Jin, Chun-Sil; Park, Sang-Hun; Joh, Min-Su

    2017-06-01

    A total of three binary tropical cyclone (TC) cases over the Western North Pacific are selected to investigate the effects of satellite radiance data assimilation on analyses and forecasts of binary TCs. Two parallel cycling experiments with a 6 h interval are performed for each binary TC case, and the difference between the two experiments is whether satellite radiance observations are assimilated. Satellite radiance observations are assimilated using the Weather Research and Forecasting Data Assimilation (WRFDA)'s three-dimensional variational (3D-Var) system, which includes the observation operator, quality control procedures, and bias correction algorithm for radiance observations. On average, radiance assimilation results in slight improvements of environmental fields and track forecasts of binary TC cases, but the detailed effects vary with the case. When there is no direct interaction between binary TCs, radiance assimilation leads to better depictions of environmental fields, and finally it results in improved track forecasts. However, positive effects of radiance assimilation on track forecasts can be reduced when there exists a direct interaction between binary TCs and intensities/structures of binary TCs are not represented well. An initialization method (e.g., dynamic initialization) combined with radiance assimilation and/or more advanced DA techniques (e.g., hybrid method) can be considered to overcome these limitations.

  14. Air Quality Modeling Using the NASA GEOS-5 Multispecies Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Keller, Christoph A.; Pawson, Steven; Wargan, Krzysztof; Weir, Brad

    2018-01-01

    The NASA Goddard Earth Observing System (GEOS) data assimilation system (DAS) has been expanded to include chemically reactive tropospheric trace gases including ozone (O3), nitrogen dioxide (NO2), and carbon monoxide (CO). This system combines model analyses from the GEOS-5 model with detailed atmospheric chemistry and observations from MLS (O3), OMI (O3 and NO2), and MOPITT (CO). We show results from a variety of assimilation test experiments, highlighting the improvements in the representation of model species concentrations by up to 50% compared to an assimilation-free control experiment. Taking into account the rapid chemical cycling of NO2 when applying the assimilation increments greatly improves assimilation skills for NO2 and provides large benefits for model concentrations near the surface. Analysis of the geospatial distribution of the assimilation increments suggest that the free-running model overestimates biomass burning emissions but underestimates lightning NOx emissions by 5-20%. We discuss the capability of the chemical data assimilation system to improve atmospheric composition forecasts through improved initial value and boundary condition inputs, particularly during air pollution events. We find that the current assimilation system meaningfully improves short-term forecasts (1-3 day). For longer-term forecasts more emphasis on updating the emissions instead of initial concentration fields is needed.

  15. Assimilation for Skin SST in the NASA GEOS Atmospheric Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Akella, Santha; Todling, Ricardo; Suarez, Max

    2017-01-01

    The present article describes the sea surface temperature (SST) developments implemented in the Goddard Earth Observing System, Version 5 (GEOS) Atmospheric Data Assimilation System (ADAS). These are enhancements that contribute to the development of an atmosphere-ocean coupled data assimilation system using GEOS. In the current quasi-operational GEOS-ADAS, the SST is a boundary condition prescribed based on the OSTIA product, therefore SST and skin SST (Ts) are identical. This work modifies the GEOS-ADAS Ts by modelling and assimilating near sea surface sensitive satellite infrared (IR) observations. The atmosphere-ocean interface layer of the GEOS atmospheric general circulation model (AGCM) is updated to include near-surface diurnal warming and cool-skin effects. The GEOS analysis system is also updated to directly assimilate SST-relevant Advanced Very High Resolution Radiometer (AVHRR) radiance observations. Data assimilation experiments designed to evaluate the Ts modification in GEOS-ADAS show improvements in the assimilation of radiance observations that extend beyond the thermal infrared bands of AVHRR. In particular, many channels of hyperspectral sensors, such as those of the Atmospheric Infrared Sounder (AIRS), and Infrared Atmospheric Sounding Interferometer (IASI) are also better assimilated. We also obtained improved fit to withheld insitu buoy measurement of near-surface SST. Evaluation of forecast skill scores show neutral to marginal benefit from the modified Ts.

  16. Blending geological observations and convection models to reconstruct mantle dynamics

    NASA Astrophysics Data System (ADS)

    Coltice, Nicolas; Bocher, Marie; Fournier, Alexandre; Tackley, Paul

    2015-04-01

    Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes. Such breakthrough opens the opportunity to retrieve the recent dynamics of the Earth's mantle by blending convection models together with advanced geological datasets. A proof of concept will be presented, consisting in a synthetic test based on a sequential data assimilation methodology.

  17. Biomass characteristics and simultaneous nitrification-denitrification under long sludge retention time in an integrated reactor treating rural domestic sewage.

    PubMed

    Gong, Lingxiao; Jun, Li; Yang, Qing; Wang, Shuying; Ma, Bin; Peng, Yongzhen

    2012-09-01

    In this work, a novel integrated reactor incorporating anoxic fixed bed biofilm reactor (FBBR), oxic moving bed biofilm reactor (MBBR) and settler sequentially was proposed for nitrogen removal from rural domestic sewage. For purposes of achieving high efficiency, low costs and easy maintenance, biomass characteristics and simultaneous nitrification-denitrification (SND) were investigated under long sludge retention time during a 149-day period. The results showed that enhanced SND with proportions of 37.7-42.2% tapped the reactor potentials of efficiency and economy both, despite of C/N ratio of 2.5-4.0 in influent. TN was removed averagely by 69.3% at least, even under internal recycling ratio of 200% and less proportions of biomass assimilation (<3%). Consequently, lower internal recycle and intermittent wasted sludge discharge were feasible to save costs, together with cancellations of sludge return and anoxic stir. Furthermore, biomass with low observed heterotrophic yields (0.053 ± 0.035 g VSS/g COD) and VSS/TSS ratio (<0.55) in MBBR, simplified wasted sludge disposal. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Assimilation of Global Radar Backscatter and Radiometer Brightness Temperature Observations to Improve Soil Moisture and Land Evaporation Estimates

    NASA Technical Reports Server (NTRS)

    Lievens, H.; Martens, B.; Verhoest, N. E. C.; Hahn, S.; Reichle, R. H.; Miralles, D. G.

    2017-01-01

    Active radar backscatter (s?) observations from the Advanced Scatterometer (ASCAT) and passive radiometer brightness temperature (TB) observations from the Soil Moisture Ocean Salinity (SMOS) mission are assimilated either individually or jointly into the Global Land Evaporation Amsterdam Model (GLEAM) to improve its simulations of soil moisture and land evaporation. To enable s? and TB assimilation, GLEAM is coupled to the Water Cloud Model and the L-band Microwave Emission from the Biosphere (L-MEB) model. The innovations, i.e. differences between observations and simulations, are mapped onto the model soil moisture states through an Ensemble Kalman Filter. The validation of surface (0-10 cm) soil moisture simulations over the period 2010-2014 against in situ measurements from the International Soil Moisture Network (ISMN) shows that assimilating s? or TB alone improves the average correlation of seasonal anomalies (Ran) from 0.514 to 0.547 and 0.548, respectively. The joint assimilation further improves Ran to 0.559. Associated enhancements in daily evaporative flux simulations by GLEAM are validated based on measurements from 22 FLUXNET stations. Again, the singular assimilation improves Ran from 0.502 to 0.536 and 0.533, respectively for s? and TB, whereas the best performance is observed for the joint assimilation (Ran = 0.546). These results demonstrate the complementary value of assimilating radar backscatter observations together with brightness temperatures for improving estimates of hydrological variables, as their joint assimilation outperforms the assimilation of each observation type separately.

  19. Assimilation of granite by basaltic magma at Burnt Lava flow, Medicine Lake volcano, northern California: Decoupling of heat and mass transfer

    USGS Publications Warehouse

    Grove, T.L.; Kinzler, R.J.; Baker, M.B.; Donnelly-Nolan, J. M.; Lesher, C.E.

    1988-01-01

    At Medicine Lake volcano, California, andesite of the Holocene Burnt Lava flow has been produced by fractional crystallization of parental high alumina basalt (HAB) accompanied by assimilation of granitic crustal material. Burnt Lava contains inclusions of quenched HAB liquid, a potential parent magma of the andesite, highly melted granitic crustal xenoliths, and xenocryst assemblages which provide a record of the fractional crystallization and crustal assimilation process. Samples of granitic crustal material occur as xenoliths in other Holocene and Pleistocene lavas, and these xenoliths are used to constrain geochemical models of the assimilation process. A large amount of assimilation accompanied fractional crystallization to produce the contaminated Burnt lava andesites. Models which assume that assimilation and fractionation occurred simultaneously estimate the ratio of assimilation to fractional crystallization (R) to be >1 and best fits to all geochemical data are at an R value of 1.35 at F=0.68. Petrologic evidence, however, indicates that the assimilation process did not involve continuous addition of granitic crust as fractionation occurred. Instead, heat and mass transfer were separated in space and time. During the assimilation process, HAB magma underwent large amounts of fractional crystallization which was not accompanied by significant amounts of assimilation. This fractionation process supplied heat to melt granitic crust. The models proposed to explain the contamination process involve fractionation, replenishment by parental HAB, and mixing of evolved and parental magmas with melted granitic crust. ?? 1988 Springer-Verlag.

  20. Towards a Comprehensive Dynamic-chemistry Assimilation for Eos-Chem: Plans and Status in NASA's Data Assimilation Office

    NASA Technical Reports Server (NTRS)

    Pawson, Steven; Lin, Shian-Jiann; Rood, Richard B.; Stajner, Ivanka; Nebuda, Sharon; Nielsen, J. Eric; Douglass, Anne R.

    2000-01-01

    In order to support the EOS-Chem project, a comprehensive assimilation package for the coupled chemical-dynamical system is being developed by the Data Assimilation Office at NASA GSFC. This involves development of a coupled chemistry/meteorology model and of data assimilation techniques for trace species and meteorology. The model is being developed using the flux-form semi-Lagrangian dynamical core of Lin and Rood, the physical parameterizations from the NCAR Community Climate Model, and atmospheric chemistry modules from the Atmospheric Chemistry and Dynamics branch at NASA GSFC. To date the following results have been obtained: (i) multi-annual simulations with the dynamics-radiation model show the credibility of the package for atmospheric simulations; (ii) initial simulations including a limited number of middle atmospheric trace gases reveal the realistic nature of transport mechanisms, although there is still a need for some improvements. Samples of these results will be shown. A meteorological assimilation system is currently being constructed using the model; this will form the basis for the proposed meteorological/chemical assimilation package. The latter part of the presentation will focus on areas targeted for development in the near and far terms, with the objective of Providing a comprehensive assimilation package for the EOS-Chem science experiment. The first stage will target ozone assimilation. The plans also encompass a reanalysis (ReSTS) for the 1991-1995 period, which includes the Mt. Pinatubo eruption and the time when a large number of UARS observations were available. One of the most challenging aspects of future developments will be to couple theoretical advances in tracer assimilation with the practical considerations of a real environment and eventually a near-real-time assimilation system.

  1. Preliminary Results from an Assimilation of TOMS Aerosol Observations Into the GOCART Model

    NASA Technical Reports Server (NTRS)

    daSilva, Arlindo; Weaver, Clark J.; Ginoux, Paul; Torres, Omar; Einaudi, Franco (Technical Monitor)

    2000-01-01

    At NASA Goddard we are developing a global aerosol data assimilation system that combines advances in remote sensing and modeling of atmospheric aerosols. The goal is to provide high resolution, 3-D aerosol distributions to the research community. Our first step is to develop a simple assimilation system for Saharan mineral aerosol. The Goddard Chemistry and Aerosol Radiation model (GOCART) provides accurate 3-D mineral aerosol size distributions that compare well with TOMS satellite observations. Surface, mobilization, wet and dry deposition, convective and long-range transport are all driven by assimilated fields from the Goddard Earth Observing System Data Assimilation System, GEOS-DAS. Our version of GOCART transports sizes from.08-10 microns and only simulates Saharan dust. TOMS radiance observations in the ultra violet provide information on the mineral and carbonaceous aerosol fields. We use two main observables in this study: the TOMS aerosol index (AI) which is directly related to the ratio of the 340 and 380 radiances and the 380 radiance. These are sensitive to the aerosol optical thickness, the single scattering albedo and the height of the aerosol layer. The Goddard Aerosol Assimilation System (GAAS) uses the Data Assimilation Office's Physical-space Statistical Analysis System (PSAS) to combine TOMS observations and GOCART model first guess fields. At this initial phase we only assimilate observations into the the GOCART model over regions of Africa and the Atlantic where mineral aerosols dominant and carbonaceous aerosols are minimal, Our preliminary results during summer show that the assimilation with TOMS data modifies both the aerosol mass loading and the single scattering albedo. Assimilated aerosol fields will be compared with assimilated aerosol fields from GOCART and AERONET observations over Cape Verde.

  2. Variational Assimilation of Global Microwave Rainfall Retrievals: Physical and Dynamical Impact on GEOS Analyses and Forecasts

    NASA Technical Reports Server (NTRS)

    Lin, Xin; Zhang, Sara Q.; Hou, Arthur Y.

    2006-01-01

    Global microwave rainfall retrievals from a 5-satellite constellation, including TMI from TRMM, SSWI from DMSP F13, F14 and F15, and AMSR-E from EOS-AQUA, are assimilated into the NASA Goddard Earth Observing System (GEOS) Data Assimilation System (DAS) using a 1-D variational continuous assimilation (VCA) algorithm. The physical and dynamical impact of rainfall assimilation on GEOS analyses and forecasts is examined at various temporal and spatial scales. This study demonstrates that the 1-D VCA algorithm, which was originally developed and evaluated for rainfall assimilations over tropical oceans, can effectively assimilate satellite microwave rainfall retrievals and improve GEOS analyses over both the Tropics and the extratropics where the atmospheric processes are dominated by different large-scale dynamics and moist physics, and also over the land, where rainfall estimates from passive microwave radiometers are believed to be less accurate. Results show that rainfall assimilation renders the GEOS analysis physically and dynamically more consistent with the observed precipitation at the monthly-mean and 6-hour time scales. Over regions where the model precipitation tends to misbehave in distinctly different rainy regimes, the 1-D VCA algorithm, by compensating for errors in the model s moist time-tendency in a 6-h analysis window, is able to bring the rainfall analysis closer to the observed. The radiation and cloud fields also tend to be in better agreement with independent satellite observations in the rainfall-assimilation m especially over regions where rainfall analyses indicate large improvements. Assimilation experiments with and without rainfall data for a midlatitude frontal system clearly indicates that the GEOS analysis is improved through changes in the thermodynamic and dynamic fields that respond to the rainfall assimilation. The synoptic structures of temperature, moisture, winds, divergence, and vertical motion, as well as vorticity are more realistically captured across the front. Short-term forecasts using initial conditions assimilated with rainfall data also show slight improvements. 1

  3. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  4. Evaluating Model Performance of an Ensemble-based Chemical Data Assimilation System During INTEX-B Field Mission

    NASA Technical Reports Server (NTRS)

    Arellano, A. F., Jr.; Raeder, K.; Anderson, J. L.; Hess, P. G.; Emmons, L. K.; Edwards, D. P.; Pfister, G. G.; Campos, T. L.; Sachse, G. W.

    2007-01-01

    We present a global chemical data assimilation system using a global atmosphere model, the Community Atmosphere Model (CAM3) with simplified chemistry and the Data Assimilation Research Testbed (DART) assimilation package. DART is a community software facility for assimilation studies using the ensemble Kalman filter approach. Here, we apply the assimilation system to constrain global tropospheric carbon monoxide (CO) by assimilating meteorological observations of temperature and horizontal wind velocity and satellite CO retrievals from the Measurement of Pollution in the Troposphere (MOPITT) satellite instrument. We verify the system performance using independent CO observations taken on board the NSFINCAR C-130 and NASA DC-8 aircrafts during the April 2006 part of the Intercontinental Chemical Transport Experiment (INTEX-B). Our evaluations show that MOPITT data assimilation provides significant improvements in terms of capturing the observed CO variability relative to no MOPITT assimilation (i.e. the correlation improves from 0.62 to 0.71, significant at 99% confidence). The assimilation provides evidence of median CO loading of about 150 ppbv at 700 hPa over the NE Pacific during April 2006. This is marginally higher than the modeled CO with no MOPITT assimilation (-140 ppbv). Our ensemble-based estimates of model uncertainty also show model overprediction over the source region (i.e. China) and underprediction over the NE Pacific, suggesting model errors that cannot be readily explained by emissions alone. These results have important implications for improving regional chemical forecasts and for inverse modeling of CO sources and further demonstrate the utility of the assimilation system in comparing non-coincident measurements, e.g. comparing satellite retrievals of CO with in-situ aircraft measurements. The work described above also brought to light several short-comings of the data assimilation approach for CO profiles. Because of the limited vertical resolution of the measurement, the retrievals at different altitudes are correlated which can lead to problems with numerical error and overall efficiency. This has resulted in a manuscript that is about to be submitted to JGR:

  5. DasPy 1.0 - the Open Source Multivariate Land Data Assimilation Framework in combination with the Community Land Model 4.5

    NASA Astrophysics Data System (ADS)

    Han, X.; Li, X.; He, G.; Kumbhar, P.; Montzka, C.; Kollet, S.; Miyoshi, T.; Rosolem, R.; Zhang, Y.; Vereecken, H.; Franssen, H.-J. H.

    2015-08-01

    Data assimilation has become a popular method to integrate observations from multiple sources with land surface models to improve predictions of the water and energy cycles of the soil-vegetation-atmosphere continuum. Multivariate data assimilation refers to the simultaneous assimilation of observation data from multiple model state variables into a simulation model. In recent years, several land data assimilation systems have been developed in different research agencies. Because of the software availability or adaptability, these systems are not easy to apply for the purpose of multivariate land data assimilation research. We developed an open source multivariate land data assimilation framework (DasPy) which is implemented using the Python script language mixed with the C++ and Fortran programming languages. LETKF (Local Ensemble Transform Kalman Filter) is implemented as the main data assimilation algorithm, and uncertainties in the data assimilation can be introduced by perturbed atmospheric forcing data, and represented by perturbed soil and vegetation parameters and model initial conditions. The Community Land Model (CLM) was integrated as the model operator. The implementation allows also parameter estimation (soil properties and/or leaf area index) on the basis of the joint state and parameter estimation approach. The Community Microwave Emission Modelling platform (CMEM), COsmic-ray Soil Moisture Interaction Code (COSMIC) and the Two-Source Formulation (TSF) were integrated as observation operators for the assimilation of L-band passive microwave, cosmic-ray soil moisture probe and land surface temperature measurements, respectively. DasPy has been evaluated in several assimilation studies of neutron count intensity (soil moisture), L-band brightness temperature and land surface temperature. DasPy is parallelized using the hybrid Message Passing Interface and Open Multi-Processing techniques. All the input and output data flows are organized efficiently using the commonly used NetCDF file format. Online 1-D and 2-D visualization of data assimilation results is also implemented to facilitate the post simulation analysis. In summary, DasPy is a ready to use open source parallel multivariate land data assimilation framework.

  6. Merging parallel tempering with sequential geostatistical resampling for improved posterior exploration of high-dimensional subsurface categorical fields

    NASA Astrophysics Data System (ADS)

    Laloy, Eric; Linde, Niklas; Jacques, Diederik; Mariethoz, Grégoire

    2016-04-01

    The sequential geostatistical resampling (SGR) algorithm is a Markov chain Monte Carlo (MCMC) scheme for sampling from possibly non-Gaussian, complex spatially-distributed prior models such as geologic facies or categorical fields. In this work, we highlight the limits of standard SGR for posterior inference of high-dimensional categorical fields with realistically complex likelihood landscapes and benchmark a parallel tempering implementation (PT-SGR). Our proposed PT-SGR approach is demonstrated using synthetic (error corrupted) data from steady-state flow and transport experiments in categorical 7575- and 10,000-dimensional 2D conductivity fields. In both case studies, every SGR trial gets trapped in a local optima while PT-SGR maintains an higher diversity in the sampled model states. The advantage of PT-SGR is most apparent in an inverse transport problem where the posterior distribution is made bimodal by construction. PT-SGR then converges towards the appropriate data misfit much faster than SGR and partly recovers the two modes. In contrast, for the same computational resources SGR does not fit the data to the appropriate error level and hardly produces a locally optimal solution that looks visually similar to one of the two reference modes. Although PT-SGR clearly surpasses SGR in performance, our results also indicate that using a small number (16-24) of temperatures (and thus parallel cores) may not permit complete sampling of the posterior distribution by PT-SGR within a reasonable computational time (less than 1-2 weeks).

  7. Multi-target Detection, Tracking, and Data Association on Road Networks Using Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Barkley, Brett E.

    A cooperative detection and tracking algorithm for multiple targets constrained to a road network is presented for fixed-wing Unmanned Air Vehicles (UAVs) with a finite field of view. Road networks of interest are formed into graphs with nodes that indicate the target likelihood ratio (before detection) and position probability (after detection). A Bayesian likelihood ratio tracker recursively assimilates target observations until the cumulative observations at a particular location pass a detection criterion. At this point, a target is considered detected and a position probability is generated for the target on the graph. Data association is subsequently used to route future measurements to update the likelihood ratio tracker (for undetected target) or to update a position probability (a previously detected target). Three strategies for motion planning of UAVs are proposed to balance searching for new targets with tracking known targets for a variety of scenarios. Performance was tested in Monte Carlo simulations for a variety of mission parameters, including tracking on road networks with varying complexity and using UAVs at various altitudes.

  8. Combined Monte Carlo and quantum mechanics study of the solvatochromism of phenol in water. The origin of the blue shift of the lowest pi-pi* transition.

    PubMed

    Barreto, Rafael C; Coutinho, Kaline; Georg, Herbert C; Canuto, Sylvio

    2009-03-07

    A combined and sequential use of Monte Carlo simulations and quantum mechanical calculations is made to analyze the spectral shift of the lowest pi-pi* transition of phenol in water. The solute polarization is included using electrostatic embedded calculations at the MP2/aug-cc-pVDZ level giving a dipole moment of 2.25 D, corresponding to an increase of 76% compared to the calculated gas-phase value. Using statistically uncorrelated configurations sampled from the MC simulation, first-principle size-extensive calculations are performed to obtain the solvatochromic shift. Analysis is then made of the origin of the blue shift. Results both at the optimized geometry and in room-temperature liquid water show that hydrogen bonds of water with phenol promote a red shift when phenol is the proton-donor and a blue shift when phenol is the proton-acceptor. In the case of the optimized clusters the calculated shifts are in very good agreement with results obtained from mass-selected free jet expansion experiments. In the liquid case the contribution of the solute-solvent hydrogen bonds partially cancels and the total shift obtained is dominated by the contribution of the outer solvent water molecules. Our best result, including both inner and outer water molecules, is 570 +/- 35 cm(-1), in very good agreement with the small experimental shift of 460 cm(-1) for the absorption maximum.

  9. Soybean Physiology Calibration in the Community Land Model

    NASA Astrophysics Data System (ADS)

    Drewniak, B. A.; Bilionis, I.; Constantinescu, E. M.

    2014-12-01

    With the large influence of agricultural land use on biophysical and biogeochemical cycles, integrating cultivation into Earth System Models (ESMs) is increasingly important. The Community Land Model (CLM) was augmented with a CLM-Crop extension that simulates the development of three crop types: maize, soybean, and spring wheat. The CLM-Crop model is a complex system that relies on a suite of parametric inputs that govern plant growth under a given atmospheric forcing and available resources. However, the strong nonlinearity of ESMs makes parameter fitting a difficult task. In this study, our goal is to calibrate ten of the CLM-Crop parameters for one crop type, soybean, in order to improve model projection of plant development and carbon fluxes. We used measurements of gross primary productivity, net ecosystem exchange, and plant biomass from AmeriFlux sites to choose parameter values that optimize crop productivity in the model. Calibration is performed in a Bayesian framework by developing a scalable and adaptive scheme based on sequential Monte Carlo (SMC). Our scheme can perform model calibration using very few evaluations and, by exploiting parallelism, at a fraction of the time required by plain vanilla Markov Chain Monte Carlo (MCMC). We present the results from a twin experiment (self-validation) and calibration results and validation using real observations from an AmeriFlux tower site in the Midwestern United States, for the soybean crop type. The improved model will help researchers understand how climate affects crop production and resulting carbon fluxes, and additionally, how cultivation impacts climate.

  10. Aerosol EnKF at GMAO

    NASA Technical Reports Server (NTRS)

    Buchard, Virginie; Da Silva, Arlindo; Todling, Ricardo

    2017-01-01

    In the GEOS near real-time system, as well as in MERRA-2 which is the latest reanalysis produced at NASAs Global Modeling and Assimilation Office(GMAO), the assimilation of aerosol observations is performed by means of a so-called analysis splitting method. In line with the transition of the GEOS meteorological data assimilation system to a hybrid Ensemble-Variational formulation, we are updating the aerosol component of our assimilation system to an ensemble square root filter(EnSRF; Whitaker and Hamill (2002)) type of scheme.We present a summary of our preliminary results of the assimilation of column integrated aerosol observations (Aerosol Optical Depth; AOD) using an Ensemble Square Root Filters (EnSRF) scheme and the ensemble members produced routinely by the meteorological assimilation.

  11. Soil moisture assimilation using a modified ensemble transform Kalman filter with water balance constraint

    NASA Astrophysics Data System (ADS)

    Wu, Guocan; Zheng, Xiaogu; Dan, Bo

    2016-04-01

    The shallow soil moisture observations are assimilated into Common Land Model (CoLM) to estimate the soil moisture in different layers. The forecast error is inflated to improve the analysis state accuracy and the water balance constraint is adopted to reduce the water budget residual in the assimilation procedure. The experiment results illustrate that the adaptive forecast error inflation can reduce the analysis error, while the proper inflation layer can be selected based on the -2log-likelihood function of the innovation statistic. The water balance constraint can result in reducing water budget residual substantially, at a low cost of assimilation accuracy loss. The assimilation scheme can be potentially applied to assimilate the remote sensing data.

  12. Anaerobic Metabolism in the N-Limited Green Alga Selenastrum minutum: III. Alanine Is the Product of Anaerobic Ammonium Assimilation.

    PubMed

    Vanlerberghe, G C; Joy, K W; Turpin, D H

    1991-02-01

    We have determined the flow of (15)N into free amino acids of the N-limited green alga Selenastrum minutum (Naeg.) Collins after addition of (15)NH(4) (+) to aerobic or anaerobic cells. Under aerobic conditions, only a small proportion of the N assimilated was retained in the free amino acid pool. However, under anaerobic conditions almost all assimilated NH(4) (+) accumulates in alanine. This is a unique feature of anaerobic NH(4) (+) assimilation. The pathway of carbon flow to alanine results in the production of ATP and reductant which matches exactly the requirements of NH(4) (+) assimilation. Alanine synthesis is therefore an excellent strategy to maintain energy and redox balance during anaerobic NH(4) (+) assimilation.

  13. Improved water balance component estimates through joint assimilation of GRACE water storage and SMOS soil moisture retrievals

    NASA Astrophysics Data System (ADS)

    Tian, Siyuan; Tregoning, Paul; Renzullo, Luigi J.; van Dijk, Albert I. J. M.; Walker, Jeffrey P.; Pauwels, Valentijn R. N.; Allgeyer, Sébastien

    2017-03-01

    The accuracy of global water balance estimates is limited by the lack of observations at large scale and the uncertainties of model simulations. Global retrievals of terrestrial water storage (TWS) change and soil moisture (SM) from satellites provide an opportunity to improve model estimates through data assimilation. However, combining these two data sets is challenging due to the disparity in temporal and spatial resolution at both vertical and horizontal scale. For the first time, TWS observations from the Gravity Recovery and Climate Experiment (GRACE) and near-surface SM observations from the Soil Moisture and Ocean Salinity (SMOS) were jointly assimilated into a water balance model using the Ensemble Kalman Smoother from January 2010 to December 2013 for the Australian continent. The performance of joint assimilation was assessed against open-loop model simulations and the assimilation of either GRACE TWS anomalies or SMOS SM alone. The SMOS-only assimilation improved SM estimates but reduced the accuracy of groundwater and TWS estimates. The GRACE-only assimilation improved groundwater estimates but did not always produce accurate estimates of SM. The joint assimilation typically led to more accurate water storage profile estimates with improved surface SM, root-zone SM, and groundwater estimates against in situ observations. The assimilation successfully downscaled GRACE-derived integrated water storage horizontally and vertically into individual water stores at the same spatial scale as the model and SMOS, and partitioned monthly averaged TWS into daily estimates. These results demonstrate that satellite TWS and SM measurements can be jointly assimilated to produce improved water balance component estimates.

  14. A balanced Kalman filter ocean data assimilation system with application to the South Australian Sea

    NASA Astrophysics Data System (ADS)

    Li, Yi; Toumi, Ralf

    2017-08-01

    In this paper, an Ensemble Kalman Filter (EnKF) based regional ocean data assimilation system has been developed and applied to the South Australian Sea. This system consists of the data assimilation algorithm provided by the NCAR Data Assimilation Research Testbed (DART) and the Regional Ocean Modelling System (ROMS). We describe the first implementation of the physical balance operator (temperature-salinity, hydrostatic and geostrophic balance) to DART, to reduce the spurious waves which may be introduced during the data assimilation process. The effect of the balance operator is validated in both an idealised shallow water model and the ROMS model real case study. In the shallow water model, the geostrophic balance operator eliminates spurious ageostrophic waves and produces a better sea surface height (SSH) and velocity analysis and forecast. Its impact increases as the sea surface height and wind stress increase. In the real case, satellite-observed sea surface temperature (SST) and SSH are assimilated in the South Australian Sea with 50 ensembles using the Ensemble Adjustment Kalman Filter (EAKF). Assimilating SSH and SST enhances the estimation of SSH and SST in the entire domain, respectively. Assimilation with the balance operator produces a more realistic simulation of surface currents and subsurface temperature profile. The best improvement is obtained when only SSH is assimilated with the balance operator. A case study with a storm suggests that the benefit of the balance operator is of particular importance under high wind stress conditions. Implementing the balance operator could be a general benefit to ocean data assimilation systems.

  15. Assimilative model for ionospheric dynamics employing delay, Doppler, and direction of arrival measurements from multiple HF channels

    NASA Astrophysics Data System (ADS)

    Fridman, Sergey V.; Nickisch, L. J.; Hausman, Mark; Zunich, George

    2016-03-01

    We describe the development of new HF data assimilation capabilities for our ionospheric inversion algorithm called GPSII (GPS Ionospheric Inversion). Previously existing capabilities of this algorithm included assimilation of GPS total electron content data as well as assimilation of backscatter ionograms. In the present effort we concentrated on developing assimilation tools for data related to HF propagation channels. Measurements of propagation delay, angle of arrival, and the ionosphere-induced Doppler from any number of known propagation links can now be utilized by GPSII. The resulting ionospheric model is consistent with all assimilated measurements. This means that ray tracing simulations of the assimilated propagation links are guaranteed to be in agreement with measured data within the errors of measurement. The key theoretical element for assimilating HF data is the raypath response operator (RPRO) which describes response of raypath parameters to infinitesimal variations of electron density in the ionosphere. We construct the RPRO out of the fundamental solution of linearized ray tracing equations for a dynamic magnetoactive plasma. We demonstrate performance and internal consistency of the algorithm using propagation delay data from multiple oblique ionograms (courtesy of Defence Science and Technology Organisation, Australia) as well as with time series of near-vertical incidence sky wave data (courtesy of the Intelligence Advanced Research Projects Activity HFGeo Program Government team). In all cases GPSII produces electron density distributions which are smooth in space and in time. We simulate the assimilated propagation links by performing ray tracing through GPSII-produced ionosphere and observe that simulated data are indeed in agreement with assimilated measurements.

  16. Torque Balances on the Taylor Cylinders in the Geomagnetic Data Assimilation

    NASA Technical Reports Server (NTRS)

    Kuang, Weijia; Tangborn, Andrew

    2004-01-01

    In this presentation we report on our continuing effort in geomagnetic data assimilation, aiming at understanding and predicting geomagnetic secular variation on decadal time scales. In particular, we focus on the effect of the torque balances on the cylindrical surfaces in the core co-axial with the Earth's rotation axis (the Taylor cylinders) on the time evolution of assimilated solutions. We use our MoSST core dynamics,model and observed geomagnetic field at the Earth's surface derived via Comprehensive Field Model (CFM) for the geomagnetic data assimilation. In our earlier studies, a model solution is selected randomly from our numerical database. It is then assimilated with the observations such that the poloidal field possesses the same field tomography on the core-mantel boundary (CMB) continued downward from surface observations. This tomography change is assumed to be effective through out the outer core. While this approach allows rapid convergence between model solutions and the observations, it also generates sevee numerical instabilities: the delicate balance between weak fluid inertia and the magnetic torques on the Taylor cylinders are completely altered. Consequently, the assimilated solution diverges quickly (in approximately 10% of the magnetic free-decay time in the core). To improve the assimilation, we propose a partial penetration of the assimilation from the CMB: The full-scale modification at the CMB decreases linearly and vanish at an interior radius r(sub a). We shall examine from our assimilation tests possible relationships between the convergence rate of the model solutions to observations and the cut-off radius r(sub a). A better assimilation shall serve our nudging tests in near future.

  17. Torque Balances on the Taylor Cylinders in the Geomagnetic Data Assimilation

    NASA Astrophysics Data System (ADS)

    Kuang, W.; Tangborn, A.

    2004-05-01

    In this presentation we report on our continuing effort in geomagnetic data assimilation, aiming at understanding and predicting geomagnetic secular variation on decadal time scales. In particular, we focus on the effect of the torque balances on the cylindrical surfaces in the core co-axial with the Earth's rotation axis (the Taylor cylinders) on the time evolution of assimilated solutions. We use our MoSST core dynamics model and observed geomagnetic field at the Earth's surface derived via Comprehensive Field Model (CFM) for the geomagnetic data assimilation. In our earlier studies, a model solution is selected randomly from our numerical database. It is then assimilated with the observations such that the poloidal field possesses the same field tomography on the core-mantel boundary (CMB) continued downward from surface observations. This tomography change is assumed to be effective through out the outer core. While this approach allows rapid convergence between model solutions and the observations, it also generates sever numerical instabilities: the delicate balance between weak fluid inertia and the magnetic torques on the Taylor cylinders are completely altered. Consequently, the assimilated solution diverges quickly (in approximately 10% of the magnetic free-decay time in the core). To improve the assimilation, we propose a partial penetration of the assimilation from the CMB: The full-scale modification at the CMB decreases linearly and vanish at an interior radius ra. We shall examine from our assimilation tests possible relationships between the convergence rate of the model solutions to observations and the cut-off radius ra. A better assimilation shall serve our nudging tests in near future.

  18. Joint Assimilation of SMOS Brightness Temperature and GRACE Terrestrial Water Storage Observations for Improved Soil Moisture Estimation

    NASA Technical Reports Server (NTRS)

    Girotto, Manuela; Reichle, Rolf H.; De Lannoy, Gabrielle J. M.; Rodell, Matthew

    2017-01-01

    Observations from recent soil moisture missions (e.g. SMOS) have been used in innovative data assimilation studies to provide global high spatial (i.e. 40 km) and temporal resolution (i.e. 3-days) soil moisture profile estimates from microwave brightness temperature observations. In contrast with microwave-based satellite missions that are only sensitive to near-surface soil moisture (0 - 5 cm), the Gravity Recovery and Climate Experiment (GRACE) mission provides accurate measurements of the entire vertically integrated terrestrial water storage column but, it is characterized by low spatial (i.e. 150,000 km2) and temporal (i.e. monthly) resolutions. Data assimilation studies have shown that GRACE-TWS primarily affects (in absolute terms) deeper moisture storages (i.e., groundwater). This work hypothesizes that unprecedented soil water profile accuracy can be obtained through the joint assimilation of GRACE terrestrial water storage and SMOS brightness temperature observations. A particular challenge of the joint assimilation is the use of the two different types of measurements that are relevant for hydrologic processes representing different temporal and spatial scales. The performance of the joint assimilation strongly depends on the chosen assimilation methods, measurement and model error spatial structures. The optimization of the assimilation technique constitutes a fundamental step toward a multi-variate multi-resolution integrative assimilation system aiming to improve our understanding of the global terrestrial water cycle.

  19. Application of an Ensemble Smoother to Precipitation Assimilation

    NASA Technical Reports Server (NTRS)

    Zhang, Sara; Zupanski, Dusanka; Hou, Arthur; Zupanski, Milija

    2008-01-01

    Assimilation of precipitation in a global modeling system poses a special challenge in that the observation operators for precipitation processes are highly nonlinear. In the variational approach, substantial development work and model simplifications are required to include precipitation-related physical processes in the tangent linear model and its adjoint. An ensemble based data assimilation algorithm "Maximum Likelihood Ensemble Smoother (MLES)" has been developed to explore the ensemble representation of the precipitation observation operator with nonlinear convection and large-scale moist physics. An ensemble assimilation system based on the NASA GEOS-5 GCM has been constructed to assimilate satellite precipitation data within the MLES framework. The configuration of the smoother takes the time dimension into account for the relationship between state variables and observable rainfall. The full nonlinear forward model ensembles are used to represent components involving the observation operator and its transpose. Several assimilation experiments using satellite precipitation observations have been carried out to investigate the effectiveness of the ensemble representation of the nonlinear observation operator and the data impact of assimilating rain retrievals from the TMI and SSM/I sensors. Preliminary results show that this ensemble assimilation approach is capable of extracting information from nonlinear observations to improve the analysis and forecast if ensemble size is adequate, and a suitable localization scheme is applied. In addition to a dynamically consistent precipitation analysis, the assimilation system produces a statistical estimate of the analysis uncertainty.

  20. Joint assimilation of SMOS brightness temperature and GRACE terrestrial water storage observations for improved soil moisture estimation

    NASA Astrophysics Data System (ADS)

    Girotto, M.; Reichle, R. H.; De Lannoy, G.; Rodell, M.

    2017-12-01

    Observations from recent soil moisture missions (e.g. SMOS) have been used in innovative data assimilation studies to provide global high spatial (i.e. 40 km) and temporal resolution (i.e. 3-days) soil moisture profile estimates from microwave brightness temperature observations. In contrast with microwave-based satellite missions that are only sensitive to near-surface soil moisture (0-5 cm), the Gravity Recovery and Climate Experiment (GRACE) mission provides accurate measurements of the entire vertically integrated terrestrial water storage column but, it is characterized by low spatial (i.e. 150,000 km2) and temporal (i.e. monthly) resolutions. Data assimilation studies have shown that GRACE-TWS primarily affects (in absolute terms) deeper moisture storages (i.e., groundwater). This work hypothesizes that unprecedented soil water profile accuracy can be obtained through the joint assimilation of GRACE terrestrial water storage and SMOS brightness temperature observations. A particular challenge of the joint assimilation is the use of the two different types of measurements that are relevant for hydrologic processes representing different temporal and spatial scales. The performance of the joint assimilation strongly depends on the chosen assimilation methods, measurement and model error spatial structures. The optimization of the assimilation technique constitutes a fundamental step toward a multi-variate multi-resolution integrative assimilation system aiming to improve our understanding of the global terrestrial water cycle.

  1. Assimilation for skin SST in the NASA GEOS atmospheric data assimilation system

    PubMed Central

    Akella, Santha; Todling, Ricardo; Suarez, Max

    2018-01-01

    The present article describes the sea surface temperature (SST) developments implemented in the Goddard Earth Observing System, Version 5 (GEOS-5) Atmospheric Data Assimilation System (ADAS). These are enhancements that contribute to the development of an atmosphere-ocean coupled data assimilation system using GEOS. In the current quasi-operational GEOS-ADAS, the SST is a boundary condition prescribed based on the OSTIA product, therefore SST and skin SST (Ts) are identical. This work modifies the GEOS-ADAS Ts by modeling and assimilating near sea surface sensitive satellite infrared (IR) observations. The atmosphere-ocean interface layer of the GEOS atmospheric general circulation model (AGCM) is updated to include near surface diurnal warming and cool-skin effects. The GEOS analysis system is also updated to directly assimilate SST-relevant Advanced Very High Resolution Radiometer (AVHRR) radiance observations. Data assimilation experiments designed to evaluate the Ts modification in GEOS-ADAS show improvements in the assimilation of radiance observations that extends beyond the thermal IR bands of AVHRR. In particular, many channels of hyperspectral sensors, such as those of the Atmospheric Infrared Sounder (AIRS), and Infrared Atmospheric Sounding Interferometer (IASI) are also better assimilated. We also obtained improved fit to withheld, in-situ buoy measurement of near-surface SST. Evaluation of forecast skill scores show marginal to neutral benefit from the modified Ts. PMID:29628531

  2. The Role of Culture Theory in Cross-Cultural Training: A Multimethod Study of Culture-Specific, Culture-General, and Culture Theory-Based Assimilators.

    ERIC Educational Resources Information Center

    Bhawuk, Dharm P. S.

    1998-01-01

    In a multimethod evaluation of cross-cultural training tools involving 102 exchange students at a midwestern university, a theory-based individualism and collectivism assimilator tool had significant advantages over culture-specific and culture-general assimilators and a control condition. Results support theory-based culture assimilators. (SLD)

  3. Therapist activities preceding setbacks in the assimilation process.

    PubMed

    Gabalda, Isabel Caro; Stiles, William B; Pérez Ruiz, Sergio

    2016-11-01

    This study examined the therapist activities immediately preceding assimilation setbacks in the treatment of a good-outcome client treated with linguistic therapy of evaluation (LTE). Setbacks (N = 105) were defined as decreases of one or more assimilation stages from one passage to the next dealing with the same theme. The therapist activities immediately preceding those setbacks were classified using two kinds of codes: (a) therapist interventions and (b) positions the therapist took toward the client's internal voices. Preceding setbacks to early assimilation stages, where the problem was unformulated, the therapist was more often actively listening, and the setbacks were more often attributable to pushing a theme beyond the client's working zone. Preceding setbacks to later assimilation stages, where the problem was at least formulated, the therapist was more likely to be directing clients to consider alternatives, following the LTE agenda, and setbacks were more often attributable to the client following these directives shifting attention to less assimilated (but nevertheless formulated) aspects of the problem. At least in this case, setbacks followed systematically different therapist activities depending on the problem's stage of assimilation. Possible implications for the assimilation model's account of setbacks and for practice are discussed.

  4. Effects of 4D-Var data assimilation using remote sensing precipitation products in a WRF over the complex Heihe River Basin

    NASA Astrophysics Data System (ADS)

    Pan, Xiaoduo; Li, Xin; Cheng, Guodong

    2017-04-01

    Traditionally, ground-based, in situ observations, remote sensing, and regional climate modeling, individually, cannot provide the high-quality precipitation data required for hydrological prediction, especially over complex terrain. Data assimilation techniques are often used to assimilate ground observations and remote sensing products into models for dynamic downscaling. In this study, the Weather Research and Forecasting (WRF) model was used to assimilate two satellite precipitation products (TRMM 3B42 and FY-2D) using the 4D-Var data assimilation method. The results show that the assimilation of remote sensing precipitation products can improve the initial WRF fields of humidity and temperature, thereby improving precipitation forecasting and decreasing the spin-up time. Hence, assimilating TRMM and FY-2D remote sensing precipitation products using WRF 4D-Var can be viewed as a positive step toward improving the accuracy and lead time of numerical weather prediction models, particularly for short-term weather forecasting. Future work is proposed to assimilate a suite of remote sensing data, e.g., the combination of precipitation and soil moisture data, into a WRF model to improve 7-8 day forecasts of precipitation and other atmospheric variables.

  5. AIRS Impact on Analysis and Forecast of an Extreme Rainfall Event (Indus River Valley 2010) with a Global Data Assimilation and Forecast System

    NASA Technical Reports Server (NTRS)

    Reale, O.; Lau, W. K.; Susskind, J.; Rosenberg, R.

    2011-01-01

    A set of data assimilation and forecast experiments are performed with the NASA Global data assimilation and forecast system GEOS-5, to compare the impact of different approaches towards assimilation of Advanced Infrared Spectrometer (AIRS) data on the precipitation analysis and forecast skill. The event chosen is an extreme rainfall episode which occurred in late July 11 2010 in Pakistan, causing massive floods along the Indus River Valley. Results show that the assimilation of quality-controlled AIRS temperature retrievals obtained under partly cloudy conditions produce better precipitation analyses, and substantially better 7-day forecasts, than assimilation of clear-sky radiances. The improvement of precipitation forecast skill up to 7 day is very significant in the tropics, and is caused by an improved representation, attributed to cloudy retrieval assimilation, of two contributing mechanisms: the low-level moisture advection, and the concentration of moisture over the area in the days preceding the precipitation peak.

  6. Impact of data assimilation on ocean current forecasts in the Angola Basin

    NASA Astrophysics Data System (ADS)

    Phillipson, Luke; Toumi, Ralf

    2017-06-01

    The ocean current predictability in the data limited Angola Basin was investigated using the Regional Ocean Modelling System (ROMS) with four-dimensional variational data assimilation. Six experiments were undertaken comprising a baseline case of the assimilation of salinity/temperature profiles and satellite sea surface temperature, with the subsequent addition of altimetry, OSCAR (satellite-derived sea surface currents), drifters, altimetry and drifters combined, and OSCAR and drifters combined. The addition of drifters significantly improves Lagrangian predictability in comparison to the baseline case as well as the addition of either altimetry or OSCAR. OSCAR assimilation only improves Lagrangian predictability as much as altimetry assimilation. On average the assimilation of either altimetry or OSCAR with drifter velocities does not significantly improve Lagrangian predictability compared to the drifter assimilation alone, even degrading predictability in some cases. When the forecast current speed is large, it is more likely that the combination improves trajectory forecasts. Conversely, when the currents are weaker, it is more likely that the combination degrades the trajectory forecast.

  7. Data assimilation of non-conventional observations using GEOS-R flash lightning: 1D+4D-VAR approach vs. assimilation of images (Invited)

    NASA Astrophysics Data System (ADS)

    Navon, M. I.; Stefanescu, R.

    2013-12-01

    Previous assimilation of lightning used nudging approaches. We develop three approaches namely, 3D-VAR WRFDA and1D+nD-VAR (n=3,4) WRFDA . The present research uses Convective Available Potential Energy (CAPE) as a proxy between lightning data and model variables. To test performance of aforementioned schemes, we assess quality of resulting analysis and forecasts of precipitation compared to those from a control experiment and verify them against NCEP stage IV precipitation. Results demonstrate that assimilating lightning observations improves precipitation statistics during the assimilation window and for 3-7 h thereafter. The 1D+4D-VAR approach yielded the best performance significantly improving precipitation rmse errors by 25% and 27.5%,compared to control during the assimilation window for two tornadic test cases. Finally we propose a new approach to assimilate 2-D images of lightning flashes based on pixel intensity, mitigating dimensionality by a reduced order method.

  8. Prediction of geomagnetic reversals using low-dimensional dynamical models and advanced data assimilation: a feasibility study

    NASA Astrophysics Data System (ADS)

    Fournier, A.; Morzfeld, M.; Hulot, G.

    2013-12-01

    For a suitable choice of parameters, the system of three ordinary differential equations (ODE) presented by Gissinger [1] was shown to exhibit chaotic reversals whose statistics compared well with those from the paleomagnetic record. In order to further assess the geophysical relevance of this low-dimensional model, we resort to data assimilation methods to calibrate it using reconstructions of the fluctuation of the virtual axial dipole moment spanning the past 2 millions years. Moreover, we test to which extent a properly calibrated model could possibly be used to predict a reversal of the geomagnetic field. We calibrate the ODE model to the geomagnetic field over the past 2 Ma using the SINT data set of Valet et al. [2]. To this end, we consider four data assimilation algorithms: the ensemble Kalman filter (EnKF), a variational method and two Monte Carlo (MC) schemes, prior importance sampling and implicit sampling. We observe that EnKF performs poorly and that prior importance sampling is inefficient. We obtain the most accurate reconstructions of the geomagnetic data using implicit sampling with five data points per assimilation sweep (of duration 5 kyr). The variational scheme performs equally well, but it does not provide us with quantitative information about the uncertainty of the estimates, which makes this method difficult to use for robust prediction under uncertainty. A calibration of the model using the PADM2M data set of Ziegler et al. [3] confirms these findings. We study the predictive capability of the ODE model using statistics computed from synthetic data experiments. For each experiment, we produce 2 Myr of synthetic data (with error levels similar to the ones found in real data), then calibrate the model to this record and then check if this calibrated model can correctly and reliably predict a reversal within the next 10 kyr (say). By performing 100 such experiments, we can assess how reliably our calibrated model can predict a (non-) reversal. It is found that the 5 kyr ahead predictions of reversals produced by the model appear to be accurate and reliable.These encouraging results prompted us to also test predictions of the five reversals of the SINT (and PADM2M) data set, using a similarly calibrated model. Results will be presented and discussed. [1] Gissinger, C., 2012, A new deterministic model for chaotic reversals, European Physical Journal B, 85:137 [2] Valet, J.-P., Meynadier, L. and Guyodo, Y., 2005, Geomagnetic field strength and reversal rate over the past 2 Million years, Nature, 435, 802-805. [3] Ziegler, L. B., Constable, C. G., Johnson, C. L. and Tauxe, L., 2011, PADM2M: a penalized maximum likelihood model of the 0-2 Ma paleomagnetic axial dipole moment, Geophysical Journal International, 184, 1069-1089.

  9. Specific phenotypic traits of Starmerella bacillaris regarding nitrogen source consumption and central carbon metabolites production during wine fermentation.

    PubMed

    Englezos, Vasileios; Cocolin, Luca; Rantsiou, Kalliopi; Ortiz-Julien, Anne; Bloem, Audrey; Dequin, Sylvie; Camarasa, Carole

    2018-06-01

    Over the last past years, the potential of non-Saccharomyces yeasts to improve the sensory quality of wine has been well recognized. In particular, the use of Starmerella bacillaris in mixed fermentations with Saccharomyces cerevisiae was reported as an appropriate way to enhance glycerol formation and reduce ethanol production. However, during sequential fermentation, many factors as the inoculation timing, strain combination and physical and biochemical interactions can affect yeast growth, fermentation process and/or metabolite synthesis. Among them, yeast assimilable nitrogen (YAN) availability, due to its role in the control of growth and fermentation, has been identified as a key parameter. Consequently, a comprehensive understanding of the metabolic specificities and the nitrogen requirements would be valuable to better exploit the potential of Starm. bacillaris during wine fermentation. In this study, marked differences in the consumption of the total and individual nitrogen sources were registered between the two species, while the two Starm. bacillaris strains generally behaved uniformly. Starm. bacillaris strains are differentiated by their preferential uptake of ammonium compared with amino acids that are poorly assimilated or even produced (alanine). Otherwise, the non- Saccharomyces yeast exhibits low activity through the acetaldehyde pathway, which triggers an important redistribution of fluxes through the central carbon metabolic network. In particular, the formation of metabolites deriving from the two glycolytic intermediates glyceraldehyde-3-phosphate and pyruvate is substantially increased during fermentations by Starm. bacillaris This knowledge will be useful to better control the fermentation process in mixed fermentation with Starm. bacillaris and S. cerevisiae IMPORTANCE Mixed fermentations using a controlled inoculation of Starm. bacillaris and S. cerevisiae starter cultures represent a feasible way to modulate wine composition that takes advantage of both the phenotypic specificities of the non- Saccharomyces strain and the ability of S. cerevisiae to complete wine fermentation. However, according to the composition of grape juices, the consumption by Starm. bacillaris of nutrients, in particular of nitrogen sources, during the first stages of the process may result in depletions that further limit the growth of S. cerevisiae and lead to stuck or sluggish fermentations. Consequently, understanding the preferences of non- Saccharomyces yeasts for the nitrogen sources available in grape must together with their phenotypic specificities is essential for an efficient implementation of sequential wine fermentations with Starm. bacillaris and S. cerevisiae species. The results of our studies demonstrate a clear preference for ammonium compared to amino acids for the non- Saccharomyces species. This finding underlines the importance of nitrogen sources, which modulate the functional characteristics of inoculated yeast strains to better control the fermentation process and product quality. Copyright © 2018 Englezos et al.

  10. Direct variational data assimilation algorithm for atmospheric chemistry data with transport and transformation model

    NASA Astrophysics Data System (ADS)

    Penenko, Alexey; Penenko, Vladimir; Nuterman, Roman; Baklanov, Alexander; Mahura, Alexander

    2015-11-01

    Atmospheric chemistry dynamics is studied with convection-diffusion-reaction model. The numerical Data Assimilation algorithm presented is based on the additive-averaged splitting schemes. It carries out ''fine-grained'' variational data assimilation on the separate splitting stages with respect to spatial dimensions and processes i.e. the same measurement data is assimilated to different parts of the split model. This design has efficient implementation due to the direct data assimilation algorithms of the transport process along coordinate lines. Results of numerical experiments with chemical data assimilation algorithm of in situ concentration measurements on real data scenario have been presented. In order to construct the scenario, meteorological data has been taken from EnviroHIRLAM model output, initial conditions from MOZART model output and measurements from Airbase database.

  11. Variational fine-grained data assimilation schemes for atmospheric chemistry transport and transformation models

    NASA Astrophysics Data System (ADS)

    Penenko, Alexey; Penenko, Vladimir; Tsvetova, Elena

    2015-04-01

    The paper concerns data assimilation problem for an atmospheric chemistry transport and transformation models. Data assimilation is carried out within variation approach on a single time step of the approximated model. A control function is introduced into the model source term (emission rate) to provide flexibility to adjust to data. This function is evaluated as the minimum of the target functional combining control function norm to a misfit between measured and model-simulated analog of data. This provides a flow-dependent and physically-plausible structure of the resulting analysis and reduces the need to calculate model error covariance matrices that are sought within conventional approach to data assimilation. Extension of the atmospheric transport model with a chemical transformations module influences data assimilation algorithms performance. This influence is investigated with numerical experiments for different meteorological conditions altering convection-diffusion processes characteristics, namely strong, medium and low wind conditions. To study the impact of transformation and data assimilation, we compare results for a convection-diffusion model (without data assimilation), convection-diffusion with assimilation, convection-diffusion-reaction (without data assimilation) and convection-diffusion-reaction-assimilation models. Both high dimensionalities of the atmospheric chemistry models and a real-time mode of operation demand for computational efficiency of the algorithms. Computational issues with complicated models can be solved by using a splitting technique. As the result a model is presented as a set of relatively independent simple models equipped with a kind of coupling procedure. With regard to data assimilation two approaches can be identified. In a fine-grained approach data assimilation is carried out on the separate splitting stages [1,2] independently on shared measurement data. The same situation arises when constructing a hybrid model out of two models each having its own assimilation scheme. In integrated schemes data assimilation is carried out with respect to the split model as a whole. First approach is more efficient from computational point of view, for in some important cases it can be implemented without iterations [2]. Its shortcoming is that control functions in different part of the model are adjusted independently thus having less evident physical sense. With the aid of numerical experiments we compare the two approaches. Work has been partially supported by COST Action ES1004 STSM Grants #16817 and #21654, RFBR 14-01-31482 mol a and 14-01-00125, Programmes # 4 Presidium RAS and # 3 MSD RAS, integration projects SB RAS #8 and #35. References: [1] V. V. Penenko Variational methods of data assimilation and inverse problems for studying the atmosphere, ocean, and environment Num. Anal. and Appl., 2009 V 2 No 4, 341-351. [2] A.V. Penenko and V.V. Penenko. Direct data assimilation method for convection-diffusion models based on splitting scheme. Computational technologies, 19(4):69-83, 2014.

  12. Assimilation of enterprise technology upgrades: a factor-based study

    NASA Astrophysics Data System (ADS)

    Claybaugh, Craig C.; Ramamurthy, Keshavamurthy; Haseman, William D.

    2017-02-01

    The purpose of this study is to gain a better understanding of the differences in the propensity of firms to initiate and commit to the assimilation of an enterprise technology upgrade. A research model is proposed that examines the influences that four technological and four organisational factors have on predicting assimilation of a technology upgrade. Results show that firms with a greater propensity to assimilate the new enterprise resource planning (ERP) version have a higher assessment of relative advantage, IS technical competence, and the strategic role of IS relative to those firms with a lower propensity to assimilate a new ERP version.

  13. Aerosol data assimilation in the chemical transport model MOCAGE during the TRAQA/ChArMEx campaign: aerosol optical depth

    NASA Astrophysics Data System (ADS)

    Sič, Bojan; El Amraoui, Laaziz; Piacentini, Andrea; Marécal, Virginie; Emili, Emanuele; Cariolle, Daniel; Prather, Michael; Attié, Jean-Luc

    2016-11-01

    In this study, we describe the development of the aerosol optical depth (AOD) assimilation module in the chemistry transport model (CTM) MOCAGE (Modèle de Chimie Atmosphérique à Grande Echelle). Our goal is to assimilate the spatially averaged 2-D column AOD data from the National Aeronautics and Space Administration (NASA) Moderate-resolution Imaging Spectroradiometer (MODIS) instrument, and to estimate improvements in a 3-D CTM assimilation run compared to a direct model run. Our assimilation system uses 3-D-FGAT (first guess at appropriate time) as an assimilation method and the total 3-D aerosol concentration as a control variable. In order to have an extensive validation dataset, we carried out our experiment in the northern summer of 2012 when the pre-ChArMEx (CHemistry and AeRosol MEditerranean EXperiment) field campaign TRAQA (TRAnsport à longue distance et Qualité de l'Air dans le bassin méditerranéen) took place in the western Mediterranean basin. The assimilated model run is evaluated independently against a range of aerosol properties (2-D and 3-D) measured by in situ instruments (the TRAQA size-resolved balloon and aircraft measurements), the satellite Spinning Enhanced Visible and InfraRed Imager (SEVIRI) instrument and ground-based instruments from the Aerosol Robotic Network (AERONET) network. The evaluation demonstrates that the AOD assimilation greatly improves aerosol representation in the model. For example, the comparison of the direct and the assimilated model run with AERONET data shows that the assimilation increased the correlation (from 0.74 to 0.88), and reduced the bias (from 0.050 to 0.006) and the root mean square error in the AOD (from 0.12 to 0.07). When compared to the 3-D concentration data obtained by the in situ aircraft and balloon measurements, the assimilation consistently improves the model output. The best results as expected occur when the shape of the vertical profile is correctly simulated by the direct model. We also examine how the assimilation can influence the modelled aerosol vertical distribution. The results show that a 2-D continuous AOD assimilation can improve the 3-D vertical profile, as a result of differential horizontal transport of aerosols in the model.

  14. Lotic ecosystem response to chronic metal contamination assessed by the resazurin-resorufin smart tracer with data assimilation by the Markov chain Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Stanaway, D. J.; Flores, A. N.; Haggerty, R.; Benner, S. G.; Feris, K. P.

    2011-12-01

    Concurrent assessment of biogeochemical and solute transport data (i.e. advection, dispersion, transient storage) within lotic systems remains a challenge in eco-hydrological research. Recently, the Resazurin-Resorufin Smart Tracer System (RRST) was proposed as a mechanism to measure microbial activity at the sediment-water interface [Haggerty et al., 2008, 2009] associating metabolic and hydrologic processes and allowing for the reach scale extrapolation of biotic function in the context of a dynamic physical environment. This study presents a Markov Chain Monte Carlo (MCMC) data assimilation technique to solve the inverse model of the Raz Rru Advection Dispersion Equation (RRADE). The RRADE is a suite of dependent 1-D reactive ADEs, associated through the microbially mediated reduction of Raz to Rru (k12). This reduction is proportional to DO consumption (R^2=0.928). MCMC is a suite of algorithms that solve Bayes theorem to condition uncertain model states and parameters on imperfect observations. Here, the RRST is employed to quantify the effect of chronic metal exposure on hyporheic microbial metabolism along a 100+ year old metal contamination gradient in the Clark Fork River (CF). We hypothesized that 1) the energetic cost of metal tolerance limits heterotrophic microbial respiration in communities evolved in chronic metal contaminated environments, with respiration inhibition directly correlated to degree of contamination (observational experiment) and 2) when experiencing acute metal stress, respiration rate inhibition of metal tolerant communities is less than that of naïve communities (manipulative experiment). To test these hypotheses, 4 replicate columns containing sediment collected from differently contaminated CF reaches and reference sites were fed a solution of RRST, NaCl, and cadmium (manipulative experiment only) within 24 hrs post collection. Column effluent was collected and measured for Raz, Rru, and EC to determine the Raz Rru breakthrough curves (BTC), subsequently modeled by the RRADE and thereby allowing derivation of in situ rates of metabolism. RRADE parameter values are estimated through Metropolis Hastings MCMC optimization. Unknown prior parameter distributions (PD) were constrained via a sensitivity analysis, except for the empirically estimated velocity. MCMC simulations were initiated at random points within the PD. Convergence of target distributions (TD) is achieved when the variance of the mode values of the six RRADE parameters in independent model replication is at least 10^{-3} less than the mode value. Convergence of k12, the parameter of interest, was more resolved, with modal variance of replicate simulations ranging from 10^{-4} less than the modal value to 0. The MCMC algorithm presented here offers a robust approach to solve the inverse RRST model and could be easily adapted to other inverse problems.

  15. Geochemical Evidence for Mantle Enrichment and Lower Crustal Assimilation in Orogenic Volcanics from Monte Arcuentu, Southern Sardinia: Implications for Geodynamics and Evolution of the Western Mediterranean

    NASA Astrophysics Data System (ADS)

    Vero, S.; Kempton, P. D.; Downes, H.

    2016-12-01

    Miocene (ca. 18Ma) subduction-related basalts and basaltic andesites from Monte Arcuentu (MA), southern Sardinia, show a remarkable correlation between SiO2 and 87Sr/86Sr (up to 0.711) that contrasts with most other orogenic volcanics worldwide. MgO ranges from 13.4 - 2.4 wt%, yet the rocks form a baseline trend at low SiO2 (51-56 wt%) from which other arcs diverge toward high SiO2. In contrast, MA exhibits a steep trend that extends toward the field of lithosphere-derived, lamproites from central Italy. New high-precision Pb and Hf isotope data help to constrain the petrogenesis of these rocks. The most primitive MA rocks (MgO > 8.5wt%) were derived from a mantle wedge metasomatized by melts derived from terrigenous sediment, likely derived from Archean terranes of N Africa. This metasomatized source had high 87Sr/86Sr (O.705-0.709) and 7/4Pb (15.65 - 15.67) with low ɛHf (-1 to +8) and ɛNd (+1 to -6), but does not account for the full range of isotopic compositions observed. More evolved rocks (MgO < 8.5 wt%) have higher 87Sr/86Sr (0.711) and 7/4Pb (15.68), lower ɛHf (-8) and ɛNd (-9). However, one group of evolved rocks with low Rb/Ba trends toward low 6/4Pb whereas another group with high Rb/Ba extends to high 6/4Pb. Mixing calculations suggest that evolved rocks with low Rb/Ba - low 6/4Pb interacted with Hercynian-type lower crust. High Rb/Ba - high 6/4Pb rocks may have interacted with lithospheric mantle similar to that sampled by Italian lamproites, but upper crustal contamination cannot be ruled out. Partial melting of these normally refractory lithologies was facilitated by the rapid extension, and subsequent mantle upwelling, that occurred as Sardinia rifted and rotated away from the European plate during the Miocene (32-15 Ma). High rates of melt accumulation and high melt fractions ponded near the MOHO, creating a "hot zone", enabling mafic crustal melting. Fractional crystallization under these PT conditions involved olivine + cpx with little or no plag, such that differentiation proceeded without significant increase in SiO2. High rates of extension may also have facilitated rapid ascent of magmas to the surface with minimal interaction with mid- to upper crust. The MA rocks provide insights into lower crustal assimilation process that may be obscured by upper crustal AFC processes in other suites.

  16. Thermal coefficients of technology assimilation by natural systems

    NASA Technical Reports Server (NTRS)

    Mueller, R. F.

    1971-01-01

    Estimates of thermal coefficients of the rates of technology assimilation processes was made. Consideration of such processes as vegetation and soil recovery and pollution assimilation indicates that these processes proceed ten to several hundred times more slowly in earth's cold regions than in temperate regions. It was suggested that these differential assimilation rates are important data in planning for technological expansion in Arctic regions.

  17. A Unified Data Assimilation Strategy for Regional Coupled Atmosphere-Ocean Prediction Systems

    NASA Astrophysics Data System (ADS)

    Xie, Lian; Liu, Bin; Zhang, Fuqing; Weng, Yonghui

    2014-05-01

    Improving tropical cyclone (TC) forecasts is a top priority in weather forecasting. Assimilating various observational data to produce better initial conditions for numerical models using advanced data assimilation techniques has been shown to benefit TC intensity forecasts, whereas assimilating large-scale environmental circulation into regional models by spectral nudging or Scale-Selective Data Assimilation (SSDA) has been demonstrated to improve TC track forecasts. Meanwhile, taking into account various air-sea interaction processes by high-resolution coupled air-sea modelling systems has also been shown to improve TC intensity forecasts. Despite the advances in data assimilation and air-sea coupled models, large errors in TC intensity and track forecasting remain. For example, Hurricane Nate (2011) has brought considerable challenge for the TC operational forecasting community, with very large intensity forecast errors (27, 25, and 40 kts for 48, 72, and 96 h, respectively) for the official forecasts. Considering the slow-moving nature of Hurricane Nate, it is reasonable to hypothesize that air-sea interaction processes played a critical role in the intensity change of the storm, and accurate representation of the upper ocean dynamics and thermodynamics is necessary to quantitatively describe the air-sea interaction processes. Currently, data assimilation techniques are generally only applied to hurricane forecasting in stand-alone atmospheric or oceanic model. In fact, most of the regional hurricane forecasting models only included data assimilation techniques for improving the initial condition of the atmospheric model. In such a situation, the benefit of adjustments in one model (atmospheric or oceanic) by assimilating observational data can be compromised by errors from the other model. Thus, unified data assimilation techniques for coupled air-sea modelling systems, which not only simultaneously assimilate atmospheric and oceanic observations into the coupled air-sea modelling system, but also nudging the large-scale environmental flow in the regional model towards global model forecasts are of increasing necessity. In this presentation, we will outline a strategy for an integrated approach in air-sea coupled data assimilation and discuss its benefits and feasibility from incremental results for select historical hurricane cases.

  18. Toward a multivariate reanalysis of the North Atlantic ocean biogeochemistry during 1998-2006 based on the assimilation of SeaWiFS chlorophyll data

    NASA Astrophysics Data System (ADS)

    Fontana, C.; Brasseur, P.; Brankart, J.-M.

    2012-04-01

    Today, the routine assimilation of satellite data into operational models of the ocean circulation is mature enough to enable the production of global reanalyses describing the ocean circulation variability during the past decades. The expansion of the "reanalysis" concept from ocean physics to biogeochemistry is a timely challenge that motivates the present study. The objective of this paper is to investigate the potential benefits of assimilating satellite-estimated chlorophyll data into a basin-scale three-dimensional coupled physical-biogeochemical model of the North-Atlantic. The aim is on one hand to improve forecasts of ocean biogeochemical properties and on the other hand to define a methodology for producing data-driven climatologies based on coupled physical-biogeochemical modelling. A simplified variant of the Kalman filter is used to assimilate ocean color data during a 9 year-long period. In this frame, two experiences are carried out, with and without anamorphic transformations of the state vector variables. Data assimilation efficiency is assessed with respect to the assimilated data set, the nitrate World Ocean Atlas database and a derived climatology. Along the simulation period, the non-linear assimilation scheme clearly improves the surface chlorophyll concentrations analysis and forecast, especially in the North Atlantic bloom region. Nitrate concentration forecasts are also improved thanks to the assimilation of ocean color data while this improvement is limited to the upper layer of the water column, in agreement with recent related litterature. This feature is explained by the weak correlation taken into account by the assimilation between surface phytoplankton and nitrate concentration deeper than 50 m. The assessement of the non-linear assimilation experiments indicates that the proposed methodology provides the skeleton of an assimilative system suitable for reanalysing the ocean biogeochemistry based on ocean color data.

  19. Toward a multivariate reanalysis of the North Atlantic Ocean biogeochemistry during 1998-2006 based on the assimilation of SeaWiFS chlorophyll data

    NASA Astrophysics Data System (ADS)

    Fontana, C.; Brasseur, P.; Brankart, J.-M.

    2013-01-01

    Today, the routine assimilation of satellite data into operational models of ocean circulation is mature enough to enable the production of global reanalyses describing the ocean circulation variability during the past decades. The expansion of the "reanalysis" concept from ocean physics to biogeochemistry is a timely challenge that motivates the present study. The objective of this paper is to investigate the potential benefits of assimilating satellite-estimated chlorophyll data into a basin-scale three-dimensional coupled physical-biogeochemical model of the North Atlantic. The aim is on the one hand to improve forecasts of ocean biogeochemical properties and on the other hand to define a methodology for producing data-driven climatologies based on coupled physical-biogeochemical modeling. A simplified variant of the Kalman filter is used to assimilate ocean color data during a 9-year period. In this frame, two experiments are carried out, with and without anamorphic transformations of the state vector variables. Data assimilation efficiency is assessed with respect to the assimilated data set, nitrate of the World Ocean Atlas database and a derived climatology. Along the simulation period, the non-linear assimilation scheme clearly improves the surface analysis and forecast chlorophyll concentrations, especially in the North Atlantic bloom region. Nitrate concentration forecasts are also improved thanks to the assimilation of ocean color data while this improvement is limited to the upper layer of the water column, in agreement with recent related literature. This feature is explained by the weak correlation taken into account by the assimilation between surface phytoplankton and nitrate concentrations deeper than 50 meters. The assessment of the non-linear assimilation experiments indicates that the proposed methodology provides the skeleton of an assimilative system suitable for reanalyzing the ocean biogeochemistry based on ocean color data.

  20. Temporal evolution of carbon budgets of the Appalachian forests in the U.S. from 1972 to 2000

    USGS Publications Warehouse

    Liu, J.; Liu, S.; Loveland, Thomas R.

    2006-01-01

    Estimating dynamic terrestrial ecosystem carbon (C) sources and sinks over large areas is difficult. The scaling of C sources and sinks from the field level to the regional level has been challenging due to the variations of climate, soil, vegetation, and disturbances. As part of an effort to estimate the spatial, temporal, and sectional dimensions of the United States C sources and sinks (the U.S. Carbon Trends Project), this study estimated the forest ecosystem C sequestration of the Appalachian region (186,000 km2) for the period of 1972–2000 using the General Ensemble Biogeochemical Modeling System (GEMS) that has a strong capability of assimilating land use and land cover change (LUCC) data. On 82 sampling blocks in the Appalachian region, GEMS used sequential 60 m resolution land cover change maps to capture forest stand-replacing events and used forest inventory data to estimate non-stand-replacing changes. GEMS also used Monte Carlo approaches to deal with spatial scaling issues such as initialization of forest age and soil properties. Ensemble simulations were performed to incorporate the uncertainties of input data. Simulated results show that from 1972 to 2000 the net primary productivity (NPP), net ecosystem productivity (NEP), and net biome productivity (NBP) averaged 6.2 Mg C ha−1 y−1 (±1.1), 2.2 Mg C ha−1 y−1 (±0.6), and 1.8 Mg C ha−1 y−1(±0.6), respectively. The inter-annual variability was driven mostly by climate. Detailed C budgets for the year 2000 were also calculated. Within a total 148,000 km2 forested area, average forest ecosystem C density was estimated to be 186 Mg C ha−1 (±20), of which 98 Mg C ha−1 (±12) was in biomass and 88 Mg C ha−1 (±13) was in litter and soil. The total simulated C stock of the Appalachian forests was estimated to be 2751 Tg C (±296), including 1454 Tg C (±178) in living biomass and 1297 Tg C (±192) in litter and soil. The total net C sequestration (i.e. NBP) of the forest ecosystem in 2000 was estimated to be 19.5 Tg C y−1 (±6.8).

  1. Methodological Developments in Geophysical Assimilation Modeling

    NASA Astrophysics Data System (ADS)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to investigate critical issues related to knowledge reliability, such as uncertainty due to model structure error (conceptual uncertainty).

  2. Multi-Scale 4DVAR Assimilation of Glider Teams on the North Carolina Shelf

    NASA Astrophysics Data System (ADS)

    Osborne, J. J. V.; Carrier, M.; Book, J. W.; Barron, C. N.; Rice, A. E.; Rowley, C. D.; Smedstad, L.; Souopgui, I.; Teague, W. J.

    2017-12-01

    We demonstrate a method to assimilate glider profile data from multiple gliders in close proximity ( 10 km or less). Gliders were deployed in a field experiment from 17 May until 4 June 2017, north of Cape Hatteras and inshore of the Gulf Stream. Gliders were divided into two teams, generally two or three gliders per team. One team was tasked with station keeping and the other with moving and sampling regions of high variability in temperature and salinity. Glider data are assimilated into the Relocatable Navy Coastal Ocean Model (RELO NCOM) with four dimensional variational assimilation (NCOM-4DVAR). RELO NCOM is used by the US Navy to predict the ocean. RELO NCOM is a baroclinic, Boussinesq, free-surface, and hydrostatic ocean model with a flexible sigma-z vertical coordinate. Two domains are used, one focused north and one focused south of Cape Hatteras. The domains overlap near the gliders, thus providing two forecasts near the gliders. Both domains have 1 km horizontal resolution. Data are assimilated in a newly-developed multi-scale data-processing and assimilating approach using NCOM-4DVAR. This enables NCOM-4DVAR to use many more observations than standard NCOM-4DVAR, improving the analysis and forecast. Assimilation experiments use station-keeping glider data, moving glider data, or all glider data. Sea surface temperature (SST) data and satellite altimeter (SSH) data are also assimilated. An additional experiment omits glider data but still assimilates SST and SSH data. Conductivity, temperature, and depth (CTD) profiles from the R/V Savannah are used for validation, including data from an underway CTD (UCTD). Data from glider teams have the potential to significantly improve model forecasts. Missions using teams of gliders can be planned to maximize data assimilation for optimal impact on model predictions.

  3. The role of ensemble-based statistics in variational assimilation of cloud-affected observations from infrared imagers

    NASA Astrophysics Data System (ADS)

    Hacker, Joshua; Vandenberghe, Francois; Jung, Byoung-Jo; Snyder, Chris

    2017-04-01

    Effective assimilation of cloud-affected radiance observations from space-borne imagers, with the aim of improving cloud analysis and forecasting, has proven to be difficult. Large observation biases, nonlinear observation operators, and non-Gaussian innovation statistics present many challenges. Ensemble-variational data assimilation (EnVar) systems offer the benefits of flow-dependent background error statistics from an ensemble, and the ability of variational minimization to handle nonlinearity. The specific benefits of ensemble statistics, relative to static background errors more commonly used in variational systems, have not been quantified for the problem of assimilating cloudy radiances. A simple experiment framework is constructed with a regional NWP model and operational variational data assimilation system, to provide the basis understanding the importance of ensemble statistics in cloudy radiance assimilation. Restricting the observations to those corresponding to clouds in the background forecast leads to innovations that are more Gaussian. The number of large innovations is reduced compared to the more general case of all observations, but not eliminated. The Huber norm is investigated to handle the fat tails of the distributions, and allow more observations to be assimilated without the need for strict background checks that eliminate them. Comparing assimilation using only ensemble background error statistics with assimilation using only static background error statistics elucidates the importance of the ensemble statistics. Although the cost functions in both experiments converge to similar values after sufficient outer-loop iterations, the resulting cloud water, ice, and snow content are greater in the ensemble-based analysis. The subsequent forecasts from the ensemble-based analysis also retain more condensed water species, indicating that the local environment is more supportive of clouds. In this presentation we provide details that explain the apparent benefit from using ensembles for cloudy radiance assimilation in an EnVar context.

  4. Next generation initiation techniques

    NASA Technical Reports Server (NTRS)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The third kind of next-generation technique involves strategies to initialize convective scale (non-hydrostatic) models.

  5. Empowering Geoscience with Improved Data Assimilation Using the Data Assimilation Research Testbed "Manhattan" Release.

    NASA Astrophysics Data System (ADS)

    Raeder, K.; Hoar, T. J.; Anderson, J. L.; Collins, N.; Hendricks, J.; Kershaw, H.; Ha, S.; Snyder, C.; Skamarock, W. C.; Mizzi, A. P.; Liu, H.; Liu, J.; Pedatella, N. M.; Karspeck, A. R.; Karol, S. I.; Bitz, C. M.; Zhang, Y.

    2017-12-01

    The capabilities of the Data Assimilation Research Testbed (DART) at NCAR have been significantly expanded with the recent "Manhattan" release. DART is an ensemble Kalman filter based suite of tools, which enables researchers to use data assimilation (DA) without first becoming DA experts. Highlights: significant improvement in efficient ensemble DA for very large models on thousands of processors, direct read and write of model state files in parallel, more control of the DA output for finer-grained analysis, new model interfaces which are useful to a variety of geophysical researchers, new observation forward operators and the ability to use precomputed forward operators from the forecast model. The new model interfaces and example applications include the following: MPAS-A; Model for Prediction Across Scales - Atmosphere is a global, nonhydrostatic, variable-resolution mesh atmospheric model, which facilitates multi-scale analysis and forecasting. The absence of distinct subdomains eliminates problems associated with subdomain boundaries. It demonstrates the ability to consistently produce higher-quality analyses than coarse, uniform meshes do. WRF-Chem; Weather Research and Forecasting + (MOZART) Chemistry model assimilates observations from FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment). WACCM-X; Whole Atmosphere Community Climate Model with thermosphere and ionosphere eXtension assimilates observations of electron density to investigate sudden stratospheric warming. CESM (weakly) coupled assimilation; NCAR's Community Earth System Model is used for assimilation of atmospheric and oceanic observations into their respective components using coupled atmosphere+land+ocean+sea+ice forecasts. CESM2.0; Assimilation in the atmospheric component (CAM, WACCM) of the newly released version is supported. This version contains new and extensively updated components and software environment. CICE; Los Alamos sea ice model (in CESM) is used to assimilate multivariate sea ice concentration observations to constrain the model's ice thickness, concentration, and parameters.

  6. Light-Stimulated Bacterial Production and Amino Acid Assimilation by Cyanobacteria and Other Microbes in the North Atlantic Ocean▿

    PubMed Central

    Michelou, Vanessa K.; Cottrell, Matthew T.; Kirchman, David L.

    2007-01-01

    We examined the contribution of photoheterotrophic microbes—those capable of light-mediated assimilation of organic compounds—to bacterial production and amino acid assimilation along a transect from Florida to Iceland from 28 May to 9 July 2005. Bacterial production (leucine incorporation at a 20 nM final concentration) was on average 30% higher in light than in dark-incubated samples, but the effect varied greatly (3% to 60%). To further characterize this light effect, we examined the abundance of potential photoheterotrophs and measured their contribution to bacterial production and amino acid assimilation (0.5 nM addition) using flow cytometry. Prochlorococcus and Synechococcus were abundant in surface waters where light-dependent leucine incorporation was observed, whereas aerobic anoxygenic phototrophic bacteria were abundant but did not correlate with the light effect. The per-cell assimilation rates of Prochlorococcus and Synechococcus were comparable to or higher than those of other prokaryotes, especially in the light. Picoeukaryotes also took up leucine (20 nM) and other amino acids (0.5 nM), but rates normalized to biovolume were much lower than those of prokaryotes. Prochlorococcus was responsible for 80% of light-stimulated bacterial production and amino acid assimilation in surface waters south of the Azores, while Synechococcus accounted for on average 12% of total assimilation. However, nearly 40% of the light-stimulated leucine assimilation was not accounted for by these groups, suggesting that assimilation by other microbes is also affected by light. Our results clarify the contribution of cyanobacteria to photoheterotrophy and highlight the potential role of other photoheterotrophs in biomass production and dissolved-organic-matter assimilation. PMID:17630296

  7. Mitochondrial Respiration Can Support NO(3) and NO(2) Reduction during Photosynthesis : Interactions between Photosynthesis, Respiration, and N Assimilation in the N-Limited Green Alga Selenastrum minutum.

    PubMed

    Weger, H G; Turpin, D H

    1989-02-01

    Mass spectrometric analysis shows that assimilation of inorganic nitrogen (NH(4) (+), NO(2) (-), NO(3) (-)) by N-limited cells of Selenastrum minutum (Naeg.) Collins results in a stimulation of tricarboxylic acid cycle (TCA cycle) CO(2) release in both the light and dark. In a previous study we have shown that TCA cycle reductant generated during NH(4) (+) assimilation is oxidized via the cytochrome electron transport chain, resulting in an increase in respiratory O(2) consumption during photosynthesis (HG Weger, DG Birch, IR Elrifi, DH Turpin [1988] Plant Physiol 86: 688-692). NO(3) (-) and NO(2) (-) assimilation resulted in a larger stimulation of TCA cycle CO(2) release than did NH(4) (+), but a much smaller stimulation of mitochondrial O(2) consumption. NH(4) (+) assimilation was the same in the light and dark and insensitive to DCMU, but was 82% inhibited by anaerobiosis in both the light and dark. NO(3) (-) and NO(2) (-) assimilation rates were maximal in the light, but assimilation could proceed at substantial rates in the light in the presence of DCMU and in the dark. Unlike NH(4) (+), NO(3) (-) and NO(2) (-) assimilation were relatively insensitive to anaerobiosis. These results indicated that operation of the mitochondrial electron transport chain was not required to maintain TCA cycle activity during NO(3) (-) and NO(2) (-) assimilation, suggesting an alternative sink for TCA cycle generated reductant. Evaluation of changes in gross O(2) consumption during NO(3) (-) and NO(2) (-) assimilation suggest that TCA cycle reductant was exported to the chloroplast during photosynthesis and used to support NO(3) (-) and NO(2) (-) reduction.

  8. User's Guide - WRF Lightning Assimilation

    EPA Pesticide Factsheets

    This document describes how to run WRF with the lightning assimilation technique described in Heath et al. (2016). The assimilation method uses gridded lightning data to trigger and suppress sub-grid deep convection in Kain-Fritsch.

  9. Sampling, feasibility, and priors in data assimilation

    DOE PAGES

    Tu, Xuemin; Morzfeld, Matthias; Miller, Robert N.; ...

    2016-03-01

    Importance sampling algorithms are discussed in detail, with an emphasis on implicit sampling, and applied to data assimilation via particle filters. Implicit sampling makes it possible to use the data to find high-probability samples at relatively low cost, making the assimilation more efficient. A new analysis of the feasibility of data assimilation is presented, showing in detail why feasibility depends on the Frobenius norm of the covariance matrix of the noise and not on the number of variables. A discussion of the convergence of particular particle filters follows. A major open problem in numerical data assimilation is the determination ofmore » appropriate priors, a progress report on recent work on this problem is given. The analysis highlights the need for a careful attention both to the data and to the physics in data assimilation problems.« less

  10. An OSSE Study for Deep Argo Array using the GFDL Ensemble Coupled Data Assimilation System

    NASA Astrophysics Data System (ADS)

    Chang, You-Soon; Zhang, Shaoqing; Rosati, Anthony; Vecchi, Gabriel A.; Yang, Xiaosong

    2018-03-01

    An observing system simulation experiment (OSSE) using an ensemble coupled data assimilation system was designed to investigate the impact of deep ocean Argo profile assimilation in a biased numerical climate system. Based on the modern Argo observational array and an artificial extension to full depth, "observations" drawn from one coupled general circulation model (CM2.0) were assimilated into another model (CM2.1). Our results showed that coupled data assimilation with simultaneous atmospheric and oceanic constraints plays a significant role in preventing deep ocean drift. However, the extension of the Argo array to full depth did not significantly improve the quality of the oceanic climate estimation within the bias magnitude in the twin experiment. Even in the "identical" twin experiment for the deep Argo array from the same model (CM2.1) with the assimilation model, no significant changes were shown in the deep ocean, such as in the Atlantic meridional overturning circulation and the Antarctic bottom water cell. The small ensemble spread and corresponding weak constraints by the deep Argo profiles with medium spatial and temporal resolution may explain why the deep Argo profiles did not improve the deep ocean features in the assimilation system. Additional studies using different assimilation methods with improved spatial and temporal resolution of the deep Argo array are necessary in order to more thoroughly understand the impact of the deep Argo array on the assimilation system.

  11. Added Value of Assimilating Himawari-8 AHI Water Vapor Radiances on Analyses and Forecasts for "7.19" Severe Storm Over North China

    NASA Astrophysics Data System (ADS)

    Wang, Yuanbing; Liu, Zhiquan; Yang, Sen; Min, Jinzhong; Chen, Liqiang; Chen, Yaodeng; Zhang, Tao

    2018-04-01

    Himawari-8 is the first launched and operational new-generation geostationary meteorological satellite. The Advanced Himawari Imager (AHI) on board Himawari-8 provides continuous high-resolution observations of severe weather phenomena in space and time. In this study, the capability to assimilate AHI radiances has been developed within the Weather Research and Forecasting (WRF) model's data assimilation system. As the first attempt to assimilate AHI using WRF data assimilation at convective scales, the added value of hourly AHI clear-sky radiances from three water vapor channels on convection-permitting (3 km) analyses and forecasts of the "7.19" severe rainstorm that occurred over north China during 18-21 July 2016 was investigated. Analyses were produced hourly, and 24 h forecasts were produced every 6 h. The results showed that improved wind and humidity fields were obtained in analyses and forecasts verified against conventional observations after assimilating AHI water vapor radiances when compared to the control experiment which assimilated only conventional observations. It was also found that the assimilation of AHI water vapor radiances had a clearly positive impact on the rainfall forecast for the first 6 h lead time, especially for heavy rainfall exceeding 100 mm when verified against the observed rainfall. Furthermore, the horizontal and vertical distribution of features in the moisture fields were improved after assimilating AHI water vapor radiances, eventually contributing to a better forecast of the severe rainstorm.

  12. Variational Continuous Assimilation of TMI and SSM/I Rain Rates: Impact on GEOS-3 Hurricane Analyses and Forecasts

    NASA Technical Reports Server (NTRS)

    Hou, Arthur Y.; Zhang, Sara Q.; Reale, Oreste

    2003-01-01

    We describe a variational continuous assimilation (VCA) algorithm for assimilating tropical rainfall data using moisture and temperature tendency corrections as the control variable to offset model deficiencies. For rainfall assimilation, model errors are of special concern since model-predicted precipitation is based on parameterized moist physics, which can have substantial systematic errors. This study examines whether a VCA scheme using the forecast model as a weak constraint offers an effective pathway to precipitation assimilation. The particular scheme we exarnine employs a '1+1' dimension precipitation observation operator based on a 6-h integration of a column model of moist physics from the Goddard Earth Observing System (GEOS) global data assimilation system DAS). In earlier studies, we tested a simplified version of this scheme and obtained improved monthly-mean analyses and better short-range forecast skills. This paper describes the full implementation ofthe 1+1D VCA scheme using background and observation error statistics, and examines how it may improve GEOS analyses and forecasts of prominent tropical weather systems such as hurricanes. Parallel assimilation experiments with and without rainfall data for Hurricanes Bonnie and Floyd show that assimilating 6-h TMI and SSM/I surfice rain rates leads to more realistic storm features in the analysis, which, in turn, provide better initial conditions for 5-day storm track prediction and precipitation forecast. These results provide evidence that addressing model deficiencies in moisture tendency may be crucial to making effective use of precipitation information in data assimilation.

  13. Evaluation of radar and automatic weather station data assimilation for a heavy rainfall event in southern China

    NASA Astrophysics Data System (ADS)

    Hou, Tuanjie; Kong, Fanyou; Chen, Xunlai; Lei, Hengchi; Hu, Zhaoxia

    2015-07-01

    To improve the accuracy of short-term (0-12 h) forecasts of severe weather in southern China, a real-time storm-scale forecasting system, the Hourly Assimilation and Prediction System (HAPS), has been implemented in Shenzhen, China. The forecasting system is characterized by combining the Advanced Research Weather Research and Forecasting (WRF-ARW) model and the Advanced Regional Prediction System (ARPS) three-dimensional variational data assimilation (3DVAR) package. It is capable of assimilating radar reflectivity and radial velocity data from multiple Doppler radars as well as surface automatic weather station (AWS) data. Experiments are designed to evaluate the impacts of data assimilation on quantitative precipitation forecasting (QPF) by studying a heavy rainfall event in southern China. The forecasts from these experiments are verified against radar, surface, and precipitation observations. Comparison of echo structure and accumulated precipitation suggests that radar data assimilation is useful in improving the short-term forecast by capturing the location and orientation of the band of accumulated rainfall. The assimilation of radar data improves the short-term precipitation forecast skill by up to 9 hours by producing more convection. The slight but generally positive impact that surface AWS data has on the forecast of near-surface variables can last up to 6-9 hours. The assimilation of AWS observations alone has some benefit for improving the Fractions Skill Score (FSS) and bias scores; when radar data are assimilated, the additional AWS data may increase the degree of rainfall overprediction.

  14. Understanding the influence of assimilating satellite-derived observations on mesoscale analyses and forecasts of tropical cyclone track and structure

    NASA Astrophysics Data System (ADS)

    Wu, Ting-Chi

    This dissertation research explores the influence of assimilating satellite-derived observations on mesoscale numerical analyses and forecasts of tropical cyclones (TC). The ultimate goal is to provide more accurate mesoscale analyses of TC and its surrounding environment for superior TC track and intensity forecasts. High spatial and temporal resolution satellite-derived observations are prepared for two TC cases, Typhoon Sinlaku and Hurricane Ike (both 2008). The Advanced Research version of the Weather and Research Forecasting Model (ARW-WRF) is employed and data is assimilated using the Ensemble Adjustment Kalman Filter (EAKF) implemented in the Data Assimilation Research Testbed. In the first part of this research, the influence of assimilating enhanced atmospheric motion vectors (AMVs) derived from geostationary satellites is examined by comparing three parallel WRF/EnKF experiments. The control experiment assimilates the same AMV dataset assimilated in NCEP operational analysis along with conventional observations from radiosondes, aircraft, and advisory TC position data. During Sinlaku and Ike, the Cooperative Institute for Meteorological Satellite Studies (CIMSS) generates hourly AMVs along with Rapid-Scan (RS) AMVs when the satellite RS mode is activated. With an order of magnitude more AMV data assimilated, the assimilation of hourly CIMSS AMV dataset exhibit superior initial TC position, intensity and structure estimates to the control analyses and the subsequent short-range forecasts. When RS AMVs are processed and assimilated, the addition of RS AMVs offers additional modification to the TC and its environment and leads to Sinlaku's recurvature toward Japan, albeit prematurely. The results demonstrate the promise of assimilating enhanced AMV data into regional TC models. The second part of this research continues the work in the first part and further explores the influence of assimilating enhanced AMV datasets by conducting parallel data-denial WRF/EnKF experiments that assimilate AMVs subsetted horizontally by their distances to the TC center (interior and exterior) and vertically by their assigned heights (upper, middle, and lower layers). For both Sinlaku and Ike, it is found: 1) interior AMVs are important for accurate TC intensity, 2) excluding upper-layer AMVs generally results in larger track errors and ensemble spread, 3) exclusion of interior AMVs has the largest impact on the forecast of TC size than exclusively removing AMVs in particular tropospheric layers, 4) the largest ensemble spreads are found in track, intensity, and size forecasts when interior and upper-layer AMVs are not included, 5) withholding the middle-layer AMVs can improve the track forecasts. Findings from this study could influence future scenarios that involve the targeted acquisition and assimilation of high-density AMV observations in TC events. The last part of the research focuses on the assimilation of hyperspectral temperature and moisture soundings and microwave based vertically-integrated total precipitable water (TPW) products derived from polar-orbiting satellites. A comparison is made between the assimilation of soundings retrieved from the combined use of Advanced Microwave Scanning Radiometer and Atmospheric Infrared Sounder (AMSU-AIRS) and sounding products provided by CIMSS (CIMSS-AIRS). AMSU-AIRS soundings provide broad spatial coverage albeit coarse resolution, whilst CIMSS-AIRS is geared towards mesoscale applications and thus provide higher spatial resolution but restricted coverage due to the use of radiance in clear sky. The assimilation of bias-corrected CIMSS-AIRS soundings provides slightly more accurate TC structure than the control case. The assimilation of AMSU-AIRS improves the track forecasts but produces weaker and smaller storm. Preliminary results of assimilating TPW product derived from the Advanced Microwave Scanning Radiometer-EOS indicate improved TC structure over the control case. However, the short-range forecasts exhibit the largest TC track errors. In all, this study demonstrates the influence of assimilating high-resolution satellite data on mesoscale analyses and forecasts of TC track and structure. The results suggest the inclusion and assimilation of observations with high temporal resolution, broad spatial coverage, and greater proximity to TCs does indeed improve TC track and structure forecasts. Such findings are beneficial for future decisions on data collecting and retrievals that are essential for TC forecasts.

  15. Processing of phonological variation in children with hearing loss: compensation for English place assimilation in connected speech.

    PubMed

    Skoruppa, Katrin; Rosen, Stuart

    2014-06-01

    In this study, the authors explored phonological processing in connected speech in children with hearing loss. Specifically, the authors investigated these children's sensitivity to English place assimilation, by which alveolar consonants like t and n can adapt to following sounds (e.g., the word ten can be realized as tem in the phrase ten pounds). Twenty-seven 4- to 8-year-old children with moderate to profound hearing impairments, using hearing aids (n = 10) or cochlear implants (n = 17), and 19 children with normal hearing participated. They were asked to choose between pictures of familiar (e.g., pen) and unfamiliar objects (e.g., astrolabe) after hearing t- and n-final words in sentences. Standard pronunciations (Can you find the pen dear?) and assimilated forms in correct (… pem please?) and incorrect contexts (… pem dear?) were presented. As expected, the children with normal hearing chose the familiar object more often for standard forms and correct assimilations than for incorrect assimilations. Thus, they are sensitive to word-final place changes and compensate for assimilation. However, the children with hearing impairment demonstrated reduced sensitivity to word-final place changes, and no compensation for assimilation. Restricted analyses revealed that children with hearing aids who showed good perceptual skills compensated for assimilation in plosives only.

  16. A comparison of linear and non-linear data assimilation methods using the NEMO ocean model

    NASA Astrophysics Data System (ADS)

    Kirchgessner, Paul; Tödter, Julian; Nerger, Lars

    2015-04-01

    The assimilation behavior of the widely used LETKF is compared with the Equivalent Weight Particle Filter (EWPF) in a data assimilation application with an idealized configuration of the NEMO ocean model. The experiments show how the different filter methods behave when they are applied to a realistic ocean test case. The LETKF is an ensemble-based Kalman filter, which assumes Gaussian error distributions and hence implicitly requires model linearity. In contrast, the EWPF is a fully nonlinear data assimilation method that does not rely on a particular error distribution. The EWPF has been demonstrated to work well in highly nonlinear situations, like in a model solving a barotropic vorticity equation, but it is still unknown how the assimilation performance compares to ensemble Kalman filters in realistic situations. For the experiments, twin assimilation experiments with a square basin configuration of the NEMO model are performed. The configuration simulates a double gyre, which exhibits significant nonlinearity. The LETKF and EWPF are both implemented in PDAF (Parallel Data Assimilation Framework, http://pdaf.awi.de), which ensures identical experimental conditions for both filters. To account for the nonlinearity, the assimilation skill of the two methods is assessed by using different statistical metrics, like CRPS and Histograms.

  17. Leaf gas exchange of Andropogon gerardii Vitman, Panicum virgatum L., and Sorghastrum nutans (L.) Nash in a tallgrass prairie

    NASA Technical Reports Server (NTRS)

    Polley, H. W.; Norman, J. M.; Arkebauer, T. J.; Walter-Shea, E. A.; Greegor, D. H., Jr.; Bramer, B.

    1992-01-01

    Net CO2 assimilation as a function of internal CO2 and stomatal conductance to water vapor were measured on blades of the C4 grasses Andropogon gerardii Vitman, Panicum virgatrum L., and Sorghastrum nutans (L.) Nash in northeast Kansas over two growing seasons to determine the comparative physiological responses of these dominant grasses of the tallgrass prairie to environmental variables. The response of dark respiration to temperature and of net assimilation to CO2 concentration and absorbed quantum flux differed little among species. A. gerardii had lower potential photosynthetic rates at internal CO2 concentrations below saturation than P. virgatum and S. nutans, but net assimilation under ambient conditions was similar in the three species. Net assimilation and both the initial slope of assimilation versus internal CO2 curves and the maximum potential assimilation rate decreased as leaf water potential declined in blades of A. gerardii and S. nutans. Changes in assimilation capacity were paralleled by changes in stomatal conductance that were similar in all three species. The strong correlations among processes regulating leaf CO2 assimilation and transpiration in A. gerardii, P. virgatum, and S. nutans suggest that the processes are tightly and similarly coupled in these grasses over a wide range of environmental conditions encountered in the tallgrass prairie.

  18. Impact of Flow-Dependent Error Correlations and Tropospheric Chemistry on Assimilated Ozone

    NASA Technical Reports Server (NTRS)

    Wargan, K.; Stajner, I.; Hayashi, H.; Pawson, S.; Jones, D. B. A.

    2003-01-01

    The presentation compares different versions of a global three-dimensional ozone data assimilation system developed at NASA's Data Assimilation Office. The Solar Backscatter Ultraviolet/2 (SBUV/2) total and partial ozone column retrievals are the sole data assimilated in all of the experiments presented. We study the impact of changing the forecast error covariance model from a version assuming static correlations with a one that captures a short-term Lagrangian evolution of those correlations. This is further combined with a study of the impact of neglecting the tropospheric ozone production, loss and dry deposition rates, which are obtained from the Harvard GEOS-CHEM model. We compare statistical characteristics of the assimilated data and the results of validation against independent observations, obtained from WMO balloon-borne sondes and the Polar Ozone and Aerosol Measurement (POAM) III instrument. Experiments show that allowing forecast error correlations to evolve with the flow results in positive impact on assimilated ozone within the regions where data were not assimilated, particularly at high latitudes in both hemispheres. On the other hand, the main sensitivity to tropospheric chemistry is in the Tropics and sub-Tropics. The best agreement between the assimilated ozone and the in-situ sonde data is in the experiment using both flow-dependent error covariances and tropospheric chemistry.

  19. Assessment of Two Types of Observations (SATWND and GPSRO) for the Operational Global 4DVAR System

    NASA Astrophysics Data System (ADS)

    Leng, H.

    2017-12-01

    The performance of a data assimilation system is significantly dependent on the quality and quantity of observations assimilated. In these years, more and more satellite observations have been applied in many operational assimilation systems. In this paper, the assessment of satellite-derived winds (SATWND) and GPS radio occultation (GPSRO) bending angles has been performed using a range of diagnostics. The main positive impacts are made when satellite-derived cloud data (GOES cloud data and MODIS cloud data) is assimilated, but benefit is hardly obtained from GPSRO data in the Operational Global 4DVAR System. In a full system configuration, the assimilation of satellite-derived observations is globally beneficial on the analysis, and the benefit can be well propagated into the forecast. The assimilation of the GPSRO observations has a slightly positive impact in the Tropics, but is neutral in the Northern Hemisphere and in the Southern Hemisphere. To assess the synergies of satellite-derived observations with other types of observation, experiments assimilating satellite-derived data and AMSU-A and AMSU-B observations were run. The results show that the analysis increments structure is not modified when AMSU-A and AMSU-B observations are also assimilated. This suggests that the impact of satellite-derived observations is not limited by the large impact of satellite radiance observations.

  20. Bio-Optical Data Assimilation With Observational Error Covariance Derived From an Ensemble of Satellite Images

    NASA Astrophysics Data System (ADS)

    Shulman, Igor; Gould, Richard W.; Frolov, Sergey; McCarthy, Sean; Penta, Brad; Anderson, Stephanie; Sakalaukus, Peter

    2018-03-01

    An ensemble-based approach to specify observational error covariance in the data assimilation of satellite bio-optical properties is proposed. The observational error covariance is derived from statistical properties of the generated ensemble of satellite MODIS-Aqua chlorophyll (Chl) images. The proposed observational error covariance is used in the Optimal Interpolation scheme for the assimilation of MODIS-Aqua Chl observations. The forecast error covariance is specified in the subspace of the multivariate (bio-optical, physical) empirical orthogonal functions (EOFs) estimated from a month-long model run. The assimilation of surface MODIS-Aqua Chl improved surface and subsurface model Chl predictions. Comparisons with surface and subsurface water samples demonstrate that data assimilation run with the proposed observational error covariance has higher RMSE than the data assimilation run with "optimistic" assumption about observational errors (10% of the ensemble mean), but has smaller or comparable RMSE than data assimilation run with an assumption that observational errors equal to 35% of the ensemble mean (the target error for satellite data product for chlorophyll). Also, with the assimilation of the MODIS-Aqua Chl data, the RMSE between observed and model-predicted fractions of diatoms to the total phytoplankton is reduced by a factor of two in comparison to the nonassimilative run.

  1. Fundamental principles of data assimilation underlying the Verdandi library: applications to biophysical model personalization within euHeart.

    PubMed

    Chapelle, D; Fragu, M; Mallet, V; Moireau, P

    2013-11-01

    We present the fundamental principles of data assimilation underlying the Verdandi library, and how they are articulated with the modular architecture of the library. This translates--in particular--into the definition of standardized interfaces through which the data assimilation library interoperates with the model simulation software and the so-called observation manager. We also survey various examples of data assimilation applied to the personalization of biophysical models, in particular, for cardiac modeling applications within the euHeart European project. This illustrates the power of data assimilation concepts in such novel applications, with tremendous potential in clinical diagnosis assistance.

  2. Yeast identification: reassessment of assimilation tests as sole universal identifiers.

    PubMed

    Spencer, J; Rawling, S; Stratford, M; Steels, H; Novodvorska, M; Archer, D B; Chandra, S

    2011-11-01

    To assess whether assimilation tests in isolation remain a valid method of identification of yeasts, when applied to a wide range of environmental and spoilage isolates. Seventy-one yeast strains were isolated from a soft drinks factory. These were identified using assimilation tests and by D1/D2 rDNA sequencing. When compared to sequencing, assimilation test identifications (MicroLog™) were 18·3% correct, a further 14·1% correct within the genus and 67·6% were incorrectly identified. The majority of the latter could be attributed to the rise in newly reported yeast species. Assimilation tests alone are unreliable as a universal means of yeast identification, because of numerous new species, variability of strains and increasing coincidence of assimilation profiles. Assimilation tests still have a useful role in the identification of common species, such as the majority of clinical isolates. It is probable, based on these results, that many yeast identifications reported in older literature are incorrect. This emphasizes the crucial need for accurate identification in present and future publications. © 2011 The Authors. Letters in Applied Microbiology © 2011 The Society for Applied Microbiology.

  3. A study on characteristics of retrospective optimal interpolation with WRF testbed

    NASA Astrophysics Data System (ADS)

    Kim, S.; Noh, N.; Lim, G.

    2012-12-01

    This study presents the application of retrospective optimal interpolation (ROI) with Weather Research and Forecasting model (WRF). Song et al. (2009) suggest ROI method which is an optimal interpolation (OI) that gradually assimilates observations over the analysis window for variance-minimum estimate of an atmospheric state at the initial time of the analysis window. Song and Lim (2011) improve the method by incorporating eigen-decomposition and covariance inflation. ROI method assimilates the data at post analysis time using perturbation method (Errico and Raeder, 1999) without adjoint model. In this study, ROI method is applied to WRF model to validate the algorithm and to investigate the capability. The computational costs for ROI can be reduced due to the eigen-decomposition of background error covariance. Using the background error covariance in eigen-space, 1-profile assimilation experiment is performed. The difference between forecast errors with assimilation and without assimilation is obviously increased as time passed, which means the improvement of forecast error by assimilation. The characteristics and strength/weakness of ROI method are investigated by conducting the experiments with other data assimilation method.

  4. All-Sky Microwave Imager Data Assimilation at NASA GMAO

    NASA Technical Reports Server (NTRS)

    Kim, Min-Jeong; Jin, Jianjun; El Akkraoui, Amal; McCarty, Will; Todling, Ricardo; Gu, Wei; Gelaro, Ron

    2017-01-01

    Efforts in all-sky satellite data assimilation at the Global Modeling and Assimilation Office (GMAO) at NASA Goddard Space Flight Center have been focused on the development of GSI configurations to assimilate all-sky data from microwave imagers such as the GPM Microwave Imager (GMI) and Global Change Observation Mission-Water (GCOM-W) Advanced Microwave Scanning Radiometer 2 (AMSR-2). Electromagnetic characteristics associated with their wavelengths allow microwave imager data to be relatively transparent to atmospheric gases and thin ice clouds, and highly sensitive to precipitation. Therefore, GMAOs all-sky data assimilation efforts are primarily focused on utilizing these data in precipitating regions. The all-sky framework being tested at GMAO employs the GSI in a hybrid 4D-EnVar configuration of the Goddard Earth Observing System (GEOS) data assimilation system, which will be included in the next formal update of GEOS. This article provides an overview of the development of all-sky radiance assimilation in GEOS, including some performance metrics. In addition, various projects underway at GMAO designed to enhance the all-sky implementation will be introduced.

  5. Assessment of Data Assimilation with the Prototype High Resolution Rapid Refresh for Alaska (HRRRAK)

    NASA Technical Reports Server (NTRS)

    Harrison, Kayla; Morton, Don; Zavodsky, Brad; Chou, Shih

    2012-01-01

    The Arctic Region Supercomputing Center has been running a quasi-operational prototype of a High Resolution Rapid Refresh for Alaska (HRRRAK) at 3km resolution, initialized by the 13km Rapid Refresh (RR). Although the RR assimilates a broad range of observations into its analyses, experiments with the HRRRAK suggest that there may be added value in assimilating observations into the 3km initial conditions, downscaled from the 13km RR analyses. The NASA Short-term Prediction Research and Transition (SPoRT) group has been using assimilated data from the Atmospheric Infrared Sounder (AIRS) in WRF and WRF-Var simulations since 2004 with promising results. The sounder is aboard NASA s Aqua satellite, and provides vertical profiles of temperature and humidity. The Gridpoint Statistical Interpolation (GSI) system is then used to assimilate these vertical profiles into WRF forecasts. In this work, we assess the use of AIRS data in combination with other global data assimilation products on non-assimilated HRRRAK case studies. Two separate weather events will be assessed to qualitatively and quantitatively assess the impacts of AIRS data on HRRRAK forecasts.

  6. Advances in Land Data Assimilation at the NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Reichle, Rolf

    2009-01-01

    Research in land surface data assimilation has grown rapidly over the last decade. In this presentation we provide a brief overview of key research contributions by the NASA Goddard Space Flight Center (GSFC). The GSFC contributions to land assimilation primarily include the continued development and application of the Land Information System (US) and the ensemble Kalman filter (EnKF). In particular, we have developed a method to generate perturbation fields that are correlated in space, time, and across variables and that permit the flexible modeling of errors in land surface models and observations, along with an adaptive filtering approach that estimates observation and model error input parameters. A percentile-based scaling method that addresses soil moisture biases in model and observational estimates opened the path to the successful application of land data assimilation to satellite retrievals of surface soil moisture. Assimilation of AMSR-E surface soil moisture retrievals into the NASA Catchment model provided superior surface and root zone assimilation products (when validated against in situ measurements and compared to the model estimates or satellite observations alone). The multi-model capabilities of US were used to investigate the role of subsurface physics in the assimilation of surface soil moisture observations. Results indicate that the potential of surface soil moisture assimilation to improve root zone information is higher when the surface to root zone coupling is stronger. Building on this experience, GSFC leads the development of the Level 4 Surface and Root-Zone Soil Moisture (L4_SM) product for the planned NASA Soil-Moisture-Active-Passive (SMAP) mission. A key milestone was the design and execution of an Observing System Simulation Experiment that quantified the contribution of soil moisture retrievals to land data assimilation products as a function of retrieval and land model skill and yielded an estimate of the error budget for the SMAP L4_SM product. Terrestrial water storage observations from GRACE satellite system were also successfully assimilated into the NASA Catchment model and provided improved estimates of groundwater variability when compared to the model estimates alone. Moreover, satellite-based land surface temperature (LST) observations from the ISCCP archive were assimilated using a bias estimation module that was specifically designed for LST assimilation. As with soil moisture, LST assimilation provides modest yet statistically significant improvements when compared to the model or satellite observations alone. To achieve the improvement, however, the LST assimilation algorithm must be adapted to the specific formulation of LST in the land model. An improved method for the assimilation of snow cover observations was also developed. Finally, the coupling of LIS to the mesoscale Weather Research and Forecasting (WRF) model enabled investigations into how the sensitivity of land-atmosphere interactions to the specific choice of planetary boundary layer scheme and land surface model varies across surface moisture regimes, and how it can be quantified and evaluated against observations. The on-going development and integration of land assimilation modules into the Land Information System will enable the use of GSFC software with a variety of land models and make it accessible to the research community.

  7. Climatology and Archived Data - Naval Oceanography Portal

    Science.gov Websites

    Archived Data godae_text_logo.png Global Ocean Data Assimilation Experiment (GODAE) The Global Ocean Data Assimilation Experiment (GODAE) is a practical demonstration of near-real-time, global ocean data assimilation

  8. Time compression diseconomies in environmental management: the effect of assimilation on environmental performance.

    PubMed

    Lannelongue, Gustavo; Gonzalez-Benito, Javier; Gonzalez-Benito, Oscar; Gonzalez-Zapatero, Carmen

    2015-01-01

    This research addresses the relationship between an organisation's assimilation of its environmental management system (EMS), the experience it gains through it, and its environmental performance. Assimilation here refers to the degree to which the requirements of the management standard are integrated within a plant's daily operations. Basing ourselves on the heterogeneity of organisations, we argue that assimilation and experience will inform environmental performance. Furthermore, we posit that the relationship between assimilation and environmental performance depends on experience. The attempt to obtain greater assimilation in a shorter time leads an organisation to record a poorer environmental outcome, which we shall refer to as time compression diseconomies in environmental management. We provide empirical evidence based on 154 plants pertaining to firms in Spain subject to the European Union's CO2 Emissions Trading System. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. MATLAB algorithm to implement soil water data assimilation with the Ensemble Kalman Filter using HYDRUS.

    PubMed

    Valdes-Abellan, Javier; Pachepsky, Yakov; Martinez, Gonzalo

    2018-01-01

    Data assimilation is becoming a promising technique in hydrologic modelling to update not only model states but also to infer model parameters, specifically to infer soil hydraulic properties in Richard-equation-based soil water models. The Ensemble Kalman Filter method is one of the most widely employed method among the different data assimilation alternatives. In this study the complete Matlab© code used to study soil data assimilation efficiency under different soil and climatic conditions is shown. The code shows the method how data assimilation through EnKF was implemented. Richards equation was solved by the used of Hydrus-1D software which was run from Matlab. •MATLAB routines are released to be used/modified without restrictions for other researchers•Data assimilation Ensemble Kalman Filter method code.•Soil water Richard equation flow solved by Hydrus-1D.

  10. Ocean Data Assimilation in Support of Climate Applications: Status and Perspectives.

    PubMed

    Stammer, D; Balmaseda, M; Heimbach, P; Köhl, A; Weaver, A

    2016-01-01

    Ocean data assimilation brings together observations with known dynamics encapsulated in a circulation model to describe the time-varying ocean circulation. Its applications are manifold, ranging from marine and ecosystem forecasting to climate prediction and studies of the carbon cycle. Here, we address only climate applications, which range from improving our understanding of ocean circulation to estimating initial or boundary conditions and model parameters for ocean and climate forecasts. Because of differences in underlying methodologies, data assimilation products must be used judiciously and selected according to the specific purpose, as not all related inferences would be equally reliable. Further advances are expected from improved models and methods for estimating and representing error information in data assimilation systems. Ultimately, data assimilation into coupled climate system components is needed to support ocean and climate services. However, maintaining the infrastructure and expertise for sustained data assimilation remains challenging.

  11. Perceptual belongingness determines the direction of lightness induction depending on grouping stability and intentionality.

    PubMed

    Murgia, Mauro; Prpic, Valter; Santoro, Ilaria; Sors, Fabrizio; Agostini, Tiziano; Galmonte, Alessandra

    2016-09-01

    Contrast and assimilation are two opposite perceptual phenomena deriving from the relationships among perceptual elements in a visual field. In contrast, perceptual differences are enhanced; while, in assimilation, they are decreased. Indeed, if contrast or assimilation occurs depends on various factors. Interestingly, Gestalt scientists explained both phenomena as the result of perceptual belongingness, giving rise to an intriguing paradox. Benary suggested that belongingness determines contrast; conversely, Fuchs suggested that it determines assimilation. This paradox can be related both to the grouping stability (stable/multi-stable) and to the grouping intentionality (intentional/non-intentional). In the present work we ran four experiments to test whether the contrast/assimilation outcomes depend on the above-mentioned variables. We found that, intentionality and multi-stability elicit assimilation; while, non-intentionality and stability elicit contrast. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Effects of sounding temperature assimilation on weather forecasting - Model dependence studies

    NASA Technical Reports Server (NTRS)

    Ghil, M.; Halem, M.; Atlas, R.

    1979-01-01

    In comparing various methods for the assimilation of remote sounding information into numerical weather prediction (NWP) models, the problem of model dependence for the different results obtained becomes important. The paper investigates two aspects of the model dependence question: (1) the effect of increasing horizontal resolution within a given model on the assimilation of sounding data, and (2) the effect of using two entirely different models with the same assimilation method and sounding data. Tentative conclusions reached are: first, that model improvement as exemplified by increased resolution, can act in the same direction as judicious 4-D assimilation of remote sounding information, to improve 2-3 day numerical weather forecasts. Second, that the time continuous 4-D methods developed at GLAS have similar beneficial effects when used in the assimilation of remote sounding information into NWP models with very different numerical and physical characteristics.

  13. Response of an eddy-permitting ocean model to the assimilation of sparse in situ data

    NASA Astrophysics Data System (ADS)

    Li, Jian-Guo; Killworth, Peter D.; Smeed, David A.

    2003-04-01

    The response of an eddy-permitting ocean model to changes introduced by data assimilation is studied when the available in situ data are sparse in both space and time (typical for the majority of the ocean). Temperature and salinity (T&S) profiles from the WOCE upper ocean thermal data set were assimilated into a primitive equation ocean model over the North Atlantic, using a simple nudging scheme with a time window of about 2 days and a horizontal spatial radius of about 1°. When data are sparse the model returns to its unassimilated behavior, locally "forgetting" or rejecting the assimilation, on timescales determined by the local advection and diffusion. Increasing the spatial weighting radius effectively reduces both processes and hence lengthens the model restoring time (and with it, the impact of assimilation). Increasing the nudging factor enhances the assimilation effect but has little effect on the model restoring time.

  14. Anaerobic metabolism in the N-limited green alga Selenastrum minutum. 3. Alanine is the product of anaerobic ammonium assimilation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanlerberghe, G.C.; Turpin, D.H.; Joy, K.W.

    The authors have determined the flow of {sup 15}N into free amino acids of the N-limited green alga Selenastrum minutum (Naeg.) Collins after addition of {sup 15}NH{sub 4}{sup +} to aerobic or anaerobic cells. Under aerobic conditions, only a small proportion of the N assimilated was retained in the free amino acid pool. However, under anaerobic conditions almost all assimilated NH{sub 4}{sup +} accumulates in alanine. This is a unique feature of anaerobic NH{sub 4}{sup +} assimilation. The pathway of carbon flow to alanine results in the production of ATP and reductant which matches exactly the requirements of NH{sub 4}{supmore » +} assimilation. Alanine synthesis is therefore an excellent strategy to maintain energy and redox balance during anaerobic NH{sub 4}{sup +} assimilation.« less

  15. Evaluation of Oceanic Surface Observation for Reproducing the Upper Ocean Structure in ECHAM5/MPI-OM

    NASA Astrophysics Data System (ADS)

    Luo, Hao; Zheng, Fei; Zhu, Jiang

    2017-12-01

    Better constraints of initial conditions from data assimilation are necessary for climate simulations and predictions, and they are particularly important for the ocean due to its long climate memory; as such, ocean data assimilation (ODA) is regarded as an effective tool for seasonal to decadal predictions. In this work, an ODA system is established for a coupled climate model (ECHAM5/MPI-OM), which can assimilate all available oceanic observations using an ensemble optimal interpolation approach. To validate and isolate the performance of different surface observations in reproducing air-sea climate variations in the model, a set of observing system simulation experiments (OSSEs) was performed over 150 model years. Generally, assimilating sea surface temperature, sea surface salinity, and sea surface height (SSH) can reasonably reproduce the climate variability and vertical structure of the upper ocean, and assimilating SSH achieves the best results compared to the true states. For the El Niño-Southern Oscillation (ENSO), assimilating different surface observations captures true aspects of ENSO well, but assimilating SSH can further enhance the accuracy of ENSO-related feedback processes in the coupled model, leading to a more reasonable ENSO evolution and air-sea interaction over the tropical Pacific. For ocean heat content, there are still limitations in reproducing the long time-scale variability in the North Atlantic, even if SSH has been taken into consideration. These results demonstrate the effectiveness of assimilating surface observations in capturing the interannual signal and, to some extent, the decadal signal but still highlight the necessity of assimilating profile data to reproduce specific decadal variability.

  16. Assimilation of Satellite-Derived Skin Temperature Observations into Land Surface Models

    NASA Technical Reports Server (NTRS)

    Reichle, Rolf H.; Kumar, Sujay V.; Mahanama, P. P.; Koster, Randal D.; Liu, Q.

    2010-01-01

    Land surface (or "skin") temperature (LST) lies at the heart of the surface energy balance and is a key variable in weather and climate models. Here we assimilate LST retrievals from the International Satellite Cloud Climatology Project (ISCCP) into the Noah and Catchment (CLSM) land surface models using an ensemble-based, off-line land data assimilation system. LST is described very differently in the two models. A priori scaling and dynamic bias estimation approaches are applied because satellite and model LST typically exhibit different mean values and variability. Performance is measured against 27 months of in situ measurements from the Coordinated Energy and Water Cycle Observations Project at 48 stations. LST estimates from Noah and CLSM without data assimilation ("open loop") are comparable to each other and superior to that of ISCCP retrievals. For LST, RMSE values are 4.9 K (CLSM), 5.6 K (Noah), and 7.6 K (ISCCP), and anomaly correlation coefficients (R) are 0.62 (CLSM), 0.61 (Noah), and 0.52 (ISCCP). Assimilation of ISCCP retrievals provides modest yet statistically significant improvements (over open loop) of up to 0.7 K in RMSE and 0.05 in anomaly R. The skill of surface turbulent flux estimates from the assimilation integrations is essentially identical to the corresponding open loop skill. Noah assimilation estimates of ground heat flux, however, can be significantly worse than open loop estimates. Provided the assimilation system is properly adapted to each land model, the benefits from the assimilation of LST retrievals are comparable for both models.

  17. Soil moisture estimation by assimilating L-band microwave brightness temperature with geostatistics and observation localization.

    PubMed

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects.

  18. Multi-parametric variational data assimilation for hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  19. Simultaneous Radar and Satellite Data Storm-Scale Assimilation Using an Ensemble Kalman Filter Approach for 24 May 2011

    NASA Technical Reports Server (NTRS)

    Jones, Thomas A.; Stensrud, David; Wicker, Louis; Minnis, Patrick; Palikonda, Rabindra

    2015-01-01

    Assimilating high-resolution radar reflectivity and radial velocity into convection-permitting numerical weather prediction models has proven to be an important tool for improving forecast skill of convection. The use of satellite data for the application is much less well understood, only recently receiving significant attention. Since both radar and satellite data provide independent information, combing these two sources of data in a robust manner potentially represents the future of high-resolution data assimilation. This research combines Geostationary Operational Environmental Satellite 13 (GOES-13) cloud water path (CWP) retrievals with Weather Surveillance Radar-1988 Doppler (WSR-88D) reflectivity and radial velocity to examine the impacts of assimilating each for a severe weather event occurring in Oklahoma on 24 May 2011. Data are assimilated into a 3-km model using an ensemble adjustment Kalman filter approach with 36 members over a 2-h assimilation window between 1800 and 2000 UTC. Forecasts are then generated for 90 min at 5-min intervals starting at 1930 and 2000 UTC. Results show that both satellite and radar data are able to initiate convection, but that assimilating both spins up a storm much faster. Assimilating CWP also performs well at suppressing spurious precipitation and cloud cover in the model as well as capturing the anvil characteristics of developed storms. Radar data are most effective at resolving the 3D characteristics of the core convection. Assimilating both satellite and radar data generally resulted in the best model analysis and most skillful forecast for this event.

  20. Effect of Data Assimilation Parameters on The Optimized Surface CO2 Flux in Asia

    NASA Astrophysics Data System (ADS)

    Kim, Hyunjung; Kim, Hyun Mee; Kim, Jinwoong; Cho, Chun-Ho

    2018-02-01

    In this study, CarbonTracker, an inverse modeling system based on the ensemble Kalman filter, was used to evaluate the effects of data assimilation parameters (assimilation window length and ensemble size) on the estimation of surface CO2 fluxes in Asia. Several experiments with different parameters were conducted, and the results were verified using CO2 concentration observations. The assimilation window lengths tested were 3, 5, 7, and 10 weeks, and the ensemble sizes were 100, 150, and 300. Therefore, a total of 12 experiments using combinations of these parameters were conducted. The experimental period was from January 2006 to December 2009. Differences between the optimized surface CO2 fluxes of the experiments were largest in the Eurasian Boreal (EB) area, followed by Eurasian Temperate (ET) and Tropical Asia (TA), and were larger in boreal summer than in boreal winter. The effect of ensemble size on the optimized biosphere flux is larger than the effect of the assimilation window length in Asia, but the importance of them varies in specific regions in Asia. The optimized biosphere flux was more sensitive to the assimilation window length in EB, whereas it was sensitive to the ensemble size as well as the assimilation window length in ET. The larger the ensemble size and the shorter the assimilation window length, the larger the uncertainty (i.e., spread of ensemble) of optimized surface CO2 fluxes. The 10-week assimilation window and 300 ensemble size were the optimal configuration for CarbonTracker in the Asian region based on several verifications using CO2 concentration measurements.

  1. Assimilation of Gridded GRACE Terrestrial Water Storage Estimates in the North American Land Data Assimilation System

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Zaitchik, Benjamin F.; Peters-Lidard, Christa D.; Rodell, Matthew; Reichle, Rolf; Li, Bailing; Jasinski, Michael; Mocko, David; Getirana, Augusto; De Lannoy, Gabrielle; hide

    2016-01-01

    The objective of the North American Land Data Assimilation System (NLDAS) is to provide best available estimates of near-surface meteorological conditions and soil hydrological status for the continental United States. To support the ongoing efforts to develop data assimilation (DA) capabilities for NLDAS, the results of Gravity Recovery and Climate Experiment (GRACE) DA implemented in a manner consistent with NLDAS development are presented. Following previous work, GRACE terrestrial water storage (TWS) anomaly estimates are assimilated into the NASA Catchment land surface model using an ensemble smoother. In contrast to many earlier GRACE DA studies, a gridded GRACE TWS product is assimilated, spatially distributed GRACE error estimates are accounted for, and the impact that GRACE scaling factors have on assimilation is evaluated. Comparisons with quality-controlled in situ observations indicate that GRACE DA has a positive impact on the simulation of unconfined groundwater variability across the majority of the eastern United States and on the simulation of surface and root zone soil moisture across the country. Smaller improvements are seen in the simulation of snow depth, and the impact of GRACE DA on simulated river discharge and evapotranspiration is regionally variable. The use of GRACE scaling factors during assimilation improved DA results in the western United States but led to small degradations in the eastern United States. The study also found comparable performance between the use of gridded and basin averaged GRACE observations in assimilation. Finally, the evaluations presented in the paper indicate that GRACE DA can be helpful in improving the representation of droughts.

  2. On the assimilation of absolute geodetic dynamic topography in a global ocean model: impact on the deep ocean state

    NASA Astrophysics Data System (ADS)

    Androsov, Alexey; Nerger, Lars; Schnur, Reiner; Schröter, Jens; Albertella, Alberta; Rummel, Reiner; Savcenko, Roman; Bosch, Wolfgang; Skachko, Sergey; Danilov, Sergey

    2018-05-01

    General ocean circulation models are not perfect. Forced with observed atmospheric fluxes they gradually drift away from measured distributions of temperature and salinity. We suggest data assimilation of absolute dynamical ocean topography (DOT) observed from space geodetic missions as an option to reduce these differences. Sea surface information of DOT is transferred into the deep ocean by defining the analysed ocean state as a weighted average of an ensemble of fully consistent model solutions using an error-subspace ensemble Kalman filter technique. Success of the technique is demonstrated by assimilation into a global configuration of the ocean circulation model FESOM over 1 year. The dynamic ocean topography data are obtained from a combination of multi-satellite altimetry and geoid measurements. The assimilation result is assessed using independent temperature and salinity analysis derived from profiling buoys of the AGRO float data set. The largest impact of the assimilation occurs at the first few analysis steps where both the model ocean topography and the steric height (i.e. temperature and salinity) are improved. The continued data assimilation over 1 year further improves the model state gradually. Deep ocean fields quickly adjust in a sustained manner: A model forecast initialized from the model state estimated by the data assimilation after only 1 month shows that improvements induced by the data assimilation remain in the model state for a long time. Even after 11 months, the modelled ocean topography and temperature fields show smaller errors than the model forecast without any data assimilation.

  3. Soil Moisture Estimation by Assimilating L-Band Microwave Brightness Temperature with Geostatistics and Observation Localization

    PubMed Central

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects. PMID:25635771

  4. Mapping Surface Heat Fluxes by Assimilating SMAP Soil Moisture and GOES Land Surface Temperature Data

    NASA Astrophysics Data System (ADS)

    Lu, Yang; Steele-Dunne, Susan C.; Farhadi, Leila; van de Giesen, Nick

    2017-12-01

    Surface heat fluxes play a crucial role in the surface energy and water balance. In situ measurements are costly and difficult, and large-scale flux mapping is hindered by surface heterogeneity. Previous studies have demonstrated that surface heat fluxes can be estimated by assimilating land surface temperature (LST) and soil moisture to determine two key parameters: a neutral bulk heat transfer coefficient (CHN) and an evaporative fraction (EF). Here a methodology is proposed to estimate surface heat fluxes by assimilating Soil Moisture Active Passive (SMAP) soil moisture data and Geostationary Operational Environmental Satellite (GOES) LST data into a dual-source (DS) model using a hybrid particle assimilation strategy. SMAP soil moisture data are assimilated using a particle filter (PF), and GOES LST data are assimilated using an adaptive particle batch smoother (APBS) to account for the large gap in the spatial and temporal resolution. The methodology is implemented in an area in the U.S. Southern Great Plains. Assessment against in situ observations suggests that soil moisture and LST estimates are in better agreement with observations after assimilation. The RMSD for 30 min (daytime) flux estimates is reduced by 6.3% (8.7%) and 31.6% (37%) for H and LE on average. Comparison against a LST-only and a soil moisture-only assimilation case suggests that despite the coarse resolution, assimilating SMAP soil moisture data is not only beneficial but also crucial for successful and robust flux estimation, particularly when the uncertainties in the model estimates are large.

  5. Impact of glider data assimilation on the Monterey Bay model

    NASA Astrophysics Data System (ADS)

    Shulman, Igor; Rowley, Clark; Anderson, Stephanie; DeRada, Sergio; Kindle, John; Martin, Paul; Doyle, James; Cummings, James; Ramp, Steve; Chavez, Francisco; Fratantoni, David; Davis, Russ

    2009-02-01

    Glider observations were essential components of the observational program in the Autonomous Ocean Sampling Network (AOSN-II) experiment in the Monterey Bay area during summer of 2003. This paper is focused on the impact of the assimilation of glider temperature and salinity observations on the Navy Coastal Ocean Model (NCOM) predictions of surface and subsurface properties. The modeling system consists of an implementation of the NCOM model using a curvilinear, orthogonal grid with 1-4 km resolution, with finest resolution around the bay. The model receives open boundary conditions from a regional (9 km resolution) NCOM implementation for the California Current System, and surface fluxes from the Coupled Ocean-Atmosphere Mesoscale Prediction System (COAMPS) atmospheric model at 3 km resolution. The data assimilation component of the system is a version of the Navy Coupled Ocean Data Assimilation (NCODA) system, which is used for assimilation of the glider data into the NCOM model of the Monterey Bay area. The NCODA is a fully 3D multivariate optimum interpolation system that produces simultaneous analyses of temperature, salinity, geopotential, and vector velocity. Assimilation of glider data improves the surface temperature at the mooring locations for the NCOM model hindcast and nowcasts, and for the short-range (1-1.5 days) forecasts. It is shown that it is critical to have accurate atmospheric forcing for more extended forecasts. Assimilation of glider data provided better agreement with independent observations (for example, with aircraft measured SSTs) of the model-predicted and observed spatial distributions of surface temperature and salinity. Mooring observations of subsurface temperature and salinity show sharp changes in the thermocline and halocline depths during transitions from upwelling to relaxation and vice versa. The non-assimilative run also shows these transitions in subsurface temperature; but they are not as well defined. For salinity, the non-assimilative run significantly differs from the observations. However, the glider data assimilating run is able to show comparable results with observations of thermocline as well as halocline depths during upwelling and relaxation events in the Monterey Bay area. It is also shown that during the relaxation of wind, the data assimilative run has higher value of subsurface velocity complex correlation with observations than the non-assimilative run.

  6. Emulator-assisted data assimilation in complex models

    NASA Astrophysics Data System (ADS)

    Margvelashvili, Nugzar Yu; Herzfeld, Mike; Rizwi, Farhan; Mongin, Mathieu; Baird, Mark E.; Jones, Emlyn; Schaffelke, Britta; King, Edward; Schroeder, Thomas

    2016-09-01

    Emulators are surrogates of complex models that run orders of magnitude faster than the original model. The utility of emulators for the data assimilation into ocean models is still not well understood. High complexity of ocean models translates into high uncertainty of the corresponding emulators which may undermine the quality of the assimilation schemes based on such emulators. Numerical experiments with a chaotic Lorenz-95 model are conducted to illustrate this point and suggest a strategy to alleviate this problem through the localization of the emulation and data assimilation procedures. Insights gained through these experiments are used to design and implement data assimilation scenario for a 3D fine-resolution sediment transport model of the Great Barrier Reef (GBR), Australia.

  7. P161 Improved Impact of Atmospheric Infrared Sounder (AIRS) Radiance Assimilation in Numerical Weather Prediction

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley T.; Chou, Shih-Hung; Jedlovec, Gary J.

    2012-01-01

    For over 6 years, AIRS radiances have been assimilated operationally into National (e.g. Environmental Modeling Center (EMC)) and International (e.g. European Centre for Medium-Range Weather Forecasts (ECMWF)), operational centers; assimilated in the North American Mesoscale (NAM) since 2008. Due partly to data latency and operational constraints, hyperspectral radiance assimilation has had less impact on the Gridpoint Statistical Interpolation (GSI) system used in the NAM and GFS. Objective of this project is to use AIRS retrieved profiles as a proxy for the AIRS radiances in situations where AIRS radiances are unable to be assimilated in the current operational system by evaluating location and magnitude of analysis increments.

  8. Evaluating the Impact of AIRS Observations on Regional Forecasts at the SPoRT Center

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley

    2011-01-01

    NASA Short-term Prediction Research and Transition (SPoRT) Center collaborates with operational partners of different sizes and operational goals to improve forecasts using targeted projects and data sets. Modeling and DA activities focus on demonstrating utility of NASA data sets and capabilities within operational systems. SPoRT has successfully assimilated the Atmospheric Infrared Sounder (AIRS) radiance and profile data. A collaborative project is underway with the Joint Center for Satellite Data Assimilation (JCSDA) to use AIRS profiles to better understand the impact of AIRS radiances assimilated within Gridpoint Statistical Interpolation (GSI) in hopes of engaging the operational DA community in a reassessment of assimilation methodologies to more effectively assimilate hyperspectral radiances.

  9. A time-parallel approach to strong-constraint four-dimensional variational data assimilation

    NASA Astrophysics Data System (ADS)

    Rao, Vishwas; Sandu, Adrian

    2016-05-01

    A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows parallelization of cost function and gradient computations. The solutions to the continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than the weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems involving the Lorenz-96 and the shallow water models.

  10. Petrology, geochemistry and LA-ICP-MS U-Pb geochronology of Paleoproterozoic basement rocks in Bangladesh: An evaluation of calc-alkaline magmatism and implication for Columbia supercontinent amalgamation

    NASA Astrophysics Data System (ADS)

    Hossain, Ismail; Tsunogae, Toshiaki; Tsutsumi, Yukiyasu; Takahashi, Kazuki

    2018-05-01

    The Paleoproterozoic (1.7 Ga) basement rocks from Maddhapara, Bangladesh show a large range of chemical variations (e.g. SiO2 = 50.7-74.7%) and include diorite, quartz diorite, monzodiorite, quartz monzonite and granite. The pluton overall displays metaluminous, calc-alkaline orogenic suite; mostly I-type suites formed within subduction-related magmatism. The observed major elements show general trends for fractional crystallization. Trace element contents also indicate the possibility of a fractionation or assimilation; explain the entire variation from diorite to monzonite, even granite. The pluton may have evolved the unique chemical features by a process that included partial melting of calc-alkaline lithologies and mixing of mantle-derived magmas, followed by fractional crystallization, and by assimilation of country rocks. The pluton shows evidence of crystal fractionation involving largely plagioclase, amphibole and possibly biotite. Some of the fractionated magmas may have mixed with more potassic melts from distinct parts of the continental lithosphere to produce granites and/or pegmatites. New geochronological results of granitic pegmatite (1722 ± 10 Ma) are indisputably consistent with diorite and tonalite and those data showing credible geochronological sequence (i.e., diorite - tonalite - granitic pegmatite). Identical Paleoproterozoic age (1.7 Ga) with distinctive magmatism of the Maddhapara basement rocks have agreeable relationship with the CITZ, India. The consistent magmatism is also common in the Transamazonian of South America, Trans-Hudson orogeny in North America, Bohemian Massif and the Svecofennian, Poland, have identified the sequential growth of the continent through the amalgamation of juvenile terrains, succeeded by a major collisional orogeny. Such Paleoproterozoic subduction-related orogens in Australia have similar counterparts in Antarctica and other part of the world. These types of Paleoproterozoic magmatism dominantly contributed to assemble, amalgamation and breakup of the enormous Columbia supercontinent.

  11. An adaptive ANOVA-based PCKF for high-dimensional nonlinear inverse modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan, E-mail: weixuan.li@usc.edu; Lin, Guang, E-mail: guang.lin@pnnl.gov; Zhang, Dongxiao, E-mail: dxz@pku.edu.cn

    2014-02-01

    The probabilistic collocation-based Kalman filter (PCKF) is a recently developed approach for solving inverse problems. It resembles the ensemble Kalman filter (EnKF) in every aspect—except that it represents and propagates model uncertainty by polynomial chaos expansion (PCE) instead of an ensemble of model realizations. Previous studies have shown PCKF is a more efficient alternative to EnKF for many data assimilation problems. However, the accuracy and efficiency of PCKF depends on an appropriate truncation of the PCE series. Having more polynomial chaos basis functions in the expansion helps to capture uncertainty more accurately but increases computational cost. Selection of basis functionsmore » is particularly important for high-dimensional stochastic problems because the number of polynomial chaos basis functions required to represent model uncertainty grows dramatically as the number of input parameters (random dimensions) increases. In classic PCKF algorithms, the PCE basis functions are pre-set based on users' experience. Also, for sequential data assimilation problems, the basis functions kept in PCE expression remain unchanged in different Kalman filter loops, which could limit the accuracy and computational efficiency of classic PCKF algorithms. To address this issue, we present a new algorithm that adaptively selects PCE basis functions for different problems and automatically adjusts the number of basis functions in different Kalman filter loops. The algorithm is based on adaptive functional ANOVA (analysis of variance) decomposition, which approximates a high-dimensional function with the summation of a set of low-dimensional functions. Thus, instead of expanding the original model into PCE, we implement the PCE expansion on these low-dimensional functions, which is much less costly. We also propose a new adaptive criterion for ANOVA that is more suited for solving inverse problems. The new algorithm was tested with different examples and demonstrated great effectiveness in comparison with non-adaptive PCKF and EnKF algorithms.« less

  12. An Adaptive ANOVA-based PCKF for High-Dimensional Nonlinear Inverse Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LI, Weixuan; Lin, Guang; Zhang, Dongxiao

    2014-02-01

    The probabilistic collocation-based Kalman filter (PCKF) is a recently developed approach for solving inverse problems. It resembles the ensemble Kalman filter (EnKF) in every aspect—except that it represents and propagates model uncertainty by polynomial chaos expansion (PCE) instead of an ensemble of model realizations. Previous studies have shown PCKF is a more efficient alternative to EnKF for many data assimilation problems. However, the accuracy and efficiency of PCKF depends on an appropriate truncation of the PCE series. Having more polynomial chaos bases in the expansion helps to capture uncertainty more accurately but increases computational cost. Bases selection is particularly importantmore » for high-dimensional stochastic problems because the number of polynomial chaos bases required to represent model uncertainty grows dramatically as the number of input parameters (random dimensions) increases. In classic PCKF algorithms, the PCE bases are pre-set based on users’ experience. Also, for sequential data assimilation problems, the bases kept in PCE expression remain unchanged in different Kalman filter loops, which could limit the accuracy and computational efficiency of classic PCKF algorithms. To address this issue, we present a new algorithm that adaptively selects PCE bases for different problems and automatically adjusts the number of bases in different Kalman filter loops. The algorithm is based on adaptive functional ANOVA (analysis of variance) decomposition, which approximates a high-dimensional function with the summation of a set of low-dimensional functions. Thus, instead of expanding the original model into PCE, we implement the PCE expansion on these low-dimensional functions, which is much less costly. We also propose a new adaptive criterion for ANOVA that is more suited for solving inverse problems. The new algorithm is tested with different examples and demonstrated great effectiveness in comparison with non-adaptive PCKF and EnKF algorithms.« less

  13. Streamflow hindcasting in European river basins via multi-parametric ensemble of the mesoscale hydrologic model (mHM)

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis

    2016-04-01

    There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins having different climate and catchment characteristics. Because augmentation of parameters is not required within an assimilation window, the approach could be stable with limited ensemble members and viable for practical uses.

  14. A Bayesian consistent dual ensemble Kalman filter for state-parameter estimation in subsurface hydrology

    NASA Astrophysics Data System (ADS)

    Ait-El-Fquih, Boujemaa; El Gharamti, Mohamad; Hoteit, Ibrahim

    2016-08-01

    Ensemble Kalman filtering (EnKF) is an efficient approach to addressing uncertainties in subsurface groundwater models. The EnKF sequentially integrates field data into simulation models to obtain a better characterization of the model's state and parameters. These are generally estimated following joint and dual filtering strategies, in which, at each assimilation cycle, a forecast step by the model is followed by an update step with incoming observations. The joint EnKF directly updates the augmented state-parameter vector, whereas the dual EnKF empirically employs two separate filters, first estimating the parameters and then estimating the state based on the updated parameters. To develop a Bayesian consistent dual approach and improve the state-parameter estimates and their consistency, we propose in this paper a one-step-ahead (OSA) smoothing formulation of the state-parameter Bayesian filtering problem from which we derive a new dual-type EnKF, the dual EnKFOSA. Compared with the standard dual EnKF, it imposes a new update step to the state, which is shown to enhance the performance of the dual approach with almost no increase in the computational cost. Numerical experiments are conducted with a two-dimensional (2-D) synthetic groundwater aquifer model to investigate the performance and robustness of the proposed dual EnKFOSA, and to evaluate its results against those of the joint and dual EnKFs. The proposed scheme is able to successfully recover both the hydraulic head and the aquifer conductivity, providing further reliable estimates of their uncertainties. Furthermore, it is found to be more robust to different assimilation settings, such as the spatial and temporal distribution of the observations, and the level of noise in the data. Based on our experimental setups, it yields up to 25 % more accurate state and parameter estimations than the joint and dual approaches.

  15. Analyzing the uncertainty of suspended sediment load prediction using sequential data assimilation

    NASA Astrophysics Data System (ADS)

    Leisenring, Marc; Moradkhani, Hamid

    2012-10-01

    SummaryA first step in understanding the impacts of sediment and controlling the sources of sediment is to quantify the mass loading. Since mass loading is the product of flow and concentration, the quantification of loads first requires the quantification of runoff volume. Using the National Weather Service's SNOW-17 and the Sacramento Soil Moisture Accounting (SAC-SMA) models, this study employed particle filter based Bayesian data assimilation methods to predict seasonal snow water equivalent (SWE) and runoff within a small watershed in the Lake Tahoe Basin located in California, USA. A procedure was developed to scale the variance multipliers (a.k.a hyperparameters) for model parameters and predictions based on the accuracy of the mean predictions relative to the ensemble spread. In addition, an online bias correction algorithm based on the lagged average bias was implemented to detect and correct for systematic bias in model forecasts prior to updating with the particle filter. Both of these methods significantly improved the performance of the particle filter without requiring excessively wide prediction bounds. The flow ensemble was linked to a non-linear regression model that was used to predict suspended sediment concentrations (SSCs) based on runoff rate and time of year. Runoff volumes and SSC were then combined to produce an ensemble of suspended sediment load estimates. Annual suspended sediment loads for the 5 years of simulation were finally computed along with 95% prediction intervals that account for uncertainty in both the SSC regression model and flow rate estimates. Understanding the uncertainty associated with annual suspended sediment load predictions is critical for making sound watershed management decisions aimed at maintaining the exceptional clarity of Lake Tahoe. The computational methods developed and applied in this research could assist with similar studies where it is important to quantify the predictive uncertainty of pollutant load estimates.

  16. Preparation and characterization of cobalt-substituted anthrax lethal factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saebel, Crystal E.; Carbone, Ryan; Dabous, John R.

    2011-12-09

    Highlights: Black-Right-Pointing-Pointer Cobalt-substituted anthrax lethal factor (CoLF) is highly active. Black-Right-Pointing-Pointer CoLF can be prepared by bio-assimilation and direct exchange. Black-Right-Pointing-Pointer Lethal factor binds cobalt tightly. Black-Right-Pointing-Pointer The electronic spectrum of CoLF reveals penta-coordination. Black-Right-Pointing-Pointer Interaction of CoLF with thioglycolic acid follows a 2-step mechanism. -- Abstract: Anthrax lethal factor (LF) is a zinc-dependent endopeptidase involved in the cleavage of mitogen-activated protein kinase kinases near their N-termini. The current report concerns the preparation of cobalt-substituted LF (CoLF) and its characterization by electronic spectroscopy. Two strategies to produce CoLF were explored, including (i) a bio-assimilation approach involving the cultivation of LF-expressingmore » Bacillus megaterium cells in the presence of CoCl{sub 2}, and (ii) direct exchange by treatment of zinc-LF with CoCl{sub 2}. Independent of the method employed, the protein was found to contain one Co{sup 2+} per LF molecule, and was shown to be twice as active as its native zinc counterpart. The electronic spectrum of CoLF suggests the Co{sup 2+} ion to be five-coordinate, an observation similar to that reported for other Co{sup 2+}-substituted gluzincins, but distinct from that documented for the crystal structure of native LF. Furthermore, spectroscopic studies following the exposure of CoLF to thioglycolic acid (TGA) revealed a sequential mechanism of metal removal from LF, which likely involves the formation of an enzyme: Co{sup 2+}:TGA ternary complex prior to demetallation of the active site. CoLF reported herein constitutes the first spectroscopic probe of LF's active site, which may be utilized in future studies to gain further insight into the enzyme's mechanism and inhibitor interactions.« less

  17. A New Methodology for the Extension of the Impact of Data Assimilation on Ocean Wave Prediction

    DTIC Science & Technology

    2008-07-01

    Assimilation method The analysis fields used were corrected by an assimilation method developed at the Norwegian Meteorological Insti- tute ( Breivik and Reistad...523–535 525 becomes equal to the solution obtained by optimal interpolation (see Bratseth 1986 and Breivik and Reistad 1994). The iterations begin with...updated accordingly. A more detailed description of the assimilation method is given in Breivik and Reistad (1994). 2.3 Kolmogorov–Zurbenko filters

  18. Carbon Source Preference in Chemosynthetic Hot Spring Communities

    PubMed Central

    Urschel, Matthew R.; Kubo, Michael D.; Hoehler, Tori M.; Peters, John W.

    2015-01-01

    Rates of dissolved inorganic carbon (DIC), formate, and acetate mineralization and/or assimilation were determined in 13 high-temperature (>73°C) hot springs in Yellowstone National Park (YNP), Wyoming, in order to evaluate the relative importance of these substrates in supporting microbial metabolism. While 9 of the hot spring communities exhibited rates of DIC assimilation that were greater than those of formate and acetate assimilation, 2 exhibited rates of formate and/or acetate assimilation that exceeded those of DIC assimilation. Overall rates of DIC, formate, and acetate mineralization and assimilation were positively correlated with spring pH but showed little correlation with temperature. Communities sampled from hot springs with similar geochemistries generally exhibited similar rates of substrate transformation, as well as similar community compositions, as revealed by 16S rRNA gene-tagged sequencing. Amendment of microcosms with small (micromolar) amounts of formate suppressed DIC assimilation in short-term (<45-min) incubations, despite the presence of native DIC concentrations that exceeded those of added formate by 2 to 3 orders of magnitude. The concentration of added formate required to suppress DIC assimilation was similar to the affinity constant (Km) for formate transformation, as determined by community kinetic assays. These results suggest that dominant chemoautotrophs in high-temperature communities are facultatively autotrophic or mixotrophic, are adapted to fluctuating nutrient availabilities, and are capable of taking advantage of energy-rich organic substrates when they become available. PMID:25819970

  19. Analyses and forecasts of a tornadic supercell outbreak using a 3DVAR system ensemble

    NASA Astrophysics Data System (ADS)

    Zhuang, Zhaorong; Yussouf, Nusrat; Gao, Jidong

    2016-05-01

    As part of NOAA's "Warn-On-Forecast" initiative, a convective-scale data assimilation and prediction system was developed using the WRF-ARW model and ARPS 3DVAR data assimilation technique. The system was then evaluated using retrospective short-range ensemble analyses and probabilistic forecasts of the tornadic supercell outbreak event that occurred on 24 May 2011 in Oklahoma, USA. A 36-member multi-physics ensemble system provided the initial and boundary conditions for a 3-km convective-scale ensemble system. Radial velocity and reflectivity observations from four WSR-88Ds were assimilated into the ensemble using the ARPS 3DVAR technique. Five data assimilation and forecast experiments were conducted to evaluate the sensitivity of the system to data assimilation frequencies, in-cloud temperature adjustment schemes, and fixed- and mixed-microphysics ensembles. The results indicated that the experiment with 5-min assimilation frequency quickly built up the storm and produced a more accurate analysis compared with the 10-min assimilation frequency experiment. The predicted vertical vorticity from the moist-adiabatic in-cloud temperature adjustment scheme was larger in magnitude than that from the latent heat scheme. Cycled data assimilation yielded good forecasts, where the ensemble probability of high vertical vorticity matched reasonably well with the observed tornado damage path. Overall, the results of the study suggest that the 3DVAR analysis and forecast system can provide reasonable forecasts of tornadic supercell storms.

  20. SMOS brightness temperature assimilation into the Community Land Model

    NASA Astrophysics Data System (ADS)

    Rains, Dominik; Han, Xujun; Lievens, Hans; Montzka, Carsten; Verhoest, Niko E. C.

    2017-11-01

    SMOS (Soil Moisture and Ocean Salinity mission) brightness temperatures at a single incident angle are assimilated into the Community Land Model (CLM) across Australia to improve soil moisture simulations. Therefore, the data assimilation system DasPy is coupled to the local ensemble transform Kalman filter (LETKF) as well as to the Community Microwave Emission Model (CMEM). Brightness temperature climatologies are precomputed to enable the assimilation of brightness temperature anomalies, making use of 6 years of SMOS data (2010-2015). Mean correlation R with in situ measurements increases moderately from 0.61 to 0.68 (11 %) for upper soil layers if the root zone is included in the updates. A reduced improvement of 5 % is achieved if the assimilation is restricted to the upper soil layers. Root-zone simulations improve by 7 % when updating both the top layers and root zone, and by 4 % when only updating the top layers. Mean increments and increment standard deviations are compared for the experiments. The long-term assimilation impact is analysed by looking at a set of quantiles computed for soil moisture at each grid cell. Within hydrological monitoring systems, extreme dry or wet conditions are often defined via their relative occurrence, adding great importance to assimilation-induced quantile changes. Although still being limited now, longer L-band radiometer time series will become available and make model output improved by assimilating such data that are more usable for extreme event statistics.

Top