Sample records for multifactor stochastic volatility

  1. Entropy measure of credit risk in highly correlated markets

    NASA Astrophysics Data System (ADS)

    Gottschalk, Sylvia

    2017-07-01

    We compare the single and multi-factor structural models of corporate default by calculating the Jeffreys-Kullback-Leibler divergence between their predicted default probabilities when asset correlations are either high or low. Single-factor structural models assume that the stochastic process driving the value of a firm is independent of that of other companies. A multi-factor structural model, on the contrary, is built on the assumption that a single firm's value follows a stochastic process correlated with that of other companies. Our main results show that the divergence between the two models increases in highly correlated, volatile, and large markets, but that it is closer to zero in small markets, when asset correlations are low and firms are highly leveraged. These findings suggest that during periods of financial instability, when asset volatility and correlations increase, one of the models misreports actual default risk.

  2. On the source of stochastic volatility: Evidence from CAC40 index options during the subprime crisis

    NASA Astrophysics Data System (ADS)

    Slim, Skander

    2016-12-01

    This paper investigates the performance of time-changed Lévy processes with distinct sources of return volatility variation for modeling cross-sectional option prices on the CAC40 index during the subprime crisis. Specifically, we propose a multi-factor stochastic volatility model: one factor captures the diffusion component dynamics and two factors capture positive and negative jump variations. In-sample and out-of-sample tests show that our full-fledged model significantly outperforms nested lower-dimensional specifications. We find that all three sources of return volatility variation, with different persistence, are needed to properly account for market pricing dynamics across moneyness, maturity and volatility level. Besides, the model estimation reveals negative risk premium for both diffusive volatility and downward jump intensity whereas a positive risk premium is found to be attributed to upward jump intensity.

  3. Numerical methods on European option second order asymptotic expansions for multiscale stochastic volatility

    NASA Astrophysics Data System (ADS)

    Canhanga, Betuel; Ni, Ying; Rančić, Milica; Malyarenko, Anatoliy; Silvestrov, Sergei

    2017-01-01

    After Black-Scholes proposed a model for pricing European Options in 1973, Cox, Ross and Rubinstein in 1979, and Heston in 1993, showed that the constant volatility assumption made by Black-Scholes was one of the main reasons for the model to be unable to capture some market details. Instead of constant volatilities, they introduced stochastic volatilities to the asset dynamic modeling. In 2009, Christoffersen empirically showed "why multifactor stochastic volatility models work so well". Four years later, Chiarella and Ziveyi solved the model proposed by Christoffersen. They considered an underlying asset whose price is governed by two factor stochastic volatilities of mean reversion type. Applying Fourier transforms, Laplace transforms and the method of characteristics they presented a semi-analytical formula to compute an approximate price for American options. The huge calculation involved in the Chiarella and Ziveyi approach motivated the authors of this paper in 2014 to investigate another methodology to compute European Option prices on a Christoffersen type model. Using the first and second order asymptotic expansion method we presented a closed form solution for European option, and provided experimental and numerical studies on investigating the accuracy of the approximation formulae given by the first order asymptotic expansion. In the present paper we will perform experimental and numerical studies for the second order asymptotic expansion and compare the obtained results with results presented by Chiarella and Ziveyi.

  4. Stochastic volatility of the futures prices of emission allowances: A Bayesian approach

    NASA Astrophysics Data System (ADS)

    Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin

    2017-01-01

    Understanding the stochastic nature of the spot volatility of emission allowances is crucial for risk management in emissions markets. In this study, by adopting a stochastic volatility model with or without jumps to represent the dynamics of European Union Allowances (EUA) futures prices, we estimate the daily volatilities and model parameters by using the Markov Chain Monte Carlo method for stochastic volatility (SV), stochastic volatility with return jumps (SVJ) and stochastic volatility with correlated jumps (SVCJ) models. Our empirical results reveal three important features of emissions markets. First, the data presented herein suggest that EUA futures prices exhibit significant stochastic volatility. Second, the leverage effect is noticeable regardless of whether or not jumps are included. Third, the inclusion of jumps has a significant impact on the estimation of the volatility dynamics. Finally, the market becomes very volatile and large jumps occur at the beginning of a new phase. These findings are important for policy makers and regulators.

  5. Stochastic volatility models and Kelvin waves

    NASA Astrophysics Data System (ADS)

    Lipton, Alex; Sepp, Artur

    2008-08-01

    We use stochastic volatility models to describe the evolution of an asset price, its instantaneous volatility and its realized volatility. In particular, we concentrate on the Stein and Stein model (SSM) (1991) for the stochastic asset volatility and the Heston model (HM) (1993) for the stochastic asset variance. By construction, the volatility is not sign definite in SSM and is non-negative in HM. It is well known that both models produce closed-form expressions for the prices of vanilla option via the Lewis-Lipton formula. However, the numerical pricing of exotic options by means of the finite difference and Monte Carlo methods is much more complex for HM than for SSM. Until now, this complexity was considered to be an acceptable price to pay for ensuring that the asset volatility is non-negative. We argue that having negative stochastic volatility is a psychological rather than financial or mathematical problem, and advocate using SSM rather than HM in most applications. We extend SSM by adding volatility jumps and obtain a closed-form expression for the density of the asset price and its realized volatility. We also show that the current method of choice for solving pricing problems with stochastic volatility (via the affine ansatz for the Fourier-transformed density function) can be traced back to the Kelvin method designed in the 19th century for studying wave motion problems arising in fluid dynamics.

  6. Approximation methods of European option pricing in multiscale stochastic volatility model

    NASA Astrophysics Data System (ADS)

    Ni, Ying; Canhanga, Betuel; Malyarenko, Anatoliy; Silvestrov, Sergei

    2017-01-01

    In the classical Black-Scholes model for financial option pricing, the asset price follows a geometric Brownian motion with constant volatility. Empirical findings such as volatility smile/skew, fat-tailed asset return distributions have suggested that the constant volatility assumption might not be realistic. A general stochastic volatility model, e.g. Heston model, GARCH model and SABR volatility model, in which the variance/volatility itself follows typically a mean-reverting stochastic process, has shown to be superior in terms of capturing the empirical facts. However in order to capture more features of the volatility smile a two-factor, of double Heston type, stochastic volatility model is more useful as shown in Christoffersen, Heston and Jacobs [12]. We consider one modified form of such two-factor volatility models in which the volatility has multiscale mean-reversion rates. Our model contains two mean-reverting volatility processes with a fast and a slow reverting rate respectively. We consider the European option pricing problem under one type of the multiscale stochastic volatility model where the two volatility processes act as independent factors in the asset price process. The novelty in this paper is an approximating analytical solution using asymptotic expansion method which extends the authors earlier research in Canhanga et al. [5, 6]. In addition we propose a numerical approximating solution using Monte-Carlo simulation. For completeness and for comparison we also implement the semi-analytical solution by Chiarella and Ziveyi [11] using method of characteristics, Fourier and bivariate Laplace transforms.

  7. Bias correction in the realized stochastic volatility model for daily volatility on the Tokyo Stock Exchange

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2018-06-01

    The realized stochastic volatility model has been introduced to estimate more accurate volatility by using both daily returns and realized volatility. The main advantage of the model is that no special bias-correction factor for the realized volatility is required a priori. Instead, the model introduces a bias-correction parameter responsible for the bias hidden in realized volatility. We empirically investigate the bias-correction parameter for realized volatilities calculated at various sampling frequencies for six stocks on the Tokyo Stock Exchange, and then show that the dynamic behavior of the bias-correction parameter as a function of sampling frequency is qualitatively similar to that of the Hansen-Lunde bias-correction factor although their values are substantially different. Under the stochastic diffusion assumption of the return dynamics, we investigate the accuracy of estimated volatilities by examining the standardized returns. We find that while the moments of the standardized returns from low-frequency realized volatilities are consistent with the expectation from the Gaussian variables, the deviation from the expectation becomes considerably large at high frequencies. This indicates that the realized stochastic volatility model itself cannot completely remove bias at high frequencies.

  8. Range-based volatility, expected stock returns, and the low volatility anomaly

    PubMed Central

    2017-01-01

    One of the foundations of financial economics is the idea that rational investors will discount stocks with more risk (volatility), which will result in a positive relation between risk and future returns. However, the empirical evidence is mixed when determining how volatility is related to future returns. In this paper, we examine this relation using a range-based measure of volatility, which is shown to be theoretically, numerically, and empirically superior to other measures of volatility. In a variety of tests, we find that range-based volatility is negatively associated with expected stock returns. These results are robust to time-series multifactor models as well as cross-sectional tests. Our findings contribute to the debate about the direction of the relationship between risk and return and confirm the presence of the low volatility anomaly, or the anomalous finding that low volatility stocks outperform high volatility stocks. In other tests, we find that the lower returns associated with range-based volatility are driven by stocks with lottery-like characteristics. PMID:29190652

  9. Range-based volatility, expected stock returns, and the low volatility anomaly.

    PubMed

    Blau, Benjamin M; Whitby, Ryan J

    2017-01-01

    One of the foundations of financial economics is the idea that rational investors will discount stocks with more risk (volatility), which will result in a positive relation between risk and future returns. However, the empirical evidence is mixed when determining how volatility is related to future returns. In this paper, we examine this relation using a range-based measure of volatility, which is shown to be theoretically, numerically, and empirically superior to other measures of volatility. In a variety of tests, we find that range-based volatility is negatively associated with expected stock returns. These results are robust to time-series multifactor models as well as cross-sectional tests. Our findings contribute to the debate about the direction of the relationship between risk and return and confirm the presence of the low volatility anomaly, or the anomalous finding that low volatility stocks outperform high volatility stocks. In other tests, we find that the lower returns associated with range-based volatility are driven by stocks with lottery-like characteristics.

  10. Portfolio Optimization with Stochastic Dividends and Stochastic Volatility

    ERIC Educational Resources Information Center

    Varga, Katherine Yvonne

    2015-01-01

    We consider an optimal investment-consumption portfolio optimization model in which an investor receives stochastic dividends. As a first problem, we allow the drift of stock price to be a bounded function. Next, we consider a stochastic volatility model. In each problem, we use the dynamic programming method to derive the Hamilton-Jacobi-Bellman…

  11. Effect of five enological practices and of the general phenolic composition on fermentation-related aroma compounds in Mencia young red wines.

    PubMed

    Añón, Ana; López, Jorge F; Hernando, Diego; Orriols, Ignacio; Revilla, Eugenio; Losada, Manuel M

    2014-04-01

    The effects of five technological procedures and of the contents of total anthocyanins and condensed tannins on 19 fermentation-related aroma compounds of young red Mencia wines were studied. Multifactor ANOVA revealed that levels of those volatiles changed significantly over the length of storage in bottles and, to a lesser extent, due to other technological factors considered; total anthocyanins and condensed tannins also changed significantly as a result of the five practices assayed. Five aroma compounds possessed an odour activity value >1 in all wines, and another four in some wines. Linear correlation among volatile compounds and general phenolic composition revealed that total anthocyanins were highly related to 14 different aroma compounds. Multifactor ANOVA, considering the content of total anthocyanins as a sixth random factor, revealed that this parameter affected significantly the contents of ethyl lactate, ethyl isovalerate, 1-pentanol and ethyl octanoate. Thus, the aroma of young red Mencia wines may be affected by levels of total anthocyanins. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Estimation of stochastic volatility by using Ornstein-Uhlenbeck type models

    NASA Astrophysics Data System (ADS)

    Mariani, Maria C.; Bhuiyan, Md Al Masum; Tweneboah, Osei K.

    2018-02-01

    In this study, we develop a technique for estimating the stochastic volatility (SV) of a financial time series by using Ornstein-Uhlenbeck type models. Using the daily closing prices from developed and emergent stock markets, we conclude that the incorporation of stochastic volatility into the time varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. Furthermore, our estimation algorithm is feasible with large data sets and have good convergence properties.

  13. A study about the existence of the leverage effect in stochastic volatility models

    NASA Astrophysics Data System (ADS)

    Florescu, Ionuţ; Pãsãricã, Cristian Gabriel

    2009-02-01

    The empirical relationship between the return of an asset and the volatility of the asset has been well documented in the financial literature. Named the leverage effect or sometimes risk-premium effect, it is observed in real data that, when the return of the asset decreases, the volatility increases and vice versa. Consequently, it is important to demonstrate that any formulated model for the asset price is capable of generating this effect observed in practice. Furthermore, we need to understand the conditions on the parameters present in the model that guarantee the apparition of the leverage effect. In this paper we analyze two general specifications of stochastic volatility models and their capability of generating the perceived leverage effect. We derive conditions for the apparition of leverage effect in both of these stochastic volatility models. We exemplify using stochastic volatility models used in practice and we explicitly state the conditions for the existence of the leverage effect in these examples.

  14. Pricing foreign equity option under stochastic volatility tempered stable Lévy processes

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoli; Zhuang, Xintian

    2017-10-01

    Considering that financial assets returns exhibit leptokurtosis, asymmetry properties as well as clustering and heteroskedasticity effect, this paper substitutes the logarithm normal jumps in Heston stochastic volatility model by the classical tempered stable (CTS) distribution and normal tempered stable (NTS) distribution to construct stochastic volatility tempered stable Lévy processes (TSSV) model. The TSSV model framework permits infinite activity jump behaviors of return dynamics and time varying volatility consistently observed in financial markets through subordinating tempered stable process to stochastic volatility process, capturing leptokurtosis, fat tailedness and asymmetry features of returns. By employing the analytical characteristic function and fast Fourier transform (FFT) technique, the formula for probability density function (PDF) of TSSV returns is derived, making the analytical formula for foreign equity option (FEO) pricing available. High frequency financial returns data are employed to verify the effectiveness of proposed models in reflecting the stylized facts of financial markets. Numerical analysis is performed to investigate the relationship between the corresponding parameters and the implied volatility of foreign equity option.

  15. A DG approach to the numerical solution of the Stein-Stein stochastic volatility option pricing model

    NASA Astrophysics Data System (ADS)

    Hozman, J.; Tichý, T.

    2017-12-01

    Stochastic volatility models enable to capture the real world features of the options better than the classical Black-Scholes treatment. Here we focus on pricing of European-style options under the Stein-Stein stochastic volatility model when the option value depends on the time, on the price of the underlying asset and on the volatility as a function of a mean reverting Orstein-Uhlenbeck process. A standard mathematical approach to this model leads to the non-stationary second-order degenerate partial differential equation of two spatial variables completed by the system of boundary and terminal conditions. In order to improve the numerical valuation process for a such pricing equation, we propose a numerical technique based on the discontinuous Galerkin method and the Crank-Nicolson scheme. Finally, reference numerical experiments on real market data illustrate comprehensive empirical findings on options with stochastic volatility.

  16. Pricing foreign equity option with stochastic volatility

    NASA Astrophysics Data System (ADS)

    Sun, Qi; Xu, Weidong

    2015-11-01

    In this paper we propose a general foreign equity option pricing framework that unifies the vast foreign equity option pricing literature and incorporates the stochastic volatility into foreign equity option pricing. Under our framework, the time-changed Lévy processes are used to model the underlying assets price of foreign equity option and the closed form pricing formula is obtained through the use of characteristic function methodology. Numerical tests indicate that stochastic volatility has a dramatic effect on the foreign equity option prices.

  17. Variational formulation for Black-Scholes equations in stochastic volatility models

    NASA Astrophysics Data System (ADS)

    Gyulov, Tihomir B.; Valkov, Radoslav L.

    2012-11-01

    In this note we prove existence and uniqueness of weak solutions to a boundary value problem arising from stochastic volatility models in financial mathematics. Our settings are variational in weighted Sobolev spaces. Nevertheless, as it will become apparent our variational formulation agrees well with the stochastic part of the problem.

  18. Option pricing, stochastic volatility, singular dynamics and constrained path integrals

    NASA Astrophysics Data System (ADS)

    Contreras, Mauricio; Hojman, Sergio A.

    2014-01-01

    Stochastic volatility models have been widely studied and used in the financial world. The Heston model (Heston, 1993) [7] is one of the best known models to deal with this issue. These stochastic volatility models are characterized by the fact that they explicitly depend on a correlation parameter ρ which relates the two Brownian motions that drive the stochastic dynamics associated to the volatility and the underlying asset. Solutions to the Heston model in the context of option pricing, using a path integral approach, are found in Lemmens et al. (2008) [21] while in Baaquie (2007,1997) [12,13] propagators for different stochastic volatility models are constructed. In all previous cases, the propagator is not defined for extreme cases ρ=±1. It is therefore necessary to obtain a solution for these extreme cases and also to understand the origin of the divergence of the propagator. In this paper we study in detail a general class of stochastic volatility models for extreme values ρ=±1 and show that in these two cases, the associated classical dynamics corresponds to a system with second class constraints, which must be dealt with using Dirac’s method for constrained systems (Dirac, 1958,1967) [22,23] in order to properly obtain the propagator in the form of a Euclidean Hamiltonian path integral (Henneaux and Teitelboim, 1992) [25]. After integrating over momenta, one gets an Euclidean Lagrangian path integral without constraints, which in the case of the Heston model corresponds to a path integral of a repulsive radial harmonic oscillator. In all the cases studied, the price of the underlying asset is completely determined by one of the second class constraints in terms of volatility and plays no active role in the path integral.

  19. Path integral approach to closed-form option pricing formulas with applications to stochastic volatility and interest rate models

    NASA Astrophysics Data System (ADS)

    Lemmens, D.; Wouters, M.; Tempere, J.; Foulon, S.

    2008-07-01

    We present a path integral method to derive closed-form solutions for option prices in a stochastic volatility model. The method is explained in detail for the pricing of a plain vanilla option. The flexibility of our approach is demonstrated by extending the realm of closed-form option price formulas to the case where both the volatility and interest rates are stochastic. This flexibility is promising for the treatment of exotic options. Our analytical formulas are tested with numerical Monte Carlo simulations.

  20. Asymptotic Behavior of the Stock Price Distribution Density and Implied Volatility in Stochastic Volatility Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulisashvili, Archil, E-mail: guli@math.ohiou.ed; Stein, Elias M., E-mail: stein@math.princeton.ed

    2010-06-15

    We study the asymptotic behavior of distribution densities arising in stock price models with stochastic volatility. The main objects of our interest in the present paper are the density of time averages of the squared volatility process and the density of the stock price process in the Stein-Stein and the Heston model. We find explicit formulas for leading terms in asymptotic expansions of these densities and give error estimates. As an application of our results, sharp asymptotic formulas for the implied volatility in the Stein-Stein and the Heston model are obtained.

  1. Pricing European option with transaction costs under the fractional long memory stochastic volatility model

    NASA Astrophysics Data System (ADS)

    Wang, Xiao-Tian; Wu, Min; Zhou, Ze-Min; Jing, Wei-Shu

    2012-02-01

    This paper deals with the problem of discrete time option pricing using the fractional long memory stochastic volatility model with transaction costs. Through the 'anchoring and adjustment' argument in a discrete time setting, a European call option pricing formula is obtained.

  2. A Path Integral Approach to Option Pricing with Stochastic Volatility: Some Exact Results

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.

    1997-12-01

    The Black-Scholes formula for pricing options on stocks and other securities has been generalized by Merton and Garman to the case when stock volatility is stochastic. The derivation of the price of a security derivative with stochastic volatility is reviewed starting from the first principles of finance. The equation of Merton and Garman is then recast using the path integration technique of theoretical physics. The price of the stock option is shown to be the analogue of the Schrödinger wavefunction of quantum mechanics and the exact Hamiltonian and Lagrangian of the system is obtained. The results of Hull and White are generalized to the case when stock price and volatility have non-zero correlation. Some exact results for pricing stock options for the general correlated case are derived.

  3. Estimation of stochastic volatility with long memory for index prices of FTSE Bursa Malaysia KLCI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Kho Chia; Kane, Ibrahim Lawal; Rahman, Haliza Abd

    In recent years, modeling in long memory properties or fractionally integrated processes in stochastic volatility has been applied in the financial time series. A time series with structural breaks can generate a strong persistence in the autocorrelation function, which is an observed behaviour of a long memory process. This paper considers the structural break of data in order to determine true long memory time series data. Unlike usual short memory models for log volatility, the fractional Ornstein-Uhlenbeck process is neither a Markovian process nor can it be easily transformed into a Markovian process. This makes the likelihood evaluation and parametermore » estimation for the long memory stochastic volatility (LMSV) model challenging tasks. The drift and volatility parameters of the fractional Ornstein-Unlenbeck model are estimated separately using the least square estimator (lse) and quadratic generalized variations (qgv) method respectively. Finally, the empirical distribution of unobserved volatility is estimated using the particle filtering with sequential important sampling-resampling (SIR) method. The mean square error (MSE) between the estimated and empirical volatility indicates that the performance of the model towards the index prices of FTSE Bursa Malaysia KLCI is fairly well.« less

  4. Estimation of stochastic volatility with long memory for index prices of FTSE Bursa Malaysia KLCI

    NASA Astrophysics Data System (ADS)

    Chen, Kho Chia; Bahar, Arifah; Kane, Ibrahim Lawal; Ting, Chee-Ming; Rahman, Haliza Abd

    2015-02-01

    In recent years, modeling in long memory properties or fractionally integrated processes in stochastic volatility has been applied in the financial time series. A time series with structural breaks can generate a strong persistence in the autocorrelation function, which is an observed behaviour of a long memory process. This paper considers the structural break of data in order to determine true long memory time series data. Unlike usual short memory models for log volatility, the fractional Ornstein-Uhlenbeck process is neither a Markovian process nor can it be easily transformed into a Markovian process. This makes the likelihood evaluation and parameter estimation for the long memory stochastic volatility (LMSV) model challenging tasks. The drift and volatility parameters of the fractional Ornstein-Unlenbeck model are estimated separately using the least square estimator (lse) and quadratic generalized variations (qgv) method respectively. Finally, the empirical distribution of unobserved volatility is estimated using the particle filtering with sequential important sampling-resampling (SIR) method. The mean square error (MSE) between the estimated and empirical volatility indicates that the performance of the model towards the index prices of FTSE Bursa Malaysia KLCI is fairly well.

  5. Testing CEV stochastic volatility models using implied volatility index data

    NASA Astrophysics Data System (ADS)

    Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin

    2018-06-01

    We test the goodness-of-fit of stochastic volatility (SV) models using the implied volatility index of the KOSPI200 options (VKOSPI). The likelihood ratio tests reject the Heston and Hull-White SV models, whether or not they include jumps. Our estimation results advocate the unconstrained constant elasticity of variance (CEV) model with return jumps for describing the physical-measure dynamics of the spot index. The sub-period analysis shows that there was a significant increase in the size and frequency of jumps during the crisis period, when compared to those in the normal periods.

  6. Analytical pricing formulas for hybrid variance swaps with regime-switching

    NASA Astrophysics Data System (ADS)

    Roslan, Teh Raihana Nazirah; Cao, Jiling; Zhang, Wenjun

    2017-11-01

    The problem of pricing discretely-sampled variance swaps under stochastic volatility, stochastic interest rate and regime-switching is being considered in this paper. An extension of the Heston stochastic volatility model structure is done by adding the Cox-Ingersoll-Ross (CIR) stochastic interest rate model. In addition, the parameters of the model are permitted to have transitions following a Markov chain process which is continuous and discoverable. This hybrid model can be used to illustrate certain macroeconomic conditions, for example the changing phases of business stages. The outcome of our regime-switching hybrid model is presented in terms of analytical pricing formulas for variance swaps.

  7. Simulation of probabilistic wind loads and building analysis

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  8. The Pricing of European Options Under the Constant Elasticity of Variance with Stochastic Volatility

    NASA Astrophysics Data System (ADS)

    Bock, Bounghun; Choi, Sun-Yong; Kim, Jeong-Hoon

    This paper considers a hybrid risky asset price model given by a constant elasticity of variance multiplied by a stochastic volatility factor. A multiscale analysis leads to an asymptotic pricing formula for both European vanilla option and a Barrier option near the zero elasticity of variance. The accuracy of the approximation is provided in a rigorous manner. A numerical experiment for implied volatilities shows that the hybrid model improves some of the well-known models in view of fitting the data for different maturities.

  9. Estimation and prediction under local volatility jump-diffusion model

    NASA Astrophysics Data System (ADS)

    Kim, Namhyoung; Lee, Younhee

    2018-02-01

    Volatility is an important factor in operating a company and managing risk. In the portfolio optimization and risk hedging using the option, the value of the option is evaluated using the volatility model. Various attempts have been made to predict option value. Recent studies have shown that stochastic volatility models and jump-diffusion models reflect stock price movements accurately. However, these models have practical limitations. Combining them with the local volatility model, which is widely used among practitioners, may lead to better performance. In this study, we propose a more effective and efficient method of estimating option prices by combining the local volatility model with the jump-diffusion model and apply it using both artificial and actual market data to evaluate its performance. The calibration process for estimating the jump parameters and local volatility surfaces is divided into three stages. We apply the local volatility model, stochastic volatility model, and local volatility jump-diffusion model estimated by the proposed method to KOSPI 200 index option pricing. The proposed method displays good estimation and prediction performance.

  10. Free float and stochastic volatility: the experience of a small open economy

    NASA Astrophysics Data System (ADS)

    Selçuk, Faruk

    2004-11-01

    Following a dramatic collapse of a fixed exchange rate based inflation stabilization program, Turkey moved into a free floating exchange rate system in February 2001. In this paper, an asymmetric stochastic volatility model of the foreign exchange rate in Turkey is estimated for the floating period. It is shown that there is a positive relation between the exchange return and its volatility. Particularly, an increase in the return at time t results in an increase in volatility at time t+1. However, the effect is asymmetric: a decrease in the exchange rate return at time t causes a relatively less decrease in volatility at time t+1. The results imply that a central bank with a volatility smoothing policy would be biased in viewing the shocks to the exchange rate in favor of appreciation. The bias would increase if the bank is also following an inflation targeting policy.

  11. Bayesian estimation of realized stochastic volatility model by Hybrid Monte Carlo algorithm

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2014-03-01

    The hybrid Monte Carlo algorithm (HMCA) is applied for Bayesian parameter estimation of the realized stochastic volatility (RSV) model. Using the 2nd order minimum norm integrator (2MNI) for the molecular dynamics (MD) simulation in the HMCA, we find that the 2MNI is more efficient than the conventional leapfrog integrator. We also find that the autocorrelation time of the volatility variables sampled by the HMCA is very short. Thus it is concluded that the HMCA with the 2MNI is an efficient algorithm for parameter estimations of the RSV model.

  12. GPU Computing in Bayesian Inference of Realized Stochastic Volatility Model

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2015-01-01

    The realized stochastic volatility (RSV) model that utilizes the realized volatility as additional information has been proposed to infer volatility of financial time series. We consider the Bayesian inference of the RSV model by the Hybrid Monte Carlo (HMC) algorithm. The HMC algorithm can be parallelized and thus performed on the GPU for speedup. The GPU code is developed with CUDA Fortran. We compare the computational time in performing the HMC algorithm on GPU (GTX 760) and CPU (Intel i7-4770 3.4GHz) and find that the GPU can be up to 17 times faster than the CPU. We also code the program with OpenACC and find that appropriate coding can achieve the similar speedup with CUDA Fortran.

  13. Bayesian analysis of stochastic volatility-in-mean model with leverage and asymmetrically heavy-tailed error using generalized hyperbolic skew Student’s t-distribution*

    PubMed Central

    Leão, William L.; Chen, Ming-Hui

    2017-01-01

    A stochastic volatility-in-mean model with correlated errors using the generalized hyperbolic skew Student-t (GHST) distribution provides a robust alternative to the parameter estimation for daily stock returns in the absence of normality. An efficient Markov chain Monte Carlo (MCMC) sampling algorithm is developed for parameter estimation. The deviance information, the Bayesian predictive information and the log-predictive score criterion are used to assess the fit of the proposed model. The proposed method is applied to an analysis of the daily stock return data from the Standard & Poor’s 500 index (S&P 500). The empirical results reveal that the stochastic volatility-in-mean model with correlated errors and GH-ST distribution leads to a significant improvement in the goodness-of-fit for the S&P 500 index returns dataset over the usual normal model. PMID:29333210

  14. Bayesian analysis of stochastic volatility-in-mean model with leverage and asymmetrically heavy-tailed error using generalized hyperbolic skew Student's t-distribution.

    PubMed

    Leão, William L; Abanto-Valle, Carlos A; Chen, Ming-Hui

    2017-01-01

    A stochastic volatility-in-mean model with correlated errors using the generalized hyperbolic skew Student-t (GHST) distribution provides a robust alternative to the parameter estimation for daily stock returns in the absence of normality. An efficient Markov chain Monte Carlo (MCMC) sampling algorithm is developed for parameter estimation. The deviance information, the Bayesian predictive information and the log-predictive score criterion are used to assess the fit of the proposed model. The proposed method is applied to an analysis of the daily stock return data from the Standard & Poor's 500 index (S&P 500). The empirical results reveal that the stochastic volatility-in-mean model with correlated errors and GH-ST distribution leads to a significant improvement in the goodness-of-fit for the S&P 500 index returns dataset over the usual normal model.

  15. Mathematics, Pricing, Market Risk Management and Trading Strategies for Financial Derivatives (3/3)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coffey, Brian J., Lynn, Bryan

    IR and Long Term FX Derivatives - Stochastic Martingales for IR Curves - Implied Volatility Along the IR Curve - IR Libor Bonds - Vanilla IR Options: Caplets, Floorlets - Long Term FX Options: Interaction of Stochastic FX and Stochastic IR - $-Yen Bermudan Power Reverse Duals

  16. Mathematics, Pricing, Market Risk Management and Trading Strategies for Financial Derivatives (3/3)

    ScienceCinema

    Coffey, Brian J., Lynn, Bryan

    2018-04-26

    IR and Long Term FX Derivatives - Stochastic Martingales for IR Curves - Implied Volatility Along the IR Curve - IR Libor Bonds - Vanilla IR Options: Caplets, Floorlets - Long Term FX Options: Interaction of Stochastic FX and Stochastic IR - $-Yen Bermudan Power Reverse Duals

  17. Application of stochastic differential geometry to the term structure of interst rates in developed markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taranenko, Y.; Barnes, C.

    1996-12-31

    This paper deals with further developments of the new theory that applies stochastic differential geometry (SDG) to dynamics of interest rates. We examine mathematical constraints on the evolution of interest rate volatilities that arise from stochastic differential calculus under assumptions of an arbitrage free evolution of zero coupon bonds and developed markets (i.e., none of the party/factor can drive the whole market). The resulting new theory incorporates the Heath-Jarrow-Morton (HJM) model of interest rates and provides new equations for volatilities which makes the system of equations for interest rates and volatilities complete and self consistent. It results in much smallermore » amount of volatility data that should be guessed for the SDG model as compared to the HJM model. Limited analysis of the market volatility data suggests that the assumption of the developed market is violated around maturity of two years. Such maturities where the assumptions of the SDG model are violated are suggested to serve as boundaries at which volatilities should be specified independently from the model. Our numerical example with two boundaries (two years and five years) qualitatively resembles the market behavior. Under some conditions solutions of the SDG model become singular that may indicate market crashes. More detail comparison with the data is needed before the theory can be established or refuted.« less

  18. Static vs stochastic optimization: A case study of FTSE Bursa Malaysia sectorial indices

    NASA Astrophysics Data System (ADS)

    Mamat, Nur Jumaadzan Zaleha; Jaaman, Saiful Hafizah; Ahmad, Rokiah@Rozita

    2014-06-01

    Traditional portfolio optimization methods in the likes of Markowitz' mean-variance model and semi-variance model utilize static expected return and volatility risk from historical data to generate an optimal portfolio. The optimal portfolio may not truly be optimal in reality due to the fact that maximum and minimum values from the data may largely influence the expected return and volatility risk values. This paper considers distributions of assets' return and volatility risk to determine a more realistic optimized portfolio. For illustration purposes, the sectorial indices data in FTSE Bursa Malaysia is employed. The results show that stochastic optimization provides more stable information ratio.

  19. Static vs stochastic optimization: A case study of FTSE Bursa Malaysia sectorial indices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mamat, Nur Jumaadzan Zaleha; Jaaman, Saiful Hafizah; Ahmad, Rokiah Rozita

    2014-06-19

    Traditional portfolio optimization methods in the likes of Markowitz' mean-variance model and semi-variance model utilize static expected return and volatility risk from historical data to generate an optimal portfolio. The optimal portfolio may not truly be optimal in reality due to the fact that maximum and minimum values from the data may largely influence the expected return and volatility risk values. This paper considers distributions of assets' return and volatility risk to determine a more realistic optimized portfolio. For illustration purposes, the sectorial indices data in FTSE Bursa Malaysia is employed. The results show that stochastic optimization provides more stablemore » information ratio.« less

  20. Artificial neural network model of the hybrid EGARCH volatility of the Taiwan stock index option prices

    NASA Astrophysics Data System (ADS)

    Tseng, Chih-Hsiung; Cheng, Sheng-Tzong; Wang, Yi-Hsien; Peng, Jin-Tang

    2008-05-01

    This investigation integrates a novel hybrid asymmetric volatility approach into an Artificial Neural Networks option-pricing model to upgrade the forecasting ability of the price of derivative securities. The use of the new hybrid asymmetric volatility method can simultaneously decrease the stochastic and nonlinearity of the error term sequence, and capture the asymmetric volatility. Therefore, analytical results of the ANNS option-pricing model reveal that Grey-EGARCH volatility provides greater predictability than other volatility approaches.

  1. A discontinuous Galerkin method for numerical pricing of European options under Heston stochastic volatility

    NASA Astrophysics Data System (ADS)

    Hozman, J.; Tichý, T.

    2016-12-01

    The paper is based on the results from our recent research on multidimensional option pricing problems. We focus on European option valuation when the price movement of the underlying asset is driven by a stochastic volatility following a square root process proposed by Heston. The stochastic approach incorporates a new additional spatial variable into this model and makes it very robust, i.e. it provides a framework to price a variety of options that is closer to reality. The main topic is to present the numerical scheme arising from the concept of discontinuous Galerkin methods and applicable to the Heston option pricing model. The numerical results are presented on artificial benchmarks as well as on reference market data.

  2. The consentaneous model of the financial markets exhibiting spurious nature of long-range memory

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Kononovicius, A.

    2018-09-01

    It is widely accepted that there is strong persistence in the volatility of financial time series. The origin of the observed persistence, or long-range memory, is still an open problem as the observed phenomenon could be a spurious effect. Earlier we have proposed the consentaneous model of the financial markets based on the non-linear stochastic differential equations. The consentaneous model successfully reproduces empirical probability and power spectral densities of volatility. This approach is qualitatively different from models built using fractional Brownian motion. In this contribution we investigate burst and inter-burst duration statistics of volatility in the financial markets employing the consentaneous model. Our analysis provides an evidence that empirical statistical properties of burst and inter-burst duration can be explained by non-linear stochastic differential equations driving the volatility in the financial markets. This serves as an strong argument that long-range memory in finance can have spurious nature.

  3. Volatility in financial markets: stochastic models and empirical results

    NASA Astrophysics Data System (ADS)

    Miccichè, Salvatore; Bonanno, Giovanni; Lillo, Fabrizio; Mantegna, Rosario N.

    2002-11-01

    We investigate the historical volatility of the 100 most capitalized stocks traded in US equity markets. An empirical probability density function (pdf) of volatility is obtained and compared with the theoretical predictions of a lognormal model and of the Hull and White model. The lognormal model well describes the pdf in the region of low values of volatility whereas the Hull and White model better approximates the empirical pdf for large values of volatility. Both models fail in describing the empirical pdf over a moderately large volatility range.

  4. Virtual volatility

    NASA Astrophysics Data System (ADS)

    Silva, A. Christian; Prange, Richard E.

    2007-03-01

    We introduce the concept of virtual volatility. This simple but new measure shows how to quantify the uncertainty in the forecast of the drift component of a random walk. The virtual volatility also is a useful tool in understanding the stochastic process for a given portfolio. In particular, and as an example, we were able to identify mean reversion effect in our portfolio. Finally, we briefly discuss the potential practical effect of the virtual volatility on an investor asset allocation strategy.

  5. Medical imaging technology shock and volatility of macro economics: Analysis using a three-sector dynamical stochastic general equilibrium REC model.

    PubMed

    Han, Shurong; Huang, Yeqing

    2017-07-07

    The study analysed the medical imaging technology business cycle from 1981 to 2009 and found that the volatility of consumption in Chinese medical imaging business was higher than that of the developed countries. The volatility of gross domestic product (GDP) and the correlation between consumption and GDP is also higher than that of the developed countries. Prior to the early 1990s the volatility of consumption is even higher than GDP. This fact makes it difficult to explain the volatile market using the standard one sector real economic cycle (REC) model. Contrary to the other domestic studies, this study considers a three-sector dynamical stochastic general equilibrium REC model. In this model there are two consumption sectors, whereby one is labour intensive and another is capital intensive. The more capital intensive investment sector only introduces technology shocks in the medical imaging market. Our response functions and Monte-Carlo simulation results show that the model can explain 90% of the volatility of consummation relative to GDP, and explain the correlation between consumption and GDP. The results demonstrated the significant correlation between the technological reform in medical imaging and volatility in the labour market on Chinese macro economy development.

  6. Option pricing for stochastic volatility model with infinite activity Lévy jumps

    NASA Astrophysics Data System (ADS)

    Gong, Xiaoli; Zhuang, Xintian

    2016-08-01

    The purpose of this paper is to apply the stochastic volatility model driven by infinite activity Lévy processes to option pricing which displays infinite activity jumps behaviors and time varying volatility that is consistent with the phenomenon observed in underlying asset dynamics. We specially pay attention to three typical Lévy processes that replace the compound Poisson jumps in Bates model, aiming to capture the leptokurtic feature in asset returns and volatility clustering effect in returns variance. By utilizing the analytical characteristic function and fast Fourier transform technique, the closed form formula of option pricing can be derived. The intelligent global optimization search algorithm called Differential Evolution is introduced into the above highly dimensional models for parameters calibration so as to improve the calibration quality of fitted option models. Finally, we perform empirical researches using both time series data and options data on financial markets to illustrate the effectiveness and superiority of the proposed method.

  7. No-arbitrage, leverage and completeness in a fractional volatility model

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.; Oliveira, M. J.; Rodrigues, A. M.

    2015-02-01

    When the volatility process is driven by fractional noise one obtains a model which is consistent with the empirical market data. Depending on whether the stochasticity generators of log-price and volatility are independent or are the same, two versions of the model are obtained with different leverage behaviors. Here, the no-arbitrage and completeness properties of the models are rigorously studied.

  8. Modelling of volatility in monetary transmission mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobešová, Anna; Klepáč, Václav; Kolman, Pavel

    2015-03-10

    The aim of this paper is to compare different approaches to modeling of volatility in monetary transmission mechanism. For this purpose we built time-varying parameter VAR (TVP-VAR) model with stochastic volatility and VAR-DCC-GARCH model with conditional variance. The data from three European countries are included in the analysis: the Czech Republic, Germany and Slovakia. Results show that VAR-DCC-GARCH system captures higher volatility of observed variables but main trends and detected breaks are generally identical in both approaches.

  9. Forecasting volatility of SSEC in Chinese stock market using multifractal analysis

    NASA Astrophysics Data System (ADS)

    Wei, Yu; Wang, Peng

    2008-03-01

    In this paper, taking about 7 years’ high-frequency data of the Shanghai Stock Exchange Composite Index (SSEC) as an example, we propose a daily volatility measure based on the multifractal spectrum of the high-frequency price variability within a trading day. An ARFIMA model is used to depict the dynamics of this multifractal volatility (MFV) measures. The one-day ahead volatility forecasting performances of the MFV model and some other existing volatility models, such as the realized volatility model, stochastic volatility model and GARCH, are evaluated by the superior prediction ability (SPA) test. The empirical results show that under several loss functions, the MFV model obtains the best forecasting accuracy.

  10. Empirical method to measure stochasticity and multifractality in nonlinear time series

    NASA Astrophysics Data System (ADS)

    Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping

    2013-12-01

    An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.

  11. Valuation of Capabilities and System Architecture Options to Meet Affordability Requirement

    DTIC Science & Technology

    2014-04-30

    is an extension of the historic volatility and trend of the stock using Brownian motion . In finance , the Black-Scholes equation is used to value...the underlying asset whose value is modeled as a stochastic process. In finance , the underlying asset is a tradeable stock and the stochastic process

  12. Noise-induced volatility of collective dynamics

    NASA Astrophysics Data System (ADS)

    Harras, Georges; Tessone, Claudio J.; Sornette, Didier

    2012-01-01

    Noise-induced volatility refers to a phenomenon of increased level of fluctuations in the collective dynamics of bistable units in the presence of a rapidly varying external signal, and intermediate noise levels. The archetypical signature of this phenomenon is that—beyond the increase in the level of fluctuations—the response of the system becomes uncorrelated with the external driving force, making it different from stochastic resonance. Numerical simulations and an analytical theory of a stochastic dynamical version of the Ising model on regular and random networks demonstrate the ubiquity and robustness of this phenomenon, which is argued to be a possible cause of excess volatility in financial markets, of enhanced effective temperatures in a variety of out-of-equilibrium systems, and of strong selective responses of immune systems of complex biological organisms. Extensive numerical simulations are compared with a mean-field theory for different network topologies.

  13. Escape problem under stochastic volatility: The Heston model

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Perelló, Josep

    2008-11-01

    We solve the escape problem for the Heston random diffusion model from a finite interval of span L . We obtain exact expressions for the survival probability (which amounts to solving the complete escape problem) as well as for the mean exit time. We also average the volatility in order to work out the problem for the return alone regardless of volatility. We consider these results in terms of the dimensionless normal level of volatility—a ratio of the three parameters that appear in the Heston model—and analyze their form in several asymptotic limits. Thus, for instance, we show that the mean exit time grows quadratically with large spans while for small spans the growth is systematically slower, depending on the value of the normal level. We compare our results with those of the Wiener process and show that the assumption of stochastic volatility, in an apparently paradoxical way, increases survival and prolongs the escape time. We finally observe that the model is able to describe the main exit-time statistics of the Dow-Jones daily index.

  14. path integral approach to closed form pricing formulas in the Heston framework.

    NASA Astrophysics Data System (ADS)

    Lemmens, Damiaan; Wouters, Michiel; Tempere, Jacques; Foulon, Sven

    2008-03-01

    We present a path integral approach for finding closed form formulas for option prices in the framework of the Heston model. The first model for determining option prices was the Black-Scholes model, which assumed that the logreturn followed a Wiener process with a given drift and constant volatility. To provide a realistic description of the market, the Black-Scholes results must be extended to include stochastic volatility. This is achieved by the Heston model, which assumes that the volatility follows a mean reverting square root process. Current applications of the Heston model are hampered by the unavailability of fast numerical methods, due to a lack of closed-form formulae. Therefore the search for closed form solutions is an essential step before the qualitatively better stochastic volatility models will be used in practice. To attain this goal we outline a simplified path integral approach yielding straightforward results for vanilla Heston options with correlation. Extensions to barrier options and other path-dependent option are discussed, and the new derivation is compared to existing results obtained from alternative path-integral approaches (Dragulescu, Kleinert).

  15. On multilevel RBF collocation to solve nonlinear PDEs arising from endogenous stochastic volatility models

    NASA Astrophysics Data System (ADS)

    Bastani, Ali Foroush; Dastgerdi, Maryam Vahid; Mighani, Abolfazl

    2018-06-01

    The main aim of this paper is the analytical and numerical study of a time-dependent second-order nonlinear partial differential equation (PDE) arising from the endogenous stochastic volatility model, introduced in [Bensoussan, A., Crouhy, M. and Galai, D., Stochastic equity volatility related to the leverage effect (I): equity volatility behavior. Applied Mathematical Finance, 1, 63-85, 1994]. As the first step, we derive a consistent set of initial and boundary conditions to complement the PDE, when the firm is financed by equity and debt. In the sequel, we propose a Newton-based iteration scheme for nonlinear parabolic PDEs which is an extension of a method for solving elliptic partial differential equations introduced in [Fasshauer, G. E., Newton iteration with multiquadrics for the solution of nonlinear PDEs. Computers and Mathematics with Applications, 43, 423-438, 2002]. The scheme is based on multilevel collocation using radial basis functions (RBFs) to solve the resulting locally linearized elliptic PDEs obtained at each level of the Newton iteration. We show the effectiveness of the resulting framework by solving a prototypical example from the field and compare the results with those obtained from three different techniques: (1) a finite difference discretization; (2) a naive RBF collocation and (3) a benchmark approximation, introduced for the first time in this paper. The numerical results confirm the robustness, higher convergence rate and good stability properties of the proposed scheme compared to other alternatives. We also comment on some possible research directions in this field.

  16. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    NASA Astrophysics Data System (ADS)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  17. Estimation and analysis of multifactor productivity in truck transportation : 1987 - 2003

    DOT National Transportation Integrated Search

    2009-02-01

    The analysis has three objectives: 1) to estimate multifactor : productivity (MFP) in truck transportation during : 1987-2003; 2) to examine changes in multifactor productivity : in U.S. truck transportation, over time, and : to compare these changes...

  18. Stochastic Modelling, Analysis, and Simulations of the Solar Cycle Dynamic Process

    NASA Astrophysics Data System (ADS)

    Turner, Douglas C.; Ladde, Gangaram S.

    2018-03-01

    Analytical solutions, discretization schemes and simulation results are presented for the time delay deterministic differential equation model of the solar dynamo presented by Wilmot-Smith et al. In addition, this model is extended under stochastic Gaussian white noise parametric fluctuations. The introduction of stochastic fluctuations incorporates variables affecting the dynamo process in the solar interior, estimation error of parameters, and uncertainty of the α-effect mechanism. Simulation results are presented and analyzed to exhibit the effects of stochastic parametric volatility-dependent perturbations. The results generalize and extend the work of Hazra et al. In fact, some of these results exhibit the oscillatory dynamic behavior generated by the stochastic parametric additative perturbations in the absence of time delay. In addition, the simulation results of the modified stochastic models influence the change in behavior of the very recently developed stochastic model of Hazra et al.

  19. Edgeworth expansions of stochastic trading time

    NASA Astrophysics Data System (ADS)

    Decamps, Marc; De Schepper, Ann

    2010-08-01

    Under most local and stochastic volatility models the underlying forward is assumed to be a positive function of a time-changed Brownian motion. It relates nicely the implied volatility smile to the so-called activity rate in the market. Following Young and DeWitt-Morette (1986) [8], we propose to apply the Duru-Kleinert process-cum-time transformation in path integral to formulate the transition density of the forward. The method leads to asymptotic expansions of the transition density around a Gaussian kernel corresponding to the average activity in the market conditional on the forward value. The approximation is numerically illustrated for pricing vanilla options under the CEV model and the popular normal SABR model. The asymptotics can also be used for Monte Carlo simulations or backward integration schemes.

  20. Model risk for European-style stock index options.

    PubMed

    Gençay, Ramazan; Gibson, Rajna

    2007-01-01

    In empirical modeling, there have been two strands for pricing in the options literature, namely the parametric and nonparametric models. Often, the support for the nonparametric methods is based on a benchmark such as the Black-Scholes (BS) model with constant volatility. In this paper, we study the stochastic volatility (SV) and stochastic volatility random jump (SVJ) models as parametric benchmarks against feedforward neural network (FNN) models, a class of neural network models. Our choice for FNN models is due to their well-studied universal approximation properties of an unknown function and its partial derivatives. Since the partial derivatives of an option pricing formula are risk pricing tools, an accurate estimation of the unknown option pricing function is essential for pricing and hedging. Our findings indicate that FNN models offer themselves as robust option pricing tools, over their sophisticated parametric counterparts in predictive settings. There are two routes to explain the superiority of FNN models over the parametric models in forecast settings. These are nonnormality of return distributions and adaptive learning.

  1. Multifactor analysis of multiscaling in volatility return intervals.

    PubMed

    Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H Eugene

    2009-01-01

    We study the volatility time series of 1137 most traded stocks in the U.S. stock markets for the two-year period 2001-2002 and analyze their return intervals tau , which are time intervals between volatilities above a given threshold q . We explore the probability density function of tau , P_(q)(tau) , assuming a stretched exponential function, P_(q)(tau) approximately e;(-tau;(gamma)) . We find that the exponent gamma depends on the threshold in the range between q=1 and 6 standard deviations of the volatility. This finding supports the multiscaling nature of the return interval distribution. To better understand the multiscaling origin, we study how gamma depends on four essential factors, capitalization, risk, number of trades, and return. We show that gamma depends on the capitalization, risk, and return but almost does not depend on the number of trades. This suggests that gamma relates to the portfolio selection but not on the market activity. To further characterize the multiscaling of individual stocks, we fit the moments of tau , mu_(m) identical with(tautau);(m);(1m) , in the range of 10

  2. Multifactor analysis of multiscaling in volatility return intervals

    NASA Astrophysics Data System (ADS)

    Wang, Fengzhong; Yamasaki, Kazuko; Havlin, Shlomo; Stanley, H. Eugene

    2009-01-01

    We study the volatility time series of 1137 most traded stocks in the U.S. stock markets for the two-year period 2001-2002 and analyze their return intervals τ , which are time intervals between volatilities above a given threshold q . We explore the probability density function of τ , Pq(τ) , assuming a stretched exponential function, Pq(τ)˜e-τγ . We find that the exponent γ depends on the threshold in the range between q=1 and 6 standard deviations of the volatility. This finding supports the multiscaling nature of the return interval distribution. To better understand the multiscaling origin, we study how γ depends on four essential factors, capitalization, risk, number of trades, and return. We show that γ depends on the capitalization, risk, and return but almost does not depend on the number of trades. This suggests that γ relates to the portfolio selection but not on the market activity. To further characterize the multiscaling of individual stocks, we fit the moments of τ , μm≡⟨(τ/⟨τ⟩)m⟩1/m , in the range of 10<⟨τ⟩⩽100 by a power law, μm˜⟨τ⟩δ . The exponent δ is found also to depend on the capitalization, risk, and return but not on the number of trades, and its tendency is opposite to that of γ . Moreover, we show that δ decreases with increasing γ approximately by a linear relation. The return intervals demonstrate the temporal structure of volatilities and our findings suggest that their multiscaling features may be helpful for portfolio optimization.

  3. A Guide to the Multifactored Evaluation (MFE).

    ERIC Educational Resources Information Center

    Ohio Coalition for the Education of Children with Disabilities, Marion.

    This guide provides Ohio parents of children with disabilities with information on multifactored evaluations. It begins by discussing the Intervention Assistance Team and what occurs at the assistance team meeting. It also explains that to begin the multifactored evaluation process, the parent must complete a "Request for Parent Consent for…

  4. A complex network for studying the transmission mechanisms in stock market

    NASA Astrophysics Data System (ADS)

    Long, Wen; Guan, Lijing; Shen, Jiangjian; Song, Linqiu; Cui, Lingxiao

    2017-10-01

    This paper introduces a new complex network to describe the volatility transmission mechanisms in stock market. The network can not only endogenize stock market's volatility but also figure out the direction of volatility spillover. In this model, we first use BEKK-GARCH to estimate the volatility spillover effects among Chinese 18 industry sectors. Then, based on the ARCH coefficients and GARCH coefficients, the directional shock networks and variance networks in different stages are constructed separately. We find that the spillover effects and network structures changes in different stages. The results of the topological stability test demonstrate that the connectivity of networks becomes more fragile to selective attacks than stochastic attacks.

  5. Multi-factor authentication using quantum communication

    DOEpatents

    Hughes, Richard John; Peterson, Charles Glen; Thrasher, James T.; Nordholt, Jane E.; Yard, Jon T.; Newell, Raymond Thorson; Somma, Rolando D.

    2018-02-06

    Multi-factor authentication using quantum communication ("QC") includes stages for enrollment and identification. For example, a user enrolls for multi-factor authentication that uses QC with a trusted authority. The trusted authority transmits device factor information associated with a user device (such as a hash function) and user factor information associated with the user (such as an encrypted version of a user password). The user device receives and stores the device factor information and user factor information. For multi-factor authentication that uses QC, the user device retrieves its stored device factor information and user factor information, then transmits the user factor information to the trusted authority, which also retrieves its stored device factor information. The user device and trusted authority use the device factor information and user factor information (more specifically, information such as a user password that is the basis of the user factor information) in multi-factor authentication that uses QC.

  6. Baldovin-Stella stochastic volatility process and Wiener process mixtures

    NASA Astrophysics Data System (ADS)

    Peirano, P. P.; Challet, D.

    2012-08-01

    Starting from inhomogeneous time scaling and linear decorrelation between successive price returns, Baldovin and Stella recently proposed a powerful and consistent way to build a model describing the time evolution of a financial index. We first make it fully explicit by using Student distributions instead of power law-truncated Lévy distributions and show that the analytic tractability of the model extends to the larger class of symmetric generalized hyperbolic distributions and provide a full computation of their multivariate characteristic functions; more generally, we show that the stochastic processes arising in this framework are representable as mixtures of Wiener processes. The basic Baldovin and Stella model, while mimicking well volatility relaxation phenomena such as the Omori law, fails to reproduce other stylized facts such as the leverage effect or some time reversal asymmetries. We discuss how to modify the dynamics of this process in order to reproduce real data more accurately.

  7. Understanding the determinants of volatility clustering in terms of stationary Markovian processes

    NASA Astrophysics Data System (ADS)

    Miccichè, S.

    2016-11-01

    Volatility is a key variable in the modeling of financial markets. The most striking feature of volatility is that it is a long-range correlated stochastic variable, i.e. its autocorrelation function decays like a power-law τ-β for large time lags. In the present work we investigate the determinants of such feature, starting from the empirical observation that the exponent β of a certain stock's volatility is a linear function of the average correlation of such stock's volatility with all other volatilities. We propose a simple approach consisting in diagonalizing the cross-correlation matrix of volatilities and investigating whether or not the diagonalized volatilities still keep some of the original volatility stylized facts. As a result, the diagonalized volatilities result to share with the original volatilities either the power-law decay of the probability density function and the power-law decay of the autocorrelation function. This would indicate that volatility clustering is already present in the diagonalized un-correlated volatilities. We therefore present a parsimonious univariate model based on a non-linear Langevin equation that well reproduces these two stylized facts of volatility. The model helps us in understanding that the main source of volatility clustering, once volatilities have been diagonalized, is that the economic forces driving volatility can be modeled in terms of a Smoluchowski potential with logarithmic tails.

  8. Pricing timer options and variance derivatives with closed-form partial transform under the 3/2 model

    PubMed Central

    Zheng, Wendong; Zeng, Pingping

    2016-01-01

    ABSTRACT Most of the empirical studies on stochastic volatility dynamics favour the 3/2 specification over the square-root (CIR) process in the Heston model. In the context of option pricing, the 3/2 stochastic volatility model (SVM) is reported to be able to capture the volatility skew evolution better than the Heston model. In this article, we make a thorough investigation on the analytic tractability of the 3/2 SVM by proposing a closed-form formula for the partial transform of the triple joint transition density which stand for the log asset price, the quadratic variation (continuous realized variance) and the instantaneous variance, respectively. Two distinct formulations are provided for deriving the main result. The closed-form partial transform enables us to deduce a variety of marginal partial transforms and characteristic functions and plays a crucial role in pricing discretely sampled variance derivatives and exotic options that depend on both the asset price and quadratic variation. Various applications and numerical examples on pricing moment swaps and timer options with discrete monitoring feature are given to demonstrate the versatility of the partial transform under the 3/2 model. PMID:28706460

  9. Analysis of carbon emission regulations in supply chains with volatile demand.

    DOT National Transportation Integrated Search

    2014-07-01

    This study analyzes an inventory control problem of a company in stochastic demand environment under carbon emissions : regulations. In particular, a continuous review inventory model with multiple suppliers is investigated under carbon taxing and ca...

  10. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  11. Nonlinear stochastic interacting dynamics and complexity of financial gasket fractal-like lattice percolation

    NASA Astrophysics Data System (ADS)

    Zhang, Wei; Wang, Jun

    2018-05-01

    A novel nonlinear stochastic interacting price dynamics is proposed and investigated by the bond percolation on Sierpinski gasket fractal-like lattice, aim to make a new approach to reproduce and study the complexity dynamics of real security markets. Fractal-like lattices correspond to finite graphs with vertices and edges, which are similar to fractals, and Sierpinski gasket is a well-known example of fractals. Fractional ordinal array entropy and fractional ordinal array complexity are introduced to analyze the complexity behaviors of financial signals. To deeper comprehend the fluctuation characteristics of the stochastic price evolution, the complexity analysis of random logarithmic returns and volatility are preformed, including power-law distribution, fractional sample entropy and fractional ordinal array complexity. For further verifying the rationality and validity of the developed stochastic price evolution, the actual security market dataset are also studied with the same statistical methods for comparison. The empirical results show that this stochastic price dynamics can reconstruct complexity behaviors of the actual security markets to some extent.

  12. Robust Bayesian Analysis of Heavy-tailed Stochastic Volatility Models using Scale Mixtures of Normal Distributions

    PubMed Central

    Abanto-Valle, C. A.; Bandyopadhyay, D.; Lachos, V. H.; Enriquez, I.

    2009-01-01

    A Bayesian analysis of stochastic volatility (SV) models using the class of symmetric scale mixtures of normal (SMN) distributions is considered. In the face of non-normality, this provides an appealing robust alternative to the routine use of the normal distribution. Specific distributions examined include the normal, student-t, slash and the variance gamma distributions. Using a Bayesian paradigm, an efficient Markov chain Monte Carlo (MCMC) algorithm is introduced for parameter estimation. Moreover, the mixing parameters obtained as a by-product of the scale mixture representation can be used to identify outliers. The methods developed are applied to analyze daily stock returns data on S&P500 index. Bayesian model selection criteria as well as out-of- sample forecasting results reveal that the SV models based on heavy-tailed SMN distributions provide significant improvement in model fit as well as prediction to the S&P500 index data over the usual normal model. PMID:20730043

  13. A primer on multifactor productivity : description, benefits, and uses

    DOT National Transportation Integrated Search

    2008-04-01

    This primer presents a description of multifactor : productivity (MFP) and its calculation. Productivity : is an important measure of the state of the : economy at various levels: firm, industry, sectoral, : and the macroeconomic. The method describe...

  14. The Sign Effect in Emerging Markets: the Inherent Instability of Bad News

    NASA Astrophysics Data System (ADS)

    Tenenbaum, Joel; Podobnik, Boris; Horvatic, Davor; Bajic, Slavica; Pehlivanovic, Beco; Stanley, H. Eugene

    2011-03-01

    In developed economy market indices, the sign of a term in a series influences the volatility in an asymmetric fashion --- bad news results in larger subsequent fluctuations while good news results in smaller fluctuations. We study this phenomenon of volatility asymmetry using a stochastic process, exploring whether this asymmetry manifests in emerging markets, and if so, how such asymmetry changes over time as economies develop, mature, and react to crises such as the present one. We find that while both developed and emerging markets show distinctive behavior with respect to volatility asymmetry during times of economic tumult, they do so in ways that could be viewed either as universal or qualitatively different, posing interesting questions for further research.

  15. Neglected chaos in international stock markets: Bayesian analysis of the joint return-volatility dynamical system

    NASA Astrophysics Data System (ADS)

    Tsionas, Mike G.; Michaelides, Panayotis G.

    2017-09-01

    We use a novel Bayesian inference procedure for the Lyapunov exponent in the dynamical system of returns and their unobserved volatility. In the dynamical system, computation of largest Lyapunov exponent by traditional methods is impossible as the stochastic nature has to be taken explicitly into account due to unobserved volatility. We apply the new techniques to daily stock return data for a group of six countries, namely USA, UK, Switzerland, Netherlands, Germany and France, from 2003 to 2014, by means of Sequential Monte Carlo for Bayesian inference. The evidence points to the direction that there is indeed noisy chaos both before and after the recent financial crisis. However, when a much simpler model is examined where the interaction between returns and volatility is not taken into consideration jointly, the hypothesis of chaotic dynamics does not receive much support by the data ("neglected chaos").

  16. What distinguishes individual stocks from the index?

    NASA Astrophysics Data System (ADS)

    Wagner, F.; Milaković, M.; Alfarano, S.

    2010-01-01

    Stochastic volatility models decompose the time series of financial returns into the product of a volatility factor and an iid noise factor. Assuming a slow dynamic for the volatility factor, we show via nonparametric tests that both the index as well as its individual stocks share a common volatility factor. While the noise component is Gaussian for the index, individual stock returns turn out to require a leptokurtic noise. Thus we propose a two-component model for stocks, given by the sum of Gaussian noise, which reflects market-wide fluctuations, and Laplacian noise, which incorporates firm-specific factors such as firm profitability or growth performance, both of which are known to be Laplacian distributed. In the case of purely Gaussian noise, the chi-squared probability for the density of individual stock returns is typically on the order of 10-20, while it increases to values of O(1) by adding the Laplace component.

  17. An analysis of labor and multifactor productivity in air transportation : 1990 - 2001

    DOT National Transportation Integrated Search

    2002-01-01

    The analysis has two main objectives: 1) to examine : labor productivity and multifactor productivity : (MFP) in U.S. air transportation during the 1990 : to 2001 period and to compare these measures to : those of two other transportation subsectors ...

  18. A comparison of linear and nonlinear statistical techniques in performance attribution.

    PubMed

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  19. Fractional Ornstein-Uhlenbeck for index prices of FTSE Bursa Malaysia KLCI

    NASA Astrophysics Data System (ADS)

    Chen, Kho Chia; Bahar, Arifah; Ting, Chee-Ming

    2014-07-01

    This paper studies the Ornstein-Uhlenbeck model that incorporates long memory stochastic volatility which is known as fractional Ornstein-Uhlenbeck model. The determination of the existence of long range dependence of the index prices of FTSE Bursa Malaysia KLCI is measured by the Hurst exponent. The empirical distribution of unobserved volatility is estimated using the particle filtering method. The performance between fractional Ornstein -Uhlenbeck and standard Ornstein -Uhlenbeck process had been compared. The mean square errors of the fractional Ornstein-Uhlenbeck model indicated that the model describes index prices better than the standard Ornstein-Uhlenbeck process.

  20. Leverage effect in financial markets: the retarded volatility model.

    PubMed

    Bouchaud, J P; Matacz, A; Potters, M

    2001-11-26

    We investigate quantitatively the so-called "leverage effect," which corresponds to a negative correlation between past returns and future volatility. For individual stocks this correlation is moderate and decays over 50 days, while for stock indices it is much stronger but decays faster. For individual stocks the magnitude of this correlation has a universal value that can be rationalized in terms of a new "retarded" model which interpolates between a purely additive and a purely multiplicative stochastic process. For stock indices a specific amplification phenomenon seems to be necessary to account for the observed amplitude of the effect.

  1. Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Validation Results

    Cancer.gov

    Risk Factor Assessment Branch (RFAB) staff have assessed the validity of the Multifactor Screener in several studies: NCI's Observing Protein and Energy (OPEN) Study, the Eating at America's Table Study (EATS), and the joint NIH-AARP Diet and Health Study.

  2. Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Overview

    Cancer.gov

    The Multifactor Screener may be useful to assess approximate intakes of fruits and vegetables, percentage energy from fat, and fiber. The screener asks respondents to report how frequently they consume foods in 16 categories. The screener also asks one question about the type of milk consumed.

  3. Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Uses of Screener Estimates

    Cancer.gov

    Dietary intake estimates derived from the Multifactor Screener are rough estimates of usual intake of fruits and vegetables, fiber, calcium, servings of dairy, and added sugar. These estimates are not as accurate as those from more detailed methods (e.g., 24-hour recalls).

  4. Volatile decision dynamics: experiments, stochastic description, intermittency control and traffic optimization

    NASA Astrophysics Data System (ADS)

    Helbing, Dirk; Schönhof, Martin; Kern, Daniel

    2002-06-01

    The coordinated and efficient distribution of limited resources by individual decisions is a fundamental, unsolved problem. When individuals compete for road capacities, time, space, money, goods, etc, they normally make decisions based on aggregate rather than complete information, such as TV news or stock market indices. In related experiments, we have observed a volatile decision dynamics and far-from-optimal payoff distributions. We have also identified methods of information presentation that can considerably improve the overall performance of the system. In order to determine optimal strategies of decision guidance by means of user-specific recommendations, a stochastic behavioural description is developed. These strategies manage to increase the adaptibility to changing conditions and to reduce the deviation from the time-dependent user equilibrium, thereby enhancing the average and individual payoffs. Hence, our guidance strategies can increase the performance of all users by reducing overreaction and stabilizing the decision dynamics. These results are highly significant for predicting decision behaviour, for reaching optimal behavioural distributions by decision support systems and for information service providers. One of the promising fields of application is traffic optimization.

  5. Fearless versus fearful speculative financial bubbles

    NASA Astrophysics Data System (ADS)

    Andersen, J. V.; Sornette, D.

    2004-06-01

    Using a recently introduced rational expectation model of bubbles, based on the interplay between stochasticity and positive feedbacks of prices on returns and volatility, we develop a new methodology to test how this model classifies nine time series that have been previously considered as bubbles ending in crashes. The model predicts the existence of two anomalous behaviors occurring simultaneously: (i) super-exponential price growth and (ii) volatility growth, that we refer to as the “fearful singular bubble” regime. Out of the nine time series, we find that five pass our tests and can be characterized as “fearful singular bubbles”. The four other cases are the information technology Nasdaq bubble and three bubbles of the Hang Seng index ending in crashes in 1987, 1994 and 1997. According to our analysis, these four bubbles have developed with essentially no significant increase of their volatility. This paper thus proposes that speculative bubbles ending in crashes form two groups hitherto unrecognized, namely those accompanied by increasing volatility (reflecting increasing risk perception) and those without change of volatility (reflecting an absence of risk perception).

  6. Variational Solutions and Random Dynamical Systems to SPDEs Perturbed by Fractional Gaussian Noise

    PubMed Central

    Zeng, Caibin; Yang, Qigui; Cao, Junfei

    2014-01-01

    This paper deals with the following type of stochastic partial differential equations (SPDEs) perturbed by an infinite dimensional fractional Brownian motion with a suitable volatility coefficient Φ: dX(t) = A(X(t))dt+Φ(t)dB H(t), where A is a nonlinear operator satisfying some monotonicity conditions. Using the variational approach, we prove the existence and uniqueness of variational solutions to such system. Moreover, we prove that this variational solution generates a random dynamical system. The main results are applied to a general type of nonlinear SPDEs and the stochastic generalized p-Laplacian equation. PMID:24574903

  7. Cautionary Note on Reporting Eta-Squared Values from Multifactor ANOVA Designs

    ERIC Educational Resources Information Center

    Pierce, Charles A.; Block, Richard A.; Aguinis, Herman

    2004-01-01

    The authors provide a cautionary note on reporting accurate eta-squared values from multifactor analysis of variance (ANOVA) designs. They reinforce the distinction between classical and partial eta-squared as measures of strength of association. They provide examples from articles published in premier psychology journals in which the authors…

  8. Forest ecosystems of a Lower Gulf Coastal Plainlandscape: multifactor classification and analysis

    Treesearch

    P. Charles Goebel; Brian J. Palik; L. Katherine Kirkman; Mark B. Drew; Larry West; Dee C. Pederson

    2001-01-01

    The most common forestland classification techniques applied in the southeastern United States are vegetation-based. While not completely ignored, the application of multifactor, hierarchical ecosystem classifications are limited despite their widespread use in other regions of the eastern United States. We present one of the few truly integrated ecosystem...

  9. A Multifactor Ecosystem Assessment of Wetlands Created Using a Novel Dredged Material Placement Technique in the Atchafalaya River, Louisiana: An Engineering With Nature Demonstration Project

    DTIC Science & Technology

    functions. The strategic placement of dredged materials in locations that mimic natural process promoted additional ecological benefits, especially...regarding wading bird and infaunal habitat, thus adhering to Engineering With Nature (EWN) processes. The multifactor approach improved the wetland

  10. In vitro mineral nutrition of Curcuma longa L. affects production of volatile compounds in rhizomes after transfer to the greenhouse.

    PubMed

    El-Hawaz, Rabia F; Grace, Mary H; Janbey, Alan; Lila, Mary Ann; Adelberg, Jeffrey W

    2018-06-18

    Turmeric is a rich source of bioactive compounds useful in both medicine and cuisine. Mineral concentrations effects (PO 4 3- , Ca 2+ , Mg 2+ , and KNO 3 ) were tested during in vitro rhizome development on the ex vitro content of volatile constituents in rhizomes after 6 months in the greenhouse. A response surface method (D-optimal criteria) was repeated in both high and low-input fertilizer treatments. Control plants were grown on Murashige and Skoog (MS) medium, acclimatized in the greenhouse and grown in the field. The volatile constituents were investigated by GC-MS. The total content of volatiles was affected by fertilizer treatments, and in vitro treatment with Ca 2+ and KNO 3 ; but PO 4 3- and Mg 2+ had no significant effect. The content was higher in the high-input fertilizer treatments (49.7 ± 9 mg/g DM) with 4 mM Ca 2+ , 60 mM KNO 3 and 5 mM NH 4 + , than the low-input fertilizer (26.6 ± 9 mg/g DM), and the MS control (15.28 ± 2.7 mg/g DM; 3 mM Ca 2+ , 20 mM K + , 39 mM NO 3 - , 20 mM NH 4 + , 1.25 mM PO 4 3- , and 1.5 mM Mg 2+ ). The interaction of Ca 2+ with KNO 3 affected curcumenol isomer I and II, germacrone, isocurcumenol, and β-elemenone content. Increasing in vitro phosphate concentration to 6.25 mM increased ex vitro neocurdione and methenolone contents. These results show that minerals in the in vitro bioreactor medium during rhizome development affected biosynthesis of turmeric volatile components after transfer to the greenhouse six months later. The multi-factor design identified 1) nutrient regulation of specific components within unique phytochemical profile for Curcuma longa L. clone 35-1 and 2) the varied phytochemical profiles were maintained with integrity during the greenhouse growth in high fertility conditions.

  11. Eliciting interval beliefs: An experimental study

    PubMed Central

    Peeters, Ronald; Wolk, Leonard

    2017-01-01

    In this paper we study the interval scoring rule as a mechanism to elicit subjective beliefs under varying degrees of uncertainty. In our experiment, subjects forecast the termination time of a time series to be generated from a given but unknown stochastic process. Subjects gradually learn more about the underlying process over time and hence the true distribution over termination times. We conduct two treatments, one with a high and one with a low volatility process. We find that elicited intervals are better when subjects are facing a low volatility process. In this treatment, participants learn to position their intervals almost optimally over the course of the experiment. This is in contrast with the high volatility treatment, where subjects, over the course of the experiment, learn to optimize the location of their intervals but fail to provide the optimal length. PMID:28380020

  12. Jump spillover between oil prices and exchange rates

    NASA Astrophysics Data System (ADS)

    Li, Xiao-Ping; Zhou, Chun-Yang; Wu, Chong-Feng

    2017-11-01

    In this paper, we investigate the jump spillover effects between oil prices and exchange rates. To identify the latent historical jumps for exchange rates and oil prices, we use a Bayesian MCMC approach to estimate the stochastic volatility model with correlated jumps in both returns and volatilities for each. We examine the simultaneous jump intensities and the conditional jump spillover probabilities between oil prices and exchange rates, finding strong evidence of jump spillover effects. Further analysis shows that the jump spillovers are mainly due to exogenous events such as financial crises and geopolitical events. Thus, the findings have important implications for financial risk management.

  13. On the Interface of Probabilistic and PDE Methods in a Multifactor Term Structure Theory

    ERIC Educational Resources Information Center

    Mamon, Rogemar S.

    2004-01-01

    Within the general framework of a multifactor term structure model, the fundamental partial differential equation (PDE) satisfied by a default-free zero-coupon bond price is derived via a martingale-oriented approach. Using this PDE, a result characterizing a model belonging to an exponential affine class is established using only a system of…

  14. Stochastic Models for Precipitable Water in Convection

    NASA Astrophysics Data System (ADS)

    Leung, Kimberly

    Atmospheric precipitable water vapor (PWV) is the amount of water vapor in the atmosphere within a vertical column of unit cross-sectional area and is a critically important parameter of precipitation processes. However, accurate high-frequency and long-term observations of PWV in the sky were impossible until the availability of modern instruments such as radar. The United States Department of Energy (DOE)'s Atmospheric Radiation Measurement (ARM) Program facility made the first systematic and high-resolution observations of PWV at Darwin, Australia since 2002. At a resolution of 20 seconds, this time series allowed us to examine the volatility of PWV, including fractal behavior with dimension equal to 1.9, higher than the Brownian motion dimension of 1.5. Such strong fractal behavior calls for stochastic differential equation modeling in an attempt to address some of the difficulties of convective parameterization in various kinds of climate models, ranging from general circulation models (GCM) to weather research forecasting (WRF) models. This important observed data at high resolution can capture the fractal behavior of PWV and enables stochastic exploration into the next generation of climate models which considers scales from micrometers to thousands of kilometers. As a first step, this thesis explores a simple stochastic differential equation model of water mass balance for PWV and assesses accuracy, robustness, and sensitivity of the stochastic model. A 1000-day simulation allows for the determination of the best-fitting 25-day period as compared to data from the TWP-ICE field campaign conducted out of Darwin, Australia in early 2006. The observed data and this portion of the simulation had a correlation coefficient of 0.6513 and followed similar statistics and low-resolution temporal trends. Building on the point model foundation, a similar algorithm was applied to the National Center for Atmospheric Research (NCAR)'s existing single-column model as a test-of-concept for eventual inclusion in a general circulation model. The stochastic scheme was designed to be coupled with the deterministic single-column simulation by modifying results of the existing convective scheme (Zhang-McFarlane) and was able to produce a 20-second resolution time series that effectively simulated observed PWV, as measured by correlation coefficient (0.5510), fractal dimension (1.9), statistics, and visual examination of temporal trends. Results indicate that simulation of a highly volatile time series of observed PWV is certainly achievable and has potential to improve prediction capabilities in climate modeling. Further, this study demonstrates the feasibility of adding a mathematics- and statistics-based stochastic scheme to an existing deterministic parameterization to simulate observed fractal behavior.

  15. Reverse resonance in stock prices of financial system with periodic information

    NASA Astrophysics Data System (ADS)

    Li, Jiang-Cheng; Mei, Dong-Cheng

    2013-07-01

    We investigate the stochastic resonance of the stock prices in a finance system with the Heston model. The extrinsic and intrinsic periodic information are introduced into the stochastic differential equations of the Heston model for stock price by focusing on the signal power amplification (SPA). We find that for both cases of extrinsic and intrinsic periodic information a phenomenon of reverse resonance emerges in the behaviors of SPA as a function of the system and external driving parameters. Moreover, in both cases, a phenomenon of double reverse resonance is observed in the behavior of SPA versus the amplitude of volatility fluctuations, by increasing the cross correlation between the noise sources in the Heston model.

  16. Measurement Invariance of Second-Order Factor Model of the Multifactor Leadership Questionnaire (MLQ) across K-12 Principal Gender

    ERIC Educational Resources Information Center

    Xu, Lihua; Wubbena, Zane; Stewart, Trae

    2016-01-01

    Purpose: The purpose of this paper is to investigate the factor structure and the measurement invariance of the Multifactor Leadership Questionnaire (MLQ) across gender of K-12 school principals (n=6,317) in the USA. Design/methodology/approach: Nine first-order factor models and four second-order factor models were tested using confirmatory…

  17. Bullying among adolescents in North Cyprus and Turkey: testing a multifactor model.

    PubMed

    Bayraktar, Fatih

    2012-04-01

    Peer bullying has been studied since the 1970s. Therefore, a vast literature has accumulated about the various predictors of bullying. However, to date there has been no study which has combined individual-, peer-, parental-, teacher-, and school-related predictors of bullying within a model. In this sense, the main aim of this study was to test a multifactor model of bullying among adolescents in North Cyprus and Turkey. A total of 1,052 adolescents (554 girls, 498 boys) aged between 13 and 18 (M = 14.7, SD = 1.17) were recruited from North Cyprus and Turkey. Before testing the multifactor models, the measurement models were tested according to structural equation modeling propositions. Both models indicated that the psychological climate of the school, teacher attitudes within classroom, peer relationships, parental acceptance-rejection, and individual social competence factors had significant direct effects on bullying behaviors. Goodness-of-fit indexes indicated that the proposed multifactor model fitted both data well. The strongest predictors of bullying were the psychological climate of the school following individual social competence factors and teacher attitudes within classroom in both samples. All of the latent variables explained 44% and 51% of the variance in bullying in North Cyprus and Turkey, respectively.

  18. Potential barriers to the application of multi-factor portfolio analysis in public hospitals: evidence from a pilot study in the Netherlands.

    PubMed

    Pavlova, Milena; Tsiachristas, Apostolos; Vermaeten, Gerhard; Groot, Wim

    2009-01-01

    Portfolio analysis is a business management tool that can assist health care managers to develop new organizational strategies. The application of portfolio analysis to US hospital settings has been frequently reported. In Europe however, the application of this technique has received little attention, especially concerning public hospitals. Therefore, this paper examines the peculiarities of portfolio analysis and its applicability to the strategic management of European public hospitals. The analysis is based on a pilot application of a multi-factor portfolio analysis in a Dutch university hospital. The nature of portfolio analysis and the steps in a multi-factor portfolio analysis are reviewed along with the characteristics of the research setting. Based on these data, a multi-factor portfolio model is developed and operationalized. The portfolio model is applied in a pilot investigation to analyze the market attractiveness and hospital strengths with regard to the provision of three orthopedic services: knee surgery, hip surgery, and arthroscopy. The pilot portfolio analysis is discussed to draw conclusions about potential barriers to the overall adoption of portfolio analysis in the management of a public hospital. Copyright (c) 2008 John Wiley & Sons, Ltd.

  19. On decoupling of volatility smile and term structure in inverse option pricing

    NASA Astrophysics Data System (ADS)

    Egger, Herbert; Hein, Torsten; Hofmann, Bernd

    2006-08-01

    Correct pricing of options and other financial derivatives is of great importance to financial markets and one of the key subjects of mathematical finance. Usually, parameters specifying the underlying stochastic model are not directly observable, but have to be determined indirectly from observable quantities. The identification of local volatility surfaces from market data of European vanilla options is one very important example of this type. As with many other parameter identification problems, the reconstruction of local volatility surfaces is ill-posed, and reasonable results can only be achieved via regularization methods. Moreover, due to the sparsity of data, the local volatility is not uniquely determined, but depends strongly on the kind of regularization norm used and a good a priori guess for the parameter. By assuming a multiplicative structure for the local volatility, which is motivated by the specific data situation, the inverse problem can be decomposed into two separate sub-problems. This removes part of the non-uniqueness and allows us to establish convergence and convergence rates under weak assumptions. Additionally, a numerical solution of the two sub-problems is much cheaper than that of the overall identification problem. The theoretical results are illustrated by numerical tests.

  20. Universal Behavior of Extreme Price Movements in Stock Markets

    PubMed Central

    Fuentes, Miguel A.; Gerig, Austin; Vicente, Javier

    2009-01-01

    Many studies assume stock prices follow a random process known as geometric Brownian motion. Although approximately correct, this model fails to explain the frequent occurrence of extreme price movements, such as stock market crashes. Using a large collection of data from three different stock markets, we present evidence that a modification to the random model—adding a slow, but significant, fluctuation to the standard deviation of the process—accurately explains the probability of different-sized price changes, including the relative high frequency of extreme movements. Furthermore, we show that this process is similar across stocks so that their price fluctuations can be characterized by a single curve. Because the behavior of price fluctuations is rooted in the characteristics of volatility, we expect our results to bring increased interest to stochastic volatility models, and especially to those that can produce the properties of volatility reported here. PMID:20041178

  1. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points, the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  2. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2008-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  3. Multi-factor authentication

    DOEpatents

    Hamlet, Jason R; Pierson, Lyndon G

    2014-10-21

    Detection and deterrence of spoofing of user authentication may be achieved by including a cryptographic fingerprint unit within a hardware device for authenticating a user of the hardware device. The cryptographic fingerprint unit includes an internal physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generates a PUF value. Combining logic is coupled to receive the PUF value, combines the PUF value with one or more other authentication factors to generate a multi-factor authentication value. A key generator is coupled to generate a private key and a public key based on the multi-factor authentication value while a decryptor is coupled to receive an authentication challenge posed to the hardware device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  4. New approach of financial volatility duration dynamics by stochastic finite-range interacting voter system.

    PubMed

    Wang, Guochao; Wang, Jun

    2017-01-01

    We make an approach on investigating the fluctuation behaviors of financial volatility duration dynamics. A new concept of volatility two-component range intensity (VTRI) is developed, which constitutes the maximal variation range of volatility intensity and shortest passage time of duration, and can quantify the investment risk in financial markets. In an attempt to study and describe the nonlinear complex properties of VTRI, a random agent-based financial price model is developed by the finite-range interacting biased voter system. The autocorrelation behaviors and the power-law scaling behaviors of return time series and VTRI series are investigated. Then, the complexity of VTRI series of the real markets and the proposed model is analyzed by Fuzzy entropy (FuzzyEn) and Lempel-Ziv complexity. In this process, we apply the cross-Fuzzy entropy (C-FuzzyEn) to study the asynchrony of pairs of VTRI series. The empirical results reveal that the proposed model has the similar complex behaviors with the actual markets and indicate that the proposed stock VTRI series analysis and the financial model are meaningful and feasible to some extent.

  5. New approach of financial volatility duration dynamics by stochastic finite-range interacting voter system

    NASA Astrophysics Data System (ADS)

    Wang, Guochao; Wang, Jun

    2017-01-01

    We make an approach on investigating the fluctuation behaviors of financial volatility duration dynamics. A new concept of volatility two-component range intensity (VTRI) is developed, which constitutes the maximal variation range of volatility intensity and shortest passage time of duration, and can quantify the investment risk in financial markets. In an attempt to study and describe the nonlinear complex properties of VTRI, a random agent-based financial price model is developed by the finite-range interacting biased voter system. The autocorrelation behaviors and the power-law scaling behaviors of return time series and VTRI series are investigated. Then, the complexity of VTRI series of the real markets and the proposed model is analyzed by Fuzzy entropy (FuzzyEn) and Lempel-Ziv complexity. In this process, we apply the cross-Fuzzy entropy (C-FuzzyEn) to study the asynchrony of pairs of VTRI series. The empirical results reveal that the proposed model has the similar complex behaviors with the actual markets and indicate that the proposed stock VTRI series analysis and the financial model are meaningful and feasible to some extent.

  6. A Unique Computational Algorithm to Simulate Probabilistic Multi-Factor Interaction Model Complex Material Point Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2010-01-01

    The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the launch external tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points--the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used was obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated.

  7. Dynamically Hedging Oil and Currency Futures Using Receding Horizontal Control and Stochastic Programming

    NASA Astrophysics Data System (ADS)

    Cottrell, Paul Edward

    There is a lack of research in the area of hedging future contracts, especially in illiquid or very volatile market conditions. It is important to understand the volatility of the oil and currency markets because reduced fluctuations in these markets could lead to better hedging performance. This study compared different hedging methods by using a hedging error metric, supplementing the Receding Horizontal Control and Stochastic Programming (RHCSP) method by utilizing the London Interbank Offered Rate with the Levy process. The RHCSP hedging method was investigated to determine if improved hedging error was accomplished compared to the Black-Scholes, Leland, and Whalley and Wilmott methods when applied on simulated, oil, and currency futures markets. A modified RHCSP method was also investigated to determine if this method could significantly reduce hedging error under extreme market illiquidity conditions when applied on simulated, oil, and currency futures markets. This quantitative study used chaos theory and emergence for its theoretical foundation. An experimental research method was utilized for this study with a sample size of 506 hedging errors pertaining to historical and simulation data. The historical data were from January 1, 2005 through December 31, 2012. The modified RHCSP method was found to significantly reduce hedging error for the oil and currency market futures by the use of a 2-way ANOVA with a t test and post hoc Tukey test. This study promotes positive social change by identifying better risk controls for investment portfolios and illustrating how to benefit from high volatility in markets. Economists, professional investment managers, and independent investors could benefit from the findings of this study.

  8. Brownian motion model with stochastic parameters for asset prices

    NASA Astrophysics Data System (ADS)

    Ching, Soo Huei; Hin, Pooi Ah

    2013-09-01

    The Brownian motion model may not be a completely realistic model for asset prices because in real asset prices the drift μ and volatility σ may change over time. Presently we consider a model in which the parameter x = (μ,σ) is such that its value x (t + Δt) at a short time Δt ahead of the present time t depends on the value of the asset price at time t + Δt as well as the present parameter value x(t) and m-1 other parameter values before time t via a conditional distribution. The Malaysian stock prices are used to compare the performance of the Brownian motion model with fixed parameter with that of the model with stochastic parameter.

  9. A stochastic hybrid model for pricing forward-start variance swaps

    NASA Astrophysics Data System (ADS)

    Roslan, Teh Raihana Nazirah

    2017-11-01

    Recently, market players have been exposed to the astounding increase in the trading volume of variance swaps. In this paper, the forward-start nature of a variance swap is being inspected, where hybridizations of equity and interest rate models are used to evaluate the price of discretely-sampled forward-start variance swaps. The Heston stochastic volatility model is being extended to incorporate the dynamics of the Cox-Ingersoll-Ross (CIR) stochastic interest rate model. This is essential since previous studies on variance swaps were mainly focusing on instantaneous-start variance swaps without considering the interest rate effects. This hybrid model produces an efficient semi-closed form pricing formula through the development of forward characteristic functions. The performance of this formula is investigated via simulations to demonstrate how the formula performs for different sampling times and against the real market scenario. Comparison done with the Monte Carlo simulation which was set as our main reference point reveals that our pricing formula gains almost the same precision in a shorter execution time.

  10. The Research of Regression Method for Forecasting Monthly Electricity Sales Considering Coupled Multi-factor

    NASA Astrophysics Data System (ADS)

    Wang, Jiangbo; Liu, Junhui; Li, Tiantian; Yin, Shuo; He, Xinhui

    2018-01-01

    The monthly electricity sales forecasting is a basic work to ensure the safety of the power system. This paper presented a monthly electricity sales forecasting method which comprehensively considers the coupled multi-factors of temperature, economic growth, electric power replacement and business expansion. The mathematical model is constructed by using regression method. The simulation results show that the proposed method is accurate and effective.

  11. Biometric Data Safeguarding Technologies Analysis and Best Practices

    DTIC Science & Technology

    2011-12-01

    fuzzy vault” scheme proposed by Juels and Sudan. The scheme was designed to encrypt data such that it could be unlocked by similar but inexact matches... designed transform functions. Multifactor Key Generation Multifactor key generation combines a biometric with one or more other inputs, such as a...cooperative, off-angle iris images.  Since the commercialized system is designed for images acquired from a specific, paired acquisition system

  12. Application of GA-SVM method with parameter optimization for landslide development prediction

    NASA Astrophysics Data System (ADS)

    Li, X. Z.; Kong, J. M.

    2013-10-01

    Prediction of landslide development process is always a hot issue in landslide research. So far, many methods for landslide displacement series prediction have been proposed. Support vector machine (SVM) has been proved to be a novel algorithm with good performance. However, the performance strongly depends on the right selection of the parameters (C and γ) of SVM model. In this study, we presented an application of GA-SVM method with parameter optimization in landslide displacement rate prediction. We selected a typical large-scale landslide in some hydro - electrical engineering area of Southwest China as a case. On the basis of analyzing the basic characteristics and monitoring data of the landslide, a single-factor GA-SVM model and a multi-factor GA-SVM model of the landslide were built. Moreover, the models were compared with single-factor and multi-factor SVM models of the landslide. The results show that, the four models have high prediction accuracies, but the accuracies of GA-SVM models are slightly higher than those of SVM models and the accuracies of multi-factor models are slightly higher than those of single-factor models for the landslide prediction. The accuracy of the multi-factor GA-SVM models is the highest, with the smallest RSME of 0.0009 and the biggest RI of 0.9992.

  13. Function of bacterial community dynamics in the formation of cadaveric semiochemicals during in situ carcass decomposition.

    PubMed

    Pascual, Javier; von Hoermann, Christian; Rottler-Hoermann, Ann-Marie; Nevo, Omer; Geppert, Alicia; Sikorski, Johannes; Huber, Katharina J; Steiger, Sandra; Ayasse, Manfred; Overmann, Jörg

    2017-08-01

    The decomposition of dead mammalian tissue involves a complex temporal succession of epinecrotic bacteria. Microbial activity may release different cadaveric volatile organic compounds which in turn attract other key players of carcass decomposition such as scavenger insects. To elucidate the dynamics and potential functions of epinecrotic bacteria on carcasses, we monitored bacterial communities developing on still-born piglets incubated in different forest ecosystems by combining high-throughput Illumina 16S rRNA sequencing with gas chromatography-mass spectrometry of volatiles. Our results show that the community structure of epinecrotic bacteria and the types of cadaveric volatile compounds released over the time course of decomposition are driven by deterministic rather than stochastic processes. Individual cadaveric volatile organic compounds were correlated with specific taxa during the first stages of decomposition which are dominated by bacteria. Through best-fitting multiple linear regression models, the synthesis of acetic acid, indole and phenol could be linked to the activity of Enterobacteriaceae, Tissierellaceae and Xanthomonadaceae, respectively. These conclusions are also commensurate with the metabolism described for the dominant taxa identified for these families. The predictable nature of in situ synthesis of cadaveric volatile organic compounds by epinecrotic bacteria provides a new basis for future chemical ecology and forensic studies. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  14. Multi-factor evaluation indicator method for the risk assessment of atmospheric and oceanic hazard group due to the attack of tropical cyclones

    NASA Astrophysics Data System (ADS)

    Qi, Peng; Du, Mei

    2018-06-01

    China's southeast coastal areas frequently suffer from storm surge due to the attack of tropical cyclones (TCs) every year. Hazards induced by TCs are complex, such as strong wind, huge waves, storm surge, heavy rain, floods, and so on. The atmospheric and oceanic hazards cause serious disasters and substantial economic losses. This paper, from the perspective of hazard group, sets up a multi-factor evaluation method for the risk assessment of TC hazards using historical extreme data of concerned atmospheric and oceanic elements. Based on the natural hazard dynamic process, the multi-factor indicator system is composed of nine natural hazard factors representing intensity and frequency, respectively. Contributing to the indicator system, in order of importance, are maximum wind speed by TCs, attack frequency of TCs, maximum surge height, maximum wave height, frequency of gusts ≥ Scale 8, rainstorm intensity, maximum tidal range, rainstorm frequency, then sea-level rising rate. The first four factors are the most important, whose weights exceed 10% in the indicator system. With normalization processing, all the single-hazard factors are superposed by multiplying their weights to generate a superposed TC hazard. The multi-factor evaluation indicator method was applied to the risk assessment of typhoon-induced atmospheric and oceanic hazard group in typhoon-prone southeast coastal cities of China.

  15. Security enhanced multi-factor biometric authentication scheme using bio-hash function.

    PubMed

    Choi, Younsung; Lee, Youngsook; Moon, Jongho; Won, Dongho

    2017-01-01

    With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An's scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user's ID during login. Cao and Ge improved upon Younghwa An's scheme, but various security problems remained. This study demonstrates that Cao and Ge's scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge's scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost.

  16. Stochastic arbitrage return and its implication for option pricing

    NASA Astrophysics Data System (ADS)

    Fedotov, Sergei; Panayides, Stephanos

    2005-01-01

    The purpose of this work is to explore the role that random arbitrage opportunities play in pricing financial derivatives. We use a non-equilibrium model to set up a stochastic portfolio, and for the random arbitrage return, we choose a stationary ergodic random process rapidly varying in time. We exploit the fact that option price and random arbitrage returns change on different time scales which allows us to develop an asymptotic pricing theory involving the central limit theorem for random processes. We restrict ourselves to finding pricing bands for options rather than exact prices. The resulting pricing bands are shown to be independent of the detailed statistical characteristics of the arbitrage return. We find that the volatility “smile” can also be explained in terms of random arbitrage opportunities.

  17. Economic policy optimization based on both one stochastic model and the parametric control theory

    NASA Astrophysics Data System (ADS)

    Ashimov, Abdykappar; Borovskiy, Yuriy; Onalbekov, Mukhit

    2016-06-01

    A nonlinear dynamic stochastic general equilibrium model with financial frictions is developed to describe two interacting national economies in the environment of the rest of the world. Parameters of nonlinear model are estimated based on its log-linearization by the Bayesian approach. The nonlinear model is verified by retroprognosis, estimation of stability indicators of mappings specified by the model, and estimation the degree of coincidence for results of internal and external shocks' effects on macroeconomic indicators on the basis of the estimated nonlinear model and its log-linearization. On the base of the nonlinear model, the parametric control problems of economic growth and volatility of macroeconomic indicators of Kazakhstan are formulated and solved for two exchange rate regimes (free floating and managed floating exchange rates)

  18. Probabilistic Usage of the Multi-Factor Interaction Model

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A Multi-Factor Interaction Model (MFIM) is used to predict the insulating foam mass expulsion during the ascending of a space vehicle. The exponents in the MFIM are evaluated by an available approach which consists of least squares and an optimization algorithm. These results were subsequently used to probabilistically evaluate the effects of the uncertainties in each participating factor in the mass expulsion. The probabilistic results show that the surface temperature dominates at high probabilities and the pressure which causes the mass expulsion at low probabil

  19. Co-movement measure of information transmission on international equity markets

    NASA Astrophysics Data System (ADS)

    Al Rahahleh, Naseem; Bhatti, M. Ishaq

    2017-03-01

    Recently, Bhatti and Nguyen (2012) used EVT and various stochastic copulas to study the cross-country co-movements diversification and asset pricing allocation. Weiss (2013) observed that Dynamic Conditional Correlation (DCC) models outperform various copula models. This paper attempts to contribute to the literature on multivariate models for capturing forward and backward return co-movement, spillover effects and volatility linkages. It reflects cross-country forward and backward co-movements more clearly among various coupled international stock markets relating to information transmission and price discovery for making investment decisions. Given the reality of fat-tail or skewed distribution of financial data, this paper proposes the use of VECM-DCC and VAR-DCC models which capture dynamic dependences between the Australian and other selected international financial stock markets. We observe that the return co-movement effects between Australian and Asian countries are bidirectional ((AUS ↔ Hong Kong), (AUS ↔ Japan)) with the exception of Taiwan (AUS → Taiwan). We also observe that the volatility spillover between the Australian and both the UK and the US markets are bidirectional with a larger volatility spillover from both toward the AUS market. Further, the UK market has a higher volatility spillover on the Australian market compared to the US market and the US market has a higher volatility spillover on the UK than that of the Australian market.

  20. Modeling volatility using state space models.

    PubMed

    Timmer, J; Weigend, A S

    1997-08-01

    In time series problems, noise can be divided into two categories: dynamic noise which drives the process, and observational noise which is added in the measurement process, but does not influence future values of the system. In this framework, we show that empirical volatilities (the squared relative returns of prices) exhibit a significant amount of observational noise. To model and predict their time evolution adequately, we estimate state space models that explicitly include observational noise. We obtain relaxation times for shocks in the logarithm of volatility ranging from three weeks (for foreign exchange) to three to five months (for stock indices). In most cases, a two-dimensional hidden state is required to yield residuals that are consistent with white noise. We compare these results with ordinary autoregressive models (without a hidden state) and find that autoregressive models underestimate the relaxation times by about two orders of magnitude since they do not distinguish between observational and dynamic noise. This new interpretation of the dynamics of volatility in terms of relaxators in a state space model carries over to stochastic volatility models and to GARCH models, and is useful for several problems in finance, including risk management and the pricing of derivative securities. Data sets used: Olsen & Associates high frequency DEM/USD foreign exchange rates (8 years). Nikkei 225 index (40 years). Dow Jones Industrial Average (25 years).

  1. Security enhanced multi-factor biometric authentication scheme using bio-hash function

    PubMed Central

    Lee, Youngsook; Moon, Jongho

    2017-01-01

    With the rapid development of personal information and wireless communication technology, user authentication schemes have been crucial to ensure that wireless communications are secure. As such, various authentication schemes with multi-factor authentication have been proposed to improve the security of electronic communications. Multi-factor authentication involves the use of passwords, smart cards, and various biometrics to provide users with the utmost privacy and data protection. Cao and Ge analyzed various authentication schemes and found that Younghwa An’s scheme was susceptible to a replay attack where an adversary masquerades as a legal server and a user masquerading attack where user anonymity is not provided, allowing an adversary to execute a password change process by intercepting the user’s ID during login. Cao and Ge improved upon Younghwa An’s scheme, but various security problems remained. This study demonstrates that Cao and Ge’s scheme is susceptible to a biometric recognition error, slow wrong password detection, off-line password attack, user impersonation attack, ID guessing attack, a DoS attack, and that their scheme cannot provide session key agreement. Then, to address all weaknesses identified in Cao and Ge’s scheme, this study proposes a security enhanced multi-factor biometric authentication scheme and provides a security analysis and formal analysis using Burrows-Abadi-Needham logic. Finally, the efficiency analysis reveals that the proposed scheme can protect against several possible types of attacks with only a slightly high computational cost. PMID:28459867

  2. Challenging terrestrial biosphere models with data from the long-term multifactor Prairie Heating and CO2 Enrichment experiment

    NASA Astrophysics Data System (ADS)

    De Kauwe, M. G.; Medlyn, B.; Walker, A.; Zaehle, S.; Pendall, E.; Norby, R. J.

    2017-12-01

    Multifactor experiments are often advocated as important for advancing models, yet to date, such models have only been tested against single-factor experiments. We applied 10 models to the multifactor Prairie Heating and CO2 Enrichment (PHACE) experiment in Wyoming, USA. Our goals were to investigate how multifactor experiments can be used to constrain models and to identify a road map for model improvement. We found models performed poorly in ambient conditions: comparison with data highlighted model failures particularly with respect to carbon allocation, phenology, and the impact of water stress on phenology. Performance against the observations from single-factors treatments was also relatively poor. In addition, similar responses were predicted for different reasons across models: there were large differences among models in sensitivity to water stress and, among the nitrogen cycle models, nitrogen availability during the experiment. Models were also unable to capture observed treatment effects on phenology: they overestimated the effect of warming on leaf onset and did not allow CO2-induced water savings to extend the growing season length. Observed interactive (CO2 × warming) treatment effects were subtle and contingent on water stress, phenology, and species composition. As the models did not correctly represent these processes under ambient and single-factor conditions, little extra information was gained by comparing model predictions against interactive responses. We outline a series of key areas in which this and future experiments could be used to improve model predictions of grassland responses to global change.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siddiqui, Afzal; Marnay, Chris

    This paper examines a California-based microgrid s decision to invest in a distributed generation (DG) unit that operates on natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find natural gas generating cost thresholds that trigger DG investment. Furthermore, the consideration of operational flexibility by the microgrid accelerates DG investment, while the option to disconnect entirely from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investmentmore » threshold boundary and find that high electricity price volatility relative to that of natural gas generating cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit.« less

  4. Stochastic modeling of stock price process induced from the conjugate heat equation

    NASA Astrophysics Data System (ADS)

    Paeng, Seong-Hun

    2015-02-01

    Currency can be considered as a ruler for values of commodities. Then the price is the measured value by the ruler. We can suppose that inflation and variation of exchange rate are caused by variation of the scale of the ruler. In geometry, variation of the scale means that the metric is time-dependent. The conjugate heat equation is the modified heat equation which satisfies the heat conservation law for the time-dependent metric space. We propose a new model of stock prices by using the stochastic process whose transition probability is determined by the kernel of the conjugate heat equation. Our model of stock prices shows how the volatility term is affected by inflation and exchange rate. This model modifies the Black-Scholes equation in light of inflation and exchange rate.

  5. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Modeling spot markets for electricity and pricing electricity derivatives

    NASA Astrophysics Data System (ADS)

    Ning, Yumei

    Spot prices for electricity have been very volatile with dramatic price spikes occurring in restructured market. The task of forecasting electricity prices and managing price risk presents a new challenge for market players. The objectives of this dissertation are: (1) to develop a stochastic model of price behavior and predict price spikes; (2) to examine the effect of weather forecasts on forecasted prices; (3) to price electricity options and value generation capacity. The volatile behavior of prices can be represented by a stochastic regime-switching model. In the model, the means of the high-price and low-price regimes and the probabilities of switching from one regime to the other are specified as functions of daily peak load. The probability of switching to the high-price regime is positively related to load, but is still not high enough at the highest loads to predict price spikes accurately. An application of this model shows how the structure of the Pennsylvania-New Jersey-Maryland market changed when market-based offers were allowed, resulting in higher price spikes. An ARIMA model including temperature, seasonal, and weekly effects is estimated to forecast daily peak load. Forecasts of load under different assumptions about weather patterns are used to predict changes of price behavior given the regime-switching model of prices. Results show that the range of temperature forecasts from a normal summer to an extremely warm summer cause relatively small increases in temperature (+1.5%) and load (+3.0%). In contrast, the increases in prices are large (+20%). The conclusion is that the seasonal outlook forecasts provided by NOAA are potentially valuable for predicting prices in electricity markets. The traditional option models, based on Geometric Brownian Motion are not appropriate for electricity prices. An option model using the regime-switching framework is developed to value a European call option. The model includes volatility risk and allows changes in prices and volatility to be correlated. The results show that the value of a power plant is much higher using the financial option model than using traditional discounted cash flow.

  7. A graph-based approach to inequality assessment

    NASA Astrophysics Data System (ADS)

    Palestini, Arsen; Pignataro, Giuseppe

    2016-08-01

    In a population consisting of heterogeneous types, whose income factors are indicated by nonnegative vectors, policies aggregating different factors can be represented by coalitions in a cooperative game, whose characteristic function is a multi-factor inequality index. When it is not possible to form all coalitions, the feasible ones can be indicated by a graph. We redefine Shapley and Banzhaf values on graph games to deduce some properties involving the degrees of the graph vertices and marginal contributions to overall inequality. An example is finally provided based on a modified multi-factor Atkinson index.

  8. A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing

    DOE PAGES

    van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J.; ...

    2017-02-20

    The brain is capable of massively parallel information processing while consuming only ~1- 100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low energymore » (<10 pJ for 10 3 μm 2 devices) and voltage, displays >500 distinct, non-volatile conductance states within a ~1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODEs are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with 3D architectures, opening a path towards extreme interconnectivity comparable to the human brain.« less

  9. A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing

    NASA Astrophysics Data System (ADS)

    van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J.; Keene, Scott T.; Faria, Grégorio C.; Agarwal, Sapan; Marinella, Matthew J.; Alec Talin, A.; Salleo, Alberto

    2017-04-01

    The brain is capable of massively parallel information processing while consuming only ~1-100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low voltage and energy (<10 pJ for 103 μm2 devices), displays >500 distinct, non-volatile conductance states within a ~1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODes are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with three-dimensional architectures, opening a path towards extreme interconnectivity comparable to the human brain.

  10. LCROSS: Volatiles and Exosphere Associated with a Permanently Shadowed Region in Cabeus

    NASA Technical Reports Server (NTRS)

    Wooden, Diane; Colaprete, Anthony; Heldmann, Jennifer; Ennico, Kimberly; Shirley, Mark; Marshall, William

    2010-01-01

    We discuss the volatile species in the LCROSS data set in addition to water that were observed by the LCROSS Shepherding Spacecraft before its own demise in the four minutes following the first impact by the Centaur. The stochastic nature of the temporal variations observed by the nadir-viewing near-infrared spectrometer combined with the diversity of the volatile species suggests that these species were in situ in the permanently shadowed crater and were released by a combination of the centaur impact and the resulting warming of the regolith by the impact and ejecta debris blanket. Adding to this intrigue are the pre-impact observations by the UVVisual spectrometer that reveal that the field-of-view into the permanently shadowed crater contains UV emission lines, The UV lines are clearly revealed once the descent of the shepherding spacecraft narrows the field-of-view of the UV-Vis spectrometer so as to exclude any surrounding bright terrain. Our suggestion is that this emission comes from tenuous gases, i.e., there appears to be a potential association between the cold, permanently shadowed region and an exosphere.

  11. A non-volatile organic electrochemical device as a low-voltage artificial synapse for neuromorphic computing.

    PubMed

    van de Burgt, Yoeri; Lubberman, Ewout; Fuller, Elliot J; Keene, Scott T; Faria, Grégorio C; Agarwal, Sapan; Marinella, Matthew J; Alec Talin, A; Salleo, Alberto

    2017-04-01

    The brain is capable of massively parallel information processing while consuming only ∼1-100 fJ per synaptic event. Inspired by the efficiency of the brain, CMOS-based neural architectures and memristors are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low voltage and energy (<10 pJ for 10 3  μm 2 devices), displays >500 distinct, non-volatile conductance states within a ∼1 V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODes are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems. Mechanical flexibility makes ENODes compatible with three-dimensional architectures, opening a path towards extreme interconnectivity comparable to the human brain.

  12. First-passage and risk evaluation under stochastic volatility

    NASA Astrophysics Data System (ADS)

    Masoliver, Jaume; Perelló, Josep

    2009-07-01

    We solve the first-passage problem for the Heston random diffusion model. We obtain exact analytical expressions for the survival and the hitting probabilities to a given level of return. We study several asymptotic behaviors and obtain approximate forms of these probabilities which prove, among other interesting properties, the nonexistence of a mean-first-passage time. One significant result is the evidence of extreme deviations—which implies a high risk of default—when certain dimensionless parameter, related to the strength of the volatility fluctuations, increases. We confront the model with empirical daily data and we observe that it is able to capture a very broad domain of the hitting probability. We believe that this may provide an effective tool for risk control which can be readily applicable to real markets both for portfolio management and trading strategies.

  13. Volatility smile as relativistic effect

    NASA Astrophysics Data System (ADS)

    Kakushadze, Zura

    2017-06-01

    We give an explicit formula for the probability distribution based on a relativistic extension of Brownian motion. The distribution (1) is properly normalized and (2) obeys the tower law (semigroup property), so we can construct martingales and self-financing hedging strategies and price claims (options). This model is a 1-constant-parameter extension of the Black-Scholes-Merton model. The new parameter is the analog of the speed of light in Special Relativity. However, in the financial context there is no ;speed limit; and the new parameter has the meaning of a characteristic diffusion speed at which relativistic effects become important and lead to a much softer asymptotic behavior, i.e., fat tails, giving rise to volatility smiles. We argue that a nonlocal stochastic description of such (Lévy) processes is inadequate and discuss a local description from physics. The presentation is intended to be pedagogical.

  14. Volatility behavior of visibility graph EMD financial time series from Ising interacting system

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Wang, Jun; Fang, Wen

    2015-08-01

    A financial market dynamics model is developed and investigated by stochastic Ising system, where the Ising model is the most popular ferromagnetic model in statistical physics systems. Applying two graph based analysis and multiscale entropy method, we investigate and compare the statistical volatility behavior of return time series and the corresponding IMF series derived from the empirical mode decomposition (EMD) method. And the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, we find that the degree distribution of visibility graph for the simulation series has the power law tails, and the assortative network exhibits the mixing pattern property. All these features are in agreement with the real market data, the research confirms that the financial model established by the Ising system is reasonable.

  15. Multi-Factor Analysis for Selecting Lunar Exploration Soft Landing Area and the best Cruise Route

    NASA Astrophysics Data System (ADS)

    Mou, N.; Li, J.; Meng, Z.; Zhang, L.; Liu, W.

    2018-04-01

    Selecting the right soft landing area and planning a reasonable cruise route are the basic tasks of lunar exploration. In this paper, the Von Karman crater in the Antarctic Aitken basin on the back of the moon is used as the study area, and multi-factor analysis is used to evaluate the landing area and cruise route of lunar exploration. The evaluation system mainly includes the factors such as the density of craters, the impact area of craters, the formation of the whole area and the formation of some areas, such as the vertical structure, rock properties and the content of (FeO + TiO2), which can reflect the significance of scientific exploration factor. And the evaluation of scientific exploration is carried out on the basis of safety and feasibility. On the basis of multi-factor superposition analysis, three landing zones A, B and C are selected, and the appropriate cruising route is analyzed through scientific research factors. This study provides a scientific basis for the lunar probe landing and cruise route planning, and it provides technical support for the subsequent lunar exploration.

  16. The Human Performance Envelope: Past Research, Present Activities and Future Directions

    NASA Technical Reports Server (NTRS)

    Edwards, Tamsyn

    2017-01-01

    Air traffic controllers (ATCOs) must maintain a consistently high level of human performance in order to maintain flight safety and efficiency. In current control environments, performance-influencing factors such as workload, fatigue and situation awareness can co-occur, and interact, to effect performance. However, multifactor influences and the association with performance are under-researched. This study utilized a high fidelity human in the loop enroute air traffic control simulation to investigate the relationship between workload, situation awareness and ATCO performance. The study aimed to replicate and extend Edwards, Sharples, Wilson and Kirwans (2012) previous study and confirm multifactor interactions with a participant sample of ex-controllers. The study also aimed to extend Edwards et als previous research by comparing multifactor relationships across 4 automation conditions. Results suggest that workload and SA may interact to produce a cumulative impact on controller performance, although the effect of the interaction on performance may be dependent on the context and amount of automation present. Findings have implications for human-automation teaming in air traffic control, and the potential prediction and support of ATCO performance.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marnay, Chris; Siddiqui, Afzal; Marnay, Chris

    This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundarymore » and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.« less

  18. Effects of in-sewer processes: a stochastic model approach.

    PubMed

    Vollertsen, J; Nielsen, A H; Yang, W; Hvitved-Jacobsen, T

    2005-01-01

    Transformations of organic matter, nitrogen and sulfur in sewers can be simulated taking into account the relevant transformation and transport processes. One objective of such simulation is the assessment and management of hydrogen sulfide formation and corrosion. Sulfide is formed in the biofilms and sediments of the water phase, but corrosion occurs on the moist surfaces of the sewer gas phase. Consequently, both phases and the transport of volatile substances between these phases must be included. Furthermore, wastewater composition and transformations in sewers are complex and subject to high, natural variability. This paper presents the latest developments of the WATS model concept, allowing integrated aerobic, anoxic and anaerobic simulation of the water phase and of gas phase processes. The resulting model is complex and with high parameter variability. An example applying stochastic modeling shows how this complexity and variability can be taken into account.

  19. Nonlinear GARCH model and 1 / f noise

    NASA Astrophysics Data System (ADS)

    Kononovicius, A.; Ruseckas, J.

    2015-06-01

    Auto-regressive conditionally heteroskedastic (ARCH) family models are still used, by practitioners in business and economic policy making, as a conditional volatility forecasting models. Furthermore ARCH models still are attracting an interest of the researchers. In this contribution we consider the well known GARCH(1,1) process and its nonlinear modifications, reminiscent of NGARCH model. We investigate the possibility to reproduce power law statistics, probability density function and power spectral density, using ARCH family models. For this purpose we derive stochastic differential equations from the GARCH processes in consideration. We find the obtained equations to be similar to a general class of stochastic differential equations known to reproduce power law statistics. We show that linear GARCH(1,1) process has power law distribution, but its power spectral density is Brownian noise-like. However, the nonlinear modifications exhibit both power law distribution and power spectral density of the 1 /fβ form, including 1 / f noise.

  20. A methodology for stochastic analysis of share prices as Markov chains with finite states.

    PubMed

    Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey

    2014-01-01

    Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.

  1. The European style arithmetic Asian option pricing with stochastic interest rate based on Black Scholes model

    NASA Astrophysics Data System (ADS)

    Winarti, Yuyun Guna; Noviyanti, Lienda; Setyanto, Gatot R.

    2017-03-01

    The stock investment is a high risk investment. Therefore, there are derivative securities to reduce these risks. One of them is Asian option. The most fundamental of option is option pricing. Many factors that determine the option price are underlying asset price, strike price, maturity date, volatility, risk free interest rate and dividends. Various option pricing usually assume that risk free interest rate is constant. While in reality, this factor is stochastic process. The arithmetic Asian option is free from distribution, then, its pricing is done using the modified Black-Scholes model. In this research, the modification use the Curran approximation. This research focuses on the arithmetic Asian option pricing without dividends. The data used is the stock daily closing data of Telkom from January 1 2016 to June 30 2016. Finnaly, those option price can be used as an option trading strategy.

  2. The influence of surface roughness on volatile transport on the Moon

    NASA Astrophysics Data System (ADS)

    Prem, P.; Goldstein, D. B.; Varghese, P. L.; Trafton, L. M.

    2018-01-01

    The Moon and other virtually airless bodies provide distinctive environments for the transport and sequestration of water and other volatiles delivered to their surfaces by various sources. In this work, we conduct Monte Carlo simulations of water vapor transport on the Moon to investigate the role of small-scale roughness (unresolved by orbital measurements) in the migration and cold-trapping of volatiles. Observations indicate that surface roughness, combined with the insulating nature of lunar regolith and the absence of significant exospheric heat flow, can cause large variations in temperature over very small scales. Surface temperature has a strong influence on the residence time of migrating water molecules on the lunar surface, which in turn affects the rate and magnitude of volatile transport to permanently shadowed craters (cold traps) near the lunar poles, as well as exospheric structure and the susceptibility of migrating molecules to photodestruction. Here, we develop a stochastic rough surface temperature model suitable for simulations of volatile transport on a global scale, and compare the results of Monte Carlo simulations of volatile transport with and without the surface roughness model. We find that including small-scale temperature variations and shadowing leads to a slight increase in cold-trapping at the lunar poles, accompanied by a slight decrease in photodestruction. Exospheric structure is altered only slightly, primarily at the dawn terminator. We also examine the sensitivity of our results to the temperature of small-scale shadows, and the energetics of water molecule desorption from the lunar regolith - two factors that remain to be definitively constrained by other methods - and find that both these factors affect the rate at which cold trap capture and photodissociation occur, as well as exospheric density and longevity.

  3. Challenging terrestrial biosphere models with data from the long-term multifactor Prairie Heating and CO2 Enrichment experiment.

    PubMed

    De Kauwe, Martin G; Medlyn, Belinda E; Walker, Anthony P; Zaehle, Sönke; Asao, Shinichi; Guenet, Bertrand; Harper, Anna B; Hickler, Thomas; Jain, Atul K; Luo, Yiqi; Lu, Xingjie; Luus, Kristina; Parton, William J; Shu, Shijie; Wang, Ying-Ping; Werner, Christian; Xia, Jianyang; Pendall, Elise; Morgan, Jack A; Ryan, Edmund M; Carrillo, Yolima; Dijkstra, Feike A; Zelikova, Tamara J; Norby, Richard J

    2017-09-01

    Multifactor experiments are often advocated as important for advancing terrestrial biosphere models (TBMs), yet to date, such models have only been tested against single-factor experiments. We applied 10 TBMs to the multifactor Prairie Heating and CO 2 Enrichment (PHACE) experiment in Wyoming, USA. Our goals were to investigate how multifactor experiments can be used to constrain models and to identify a road map for model improvement. We found models performed poorly in ambient conditions; there was a wide spread in simulated above-ground net primary productivity (range: 31-390 g C m -2  yr -1 ). Comparison with data highlighted model failures particularly with respect to carbon allocation, phenology, and the impact of water stress on phenology. Performance against the observations from single-factors treatments was also relatively poor. In addition, similar responses were predicted for different reasons across models: there were large differences among models in sensitivity to water stress and, among the N cycle models, N availability during the experiment. Models were also unable to capture observed treatment effects on phenology: they overestimated the effect of warming on leaf onset and did not allow CO 2 -induced water savings to extend the growing season length. Observed interactive (CO 2  × warming) treatment effects were subtle and contingent on water stress, phenology, and species composition. As the models did not correctly represent these processes under ambient and single-factor conditions, little extra information was gained by comparing model predictions against interactive responses. We outline a series of key areas in which this and future experiments could be used to improve model predictions of grassland responses to global change. © 2017 John Wiley & Sons Ltd.

  4. Challenging terrestrial biosphere models with data from the long-term multifactor Prairie Heating and CO 2 enrichment experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Kauwe, Martin G.; Medlyn, Belinda E.; Walker, Anthony P.

    Multi-factor experiments are often advocated as important for advancing terrestrial biosphere models (TBMs), yet to date such models have only been tested against single-factor experiments. We applied 10 TBMs to the multi-factor Prairie Heating and CO 2 Enrichment (PHACE) experiment in Wyoming, USA. Our goals were to investigate how multi-factor experiments can be used to constrain models, and to identify a road map for model improvement. We found models performed poorly in ambient conditions; there was a wide spread in simulated above-ground net primary productivity (range: 31-390 g C m -2 yr -1). Comparison with data highlighted model failures particularlymore » in respect to carbon allocation, phenology, and the impact of water stress on phenology. Performance against single-factors was also relatively poor. In addition, similar responses were predicted for different reasons across models: there were large differences among models in sensitivity to water stress and, among the nitrogen cycle models, nitrogen availability during the experiment. Models were also unable to capture observed treatment effects on phenology: they over-estimated the effect of warming on leaf onset and did not allow CO 2-induced water savings to extend growing season length. Observed interactive (CO 2 x warming) treatment effects were subtle and contingent on water stress, phenology and species composition. Since the models did not correctly represent these processes under ambient and single-factor conditions, little extra information was gained by comparing model predictions against interactive responses. Finally, we outline a series of key areas in which this and future experiments could be used to improve model predictions of grassland responses to global change.« less

  5. Challenging terrestrial biosphere models with data from the long-term multifactor Prairie Heating and CO 2 enrichment experiment

    DOE PAGES

    De Kauwe, Martin G.; Medlyn, Belinda E.; Walker, Anthony P.; ...

    2017-02-01

    Multi-factor experiments are often advocated as important for advancing terrestrial biosphere models (TBMs), yet to date such models have only been tested against single-factor experiments. We applied 10 TBMs to the multi-factor Prairie Heating and CO 2 Enrichment (PHACE) experiment in Wyoming, USA. Our goals were to investigate how multi-factor experiments can be used to constrain models, and to identify a road map for model improvement. We found models performed poorly in ambient conditions; there was a wide spread in simulated above-ground net primary productivity (range: 31-390 g C m -2 yr -1). Comparison with data highlighted model failures particularlymore » in respect to carbon allocation, phenology, and the impact of water stress on phenology. Performance against single-factors was also relatively poor. In addition, similar responses were predicted for different reasons across models: there were large differences among models in sensitivity to water stress and, among the nitrogen cycle models, nitrogen availability during the experiment. Models were also unable to capture observed treatment effects on phenology: they over-estimated the effect of warming on leaf onset and did not allow CO 2-induced water savings to extend growing season length. Observed interactive (CO 2 x warming) treatment effects were subtle and contingent on water stress, phenology and species composition. Since the models did not correctly represent these processes under ambient and single-factor conditions, little extra information was gained by comparing model predictions against interactive responses. Finally, we outline a series of key areas in which this and future experiments could be used to improve model predictions of grassland responses to global change.« less

  6. The prevalence and structure of obsessive-compulsive personality disorder in Hispanic psychiatric outpatients

    PubMed Central

    Ansell, Emily B.; Pinto, Anthony; Crosby, Ross D.; Becker, Daniel F.; Añez, Luis M.; Paris, Manuel; Grilo, Carlos M.

    2010-01-01

    This study sought to confirm a multi-factor model of Obsessive-compulsive personality disorder (OCPD) in a Hispanic outpatient sample and to explore associations of the OCPD factors with aggression, depression, and suicidal thoughts. One hundred and thirty monolingual, Spanish-speaking participants were recruited from a community mental health center and were assessed by bilingual doctoral level clinicians. OCPD was highly prevalent (26%) in this sample. Multi-factor models of OCPD were tested and the two factors - perfectionism and interpersonal rigidity - provided the best model fit. Interpersonal rigidity was associated with aggression and anger while perfectionism was associated with depression and suicidal thoughts. PMID:20227063

  7. A Multifactor Secure Authentication System for Wireless Payment

    NASA Astrophysics Data System (ADS)

    Sanyal, Sugata; Tiwari, Ayu; Sanyal, Sudip

    Organizations are deploying wireless based online payment applications to expand their business globally, it increases the growing need of regulatory requirements for the protection of confidential data, and especially in internet based financial areas. Existing internet based authentication systems often use either the Web or the Mobile channel individually to confirm the claimed identity of the remote user. The vulnerability is that access is based on only single factor authentication which is not secure to protect user data, there is a need of multifactor authentication. This paper proposes a new protocol based on multifactor authentication system that is both secure and highly usable. It uses a novel approach based on Transaction Identification Code and SMS to enforce another security level with the traditional Login/password system. The system provides a highly secure environment that is simple to use and deploy with in a limited resources that does not require any change in infrastructure or underline protocol of wireless network. This Protocol for Wireless Payment is extended as a two way authentications system to satisfy the emerging market need of mutual authentication and also supports secure B2B communication which increases faith of the user and business organizations on wireless financial transaction using mobile devices.

  8. Underlying dynamics of typical fluctuations of an emerging market price index: The Heston model from minutes to months

    NASA Astrophysics Data System (ADS)

    Vicente, Renato; de Toledo, Charles M.; Leite, Vitor B. P.; Caticha, Nestor

    2006-02-01

    We investigate the Heston model with stochastic volatility and exponential tails as a model for the typical price fluctuations of the Brazilian São Paulo Stock Exchange Index (IBOVESPA). Raw prices are first corrected for inflation and a period spanning 15 years characterized by memoryless returns is chosen for the analysis. Model parameters are estimated by observing volatility scaling and correlation properties. We show that the Heston model with at least two time scales for the volatility mean reverting dynamics satisfactorily describes price fluctuations ranging from time scales larger than 20 min to 160 days. At time scales shorter than 20 min we observe autocorrelated returns and power law tails incompatible with the Heston model. Despite major regulatory changes, hyperinflation and currency crises experienced by the Brazilian market in the period studied, the general success of the description provided may be regarded as an evidence for a general underlying dynamics of price fluctuations at intermediate mesoeconomic time scales well approximated by the Heston model. We also notice that the connection between the Heston model and Ehrenfest urn models could be exploited for bringing new insights into the microeconomic market mechanics.

  9. A Simple and Computationally Efficient Approach to Multifactor Dimensionality Reduction Analysis of Gene-Gene Interactions for Quantitative Traits

    PubMed Central

    Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane

    2013-01-01

    We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232

  10. Computational analysis of gene-gene interactions using multifactor dimensionality reduction.

    PubMed

    Moore, Jason H

    2004-11-01

    Understanding the relationship between DNA sequence variations and biologic traits is expected to improve the diagnosis, prevention and treatment of common human diseases. Success in characterizing genetic architecture will depend on our ability to address nonlinearities in the genotype-to-phenotype mapping relationship as a result of gene-gene interactions, or epistasis. This review addresses the challenges associated with the detection and characterization of epistasis. A novel strategy known as multifactor dimensionality reduction that was specifically designed for the identification of multilocus genetic effects is presented. Several case studies that demonstrate the detection of gene-gene interactions in common diseases such as atrial fibrillation, Type II diabetes and essential hypertension are also discussed.

  11. The prevalence and structure of obsessive-compulsive personality disorder in Hispanic psychiatric outpatients.

    PubMed

    Ansell, Emily B; Pinto, Anthony; Crosby, Ross D; Becker, Daniel F; Añez, Luis M; Paris, Manuel; Grilo, Carlos M

    2010-09-01

    This study sought to confirm a multi-factor model of Obsessive-compulsive personality disorder (OCPD) in a Hispanic outpatient sample and to explore associations of the OCPD factors with aggression, depression, and suicidal thoughts. One hundred and thirty monolingual, Spanish-speaking participants were recruited from a community mental health center and were assessed by bilingual doctoral-level clinicians. OCPD was highly prevalent (26%) in this sample. Multi-factor models of OCPD were tested and the two factors - perfectionism and interpersonal rigidity - provided the best model fit. Interpersonal rigidity was associated with aggression and anger while perfectionism was associated with depression and suicidal thoughts. (c) 2010 Elsevier Ltd. All rights reserved.

  12. What mental health teams want in their leaders.

    PubMed

    Corrigan, P W; Garman, A N; Lam, C; Leary, M

    1998-11-01

    The authors present the findings of the first phase of a 3-year study developing a skills training curriculum for mental health team leaders. A factor model empirically generated from clinical team members was compared to Bass' (1990) Multifactor Model of Leadership. Members of mental health teams generated individual responses to questions about effective leaders. Results from this survey were subsequently administered to a sample of mental health team members. Analysis of these data yielded six factors: Autocratic Leadership, Clear Roles and Goals, Reluctant Leadership, Vision, Diversity Issues, and Supervision. Additional analyses suggest Bass' Multifactor Model offers a useful paradigm for developing a curriculum specific to the needs of mental health team leaders.

  13. Controlling the phase locking of stochastic magnetic bits for ultra-low power computation

    NASA Astrophysics Data System (ADS)

    Mizrahi, Alice; Locatelli, Nicolas; Lebrun, Romain; Cros, Vincent; Fukushima, Akio; Kubota, Hitoshi; Yuasa, Shinji; Querlioz, Damien; Grollier, Julie

    2016-07-01

    When fabricating magnetic memories, one of the main challenges is to maintain the bit stability while downscaling. Indeed, for magnetic volumes of a few thousand nm3, the energy barrier between magnetic configurations becomes comparable to the thermal energy at room temperature. Then, switches of the magnetization spontaneously occur. These volatile, superparamagnetic nanomagnets are generally considered useless. But what if we could use them as low power computational building blocks? Remarkably, they can oscillate without the need of any external dc drive, and despite their stochastic nature, they can beat in unison with an external periodic signal. Here we show that the phase locking of superparamagnetic tunnel junctions can be induced and suppressed by electrical noise injection. We develop a comprehensive model giving the conditions for synchronization, and predict that it can be achieved with a total energy cost lower than 10-13 J. Our results open the path to ultra-low power computation based on the controlled synchronization of oscillators.

  14. Controlling the phase locking of stochastic magnetic bits for ultra-low power computation.

    PubMed

    Mizrahi, Alice; Locatelli, Nicolas; Lebrun, Romain; Cros, Vincent; Fukushima, Akio; Kubota, Hitoshi; Yuasa, Shinji; Querlioz, Damien; Grollier, Julie

    2016-07-26

    When fabricating magnetic memories, one of the main challenges is to maintain the bit stability while downscaling. Indeed, for magnetic volumes of a few thousand nm(3), the energy barrier between magnetic configurations becomes comparable to the thermal energy at room temperature. Then, switches of the magnetization spontaneously occur. These volatile, superparamagnetic nanomagnets are generally considered useless. But what if we could use them as low power computational building blocks? Remarkably, they can oscillate without the need of any external dc drive, and despite their stochastic nature, they can beat in unison with an external periodic signal. Here we show that the phase locking of superparamagnetic tunnel junctions can be induced and suppressed by electrical noise injection. We develop a comprehensive model giving the conditions for synchronization, and predict that it can be achieved with a total energy cost lower than 10(-13) J. Our results open the path to ultra-low power computation based on the controlled synchronization of oscillators.

  15. Minding Impacting Events in a Model of Stochastic Variance

    PubMed Central

    Duarte Queirós, Sílvio M.; Curado, Evaldo M. F.; Nobre, Fernando D.

    2011-01-01

    We introduce a generalization of the well-known ARCH process, widely used for generating uncorrelated stochastic time series with long-term non-Gaussian distributions and long-lasting correlations in the (instantaneous) standard deviation exhibiting a clustering profile. Specifically, inspired by the fact that in a variety of systems impacting events are hardly forgot, we split the process into two different regimes: a first one for regular periods where the average volatility of the fluctuations within a certain period of time is below a certain threshold, , and another one when the local standard deviation outnumbers . In the former situation we use standard rules for heteroscedastic processes whereas in the latter case the system starts recalling past values that surpassed the threshold. Our results show that for appropriate parameter values the model is able to provide fat tailed probability density functions and strong persistence of the instantaneous variance characterized by large values of the Hurst exponent (), which are ubiquitous features in complex systems. PMID:21483864

  16. Research on trading patterns of large users' direct power purchase considering consumption of clean energy

    NASA Astrophysics Data System (ADS)

    Guojun, He; Lin, Guo; Zhicheng, Yu; Xiaojun, Zhu; Lei, Wang; Zhiqiang, Zhao

    2017-03-01

    In order to reduce the stochastic volatility of supply and demand, and maintain the electric power system's stability after large scale stochastic renewable energy sources connected to grid, the development and consumption should be promoted by marketing means. Bilateral contract transaction model of large users' direct power purchase conforms to the actual situation of our country. Trading pattern of large users' direct power purchase is analyzed in this paper, characteristics of each power generation are summed up, and centralized matching mode is mainly introduced. Through the establishment of power generation enterprises' priority evaluation index system and the analysis of power generation enterprises' priority based on fuzzy clustering, the sorting method of power generation enterprises' priority in trading patterns of large users' direct power purchase is put forward. Suggestions for trading mechanism of large users' direct power purchase are offered by this method, which is good for expand the promotion of large users' direct power purchase further.

  17. Efficacy and safety of a multifactor intervention to improve therapeutic adherence in patients with chronic obstructive pulmonary disease (COPD): protocol for the ICEPOC study.

    PubMed

    Barnestein-Fonseca, Pilar; Leiva-Fernández, José; Vidal-España, Francisca; García-Ruiz, Antonio; Prados-Torres, Daniel; Leiva-Fernández, Francisca

    2011-02-14

    Low therapeutic adherence to medication is very common. Clinical effectiveness is related to dose rate and route of administration and so poor therapeutic adherence can reduce the clinical benefit of treatment. The therapeutic adherence of patients with chronic obstructive pulmonary disease (COPD) is extremely poor according to most studies. The research about COPD adherence has mainly focussed on quantifying its effect, and few studies have researched factors that affect non-adherence. Our study will evaluate the effectiveness of a multifactor intervention to improve the therapeutic adherence of COPD patients. A randomized controlled clinical trial with 140 COPD diagnosed patients selected by a non-probabilistic method of sampling. Subjects will be randomly allocated into two groups, using the block randomization technique. Every patient in each group will be visited four times during the year of the study. Motivational aspects related to adherence (beliefs and behaviour): group and individual interviews; cognitive aspects: information about illness; skills: inhaled technique training. Reinforcement of the cognitive-emotional aspects and inhaled technique training will be carried out in all visits of the intervention group. Adherence to a prescribed treatment involves a behavioural change. Cognitive, emotional and motivational aspects influence this change and so we consider the best intervention procedure to improve adherence would be a cognitive and emotional strategy which could be applied in daily clinical practice. Our hypothesis is that the application of a multifactor intervention (COPD information, dose reminders and reinforcing audiovisual material, motivational aspects and inhalation technique training) to COPD patients taking inhaled treatment will give a 25% increase in the number of patients showing therapeutic adherence in this group compared to the control group.We will evaluate the effectiveness of this multifactor intervention on patient adherence to inhaled drugs considering that it will be right and feasible to the clinical practice context. Current Controlled Trials ISRCTN18841601.

  18. Modeling and complexity of stochastic interacting Lévy type financial price dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Yiduan; Zheng, Shenzhou; Zhang, Wei; Wang, Jun; Wang, Guochao

    2018-06-01

    In attempt to reproduce and investigate nonlinear dynamics of security markets, a novel nonlinear random interacting price dynamics, which is considered as a Lévy type process, is developed and investigated by the combination of lattice oriented percolation and Potts dynamics, which concerns with the instinctive random fluctuation and the fluctuation caused by the spread of the investors' trading attitudes, respectively. To better understand the fluctuation complexity properties of the proposed model, the complexity analyses of random logarithmic price return and corresponding volatility series are preformed, including power-law distribution, Lempel-Ziv complexity and fractional sample entropy. In order to verify the rationality of the proposed model, the corresponding studies of actual security market datasets are also implemented for comparison. The empirical results reveal that this financial price model can reproduce some important complexity features of actual security markets to some extent. The complexity of returns decreases with the increase of parameters γ1 and β respectively, furthermore, the volatility series exhibit lower complexity than the return series

  19. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  20. Binomial tree method for pricing a regime-switching volatility stock loans

    NASA Astrophysics Data System (ADS)

    Putri, Endah R. M.; Zamani, Muhammad S.; Utomo, Daryono B.

    2018-03-01

    Binomial model with regime switching may represents the price of stock loan which follows the stochastic process. Stock loan is one of alternative that appeal investors to get the liquidity without selling the stock. The stock loan mechanism resembles that of American call option when someone can exercise any time during the contract period. From the resembles both of mechanism, determination price of stock loan can be interpreted from the model of American call option. The simulation result shows the behavior of the price of stock loan under a regime-switching with respect to various interest rate and maturity.

  1. Probabilistic lifetime strength of aerospace materials via computational simulation

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Keating, Jerome P.; Lovelace, Thomas B.; Bast, Callie C.

    1991-01-01

    The results of a second year effort of a research program are presented. The research included development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic phenomenological constitutive relationship, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects of primitive variables. These primitive variables often originate in the environment and may include stress from loading, temperature, chemical, or radiation attack. This multifactor interaction constitutive equation is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the constitutive equation using actual experimental materials data together with the multiple linear regression of that data.

  2. The Distribution of Ice in Lunar Permanently Shadowed Regions: Science Enabling Exploration (Invited)

    NASA Astrophysics Data System (ADS)

    Hurley, D.; Elphic, R. C.; Bussey, B.; Hibbitts, C.; Lawrence, D. J.

    2013-12-01

    Recent prospecting indicates that water ice occurs in enhanced abundances in some lunar PSRs. That water constitutes a resource that enables lunar exploration if it can be harvested for fuel and life support. Future lunar exploration missions will need detailed information about the distribution of volatiles in lunar permanently shadowed regions (PSRs). In addition, the volatiles also offer key insights into the recent and distant past, as they have trapped volatiles delivered to the moon over ~2 Gyr. This comprises an unparalleled reservoir of past inner solar system volatiles, and future scientific missions are needed to make the measurements that will reveal the composition of those volatiles. These scientific missions will necessarily have to acquire and analyze samples of volatiles from the PSRs. For both exploration and scientific purposes, the precise location of volatiles will need to be known. However, data indicate that ice is distributed heterogeneously on the Moon. It is unlikely that the distribution will be known a priori with enough spatial resolution to guarantee access to volatiles using a single point sample. Some mechanism for laterally or vertically distributed access will increase the likelihood of acquiring a rich sample of volatiles. Trade studies will need to be conducted to anticipate the necessary range and duration of missions to lunar PSRs that will be needed to accomplish the mission objectives. We examine the spatial distribution of volatiles in lunar PSRs reported from data analyses and couple those with models of smaller scale processes. FUV and laser data from PSRs that indicate the average surface distribution is consistent with low abundances on the extreme surface in most PSRs. Neutron and radar data that probe the distribution at depth show heterogeneity at broad spatial resolution. We consider those data in conjunction with the model to understand the full, 3-D nature of the heterogeneity. A Monte Carlo technique simulates the stochastic process of impact gardening on a putative ice deposit. The model uses the crater production function as a basis for generating a random selection of impact craters over time. Impacts are implemented by modifying the topography, volatile content, and depth distribution in the simulation volume on a case by case basis. This technique will never be able to reproduce the exact impact history of a particular area. But by conducting multiple runs with the same initial conditions and a different seed to the random number generator, we are able to calculate the probability of situations occurring. Further, by repeating the simulations with varied initial conditions, we calculate the dependence of the expectation values on the inputs. We present findings regarding the heterogeneity of volatiles in PSRs as a function of age, initial ice thickness, and contributions from steady sources.

  3. Probability distribution of financial returns in a model of multiplicative Brownian motion with stochastic diffusion coefficient

    NASA Astrophysics Data System (ADS)

    Silva, Antonio

    2005-03-01

    It is well-known that the mathematical theory of Brownian motion was first developed in the Ph. D. thesis of Louis Bachelier for the French stock market before Einstein [1]. In Ref. [2] we studied the so-called Heston model, where the stock-price dynamics is governed by multiplicative Brownian motion with stochastic diffusion coefficient. We solved the corresponding Fokker-Planck equation exactly and found an analytic formula for the time-dependent probability distribution of stock price changes (returns). The formula interpolates between the exponential (tent-shaped) distribution for short time lags and the Gaussian (parabolic) distribution for long time lags. The theoretical formula agrees very well with the actual stock-market data ranging from the Dow-Jones index [2] to individual companies [3], such as Microsoft, Intel, etc. [] [1] Louis Bachelier, ``Th'eorie de la sp'eculation,'' Annales Scientifiques de l''Ecole Normale Sup'erieure, III-17:21-86 (1900).[] [2] A. A. Dragulescu and V. M. Yakovenko, ``Probability distribution of returns in the Heston model with stochastic volatility,'' Quantitative Finance 2, 443--453 (2002); Erratum 3, C15 (2003). [cond-mat/0203046] [] [3] A. C. Silva, R. E. Prange, and V. M. Yakovenko, ``Exponential distribution of financial returns at mesoscopic time lags: a new stylized fact,'' Physica A 344, 227--235 (2004). [cond-mat/0401225

  4. Forecasting East Asian Indices Futures via a Novel Hybrid of Wavelet-PCA Denoising and Artificial Neural Network Models

    PubMed Central

    2016-01-01

    The motivation behind this research is to innovatively combine new methods like wavelet, principal component analysis (PCA), and artificial neural network (ANN) approaches to analyze trade in today’s increasingly difficult and volatile financial futures markets. The main focus of this study is to facilitate forecasting by using an enhanced denoising process on market data, taken as a multivariate signal, in order to deduct the same noise from the open-high-low-close signal of a market. This research offers evidence on the predictive ability and the profitability of abnormal returns of a new hybrid forecasting model using Wavelet-PCA denoising and ANN (named WPCA-NN) on futures contracts of Hong Kong’s Hang Seng futures, Japan’s NIKKEI 225 futures, Singapore’s MSCI futures, South Korea’s KOSPI 200 futures, and Taiwan’s TAIEX futures from 2005 to 2014. Using a host of technical analysis indicators consisting of RSI, MACD, MACD Signal, Stochastic Fast %K, Stochastic Slow %K, Stochastic %D, and Ultimate Oscillator, empirical results show that the annual mean returns of WPCA-NN are more than the threshold buy-and-hold for the validation, test, and evaluation periods; this is inconsistent with the traditional random walk hypothesis, which insists that mechanical rules cannot outperform the threshold buy-and-hold. The findings, however, are consistent with literature that advocates technical analysis. PMID:27248692

  5. Forecasting East Asian Indices Futures via a Novel Hybrid of Wavelet-PCA Denoising and Artificial Neural Network Models.

    PubMed

    Chan Phooi M'ng, Jacinta; Mehralizadeh, Mohammadali

    2016-01-01

    The motivation behind this research is to innovatively combine new methods like wavelet, principal component analysis (PCA), and artificial neural network (ANN) approaches to analyze trade in today's increasingly difficult and volatile financial futures markets. The main focus of this study is to facilitate forecasting by using an enhanced denoising process on market data, taken as a multivariate signal, in order to deduct the same noise from the open-high-low-close signal of a market. This research offers evidence on the predictive ability and the profitability of abnormal returns of a new hybrid forecasting model using Wavelet-PCA denoising and ANN (named WPCA-NN) on futures contracts of Hong Kong's Hang Seng futures, Japan's NIKKEI 225 futures, Singapore's MSCI futures, South Korea's KOSPI 200 futures, and Taiwan's TAIEX futures from 2005 to 2014. Using a host of technical analysis indicators consisting of RSI, MACD, MACD Signal, Stochastic Fast %K, Stochastic Slow %K, Stochastic %D, and Ultimate Oscillator, empirical results show that the annual mean returns of WPCA-NN are more than the threshold buy-and-hold for the validation, test, and evaluation periods; this is inconsistent with the traditional random walk hypothesis, which insists that mechanical rules cannot outperform the threshold buy-and-hold. The findings, however, are consistent with literature that advocates technical analysis.

  6. A robust multifactor dimensionality reduction method for detecting gene-gene interactions with application to the genetic analysis of bladder cancer susceptibility

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    A central goal of human genetics is to identify and characterize susceptibility genes for common complex human diseases. An important challenge in this endeavor is the modeling of gene-gene interaction or epistasis that can result in non-additivity of genetic effects. The multifactor dimensionality reduction (MDR) method was developed as machine learning alternative to parametric logistic regression for detecting interactions in absence of significant marginal effects. The goal of MDR is to reduce the dimensionality inherent in modeling combinations of polymorphisms using a computational approach called constructive induction. Here, we propose a Robust Multifactor Dimensionality Reduction (RMDR) method that performs constructive induction using a Fisher’s Exact Test rather than a predetermined threshold. The advantage of this approach is that only those genotype combinations that are determined to be statistically significant are considered in the MDR analysis. We use two simulation studies to demonstrate that this approach will increase the success rate of MDR when there are only a few genotype combinations that are significantly associated with case-control status. We show that there is no loss of success rate when this is not the case. We then apply the RMDR method to the detection of gene-gene interactions in genotype data from a population-based study of bladder cancer in New Hampshire. PMID:21091664

  7. A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies

    PubMed Central

    Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.

    2008-01-01

    Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969

  8. Multifactor Screener in OPEN: Scoring Procedures & Results

    Cancer.gov

    Scoring procedures were developed to convert a respondent's screener responses to estimates of individual dietary intake for percentage energy from fat, grams of fiber, and servings of fruits and vegetables.

  9. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  10. Modeling and estimating the jump risk of exchange rates: Applications to RMB

    NASA Astrophysics Data System (ADS)

    Wang, Yiming; Tong, Hanfei

    2008-11-01

    In this paper we propose a new type of continuous-time stochastic volatility model, SVDJ, for the spot exchange rate of RMB, and other foreign currencies. In the model, we assume that the change of exchange rate can be decomposed into two components. One is the normally small-cope innovation driven by the diffusion motion; the other is a large drop or rise engendered by the Poisson counting process. Furthermore, we develop a MCMC method to estimate our model. Empirical results indicate the significant existence of jumps in the exchange rate. Jump components explain a large proportion of the exchange rate change.

  11. On the connection between financial processes with stochastic volatility and nonextensive statistical mechanics

    NASA Astrophysics Data System (ADS)

    Queirós, S. M. D.; Tsallis, C.

    2005-11-01

    The GARCH algorithm is the most renowned generalisation of Engle's original proposal for modelising returns, the ARCH process. Both cases are characterised by presenting a time dependent and correlated variance or volatility. Besides a memory parameter, b, (present in ARCH) and an independent and identically distributed noise, ω, GARCH involves another parameter, c, such that, for c=0, the standard ARCH process is reproduced. In this manuscript we use a generalised noise following a distribution characterised by an index qn, such that qn=1 recovers the Gaussian distribution. Matching low statistical moments of GARCH distribution for returns with a q-Gaussian distribution obtained through maximising the entropy Sq=1-sumipiq/q-1, basis of nonextensive statistical mechanics, we obtain a sole analytical connection between q and left( b,c,qnright) which turns out to be remarkably good when compared with computational simulations. With this result we also derive an analytical approximation for the stationary distribution for the (squared) volatility. Using a generalised Kullback-Leibler relative entropy form based on Sq, we also analyse the degree of dependence between successive returns, zt and zt+1, of GARCH(1,1) processes. This degree of dependence is quantified by an entropic index, qop. Our analysis points the existence of a unique relation between the three entropic indexes qop, q and qn of the problem, independent of the value of (b,c).

  12. Leveraging Commercially Issued Multi-Factor Identification Credentials

    NASA Technical Reports Server (NTRS)

    Baldridge, Tim W.

    2010-01-01

    This slide presentation reviews the Identity, Credential and Access Management (ICAM) system. This system is a complete system of identity management, access to desktops and applications, use of smartcards, and building access throughout NASA.

  13. Linking market interaction intensity of 3D Ising type financial model with market volatility

    NASA Astrophysics Data System (ADS)

    Fang, Wen; Ke, Jinchuan; Wang, Jun; Feng, Ling

    2016-11-01

    Microscopic interaction models in physics have been used to investigate the complex phenomena of economic systems. The simple interactions involved can lead to complex behaviors and help the understanding of mechanisms in the financial market at a systemic level. This article aims to develop a financial time series model through 3D (three-dimensional) Ising dynamic system which is widely used as an interacting spins model to explain the ferromagnetism in physics. Through Monte Carlo simulations of the financial model and numerical analysis for both the simulation return time series and historical return data of Hushen 300 (HS300) index in Chinese stock market, we show that despite its simplicity, this model displays stylized facts similar to that seen in real financial market. We demonstrate a possible underlying link between volatility fluctuations of real stock market and the change in interaction strengths of market participants in the financial model. In particular, our stochastic interaction strength in our model demonstrates that the real market may be consistently operating near the critical point of the system.

  14. Separation of components from a scale mixture of Gaussian white noises

    NASA Astrophysics Data System (ADS)

    Vamoş, Călin; Crăciun, Maria

    2010-05-01

    The time evolution of a physical quantity associated with a thermodynamic system whose equilibrium fluctuations are modulated in amplitude by a slowly varying phenomenon can be modeled as the product of a Gaussian white noise {Zt} and a stochastic process with strictly positive values {Vt} referred to as volatility. The probability density function (pdf) of the process Xt=VtZt is a scale mixture of Gaussian white noises expressed as a time average of Gaussian distributions weighted by the pdf of the volatility. The separation of the two components of {Xt} can be achieved by imposing the condition that the absolute values of the estimated white noise be uncorrelated. We apply this method to the time series of the returns of the daily S&P500 index, which has also been analyzed by means of the superstatistics method that imposes the condition that the estimated white noise be Gaussian. The advantage of our method is that this financial time series is processed without partitioning or removal of the extreme events and the estimated white noise becomes almost Gaussian only as result of the uncorrelation condition.

  15. A Financial Market Model Incorporating Herd Behaviour.

    PubMed

    Wray, Christopher M; Bishop, Steven R

    2016-01-01

    Herd behaviour in financial markets is a recurring phenomenon that exacerbates asset price volatility, and is considered a possible contributor to market fragility. While numerous studies investigate herd behaviour in financial markets, it is often considered without reference to the pricing of financial instruments or other market dynamics. Here, a trader interaction model based upon informational cascades in the presence of information thresholds is used to construct a new model of asset price returns that allows for both quiescent and herd-like regimes. Agent interaction is modelled using a stochastic pulse-coupled network, parametrised by information thresholds and a network coupling probability. Agents may possess either one or two information thresholds that, in each case, determine the number of distinct states an agent may occupy before trading takes place. In the case where agents possess two thresholds (labelled as the finite state-space model, corresponding to agents' accumulating information over a bounded state-space), and where coupling strength is maximal, an asymptotic expression for the cascade-size probability is derived and shown to follow a power law when a critical value of network coupling probability is attained. For a range of model parameters, a mixture of negative binomial distributions is used to approximate the cascade-size distribution. This approximation is subsequently used to express the volatility of model price returns in terms of the model parameter which controls the network coupling probability. In the case where agents possess a single pulse-coupling threshold (labelled as the semi-infinite state-space model corresponding to agents' accumulating information over an unbounded state-space), numerical evidence is presented that demonstrates volatility clustering and long-memory patterns in the volatility of asset returns. Finally, output from the model is compared to both the distribution of historical stock returns and the market price of an equity index option.

  16. Tracing the ingredients for a habitable earth from interstellar space through planet formation

    PubMed Central

    Bergin, Edwin A.; Blake, Geoffrey A.; Ciesla, Fred; Hirschmann, Marc M.; Li, Jie

    2015-01-01

    We use the C/N ratio as a monitor of the delivery of key ingredients of life to nascent terrestrial worlds. Total elemental C and N contents, and their ratio, are examined for the interstellar medium, comets, chondritic meteorites, and terrestrial planets; we include an updated estimate for the bulk silicate Earth (C/N = 49.0 ± 9.3). Using a kinetic model of disk chemistry, and the sublimation/condensation temperatures of primitive molecules, we suggest that organic ices and macromolecular (refractory or carbonaceous dust) organic material are the likely initial C and N carriers. Chemical reactions in the disk can produce nebular C/N ratios of ∼1–12, comparable to those of comets and the low end estimated for planetesimals. An increase of the C/N ratio is traced between volatile-rich pristine bodies and larger volatile-depleted objects subjected to thermal/accretional metamorphism. The C/N ratios of the dominant materials accreted to terrestrial planets should therefore be higher than those seen in carbonaceous chondrites or comets. During planetary formation, we explore scenarios leading to further volatile loss and associated C/N variations owing to core formation and atmospheric escape. Key processes include relative enrichment of nitrogen in the atmosphere and preferential sequestration of carbon by the core. The high C/N bulk silicate Earth ratio therefore is best satisfied by accretion of thermally processed objects followed by large-scale atmospheric loss. These two effects must be more profound if volatile sequestration in the core is effective. The stochastic nature of these processes hints that the surface/atmospheric abundances of biosphere-essential materials will likely be variable. PMID:26150527

  17. Tracing the ingredients for a habitable earth from interstellar space through planet formation.

    PubMed

    Bergin, Edwin A; Blake, Geoffrey A; Ciesla, Fred; Hirschmann, Marc M; Li, Jie

    2015-07-21

    We use the C/N ratio as a monitor of the delivery of key ingredients of life to nascent terrestrial worlds. Total elemental C and N contents, and their ratio, are examined for the interstellar medium, comets, chondritic meteorites, and terrestrial planets; we include an updated estimate for the bulk silicate Earth (C/N = 49.0 ± 9.3). Using a kinetic model of disk chemistry, and the sublimation/condensation temperatures of primitive molecules, we suggest that organic ices and macromolecular (refractory or carbonaceous dust) organic material are the likely initial C and N carriers. Chemical reactions in the disk can produce nebular C/N ratios of ∼1-12, comparable to those of comets and the low end estimated for planetesimals. An increase of the C/N ratio is traced between volatile-rich pristine bodies and larger volatile-depleted objects subjected to thermal/accretional metamorphism. The C/N ratios of the dominant materials accreted to terrestrial planets should therefore be higher than those seen in carbonaceous chondrites or comets. During planetary formation, we explore scenarios leading to further volatile loss and associated C/N variations owing to core formation and atmospheric escape. Key processes include relative enrichment of nitrogen in the atmosphere and preferential sequestration of carbon by the core. The high C/N bulk silicate Earth ratio therefore is best satisfied by accretion of thermally processed objects followed by large-scale atmospheric loss. These two effects must be more profound if volatile sequestration in the core is effective. The stochastic nature of these processes hints that the surface/atmospheric abundances of biosphere-essential materials will likely be variable.

  18. Late Veneer collisions and their impact on the evolution of Venus (PS Division Outstanding ECS Award Lecture)

    NASA Astrophysics Data System (ADS)

    Gillmann, Cedric; Golabek, Gregor; Tackley, Paul; Raymond, Sean

    2017-04-01

    During the end of the accretion, the so-called Late Veneer phase, while the bulk of the mass of terrestrial planets is already in place, a substantial number of large collisions can still occur. Those impacts are thought to be responsible for the repartition of the Highly Siderophile Elements. They are also susceptible to have a strong effect on volatile repartition and mantle convection. We study how Late Veneer impacts modify the evolution of Venus and its atmosphere, using a coupled numerical simulation. We focus on volatile exchanges and their effects on surface conditions. Mantle dynamics, volcanism and degassing processes lead to an input of gases in the atmosphere and are modeled using the StagYY mantle convection code. Volatile losses are estimated through atmospheric escape modeling. It involves two different aspects: hydrodynamic escape (0-500 Myr) and non-thermal escape. Hydrodynamic escape is massive but occurs only when the solar energy input is strong. Post 4 Ga escape from non-thermal processes is comparatively low but long-lived. The resulting state of the atmosphere is used to the calculate greenhouse effect and surface temperature, through a one-dimensional gray radiative-convective model. Large impacts are capable of contributing to (i) atmospheric escape, (ii) volatile replenishment and (iii) energy transfer to the mantle. We test various impactor compositions, impact parameters (velocity, location, size, and timing) and eroding power. Scenarios we tested are adapted from numerical stochastic simulations (Raymond et al., 2013). Impactor sizes are dominated by large bodies (R>500 km). Erosion of the atmosphere by a few large impacts appears limited. Swarms of smaller more mass-effective impactors seem required for this effect to be significant. Large impactors have two main effects on the atmosphere. They can (i) create a large input of volatile from the melting they cause during the impact and through the volatiles they carry. This leads to an increase in atmosphere density and surface temperatures. However, early impacts can also (ii) deplete the mantle of Venus and (assuming strong early escape) ultimately remove volatiles from the system, leading to lower late degassing and lower surface temperatures. The competition between those effects depends on the time of the impact, which directly governs the strength of atmospheric losses.

  19. Stochastic Models for Precipitable Water in Convection

    NASA Astrophysics Data System (ADS)

    Leung, Kimberly

    Atmospheric precipitable water vapor (PWV) is the amount of water vapor in the atmosphere within a vertical column of unit cross-sectional area and is a critically important parameter of precipitation processes. However, accurate high-frequency and long-term observations of PWV in the sky were impossible until the availability of modern instruments such as radar. The United States Department of Energy (DOE)'s Atmospheric Radiation Measurement (ARM) Program facility made the first systematic and high-resolution observations of PWV at Darwin, Australia since 2002. At a resolution of 20 seconds, this time series allowed us to examine the volatility of PWV, including fractal behavior with dimension equal to 1.9, higher than the Brownian motion dimension of 1.5. Such strong fractal behavior calls for stochastic differential equation modeling in an attempt to address some of the difficulties of convective parameterization in various kinds of climate models, ranging from general circulation models (GCM) to weather research forecasting (WRF) models. This important observed data at high resolution can capture the fractal behavior of PWV and enables stochastic exploration into the next generation of climate models which considers scales from micrometers to thousands of kilometers. As a first step, this thesis explores a simple stochastic differential equation model of water mass balance for PWV and assesses accuracy, robustness, and sensitivity of the stochastic model. A 1000-day simulation allows for the determination of the best-fitting 25-day period as compared to data from the TWP-ICE field campaign conducted out of Darwin, Australia in early 2006. The observed data and this portion of the simulation had a correlation coefficient of 0.6513 and followed similar statistics and low-resolution temporal trends. Building on the point model foundation, a similar algorithm was applied to the National Center for Atmospheric Research (NCAR)'s existing single-column model as a test-of-concept for eventual inclusion in a general circulation model. The stochastic scheme was designed to be coupled with the deterministic single-column simulation by modifying results of the existing convective scheme (Zhang-McFarlane) and was able to produce a 20-second resolution time series that effectively simulated observed PWV, as measured by correlation coefficient (0.5510), fractal dimension (1.9), statistics, and visual examination of temporal trends.

  20. Prediction of passenger ride quality in a multifactor environment

    NASA Technical Reports Server (NTRS)

    Dempsey, T. K.; Leatherwood, J. D.

    1976-01-01

    A model being developed, permits the understanding and prediction of passenger discomfort in a multifactor environment with particular emphasis upon combined noise and vibration. The model has general applicability to diverse transportation systems and provides a means of developing ride quality design criteria as well as a diagnostic tool for identifying the vibration and/or noise stimuli causing discomfort. Presented are: (1) a review of the basic theoretical and mathematical computations associated with the model, (2) a discussion of methodological and criteria investigations for both the vertical and roll axes of vibration, (3) a description of within-axis masking of discomfort responses for the vertical axis, thereby allowing prediction of the total discomfort due to any random vertical vibration, (4) a discussion of initial data on between-axis masking, and (5) discussion of a study directed towards extension of the vibration model to the more general case of predicting ride quality in the combined noise and vibration environments.

  1. An overview of a multifactor-system theory of personality and individual differences: III. Life span development and the heredity-environment issue.

    PubMed

    Powell, A; Royce, J R

    1981-12-01

    In Part III of this three-part series on multifactor-system theory, multivariate, life-span development is approached from the standpoint of a quantitative and qualitative analysis of the ontogenesis of factors in each of the six systems. The pattern of quantitative development (described via the Gompertz equation and three developmental parameters) involves growth, stability, and decline, and qualitative development involves changes in the organization of factors (e.g., factor differentiation and convergence). Hereditary and environmental sources of variation are analyzed via the factor gene model and the concept of heredity-dominant factors, and the factor-learning model and environment-dominant factors. It is hypothesized that the sensory and motor systems are heredity dominant, that the style and value systems are environment dominant, and that the cognitive and affective systems are partially heredity dominant.

  2. Management of suicidal and self-harming behaviors in prisons: systematic literature review of evidence-based activities.

    PubMed

    Barker, Emma; Kõlves, Kairi; De Leo, Diego

    2014-01-01

    The purpose of this study was to systematically analyze existing literature testing the effectiveness of programs involving the management of suicidal and self-harming behaviors in prisons. For the study, 545 English-language articles published in peer reviewed journals were retrieved using the terms "suicid*," "prevent*," "prison," or "correctional facility" in SCOPUS, MEDLINE, PROQUEST, and Web of Knowledge. In total, 12 articles were relevant, with 6 involving multi-factored suicide prevention programs, and 2 involving peer focused programs. Others included changes to the referral and care of suicidal inmates, staff training, legislation changes, and a suicide prevention program for inmates with Borderline Personality Disorder. Multi-factored suicide prevention programs appear most effective in the prison environment. Using trained inmates to provide social support to suicidal inmates is promising. Staff attitudes toward training programs were generally positive.

  3. Application of the multifactor dimensionality reduction method in evaluation of the roles of multiple genes/enzymes in multidrug-resistant acquisition in Pseudomonas aeruginosa strains.

    PubMed

    Yao, Z; Peng, Y; Bi, J; Xie, C; Chen, X; Li, Y; Ye, X; Zhou, J

    2016-03-01

    Multidrug-resistant Pseudomonas aeruginosa (MDRPA) infections are major threats to healthcare-associated infection control and the intrinsic molecular mechanisms of MDRPA are also unclear. We examined 348 isolates of P. aeruginosa, including 188 MDRPA and 160 non-MDRPA, obtained from five tertiary-care hospitals in Guangzhou, China. Significant correlations were found between gene/enzyme carriage and increased rates of antimicrobial resistance (P < 0·01). gyrA mutation, OprD loss and metallo-β-lactamase (MBL) presence were identified as crucial molecular risk factors for MDRPA acquisition by a combination of univariate logistic regression and a multifactor dimensionality reduction approach. The MDRPA rate was also elevated with the increase in positive numbers of those three determinants (P < 0·001). Thus, gyrA mutation, OprD loss and MBL presence may serve as predictors for early screening of MDRPA infections in clinical settings.

  4. Mining nutrigenetics patterns related to obesity: use of parallel multifactor dimensionality reduction.

    PubMed

    Karayianni, Katerina N; Grimaldi, Keith A; Nikita, Konstantina S; Valavanis, Ioannis K

    2015-01-01

    This paper aims to enlighten the complex etiology beneath obesity by analysing data from a large nutrigenetics study, in which nutritional and genetic factors associated with obesity were recorded for around two thousand individuals. In our previous work, these data have been analysed using artificial neural network methods, which identified optimised subsets of factors to predict one's obesity status. These methods did not reveal though how the selected factors interact with each other in the obtained predictive models. For that reason, parallel Multifactor Dimensionality Reduction (pMDR) was used here to further analyse the pre-selected subsets of nutrigenetic factors. Within pMDR, predictive models using up to eight factors were constructed, further reducing the input dimensionality, while rules describing the interactive effects of the selected factors were derived. In this way, it was possible to identify specific genetic variations and their interactive effects with particular nutritional factors, which are now under further study.

  5. The Research on Tunnel Surrounding Rock Classification Based on Geological Radar and Probability Theory

    NASA Astrophysics Data System (ADS)

    Xiao Yong, Zhao; Xin, Ji Yong; Shuang Ying, Zuo

    2018-03-01

    In order to effectively classify the surrounding rock types of tunnels, a multi-factor tunnel surrounding rock classification method based on GPR and probability theory is proposed. Geological radar was used to identify the geology of the surrounding rock in front of the face and to evaluate the quality of the rock face. According to the previous survey data, the rock uniaxial compressive strength, integrity index, fissure and groundwater were selected for classification. The related theories combine them into a multi-factor classification method, and divide the surrounding rocks according to the great probability. Using this method to classify the surrounding rock of the Ma’anshan tunnel, the surrounding rock types obtained are basically the same as those of the actual surrounding rock, which proves that this method is a simple, efficient and practical rock classification method, which can be used for tunnel construction.

  6. Spectroscopically Enhanced Method and System for Multi-Factor Biometric Authentication

    NASA Astrophysics Data System (ADS)

    Pishva, Davar

    This paper proposes a spectroscopic method and system for preventing spoofing of biometric authentication. One of its focus is to enhance biometrics authentication with a spectroscopic method in a multifactor manner such that a person's unique ‘spectral signatures’ or ‘spectral factors’ are recorded and compared in addition to a non-spectroscopic biometric signature to reduce the likelihood of imposter getting authenticated. By using the ‘spectral factors’ extracted from reflectance spectra of real fingers and employing cluster analysis, it shows how the authentic fingerprint image presented by a real finger can be distinguished from an authentic fingerprint image embossed on an artificial finger, or molded on a fingertip cover worn by an imposter. This paper also shows how to augment two widely used biometrics systems (fingerprint and iris recognition devices) with spectral biometrics capabilities in a practical manner and without creating much overhead or inconveniencing their users.

  7. When human walking becomes random walking: fractal analysis and modeling of gait rhythm fluctuations

    NASA Astrophysics Data System (ADS)

    Hausdorff, Jeffrey M.; Ashkenazy, Yosef; Peng, Chang-K.; Ivanov, Plamen Ch.; Stanley, H. Eugene; Goldberger, Ary L.

    2001-12-01

    We present a random walk, fractal analysis of the stride-to-stride fluctuations in the human gait rhythm. The gait of healthy young adults is scale-free with long-range correlations extending over hundreds of strides. This fractal scaling changes characteristically with maturation in children and older adults and becomes almost completely uncorrelated with certain neurologic diseases. Stochastic modeling of the gait rhythm dynamics, based on transitions between different “neural centers”, reproduces distinctive statistical properties of the gait pattern. By tuning one model parameter, the hopping (transition) range, the model can describe alterations in gait dynamics from childhood to adulthood - including a decrease in the correlation and volatility exponents with maturation.

  8. Valuation of exotic options in the framework of Levy processes

    NASA Astrophysics Data System (ADS)

    Milev, Mariyan; Georgieva, Svetla; Markovska, Veneta

    2013-12-01

    In this paper we explore a straightforward procedure to price derivatives by using the Monte Carlo approach when the underlying process is a jump-diffusion. We have compared the Black-Scholes model with one of its extensions that is the Merton model. The latter model is better in capturing the market's phenomena and is comparative to stochastic volatility models in terms of pricing accuracy. We have presented simulations of asset paths and pricing of barrier options for both Geometric Brownian motion and exponential Levy processes as it is the concrete case of the Merton model. A desired level of accuracy is obtained with simple computer operations in MATLAB for efficient computational time.

  9. Asymmetry in power-law magnitude correlations.

    PubMed

    Podobnik, Boris; Horvatić, Davor; Tenenbaum, Joel N; Stanley, H Eugene

    2009-07-01

    Time series of increments can be created in a number of different ways from a variety of physical phenomena. For example, in the phenomenon of volatility clustering-well-known in finance-magnitudes of adjacent increments are correlated. Moreover, in some time series, magnitude correlations display asymmetry with respect to an increment's sign: the magnitude of |x_{i}| depends on the sign of the previous increment x_{i-1} . Here we define a model-independent test to measure the statistical significance of any observed asymmetry. We propose a simple stochastic process characterized by a an asymmetry parameter lambda and a method for estimating lambda . We illustrate both the test and process by analyzing physiological data.

  10. Research on unit commitment with large-scale wind power connected power system

    NASA Astrophysics Data System (ADS)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  11. Estimation for time-changed self-similar stochastic processes

    NASA Astrophysics Data System (ADS)

    Arroum, W.; Jones, O. D.

    2005-12-01

    We consider processes of the form X(t) = X ~(θ(t)) where X ~ is a self-similar process with stationary increments and θ is a deterministic subordinator with a periodic activity function a = θ'> 0. Such processes have been proposed as models for high-frequency financial data, such as currency exchange rates, where there are known to be daily and weekly periodic fluctuations in the volatility, captured here by the periodic activity function. We review an existing estimator for the activity function then propose three new methods for estimating it and present some experimental studies of their performance. We finish with an application to some foreign exchange and FTSE100 futures data.

  12. Transformational, transactional, and passive-avoidant leadership characteristics of a surgical resident cohort: analysis using the multifactor leadership questionnaire and implications for improving surgical education curriculums.

    PubMed

    Horwitz, Irwin B; Horwitz, Sujin K; Daram, Pallavi; Brandt, Mary L; Brunicardi, F Charles; Awad, Samir S

    2008-07-01

    The need for leadership training has become recognized as being highly important to improving medical care, and should be included in surgical resident education curriculums. Surgical residents (n = 65) completed the 5x-short version of the Multifactor Leadership Questionnaire as a means of identifying leadership areas most in need of training among medical residents. The leadership styles of the residents were measured on 12 leadership scales. Comparisons between gender and postgraduate year (PGY) and comparisons to national norms were conducted. Of 12 leadership scales, the residents as a whole had significantly higher management by exception active and passive scores than those of the national norm (t = 6.6, P < 0.01, t = 2.8, P < 0.01, respectively), and significantly lower individualized consideration scores than the norm (t = 2.7, P < 0.01). Only one score, management by exception active was statistically different and higher among males than females (t = 2.12, P < 0.05). PGY3-5 had significantly lower laissez-faire scores than PGY1-2 (t = 2.20, P < 0.05). Principal component analysis revealed two leadership factors with eigenvalues over 1.0. Hierarchical regression found evidence of an augmentation effect for transformational leadership. Areas of resident leadership strengths and weaknesses were identified. The Multifactor Leadership Questionnaire was demonstrated to be a valuable tool for identifying specific areas where leadership training would be most beneficial in the educational curriculum. The future use of this instrument could prove valuable to surgical education training programs.

  13. Development and validation of a multifactor mindfulness scale in youth: The Comprehensive Inventory of Mindfulness Experiences-Adolescents (CHIME-A).

    PubMed

    Johnson, Catherine; Burke, Christine; Brinkman, Sally; Wade, Tracey

    2017-03-01

    Mindfulness-based interventions show consistent benefits in adults for a range of pathologies, but exploration of these approaches in youth is an emergent field, with limited measures of mindfulness for this population. This study aimed to investigate whether multifactor scales of mindfulness can be used in adolescents. A series of studies are presented assessing the performance of a recently developed adult measure, the Comprehensive Inventory of Mindfulness Experiences (CHIME) in 4 early adolescent samples. Study 1 was an investigation of how well the full adult measure (37 items) was understood by youth (N = 292). Study 2 piloted a revision of items in child friendly language with a small group (N = 48). The refined questionnaire for adolescents (CHIME-A) was then tested in Study 3 in a larger sample (N = 461) and subjected to exploratory factor analysis and a range of external validity measures. Study 4 was a confirmatory factor analysis in a new sample (N = 498) with additional external validity measures. Study 5 tested temporal stability (N = 120). Results supported an 8-factor 25-item measure of mindfulness in adolescents, with excellent model fit indices and sound internal consistency for the 8 subscales. Although the CFA supported an overarching factor, internal reliability of a combined total score was poor. The development of a multifactor measure represents a first step toward testing developmental models of mindfulness in young people. This in turn will aid construction of evidence based interventions that are not simply downward derivations of adult mindfulness programs. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. A detailed view on Model-Based Multifactor Dimensionality Reduction for detecting gene-gene interactions in case-control data in the absence and presence of noise

    PubMed Central

    CATTAERT, TOM; CALLE, M. LUZ; DUDEK, SCOTT M.; MAHACHIE JOHN, JESTINAH M.; VAN LISHOUT, FRANÇOIS; URREA, VICTOR; RITCHIE, MARYLYN D.; VAN STEEN, KRISTEL

    2010-01-01

    SUMMARY Analyzing the combined effects of genes and/or environmental factors on the development of complex diseases is a great challenge from both the statistical and computational perspective, even using a relatively small number of genetic and non-genetic exposures. Several data mining methods have been proposed for interaction analysis, among them, the Multifactor Dimensionality Reduction Method (MDR), which has proven its utility in a variety of theoretical and practical settings. Model-Based Multifactor Dimensionality Reduction (MB-MDR), a relatively new MDR-based technique that is able to unify the best of both non-parametric and parametric worlds, was developed to address some of the remaining concerns that go along with an MDR-analysis. These include the restriction to univariate, dichotomous traits, the absence of flexible ways to adjust for lower-order effects and important confounders, and the difficulty to highlight epistasis effects when too many multi-locus genotype cells are pooled into two new genotype groups. Whereas the true value of MB-MDR can only reveal itself by extensive applications of the method in a variety of real-life scenarios, here we investigate the empirical power of MB-MDR to detect gene-gene interactions in the absence of any noise and in the presence of genotyping error, missing data, phenocopy, and genetic heterogeneity. For the considered simulation settings, we show that the power is generally higher for MB-MDR than for MDR, in particular in the presence of genetic heterogeneity, phenocopy, or low minor allele frequencies. PMID:21158747

  15. Mixed H2/H∞ pitch control of wind turbine with a Markovian jump model

    NASA Astrophysics Data System (ADS)

    Lin, Zhongwei; Liu, Jizhen; Wu, Qiuwei; Niu, Yuguang

    2018-01-01

    This paper proposes a Markovian jump model and the corresponding H2/H∞ control strategy for the wind turbine driven by the stochastic switching wind speed, which can be used to regulate the generator speed in order to harvest the rated power while reducing the fatigue loads on the mechanical side of wind turbine. Through sampling the low-frequency wind speed data into separate intervals, the stochastic characteristic of the steady wind speed can be represented as a Markov process, while the high-frequency wind speed in the each interval is regarded as the disturbance input. Then, the traditional operating points of wind turbine can be divided into separate subregions correspondingly, where the model parameters and the control mode can be fixed in each mode. Then, the mixed H2/H∞ control problem is discussed for such a class of Markovian jump wind turbine working above the rated wind speed to guarantee both the disturbance rejection and the mechanical loads objectives, which can reduce the power volatility and the generator torque fluctuation of the whole transmission mechanism efficiently. Simulation results for a 2 MW wind turbine show the effectiveness of the proposed method.

  16. Evaluation of Foreign Investment in Power Plants using Real Options

    NASA Astrophysics Data System (ADS)

    Kato, Moritoshi; Zhou, Yicheng

    This paper proposes new methods for evaluating foreign investment in power plants under market uncertainty using a real options approach. We suppose a thermal power plant project in a deregulated electricity market. One of our proposed methods is that we calculate the cash flow generated by the project in a reference year using actual market data to incorporate periodic characteristics of energy prices into a yearly cash flow model. We make the stochastic yearly cash flow model with the initial value which is the cash flow in the reference year, and certain trend and volatility. Then we calculate the real options value (ROV) of the project which has abandonment options using the yearly cash flow model. Another our proposed method is that we evaluate foreign currency/domestic currency exchange rate risk by representing ROV in foreign currency as yearly pay off and exchanging it to ROV in domestic currency using a stochastic exchange rate model. We analyze the effect of the heat rate and operation and maintenance costs of the power plant on ROV, and evaluate exchange rate risk through numerical examples. Our proposed method will be useful for the risk management of foreign investment in power plants.

  17. Kernel methods and flexible inference for complex stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Capobianco, Enrico

    2008-07-01

    Approximation theory suggests that series expansions and projections represent standard tools for random process applications from both numerical and statistical standpoints. Such instruments emphasize the role of both sparsity and smoothness for compression purposes, the decorrelation power achieved in the expansion coefficients space compared to the signal space, and the reproducing kernel property when some special conditions are met. We consider these three aspects central to the discussion in this paper, and attempt to analyze the characteristics of some known approximation instruments employed in a complex application domain such as financial market time series. Volatility models are often built ad hoc, parametrically and through very sophisticated methodologies. But they can hardly deal with stochastic processes with regard to non-Gaussianity, covariance non-stationarity or complex dependence without paying a big price in terms of either model mis-specification or computational efficiency. It is thus a good idea to look at other more flexible inference tools; hence the strategy of combining greedy approximation and space dimensionality reduction techniques, which are less dependent on distributional assumptions and more targeted to achieve computationally efficient performances. Advantages and limitations of their use will be evaluated by looking at algorithmic and model building strategies, and by reporting statistical diagnostics.

  18. Confounding Problems in Multifactor AOV When Using Several Organismic Variables of Limited Reliability

    ERIC Educational Resources Information Center

    Games, Paul A.

    1975-01-01

    A brief introduction is presented on how multiple regression and linear model techniques can handle data analysis situations that most educators and psychologists think of as appropriate for analysis of variance. (Author/BJG)

  19. Scaling symmetry, renormalization, and time series modeling: the case of financial assets dynamics.

    PubMed

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  20. Scaling symmetry, renormalization, and time series modeling: The case of financial assets dynamics

    NASA Astrophysics Data System (ADS)

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L.

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments’ stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  1. Connecting to HPC Systems | High-Performance Computing | NREL

    Science.gov Websites

    one of the following methods, which use multi-factor authentication. First, you will need to set up If you just need access to a command line on an HPC system, use one of the following methods

  2. Outdoor Leaders' Emotional Intelligence and Transformational Leadership

    ERIC Educational Resources Information Center

    Hayashi, Aya; Ewert, Alan

    2006-01-01

    This study explored the concept of outdoor leadership from the perspectives of emotional intelligence and transformational leadership. Levels of emotional intelligence, multifactor leadership, outdoor experience, and social desirability were examined using 46 individuals designated as outdoor leaders. The results revealed a number of unique…

  3. 75 FR 67776 - Comment Request; Review of Productivity Statistics

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-03

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Comment Request; Review of Productivity Statistics... Statistics (BLS) is responsible for publishing measures of labor productivity and multifactor productivity..., Office of Productivity and Technology, Bureau of Labor Statistics, Room 2150, 2 Massachusetts Avenue, NE...

  4. Bunched black (and grouped grey) swans: Dissipative and non-dissipative models of correlated extreme fluctuations in complex geosystems

    NASA Astrophysics Data System (ADS)

    Watkins, N. W.

    2013-01-01

    I review the hierarchy of approaches to complex systems, focusing particularly on stochastic equations. I discuss how the main models advocated by the late Benoit Mandelbrot fit into this classification, and how they continue to contribute to cross-disciplinary approaches to the increasingly important problems of correlated extreme events and unresolved scales. The ideas have broad importance, with applications ranging across science areas as diverse as the heavy tailed distributions of intense rainfall in hydrology, after which Mandelbrot named the "Noah effect"; the problem of correlated runs of dry summers in climate, after which the "Joseph effect" was named; and the intermittent, bursty, volatility seen in finance and fluid turbulence.

  5. You Have What? Personality! Traits That Predict Leadership Styles for Elementary Administrators

    ERIC Educational Resources Information Center

    Garcia, Melinda

    2013-01-01

    This research explored relationships between followers' perceptions of elementary school principals' Big Five Personality Traits, using the "International Personality Item Pool" (IPIP) (Goldberg, 1999), and principals' Leadership Styles, using the "Multi-factor Leadership Questionnaire" (MLQ) (Bass & Avolio, 2004). A sample…

  6. Understanding the Supplemental Instruction Leader

    ERIC Educational Resources Information Center

    James, Adrian; Moore, Lori

    2018-01-01

    This article explored the learning styles and leadership styles of Supplemental Instruction (SI) leaders at Texas A&M University, and the impact of those preferences on recurring attendance to their sessions. The Learning Style Inventory, the Multifactor Leadership Questionnaire, and a demographic instrument were administered to SI leaders…

  7. Factorial Design: An Eight Factor Experiment Using Paper Helicopters

    NASA Technical Reports Server (NTRS)

    Kozma, Michael

    1996-01-01

    The goal of this paper is to present the analysis of the multi-factor experiment (factorial design) conducted in EG490, Junior Design at Loyola College in Maryland. The discussion of this paper concludes the experimental analysis and ties the individual class papers together.

  8. The Multiple Component Alternative for Gifted Education.

    ERIC Educational Resources Information Center

    Swassing, Ray

    1984-01-01

    The Multiple Component Model (MCM) of gifted education includes instruction which may overlap in literature, history, art, enrichment, languages, science, physics, math, music, and dance. The model rests on multifactored identification and requires systematic development and selection of components with ongoing feedback and evaluation. (CL)

  9. Multilinear Graph Embedding: Representation and Regularization for Images.

    PubMed

    Chen, Yi-Lei; Hsu, Chiou-Ting

    2014-02-01

    Given a set of images, finding a compact and discriminative representation is still a big challenge especially when multiple latent factors are hidden in the way of data generation. To represent multifactor images, although multilinear models are widely used to parameterize the data, most methods are based on high-order singular value decomposition (HOSVD), which preserves global statistics but interprets local variations inadequately. To this end, we propose a novel method, called multilinear graph embedding (MGE), as well as its kernelization MKGE to leverage the manifold learning techniques into multilinear models. Our method theoretically links the linear, nonlinear, and multilinear dimensionality reduction. We also show that the supervised MGE encodes informative image priors for image regularization, provided that an image is represented as a high-order tensor. From our experiments on face and gait recognition, the superior performance demonstrates that MGE better represents multifactor images than classic methods, including HOSVD and its variants. In addition, the significant improvement in image (or tensor) completion validates the potential of MGE for image regularization.

  10. A Simple and Computationally Efficient Sampling Approach to Covariate Adjustment for Multifactor Dimensionality Reduction Analysis of Epistasis

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    Epistasis or gene-gene interaction is a fundamental component of the genetic architecture of complex traits such as disease susceptibility. Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free method to detect epistasis when there are no significant marginal genetic effects. However, in many studies of complex disease, other covariates like age of onset and smoking status could have a strong main effect and may potentially interfere with MDR's ability to achieve its goal. In this paper, we present a simple and computationally efficient sampling method to adjust for covariate effects in MDR. We use simulation to show that after adjustment, MDR has sufficient power to detect true gene-gene interactions. We also compare our method with the state-of-art technique in covariate adjustment. The results suggest that our proposed method performs similarly, but is more computationally efficient. We then apply this new method to an analysis of a population-based bladder cancer study in New Hampshire. PMID:20924193

  11. Research on accuracy analysis of laser transmission system based on Zemax and Matlab

    NASA Astrophysics Data System (ADS)

    Chen, Haiping; Liu, Changchun; Ye, Haixian; Xiong, Zhao; Cao, Tingfen

    2017-05-01

    Laser transmission system is important in high power solid-state laser facilities and its function is to transfer and focus the light beam in accordance with the physical function of the facility. This system is mainly composed of transmission mirror modules and wedge lens module. In order to realize the precision alignment of the system, the precision alignment of the system is required to be decomposed into the allowable range of the calibration error of each module. The traditional method is to analyze the error factors of the modules separately, and then the linear synthesis is carried out, and the influence of the multi-module and multi-factor is obtained. In order to analyze the effect of the alignment error of each module on the beam center and focus more accurately, this paper aims to combine with the Monte Carlo random test and ray tracing, analyze influence of multi-module and multi-factor on the center of the beam, and evaluate and optimize the results of accuracy decomposition.

  12. A Comparative Study on Multifactor Dimensionality Reduction Methods for Detecting Gene-Gene Interactions with the Survival Phenotype

    PubMed Central

    Lee, Seungyeoun; Kim, Yongkang; Kwon, Min-Seok; Park, Taesung

    2015-01-01

    Genome-wide association studies (GWAS) have extensively analyzed single SNP effects on a wide variety of common and complex diseases and found many genetic variants associated with diseases. However, there is still a large portion of the genetic variants left unexplained. This missing heritability problem might be due to the analytical strategy that limits analyses to only single SNPs. One of possible approaches to the missing heritability problem is to consider identifying multi-SNP effects or gene-gene interactions. The multifactor dimensionality reduction method has been widely used to detect gene-gene interactions based on the constructive induction by classifying high-dimensional genotype combinations into one-dimensional variable with two attributes of high risk and low risk for the case-control study. Many modifications of MDR have been proposed and also extended to the survival phenotype. In this study, we propose several extensions of MDR for the survival phenotype and compare the proposed extensions with earlier MDR through comprehensive simulation studies. PMID:26339630

  13. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, High-Cycle and Low-Cycle Mechanical Fatigue, Creep and Thermal Fatigue Effects

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Boyce, Lola

    1995-01-01

    The development of methodology for a probabilistic material strength degradation is described. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes five effects that typically reduce lifetime strength: high temperature, high-cycle mechanical fatigue, low-cycle mechanical fatigue, creep and thermal fatigue. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing predictions of high-cycle mechanical fatigue and high temperature effects with experiments are presented. Results from this limited verification study strongly supported that material degradation can be represented by randomized multifactor interaction models.

  14. A Study on the Assessment of Multi-Factors Affecting Urban Floods Using Satellite Image: A Case Study in Nakdong Basin, S. Korea

    NASA Astrophysics Data System (ADS)

    Kwak, Youngjoo; Kondoh, Akihiko

    2010-05-01

    Floods are also related to the changes in social economic conditions and land use. Recently, floods increased due to rapid urbanization and human activity in the lowland. Therefore, integrated management of total basin system is necessary to get the secure society. Typhoon ‘Rusa’ swept through eastern and southern parts of South Korea in the 2002. This pity experience gave us valuable knowledge that could be used to mitigate the future flood hazards. The purpose of this study is to construct the digital maps of the multi-factors related to urban flood concerning geomorphologic characteristics, land cover, and surface wetness. Parameters particularly consider geomorphologic functional unit, geomorphologic parameters derived from DEM (digital elevation model), and land use. The research area is Nakdong River Basin in S. Korea. As a result of preliminary analysis for Pusan area, the vulnerability map and the flood-prone areas can be extracted by applying spatial analysis on GIS (geographic information system).

  15. A Financial Market Model Incorporating Herd Behaviour

    PubMed Central

    2016-01-01

    Herd behaviour in financial markets is a recurring phenomenon that exacerbates asset price volatility, and is considered a possible contributor to market fragility. While numerous studies investigate herd behaviour in financial markets, it is often considered without reference to the pricing of financial instruments or other market dynamics. Here, a trader interaction model based upon informational cascades in the presence of information thresholds is used to construct a new model of asset price returns that allows for both quiescent and herd-like regimes. Agent interaction is modelled using a stochastic pulse-coupled network, parametrised by information thresholds and a network coupling probability. Agents may possess either one or two information thresholds that, in each case, determine the number of distinct states an agent may occupy before trading takes place. In the case where agents possess two thresholds (labelled as the finite state-space model, corresponding to agents’ accumulating information over a bounded state-space), and where coupling strength is maximal, an asymptotic expression for the cascade-size probability is derived and shown to follow a power law when a critical value of network coupling probability is attained. For a range of model parameters, a mixture of negative binomial distributions is used to approximate the cascade-size distribution. This approximation is subsequently used to express the volatility of model price returns in terms of the model parameter which controls the network coupling probability. In the case where agents possess a single pulse-coupling threshold (labelled as the semi-infinite state-space model corresponding to agents’ accumulating information over an unbounded state-space), numerical evidence is presented that demonstrates volatility clustering and long-memory patterns in the volatility of asset returns. Finally, output from the model is compared to both the distribution of historical stock returns and the market price of an equity index option. PMID:27007236

  16. An ensemble model of competitive multi-factor binding of the genome

    PubMed Central

    Wasson, Todd; Hartemink, Alexander J.

    2009-01-01

    Hundreds of different factors adorn the eukaryotic genome, binding to it in large number. These DNA binding factors (DBFs) include nucleosomes, transcription factors (TFs), and other proteins and protein complexes, such as the origin recognition complex (ORC). DBFs compete with one another for binding along the genome, yet many current models of genome binding do not consider different types of DBFs together simultaneously. Additionally, binding is a stochastic process that results in a continuum of binding probabilities at any position along the genome, but many current models tend to consider positions as being either binding sites or not. Here, we present a model that allows a multitude of DBFs, each at different concentrations, to compete with one another for binding sites along the genome. The result is an “occupancy profile,” a probabilistic description of the DNA occupancy of each factor at each position. We implement our model efficiently as the software package COMPETE. We demonstrate genome-wide and at specific loci how modeling nucleosome binding alters TF binding, and vice versa, and illustrate how factor concentration influences binding occupancy. Binding cooperativity between nearby TFs arises implicitly via mutual competition with nucleosomes. Our method applies not only to TFs, but also recapitulates known occupancy profiles of a well-studied replication origin with and without ORC binding. Importantly, the sequence preferences our model takes as input are derived from in vitro experiments. This ensures that the calculated occupancy profiles are the result of the forces of competition represented explicitly in our model and the inherent sequence affinities of the constituent DBFs. PMID:19720867

  17. Usable Multi-factor Authentication and Risk-based Authorization

    DTIC Science & Technology

    2015-06-01

    acceptance. In the previous section we described user studies that explored risks perceived by individuals using online banking and credit card purchases... iTunes purchases. We note that the fingerprint scanners in the current experiment are very different from what would be available in future. However

  18. Exploring the Relationships between Principals' Life Experiences and Transformational Leadership Behaviours

    ERIC Educational Resources Information Center

    Nash, Steve; Bangert, Art

    2014-01-01

    The primary objective of this research study was to explore the relationships between principals' life experiences and their transformational leadership behaviours. Over 212 public school principals completed both the lifetime leadership inventory (LLI) and the multifactor leadership questionnaire (MLQ). Exploratory and confirmatory factor…

  19. Obesity, hypertension and genetic variation in the TIGER Study

    USDA-ARS?s Scientific Manuscript database

    Obesity and hypertension are multifactoral conditions in which the onset and severity of the conditions are influenced by the interplay of genetic and environmental factors. We hypothesize that multiple genes and environmental factors account for a significant amount of variation in BMI and blood pr...

  20. Emotional Intelligence and the Career Choice Process.

    ERIC Educational Resources Information Center

    Emmerling, Robert J.; Cherniss, Cary

    2003-01-01

    Emotional intelligence as conceptualized by Mayer and Salovey consists of perceiving emotions, using emotions to facilitate thoughts, understanding emotions, and managing emotions to enhance personal growth. The Multifactor Emotional Intelligence Scale has proven a valid and reliable measure that can be used to explore the implications of…

  1. Is there a genetic solution to bovine respiratory disease complex?

    USDA-ARS?s Scientific Manuscript database

    Bovine respiratory disease complex (BRDC) is a complex multi-factor disease, which increases costs and reduces revenue from feedlot cattle. Multiple stressors and pathogens (viral and bacterial) have been implicated in the etiology of BRDC, therefore multiple approaches will be needed to evaluate a...

  2. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2010-01-01

    Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required to represent the ejected foam. The exponents were evaluated by least squares method from experimental data. The equation is used and it can represent multiple factors in other problems as well; for example, evaluation of fatigue life, creep life, fracture toughness, and structural fracture, as well as optimization functions. The software is rather simplistic. Required inputs are initial value, final value, and an exponent for each factor. The number of factors is open-ended. The value is updated as each factor is evaluated. If a factor goes to zero, the previous value is used in the evaluation.

  3. Examining Dimensions of Self-Efficacy for Writing

    ERIC Educational Resources Information Center

    Bruning, Roger; Dempsey, Michael; Kauffman, Douglas F.; McKim, Courtney; Zumbrunn, Sharon

    2013-01-01

    A multifactor perspective on writing self-efficacy was examined in 2 studies. Three factors were proposed--self-efficacy for writing ideation, writing conventions, and writing self-regulation--and a scale constructed to reflect these factors. In Study 1, middle school students (N = 697) completed the Self-Efficacy for Writing Scale (SEWS), along…

  4. Do Leadership Styles Influence Organizational Health? A Study in Educational Organizations

    ERIC Educational Resources Information Center

    Toprak, Mustafa; Inandi, Bulent; Colak, Ahmet Levent

    2015-01-01

    This research aims to investigate the effect of leadership styles of school principals on organizational health. Causal-comparative research model was used to analyze the relationships between leadership types and organizational health. For data collection, a Likert type Multifactor Leadership scale questionnaire and Organizational Health scale…

  5. Transformational Leadership and the Leadership Performance of Oregon Secondary School Principals

    ERIC Educational Resources Information Center

    Breaker, Jason Lee

    2009-01-01

    A study of 118 secondary school principals in Oregon was conducted to examine the relationship of transformational leadership to secondary school principals' leadership performance. This study measured the transformational leadership of secondary school principals in Oregon using the "Multifactor Leadership Questionnaire (5X-Short)"…

  6. Appropriate Use Policy | High-Performance Computing | NREL

    Science.gov Websites

    users of the National Renewable Energy Laboratory (NREL) High Performance Computing (HPC) resources government agency, National Laboratory, University, or private entity, the intellectual property terms (if issued a multifactor token which may be a physical token or a virtual token used with one-time password

  7. The Negative Testing Effect and Multifactor Account

    ERIC Educational Resources Information Center

    Peterson, Daniel J.; Mulligan, Neil W.

    2013-01-01

    Across 3 experiments, we investigated the factors that dictate when taking a test improves subsequent memory performance (the "testing effect"). In Experiment 1, participants retrieving a set of targets during a retrieval practice phase ultimately recalled fewer of those targets compared with a group of participants who studied the…

  8. Worker Traits Training Unit. MA Handbook No. 314.

    ERIC Educational Resources Information Center

    Manpower Administration (DOL), Washington, DC.

    This training unit provides persons involved in employment interviewing, vocational counseling, curriculum planning, and other manpower activities with a multifactor approach for obtaining information from an individual and relating the data to job requirements. It is intended to result in the development of the bridge between client potential and…

  9. Organizational Deviance and Multi-Factor Leadership

    ERIC Educational Resources Information Center

    Aksu, Ali

    2016-01-01

    Organizational deviant behaviors can be defined as behaviors that have deviated from standards and uncongenial to organization's expectations. When such behaviors have been thought to damage the organization, it can be said that reducing the deviation behaviors at minimum level is necessary for a healthy organization. The aim of this research is…

  10. A Multifactor Approach to Research in Instructional Technology.

    ERIC Educational Resources Information Center

    Ragan, Tillman J.

    In a field such as instructional design, explanations of educational outcomes must necessarily consider multiple input variables. To adequately understand the contribution made by the independent variables, it is helpful to have a visual conception of how the input variables interrelate. Two variable models are adequately represented by a two…

  11. Teacher Perceptions of Principals' Leadership Qualities: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Hauserman, Cal P.; Ivankova, Nataliya V.; Stick, Sheldon L.

    2013-01-01

    This mixed methods sequential explanatory study utilized the Multi-factor Leadership Questionnaire, responses to open-ended questions, and in-depth interviews to identify transformational leadership qualities that were present among principals in Alberta, Canada. The first quantitative phase consisted of a random sample of 135 schools (with…

  12. Recent trends in hardware security exploiting hybrid CMOS-resistive memory circuits

    NASA Astrophysics Data System (ADS)

    Sahay, Shubham; Suri, Manan

    2017-12-01

    This paper provides a comprehensive review and insight of recent trends in the field of random number generator (RNG) and physically unclonable function (PUF) circuits implemented using different types of emerging resistive non-volatile (NVM) memory devices. We present a detailed review of hybrid RNG/PUF implementations based on the use of (i) Spin-Transfer Torque (STT-MRAM), and (ii) metal-oxide based (OxRAM), NVM devices. Various approaches on Hybrid CMOS-NVM RNG/PUF circuits are considered, followed by a discussion on different nanoscale device phenomena. Certain nanoscale device phenomena (variability/stochasticity etc), which are otherwise undesirable for reliable memory and storage applications, form the basis for low power and highly scalable RNG/PUF circuits. Detailed qualitative comparison and benchmarking of all implementations is performed.

  13. Insuring wind energy production

    NASA Astrophysics Data System (ADS)

    D'Amico, Guglielmo; Petroni, Filippo; Prattico, Flavio

    2017-02-01

    This paper presents an insurance contract that the supplier of wind energy may subscribe in order to immunize the production of electricity against the volatility of the wind speed process. The other party of the contract may be any dispatchable energy producer, like gas turbine or hydroelectric generator, which can supply the required energy in case of little or no wind. The adoption of a stochastic wind speed model allows the computation of the fair premium that the wind power supplier has to pay in order to hedge the risk of inadequate output of electricity at any time. Recursive type equations are obtained for the prospective mathematical reserves of the insurance contract and for their higher order moments. The model and the validity of the results are illustrated through a numerical example.

  14. Regional variations in the diversity and predicted metabolic potential of benthic prokaryotes in coastal northern Zhejiang, East China Sea

    PubMed Central

    Wang, Kai; Ye, Xiansen; Zhang, Huajun; Chen, Heping; Zhang, Demin; Liu, Lian

    2016-01-01

    Knowledge about the drivers of benthic prokaryotic diversity and metabolic potential in interconnected coastal sediments at regional scales is limited. We collected surface sediments across six zones covering ~200 km in coastal northern Zhejiang, East China Sea and combined 16 S rRNA gene sequencing, community-level metabolic prediction, and sediment physicochemical measurements to investigate variations in prokaryotic diversity and metabolic gene composition with geographic distance and under local environmental conditions. Geographic distance was the most influential factor in prokaryotic β-diversity compared with major environmental drivers, including temperature, sediment texture, acid-volatile sulfide, and water depth, but a large unexplained variation in community composition suggested the potential effects of unmeasured abiotic/biotic factors and stochastic processes. Moreover, prokaryotic assemblages showed a biogeographic provincialism across the zones. The predicted metabolic gene composition similarly shifted as taxonomic composition did. Acid-volatile sulfide was strongly correlated with variation in metabolic gene composition. The enrichments in the relative abundance of sulfate-reducing bacteria and genes relevant with dissimilatory sulfate reduction were observed and predicted, respectively, in the Yushan area. These results provide insights into the relative importance of geographic distance and environmental condition in driving benthic prokaryotic diversity in coastal areas and predict specific biogeochemically-relevant genes for future studies. PMID:27917954

  15. A Multi-Factor Analysis of Job Satisfaction among School Nurses

    ERIC Educational Resources Information Center

    Foley, Marcia; Lee, Julie; Wilson, Lori; Cureton, Virginia Young; Canham, Daryl

    2004-01-01

    Although job satisfaction has been widely studied among registered nurses working in traditional health care settings, little is known about the job-related values and perceptions of nurses working in school systems. Job satisfaction is linked to lower levels of job-related stress, burnout, and career abandonment among nurses. This study evaluated…

  16. Investigating Teachers' Organizational Socialization Levels and Perceptions about Leadership Styles of Their Principals

    ERIC Educational Resources Information Center

    Kadi, Aysegül

    2015-01-01

    The purpose of this study is to investigate teachers' organizational socialization levels and perceptions about leadership styles of their principals. Research was conducted with 361 teachers. Research design is determined as survey and correlational. Multi-Factor Leadership Scale originally was developed by Bass (1999) and adapted to Turkish…

  17. Authentic Leadership--Is It More than Emotional Intelligence?

    ERIC Educational Resources Information Center

    Duncan, Phyllis; Green, Mark; Gergen, Esther; Ecung, Wenonah

    2017-01-01

    One of the newest theories to gain widespread interest is authentic leadership. Part of the rationale for developing a model and subsequent instrument to measure authentic leadership was a concern that the more popular theory, the full range model of leadership and its instrument, the Multifactor Leadership Questionnaire (MLQ) (Bass & Avolio,…

  18. Emotional Enhancement Effect of Memory: Removing the Influence of Cognitive Factors

    ERIC Educational Resources Information Center

    Sommer, Tobias; Glascher, Jan; Moritz, Steffen; Buchel, Christian

    2008-01-01

    According to the modulation hypothesis, arousal is the crucial factor in the emotional enhancement of memory (EEM). However, the multifactor theory of the EEM recently proposed that cognitive characteristics of emotional stimuli, e.g., relatedness and distinctiveness, also play an important role. The current study aimed to investigate the…

  19. An ecological classification system for the central hardwoods region: The Hoosier National Forest

    Treesearch

    James E. Van Kley; George R. Parker

    1993-01-01

    This study, a multifactor ecological classification system, using vegetation, soil characteristics, and physiography, was developed for the landscape of the Hoosier National Forest in Southern Indiana. Measurements of ground flora, saplings, and canopy trees from selected stands older than 80 years were subjected to TWINSPAN classification and DECORANA ordination....

  20. Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Scoring Procedures

    Cancer.gov

    Scoring procedures were developed to convert a respondent's screener responses to estimates of individual dietary intake for percentage energy from fat, grams of fiber, and servings of fruits and vegetables, using USDA's 1994-96 Continuing Survey of Food Intakes of Individuals (CSFII 94-96) dietary recall data.

  1. Technical Notes on the Multifactor Method of Elementary School Closing.

    ERIC Educational Resources Information Center

    Puleo, Vincent T.

    This report provides preliminary technical information on a method for analyzing the factors involved in the closing of elementary schools. Included is a presentation of data and a brief discussion bearing on descriptive statistics, reliability, and validity. An intercorrelation matrix is also examined. The method employs 9 factors that have a…

  2. Motivating Peak Performance: Leadership Behaviors That Stimulate Employee Motivation and Performance

    ERIC Educational Resources Information Center

    Webb, Kerry

    2007-01-01

    The impact of leader behaviors on motivation levels of employees was examined in this study. Two hundred twenty-three vice presidents and chief officers from 104 member colleges and universities in the Council for Christian Colleges and Universities were sampled. Leaders were administered the Multifactor Leadership Questionnaire (MLQ-rater…

  3. Faculty Member Perceptions of Academic Leadership Styles at Private Colleges

    ERIC Educational Resources Information Center

    Gidman, Lori Kathleen

    2013-01-01

    The leadership style of academic leaders was studied through the eyes of faculty members. This empirical study looked at faculty perceptions of academic leadership with the use of a numerical survey as the basis for observation. Faculty members at six private liberal arts institutions completed the Multifactor Leadership Questionnaire (MLQ) in…

  4. The Relationship between School Principals' Leadership Styles and Collective Teacher Efficacy

    ERIC Educational Resources Information Center

    Akan, Durdagi

    2013-01-01

    This study aims to determine the relationship between school administrators' leadership styles and the collective teacher efficacy based on teachers' perceptions. In line with this objective, the multifactor leadership style scale and the collective teacher efficacy scale were applied on 223 teachers who were working in the province of Erzurum.…

  5. Predicting plant species diversity in a longleaf pine landscape

    Treesearch

    L. Katherine Kirkman; P. Charles Goebel; Brian J. Palik; Larry T. West

    2004-01-01

    In this study, we used a hierarchical, multifactor ecological classification system to examine how spatial patterns of biodiversity develop in one of the most species-rich ecosystems in North America, the fire-maintained longleaf pine-wiregrass ecosystem and associated depressional wetlands and riparian forests. Our goal was to determine which landscape features are...

  6. Goal Oriented and Risk Taking Behavior: The Roles of Multiple Systems for Caucasian and Arab-American Adolescents

    ERIC Educational Resources Information Center

    Tynan, Joshua J.; Somers, Cheryl L.; Gleason, Jamie H.; Markman, Barry S.; Yoon, Jina

    2015-01-01

    With Bronfenbrenner's (1977) ecological theory and other multifactor models (e.g. Pianta, 1999; Prinstein, Boergers, & Spirito, 2001) underlying this study design, the purpose was to examine, simultaneously, key variables in multiple life contexts (microsystem, mesosystem, exosystem levels) for their individual and combined roles in predicting…

  7. A Study of Secondary School Principals' Leadership Styles and School Dropout Rates

    ERIC Educational Resources Information Center

    Baggerly-Hinojosa, Barbara

    2012-01-01

    This study examined the relationship between the leadership styles of secondary school principals, measured by the self-report "Multifactor Leadership Questionnaire 5X short" (Bass & Avolio, 2000) and the school's dropout rates, as reported by the Texas Education Agency in the Academic Excellence Indicator System (AEIS) report while…

  8. Can Multifactor Models of Teaching Improve Teacher Effectiveness Measures?

    ERIC Educational Resources Information Center

    Lazarev, Valeriy; Newman, Denis

    2014-01-01

    NCLB waiver requirements have led to development of teacher evaluation systems, in which student growth is a significant component. Recent empirical research has been focusing on metrics of student growth--value-added scores in particular--and their relationship to other metrics. An extensive set of recent teacher-evaluation studies conducted by…

  9. Estimating multi-factor cumulative watershed effects on fish populations with an individual-based model

    Treesearch

    Bret C. Harvey; Steven F. Railsback

    2007-01-01

    While the concept of cumulative effects is prominent in legislation governing environmental management, the ability to estimate cumulative effects remains limited. One reason for this limitation is that important natural resources such as fish populations may exhibit complex responses to changes in environmental conditions, particularly to alteration of multiple...

  10. Bureaucratic Abuse and the False Dichotomy between Intentional and Unintentional Child Injuries.

    ERIC Educational Resources Information Center

    Kotch, Jonathan B.; And Others

    This paper examines the arbitrary distinctions between intentional and unintentional child injuries, noting that a careful review of the literature of both child abuse and unintentional child injury revealed similarities among the risk factors associated with the two outcomes. A single, multifactor model of injury etiology, the ecologic model, is…

  11. Computational simulation of coupled material degradation processes for probabilistic lifetime strength of aerospace materials

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.

    1992-01-01

    The research included ongoing development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primative variables. These primative variable may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above described constitutive equation using actual experimental materials data together with linear regression of that data, thereby predicting values for the empirical material constraints for each effect or primative variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from the open literature for materials typically of interest to those studying aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rehman, Nasir, E-mail: nasirzainy1@hotmail.com; Shashiashvili, Malkhaz

    The classical Garman-Kohlhagen model for the currency exchange assumes that the domestic and foreign currency risk-free interest rates are constant and the exchange rate follows a log-normal diffusion process.In this paper we consider the general case, when exchange rate evolves according to arbitrary one-dimensional diffusion process with local volatility that is the function of time and the current exchange rate and where the domestic and foreign currency risk-free interest rates may be arbitrary continuous functions of time. First non-trivial problem we encounter in time-dependent case is the continuity in time argument of the value function of the American put optionmore » and the regularity properties of the optimal exercise boundary. We establish these properties based on systematic use of the monotonicity in volatility for the value functions of the American as well as European options with convex payoffs together with the Dynamic Programming Principle and we obtain certain type of comparison result for the value functions and corresponding exercise boundaries for the American puts with different strikes, maturities and volatilities.Starting from the latter fact that the optimal exercise boundary curve is left continuous with right-hand limits we give a mathematically rigorous and transparent derivation of the significant early exercise premium representation for the value function of the American foreign exchange put option as the sum of the European put option value function and the early exercise premium.The proof essentially relies on the particular property of the stochastic integral with respect to arbitrary continuous semimartingale over the predictable subsets of its zeros. We derive from the latter the nonlinear integral equation for the optimal exercise boundary which can be studied by numerical methods.« less

  13. Investigation of market efficiency and Financial Stability between S&P 500 and London Stock Exchange: Monthly and yearly Forecasting of Time Series Stock Returns using ARMA model

    NASA Astrophysics Data System (ADS)

    Rounaghi, Mohammad Mahdi; Nassir Zadeh, Farzaneh

    2016-08-01

    We investigated the presence and changes in, long memory features in the returns and volatility dynamics of S&P 500 and London Stock Exchange using ARMA model. Recently, multifractal analysis has been evolved as an important way to explain the complexity of financial markets which can hardly be described by linear methods of efficient market theory. In financial markets, the weak form of the efficient market hypothesis implies that price returns are serially uncorrelated sequences. In other words, prices should follow a random walk behavior. The random walk hypothesis is evaluated against alternatives accommodating either unifractality or multifractality. Several studies find that the return volatility of stocks tends to exhibit long-range dependence, heavy tails, and clustering. Because stochastic processes with self-similarity possess long-range dependence and heavy tails, it has been suggested that self-similar processes be employed to capture these characteristics in return volatility modeling. The present study applies monthly and yearly forecasting of Time Series Stock Returns in S&P 500 and London Stock Exchange using ARMA model. The statistical analysis of S&P 500 shows that the ARMA model for S&P 500 outperforms the London stock exchange and it is capable for predicting medium or long horizons using real known values. The statistical analysis in London Stock Exchange shows that the ARMA model for monthly stock returns outperforms the yearly. ​A comparison between S&P 500 and London Stock Exchange shows that both markets are efficient and have Financial Stability during periods of boom and bust.

  14. Financial risk of the biotech industry versus the pharmaceutical industry.

    PubMed

    Golec, Joseph; Vernon, John A

    2009-01-01

    The biotech industry now accounts for a substantial and growing proportion of total R&D spending on new medicines. However, compared with the pharmaceutical industry, the biotech industry is financially fragile. This article illustrates the financial fragility of the biotech and pharmaceutical industries in the US and the implications of this fragility for the effects that government regulation could have on biotech firms. Graphical analysis and statistical tests were used to show how the biotech industry differs from the pharmaceutical industry. The two industries' characteristics were measured and compared, along with various measures of firms' financial risk and sensitivity to government regulation. Data from firms' financial statements provided accounting-based measures and firms' stock returns applied to a multifactor asset pricing model provided financial market measures. The biotech industry was by far the most research-intensive industry in the US, averaging 38% R&D intensity (ratio of R&D spending to total firm assets) over the past 25 years, compared with an average of 25% for the pharmaceutical industry and 3% for all other industries. Biotech firms exhibited lower and more volatile profits and higher market-related and size-related risk, and they suffered more negative stock returns in response to threatened government price regulation. Biotech firms' financial risks increase their costs of capital and make them more sensitive to government regulations that affect their financial prospects. As biotech products grow to represent a larger share of new medicines, general stock market conditions and government regulations could have a greater impact on the level of innovation of new medicines.

  15. Assessment of beetle diversity, community composition and potential threats to forestry using kairomone-baited traps.

    PubMed

    Olivier-Espejel, S; Hurley, B P; Garnas, J

    2017-02-01

    Traps designed to capture insects during normal movement/dispersal, or via attraction to non-specific (plant) volatile lures, yield by-catch that carries valuable information about patterns of community diversity and composition. In order to identify potential native/introduced pests and detect predictors of colonization of non-native pines, we examined beetle assemblages captured in intercept panel traps baited with kairomone lures used during a national monitoring of the woodwasp, Sirex noctilio, in Southern Africa. We identified 50 families and 436 morphospecies of beetles from nine sites sampled in both 2008 and 2009 and six areas in 2007 (trap catch pooled by region) across a latitudinal and elevational gradient. The most diverse groups were mainly those strongly associated with trees, known to include damaging pests. While native species dominated the samples in terms of richness, the dominant species was the introduced bark beetle Orthotomicus erosus (Curculionidae: Scolytinae) (22 ± 34 individuals/site). Four Scolytinae species without previous records in South Africa, namely Coccotrypes niger, Hypocryphalus robustus (formerly Hypocryphalus mangiferae), Hypothenemus birmanus and Xyleborus perforans, were captured in low abundances. Communities showed temporal stability within sites and strong biogeographic patterns across the landscape. The strongest single predictors of community composition were potential evaporation, latitude and maximum relative humidity, while the strongest multifactor model contained elevation, potential evaporation and maximum relative humidity. Temperature, land use variables and distance to natural areas did not significantly correlate with community composition. Non-phytophagous beetles were also captured and were highly diverse (32 families) perhaps representing important beneficial insects.

  16. Uncertainty analysis of geothermal energy economics

    NASA Astrophysics Data System (ADS)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be captured in the valuation model. Finally, the study will compare the probability distributions of development cost and project value and discusses the market penetration potential of the geothermal power generation. There is a recent world wide interest in geothermal utilization projects. There are several reasons for the recent popularity of geothermal energy, including the increasing volatility of fossil fuel prices, need for domestic energy sources, approaching carbon emission limitations and state renewable energy standards, increasing need for baseload units, and new technology to make geothermal energy more attractive for power generation. It is our hope that this study will contribute to the recent progress of geothermal energy by shedding light on the uncertainty of geothermal energy project costs.

  17. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  18. Conference Attendance Patterns of Outdoor Orientation Program Staff at Four-Year Colleges in the United States

    ERIC Educational Resources Information Center

    Bell, Brent J.

    2009-01-01

    One purpose of professional conference attendance is to enhance social support. Intentionally fostering this support is an important political aim that should be developed. Although many multifactor definitions of social support exist (Cobb, 1979; Cohen & Syme, 1985; Kahn, 1979; Shaefer et al., 1981; Weiss, 1974), all distinguish between an…

  19. Revisiting a Cognitive Framework for Test Design: Applications for a Computerized Perceptual Speed Test.

    ERIC Educational Resources Information Center

    Alderton, David L.

    This paper highlights the need for a systematic, content aware, and theoretically-based approach to test design. The cognitive components approach is endorsed, and is applied to the development of a computerized perceptual speed test. Psychometric literature is reviewed and shows that: every major multi-factor theory includes a clerical/perceptual…

  20. Bullying in Adolescent Residential Care: The Influence of the Physical and Social Residential Care Environment

    ERIC Educational Resources Information Center

    Sekol, Ivana

    2016-01-01

    Background: To date, no study examined possible contributions of environmental factors to bullying and victimization in adolescent residential care facilities. Objective: By testing one part of the Multifactor Model of Bullying in Secure Setting (MMBSS; Ireland in "Int J Adolesc Med Health" 24(1):63-68, 2012), this research examined the…

  1. Academic Administrator Leadership Styles and the Impact on Faculty Job Satisfaction

    ERIC Educational Resources Information Center

    Bateh, Justin; Heyliger, Wilton

    2014-01-01

    This article examines the impact of three leadership styles as a predictor of job satisfaction in a state university system. The Multifactor Leadership Questionnaire was used to identify the leadership style of an administrator as perceived by faculty members. Spector's Job Satisfaction Survey was used to assess a faculty member's level of job…

  2. The Impact of Mentor Leadership Styles on First-Year Adult Student Retention

    ERIC Educational Resources Information Center

    Smith Staley, Charlesetta

    2012-01-01

    This quantitative study explored the leadership styles of mentors for retained first-year adult students to analyze whether the prevalent style had a higher impact on first-year adult student retention. The Multifactor Leadership Questionnaire (MLQ) 5x was used to collect data on the mentors' leadership styles from the perspective of retained…

  3. A Preliminary Study for a New Model of Sense of Community

    ERIC Educational Resources Information Center

    Tartaglia, Stefano

    2006-01-01

    Although Sense of Community (SOC) is usually defined as a multidimensional construct, most SOC scales are unidimensional. To reduce the split between theory and empirical research, the present work identifies a multifactor structure for the Italian Sense of Community Scale (ISCS) that has already been validated as a unitary index of SOC. This…

  4. Creativity in the Structure of Professionalism of a Higher School Teacher

    ERIC Educational Resources Information Center

    Gladilina, Irina Petrovna

    2016-01-01

    In the science, due to the absence of strict and exact criteria for differentiating between creative and non-creative activities of a human, there is no rather full definition of "creativity" notion despite that this matter was addressed by many scholars. Multifactor field in the science on creativity allows interpreting the essence of…

  5. Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Definition of Acceptable Dietary Data Values

    Cancer.gov

    We used the U.S. Department of Agriculture's (USDA) 1994-96 Continuing Survey of Food Intakes of Individuals (CSFII) data on reported intakes over two days of 24-hour recall to make judgments about reasonable frequencies of consumption that were reported on a per day basis.

  6. Evaluating the metagenome of two sampling locations in the nasal cavity of cattle with bovine respiratory disease complex

    USDA-ARS?s Scientific Manuscript database

    Bovine respiratory disease complex (BRDC) is a multi-factor disease, and disease incidence may be associated with an animal’s commensal microbiota (metagenome). Evaluation of the animal’s resident microbiota in the nasal cavity may help us to understand the impact of the metagenome on incidence of ...

  7. Evaluating the microbiome of two sampling locations in the nasal cavity of cattle with bovine respiratory disease complex (BRDC)

    USDA-ARS?s Scientific Manuscript database

    Bovine respiratory disease complex (BRDC) is a multi-factor disease, and disease incidence may be associated with an animal’s commensal microbiota (metagenome). Evaluation of the animal’s resident microbiota in the nasal cavity may help us to understand the impact of the metagenome on incidence of ...

  8. The Effects of Transformational Leadership and the Sense of Calling on Job Burnout among Special Education Teachers

    ERIC Educational Resources Information Center

    Gong, Tao; Zimmerli, Laurie; Hoffer, Harry E.

    2013-01-01

    This article examines the effects of transformational leadership of supervisors and the sense of calling on job burnout among special education teachers. A total of 256 special education teachers completed the Maslach Burnout Inventory and rated their supervisors on the Multifactor Leadership Questionnaire. The results reveal that transformational…

  9. Identifying the Best Buys in U.S. Higher Education

    ERIC Educational Resources Information Center

    Eff, E. Anthon; Klein, Christopher C.; Kyle, Reuben

    2012-01-01

    Which U.S. institutions of higher education offer the best value to consumers? To answer this question, we evaluate U.S. institutions relative to a data envelopment analysis (DEA) multi-factor frontier based on 2000-2001 data for 1,179 4-year institutions. The resulting DEA "best buy" scores allow the ranking of institutions by a…

  10. Inside The Zone of Proximal Development: Validating A Multifactor Model Of Learning Potential With Gifted Students And Their Peers

    ERIC Educational Resources Information Center

    Kanevsky, Lannie; Geake, John

    2004-01-01

    Kanevsky (1995b) proposed a model of learning potential based on Vygotsky?s notions of "good learning" and the zone of proximal development. This study investigated the contributions of general knowledge, information processing efficiency, and metacognition to differences in the learning potential of 5 gifted nongifted students.…

  11. Scaling and efficiency determine the irreversible evolution of a market

    PubMed Central

    Baldovin, F.; Stella, A. L.

    2007-01-01

    In setting up a stochastic description of the time evolution of a financial index, the challenge consists in devising a model compatible with all stylized facts emerging from the analysis of financial time series and providing a reliable basis for simulating such series. Based on constraints imposed by market efficiency and on an inhomogeneous-time generalization of standard simple scaling, we propose an analytical model which accounts simultaneously for empirical results like the linear decorrelation of successive returns, the power law dependence on time of the volatility autocorrelation function, and the multiscaling associated to this dependence. In addition, our approach gives a justification and a quantitative assessment of the irreversible character of the index dynamics. This irreversibility enters as a key ingredient in a novel simulation strategy of index evolution which demonstrates the predictive potential of the model.

  12. Common scaling behavior in finance and macroeconomics

    NASA Astrophysics Data System (ADS)

    Podobnik, B.; Horvatic, D.; Petersen, A. M.; Njavro, M.; Stanley, H. E.

    2010-08-01

    In order to test whether scaling exists in finance at the world level, we test whether the average growth rates and volatility of market capitalization (MC) depend on the level of MC. We analyze the MC for 54 worldwide stock indices and 48 worldwide bond indices. We find that (i) the average growth rate of the MC and (ii) the standard deviation σ(r) of growth rates r decrease both with MC as power laws, with exponents αw = 0.28 ± 0.09 and βw = 0.12 ± 0.04. We define a stochastic process in order to model the scaling results we find for worldwide stock and bond indices. We establish a power-law relationship between the MC of a country's financial market and the gross domestic product (GDP) of the same country.

  13. Interdisciplinary applications of statistical physics to complex systems: Seismic physics, econophysics, and sociophysics

    NASA Astrophysics Data System (ADS)

    Tenenbaum, Joel

    This thesis applies statistical physics concepts and methods to quantitatively analyze complex systems. This thesis is separated into four parts: (i) characteristics of earthquake systems (ii) memory and volatility in data time series (iii) the application of part (ii) to world financial markets, and (iv) statistical observations on the evolution of word usage. In Part I, we observe statistical patterns in the occurrence of earthquakes. We select a 14-year earthquake catalog covering the archipelago of Japan. We find that regions traditionally thought of as being too distant from one another for causal contact display remarkably high correlations, and the networks that result have a tendency to link highly connected areas with other highly connected areas. In Part II, we introduce and apply the concept of "volatility asymmetry", the primary use of which is in financial data. We explain the relation between memory and "volatility asymmetry" in terms of an asymmetry parameter lambda. We define a litmus test for determining whether lambda is statistically significant and propose a stochastic model based on this parameter and use the model to further explain empirical data. In Part III, we expand on volatility asymmetry. Importing the concepts of time dependence and universality from physics, we explore the aspects of emerging (or "transition") economies in Eastern Europe as they relate to asymmetry. We find that these emerging markets in some instances behave like developed markets and in other instances do not, and that the distinction is a matter both of country and a matter of time period, crisis periods showing different asymmetry characteristics than "healthy" periods. In Part IV, we take note of a series of findings in econophysics, showing statistical growth similarities between a variety of different areas that all have in common the fact of taking place in areas that are both (i) competing and (ii) dynamic. We show that this same growth distribution can be reproduced in observing the growth rates of the usage of individual words, that just as companies compete for sales in a zero sum marketing game, so do words compete for usage within a limited amount of reader man-hours.

  14. Nonparametric Stochastic Model for Uncertainty Quantifi cation of Short-term Wind Speed Forecasts

    NASA Astrophysics Data System (ADS)

    AL-Shehhi, A. M.; Chaouch, M.; Ouarda, T.

    2014-12-01

    Wind energy is increasing in importance as a renewable energy source due to its potential role in reducing carbon emissions. It is a safe, clean, and inexhaustible source of energy. The amount of wind energy generated by wind turbines is closely related to the wind speed. Wind speed forecasting plays a vital role in the wind energy sector in terms of wind turbine optimal operation, wind energy dispatch and scheduling, efficient energy harvesting etc. It is also considered during planning, design, and assessment of any proposed wind project. Therefore, accurate prediction of wind speed carries a particular importance and plays significant roles in the wind industry. Many methods have been proposed in the literature for short-term wind speed forecasting. These methods are usually based on modeling historical fixed time intervals of the wind speed data and using it for future prediction. The methods mainly include statistical models such as ARMA, ARIMA model, physical models for instance numerical weather prediction and artificial Intelligence techniques for example support vector machine and neural networks. In this paper, we are interested in estimating hourly wind speed measures in United Arab Emirates (UAE). More precisely, we predict hourly wind speed using a nonparametric kernel estimation of the regression and volatility functions pertaining to nonlinear autoregressive model with ARCH model, which includes unknown nonlinear regression function and volatility function already discussed in the literature. The unknown nonlinear regression function describe the dependence between the value of the wind speed at time t and its historical data at time t -1, t - 2, … , t - d. This function plays a key role to predict hourly wind speed process. The volatility function, i.e., the conditional variance given the past, measures the risk associated to this prediction. Since the regression and the volatility functions are supposed to be unknown, they are estimated using nonparametric kernel methods. In addition, to the pointwise hourly wind speed forecasts, a confidence interval is also provided which allows to quantify the uncertainty around the forecasts.

  15. Essays on parametric and nonparametric modeling and estimation with applications to energy economics

    NASA Astrophysics Data System (ADS)

    Gao, Weiyu

    My dissertation research is composed of two parts: a theoretical part on semiparametric efficient estimation and an applied part in energy economics under different dynamic settings. The essays are related in terms of their applications as well as the way in which models are constructed and estimated. In the first essay, efficient estimation of the partially linear model is studied. We work out the efficient score functions and efficiency bounds under four stochastic restrictions---independence, conditional symmetry, conditional zero mean, and partially conditional zero mean. A feasible efficient estimation method for the linear part of the model is developed based on the efficient score. A battery of specification test that allows for choosing between the alternative assumptions is provided. A Monte Carlo simulation is also conducted. The second essay presents a dynamic optimization model for a stylized oilfield resembling the largest developed light oil field in Saudi Arabia, Ghawar. We use data from different sources to estimate the oil production cost function and the revenue function. We pay particular attention to the dynamic aspect of the oil production by employing petroleum-engineering software to simulate the interaction between control variables and reservoir state variables. Optimal solutions are studied under different scenarios to account for the possible changes in the exogenous variables and the uncertainty about the forecasts. The third essay examines the effect of oil price volatility on the level of innovation displayed by the U.S. economy. A measure of innovation is calculated by decomposing an output-based Malmquist index. We also construct a nonparametric measure for oil price volatility. Technical change and oil price volatility are then placed in a VAR system with oil price and a variable indicative of monetary policy. The system is estimated and analyzed for significant relationships. We find that oil price volatility displays a significant negative effect on innovation. A key point of this analysis lies in the fact that we impose no functional forms for technologies and the methods employed keep technical assumptions to a minimum.

  16. Late Impacts and the Origins of the Atmospheres on the Terrestrial Planets

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, S.; Stewart, S. T.; Lock, S. J.; Parai, R.; Tucker, J. M.

    2014-12-01

    Models for the origin of terrestrial atmospheres typically require an intricate sequence of events, including hydrodynamic escape, outgassing of mantle volatiles and late delivery. Here we discuss the origin of the atmospheres on the terrestrial planets in light of new ideas about the formation of the Moon, giant impact induced atmospheric loss and recent noble gas measurements. Our new measurements indicate that noble gases in the Earth's atmosphere cannot be derived from any combination of fractionation of a nebular-derived atmosphere followed by outgassing of deep or shallow mantle volatiles. While Ne in the mantle retains a nebular component, the present-day atmosphere has no memory of nebular gases. Rather, atmospheric noble gases have a close affinity to chondrites. On the other hand, Venus's atmosphere has 20 and 70 times higher abundance of 20Ne and 36Ar, respectively, and a 20Ne/22Ne ratio closer to the solar value than Earth's atmosphere. While the present atmosphere of Mars is significantly fractionated in the lighter noble gases due to long term atmospheric escape, the Kr isotopic ratios in Martian atmosphere are identical to solar. Thus, while Earth's atmosphere has no memory of accretion of nebular gases, atmospheres on both Venus and Mars preserve at least a component of nebular gases. To explain the above observations, we propose that a common set of processes operated on the terrestrial planets, and that their subsequent evolutionary divergence is simply explained by planetary size and the stochastic nature of giant impacts. We present geochemical observations and simulations of giant impacts to show that most of Earth's mantle was degassed and the outgassed volatiles were largely lost during the final sequence of giant impacts onto Earth. Earth's noble gases were therefore dominantly derived from late-accreting planetesimals. In contrast, Venus did not suffer substantial atmospheric loss by a late giant impact and retains a higher abundance of both nebular and chondritic noble gases compared to Earth. Fast-accreting Mars has a noble gas signature inherited from the solar nebula, and its low mass allowed for gravitational escape of the volatile components in late accreting planetesimals due to vaporization upon impact.

  17. Bullying among Adolescents in North Cyprus and Turkey: Testing a Multifactor Model

    ERIC Educational Resources Information Center

    Bayraktar, Fatih

    2012-01-01

    Peer bullying has been studied since the 1970s. Therefore, a vast literature has accumulated about the various predictors of bullying. However, to date there has been no study which has combined individual-, peer-, parental-, teacher-, and school-related predictors of bullying within a model. In this sense, the main aim of this study was to test a…

  18. Electronic Health Records: Applying Diffusion of Innovation Theory to the Relationship between Multifactor Authentication and EHR Adoption

    ERIC Educational Resources Information Center

    Lockett, Daeron C.

    2014-01-01

    Electronic Health Record (EHR) systems are increasingly becoming accepted as future direction of medical record management systems. Programs such as the American Recovery and Reinvestment Act have provided incentives to hospitals that adopt EHR systems. In spite of these incentives, the perception of EHR adoption is that is has not achieved the…

  19. Cross-Cultural Comparisons of University Students' Science Learning Self-Efficacy: Structural Relationships among Factors within Science Learning Self-Efficacy

    ERIC Educational Resources Information Center

    Wang, Ya-Ling; Liang, Jyh-Chong; Tsai, Chin-Chung

    2018-01-01

    Science learning self-efficacy could be regarded as a multi-factor belief which comprises different aspects such as cognitive skills, practical work, and everyday application. However, few studies have investigated the relationships among these factors that compose science learning self-efficacy. Also, culture may play an important role in…

  20. The Four-Factor Model of Depressive Symptoms in Dementia Caregivers: A Structural Equation Model of Ethnic Differences

    PubMed Central

    Roth, David L.; Ackerman, Michelle L.; Okonkwo, Ozioma C.; Burgio, Louis D.

    2008-01-01

    Previous studies have suggested that 4 latent constructs (depressed affect, well-being, interpersonal problems, somatic symptoms) underlie the item responses on the Center for Epidemiological Studies Depression (CES-D) Scale. This instrument has been widely used in dementia caregiving research, but the fit of this multifactor model and the explanatory contributions of multifactor models have not been sufficiently examined for caregiving samples. The authors subjected CES-D data (N = 1,183) from the initial Resources for Enhancing Alzheimer’s Caregiver Health Study to confirmatory factor analysis methods and found that the 4-factor model provided excellent fit to the observed data. Invariance analyses suggested only minimal item-loading differences across race subgroups and supported the validity of race comparisons on the latent factors. Significant race differences were found on 3 of the 4 latent factors both before and after controlling for demographic covariates. African Americans reported less depressed affect and better well-being than White caregivers, who reported better well-being and fewer interpersonal problems than Hispanic caregivers. These findings clarify and extend previous studies of race differences in depression among diverse samples of dementia caregivers. PMID:18808246

  1. [Risk factors in the living environment of early spontaneous abortion pregnant women].

    PubMed

    Liu, Xin-yan; Bian, Xu-ming; Han, Jing-xiu; Cao, Zhao-jin; Fan, Guang-sheng; Zhang, Chao; Zhang, Wen-li; Zhang, Shu-zhen; Sun, Xiao-guang

    2007-10-01

    To study the relationship between early spontaneous abortion and living environment, and explore the risk factors of spontaneous abortion. We conducted analysis based on the interview of 200 spontaneous abortion cases and the matched control (age +/- 2 years) by using multifactor Logistic regression analysis. The proportions of watching TV > or =10 hours/week, operating computer > or =45 hours/week, using copycat, microwave oven and mobile phone, electromagnetism equipment near the dwell or work place, e. g. switch room < or =50 m and launching tower < or =500 m in the cases are significantly higher than those in the controls in single factor analysis (all P < 0.05). After adjusted the effect of other risk factors by multifactor analysis, using microwave oven and mobile phone, contacting abnormal smell of fitment material > or =3 months, having emotional stress during the first term of pregnancy and spontaneous abortion history were significantly associated with risk of spontaneous abortion. The odds ratios of these risk factors were 2.23 and 4.63, respectively. Using microwave oven and mobile phone, contacting abnormal smell of fitment material > or =3 months, having emotional stress during the first term of pregnancy, and spontaneous abortion history are risk factors of early spontaneous abortion.

  2. Interactions between MAOA and SYP polymorphisms were associated with symptoms of attention-deficit/hyperactivity disorder in Chinese Han subjects.

    PubMed

    Gao, Qian; Liu, Lu; Li, Hai-Mei; Tang, Yi-Lang; Wu, Zhao-Min; Chen, Yun; Wang, Yu-Feng; Qian, Qiu-Jin

    2015-01-01

    As candidate genes of attention--deficit/hyperactivity disorder (ADHD), monoamine oxidase A (MAOA), and synaptophysin (SYP) are both on the X chromosome, and have been suggested to be associated with the predominantly inattentive subtype (ADHD-I). The present study is to investigate the potential gene-gene interaction (G × G) between rs5905859 of MAOA and rs5906754 of SYP for ADHD in Chinese Han subjects. For family-based association study, 177 female trios were included. For case-control study, 1,462 probands and 807 normal controls were recruited. The ADHD Rating Scale-IV (ADHD-RS-IV) was used to evaluate ADHD symptoms. Pedigree-based generalized multifactor dimensionality reduction (PGMDR) for female ADHD trios indicated significant gene interaction effect of rs5905859 and rs5906754. Generalized multifactor dimensionality reduction (GMDR) indicated potential gene-gene interplay on ADHD RS-IV scores in female ADHD-I. No associations were observed in male subjects in case-control analysis. In conclusion, our findings suggested that the interaction of MAOA and SYP may be involved in the genetic mechanism of ADHD-I subtype and predict ADHD symptoms. © 2014 Wiley Periodicals, Inc.

  3. Fuzzy comprehensive evaluation of multiple environmental factors for swine building assessment and control.

    PubMed

    Xie, Qiuju; Ni, Ji-Qin; Su, Zhongbin

    2017-10-15

    In confined swine buildings, temperature, humidity, and air quality are all important for animal health and productivity. However, the current swine building environmental control is only based on temperature; and evaluation and control methods based on multiple environmental factors are needed. In this paper, fuzzy comprehensive evaluation (FCE) theory was adopted for multi-factor assessment of environmental quality in two commercial swine buildings using real measurement data. An assessment index system and membership functions were established; and predetermined weights were given using analytic hierarchy process (AHP) combined with knowledge of experts. The results show that multi-factors such as temperature, humidity, and concentrations of ammonia (NH 3 ), carbon dioxide (CO 2 ), and hydrogen sulfide (H 2 S) can be successfully integrated in FCE for swine building environment assessment. The FCE method has a high correlation coefficient of 0.737 compared with the method of single-factor evaluation (SFE). The FCE method can significantly increase the sensitivity and perform an effective and integrative assessment. It can be used as part of environmental controlling and warning systems for swine building environment management to improve swine production and welfare. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Effects of Sulfate, Chloride, and Bicarbonate on Iron Stability in a PVC-U Drinking Pipe

    PubMed Central

    Wang, Jiaying; Tao, Tao; Yan, Hexiang

    2017-01-01

    In order to describe iron stability in plastic pipes and to ensure the drinking water security, the influence factors and rules for iron adsorption and release were studied, dependent on the Unplasticized poly (vinyl chloride) (PVC-U) drinking pipes employed in this research. In this paper, sulfate, chloride, and bicarbonate, as well as synthesized models, were chosen to investigate the iron stability on the inner wall of PVC-U drinking pipes. The existence of the three kinds of anions could significantly affect the process of iron adsorption, and a positive association was found between the level of anion concentration and the adsorption rate. However, the scaling formed on the inner surface of the pipes would be released into the water under certain conditions. The Larson Index (LI), used for a synthetic consideration of anion effects on iron stability, was selected to investigate the iron release under multi-factor conditions. Moreover, a well fitted linear model was established to gain a better understanding of iron release under multi-factor conditions. The simulation results demonstrated that the linear model was better fitted than the LI model for the prediction of iron release. PMID:28629192

  5. Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time

    NASA Astrophysics Data System (ADS)

    Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.

    2017-12-01

    We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.

  6. C1,1 regularity for degenerate elliptic obstacle problems

    NASA Astrophysics Data System (ADS)

    Daskalopoulos, Panagiota; Feehan, Paul M. N.

    2016-03-01

    The Heston stochastic volatility process is a degenerate diffusion process where the degeneracy in the diffusion coefficient is proportional to the square root of the distance to the boundary of the half-plane. The generator of this process with killing, called the elliptic Heston operator, is a second-order, degenerate-elliptic partial differential operator, where the degeneracy in the operator symbol is proportional to the distance to the boundary of the half-plane. In mathematical finance, solutions to the obstacle problem for the elliptic Heston operator correspond to value functions for perpetual American-style options on the underlying asset. With the aid of weighted Sobolev spaces and weighted Hölder spaces, we establish the optimal C 1 , 1 regularity (up to the boundary of the half-plane) for solutions to obstacle problems for the elliptic Heston operator when the obstacle functions are sufficiently smooth.

  7. Complexity multiscale asynchrony measure and behavior for interacting financial dynamics

    NASA Astrophysics Data System (ADS)

    Yang, Ge; Wang, Jun; Niu, Hongli

    2016-08-01

    A stochastic financial price process is proposed and investigated by the finite-range multitype contact dynamical system, in an attempt to study the nonlinear behaviors of real asset markets. The viruses spreading process in a finite-range multitype system is used to imitate the interacting behaviors of diverse investment attitudes in a financial market, and the empirical research on descriptive statistics and autocorrelation behaviors of return time series is performed for different values of propagation rates. Then the multiscale entropy analysis is adopted to study several different shuffled return series, including the original return series, the corresponding reversal series, the random shuffled series, the volatility shuffled series and the Zipf-type shuffled series. Furthermore, we propose and compare the multiscale cross-sample entropy and its modification algorithm called composite multiscale cross-sample entropy. We apply them to study the asynchrony of pairs of time series under different time scales.

  8. An accurate European option pricing model under Fractional Stable Process based on Feynman Path Integral

    NASA Astrophysics Data System (ADS)

    Ma, Chao; Ma, Qinghua; Yao, Haixiang; Hou, Tiancheng

    2018-03-01

    In this paper, we propose to use the Fractional Stable Process (FSP) for option pricing. The FSP is one of the few candidates to directly model a number of desired empirical properties of asset price risk neutral dynamics. However, pricing the vanilla European option under FSP is difficult and problematic. In the paper, built upon the developed Feynman Path Integral inspired techniques, we present a novel computational model for option pricing, i.e. the Fractional Stable Process Path Integral (FSPPI) model under a general fractional stable distribution that tackles this problem. Numerical and empirical experiments show that the proposed pricing model provides a correction of the Black-Scholes pricing error - overpricing long term options, underpricing short term options; overpricing out-of-the-money options, underpricing in-the-money options without any additional structures such as stochastic volatility and a jump process.

  9. The influences of delay time on the stability of a market model with stochastic volatility

    NASA Astrophysics Data System (ADS)

    Li, Jiang-Cheng; Mei, Dong-Cheng

    2013-02-01

    The effects of the delay time on the stability of a market model are investigated, by using a modified Heston model with a cubic nonlinearity and cross-correlated noise sources. These results indicate that: (i) There is an optimal delay time τo which maximally enhances the stability of the stock price under strong demand elasticity of stock price, and maximally reduces the stability of the stock price under weak demand elasticity of stock price; (ii) The cross correlation coefficient of noises and the delay time play an opposite role on the stability for the case of the delay time <τo and the same role for the case of the delay time >τo. Moreover, the probability density function of the escape time of stock price returns, the probability density function of the returns and the correlation function of the returns are compared with other literatures.

  10. The Correlation between Leadership Style and Leader Power

    DTIC Science & Technology

    2016-04-22

    Article 3. DATES COVERED (From - To) 1 February 2015-31 October 2015 4. TITLE AND SUBTITLE The Correlation between Leadership Style and Leader Power...Transformational and Transactional leadership style and leader power. Leadership style was measured by the Multifactor Leadership Questionnaire (MLQ...between the factors representing Leadership Style and Leader Power. The CFA results are contrary to developer’s theories of both scales, but are

  11. The Development of a Tactical-Level Full Range Leadership Measurement Instrument

    DTIC Science & Technology

    2010-03-01

    full range leadership theory has become established as the predominant and most widely researched theory on leadership . The most commonly used survey...instrument to assess full range leadership theory is the Multifactor Leadership Questionnaire, originally developed by Bass in 1985. Although much...existing literature to develop a new full range leadership theory measurement instrument that effectively targets low- to mid-level supervisors, or

  12. Stochastic multi-objective model for optimal energy exchange optimization of networked microgrids with presence of renewable generation under risk-based strategies.

    PubMed

    Gazijahani, Farhad Samadi; Ravadanegh, Sajad Najafi; Salehi, Javad

    2018-02-01

    The inherent volatility and unpredictable nature of renewable generations and load demand pose considerable challenges for energy exchange optimization of microgrids (MG). To address these challenges, this paper proposes a new risk-based multi-objective energy exchange optimization for networked MGs from economic and reliability standpoints under load consumption and renewable power generation uncertainties. In so doing, three various risk-based strategies are distinguished by using conditional value at risk (CVaR) approach. The proposed model is specified as a two-distinct objective function. The first function minimizes the operation and maintenance costs, cost of power transaction between upstream network and MGs as well as power loss cost, whereas the second function minimizes the energy not supplied (ENS) value. Furthermore, the stochastic scenario-based approach is incorporated into the approach in order to handle the uncertainty. Also, Kantorovich distance scenario reduction method has been implemented to reduce the computational burden. Finally, non-dominated sorting genetic algorithm (NSGAII) is applied to minimize the objective functions simultaneously and the best solution is extracted by fuzzy satisfying method with respect to risk-based strategies. To indicate the performance of the proposed model, it is performed on the modified IEEE 33-bus distribution system and the obtained results show that the presented approach can be considered as an efficient tool for optimal energy exchange optimization of MGs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  13. The timing and probability of treatment switch under cost uncertainty: an application to patients with gastrointestinal stromal tumor.

    PubMed

    de Mello-Sampayo, Felipa

    2014-03-01

    Cost fluctuations render the outcome of any treatment switch uncertain, so that decision makers might have to wait for more information before optimally switching treatments, especially when the incremental cost per quality-adjusted life year (QALY) gained cannot be fully recovered later on. To analyze the timing of treatment switch under cost uncertainty. A dynamic stochastic model for the optimal timing of a treatment switch is developed and applied to a problem in medical decision taking, i.e. to patients with unresectable gastrointestinal stromal tumour (GIST). The theoretical model suggests that cost uncertainty reduces expected net benefit. In addition, cost volatility discourages switching treatments. The stochastic model also illustrates that as technologies become less cost competitive, the cost uncertainty becomes more dominant. With limited substitutability, higher quality of technologies will increase the demand for those technologies disregarding the cost uncertainty. The results of the empirical application suggest that the first-line treatment may be the better choice when considering lifetime welfare. Under uncertainty and irreversibility, low-risk patients must begin the second-line treatment as soon as possible, which is precisely when the second-line treatment is least valuable. As the costs of reversing current treatment impacts fall, it becomes more feasible to provide the option-preserving treatment to these low-risk individuals later on. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  14. The Multi-factor Predictive Seis &Gis Model of Ecological, Genetical, Population Health Risk and Bio-geodynamic Processes In Geopathogenic Zones

    NASA Astrophysics Data System (ADS)

    Bondarenko, Y.

    I. Goal and Scope. Human birth rate decrease, death-rate growth and increase of mu- tagenic deviations risk take place in geopathogenic and anthropogenic hazard zones. Such zones create unfavourable conditions for reproductive process of future genera- tions. These negative trends should be considered as a protective answer of the com- plex biosocial system to the appearance of natural and anthropogenic risk factors that are unfavourable for human health. The major goals of scientific evaluation and de- crease of risk of appearance of hazardous processes on the territory of Dnipropetrovsk, along with creation of the multi-factor predictive Spirit-Energy-Information Space "SEIS" & GIS Model of ecological, genetical and population health risk in connection with dangerous bio-geodynamic processes, were: multi-factor modeling and correla- tion of natural and anthropogenic environmental changes and those of human health; determination of indicators that show the risk of destruction structures appearance on different levels of organization and functioning of the city ecosystem (geophys- ical and geochemical fields, soil, hydrosphere, atmosphere, biosphere); analysis of regularities of natural, anthropogenic, and biological rhythms' interactions. II. Meth- ods. The long spatio-temporal researches (Y. Bondarenko, 1996, 2000) have proved that the ecological, genetic and epidemiological processes are in connection with de- velopment of dangerous bio-geophysical and bio-geodynamic processes. Mathemat- ical processing of space photos, lithogeochemical and geophysical maps with use of JEIS o and ERDAS o computer systems was executed at the first stage of forma- tion of multi-layer geoinformation model "Dnipropetrovsk ARC View GIS o. The multi-factor nonlinear correlation between solar activity and cosmic ray variations, geophysical, geodynamic, geochemical, atmospheric, technological, biological, socio- economical processes and oncologic case rate frequency, general and primary popula- tion sickness cases in Dnipropetrovsk City (1.2 million persons) are described by the multi-factor predictive SEIS & GIS model of geopathogenic zones that determines the human health risk and hazards. Results and Conclusions. We have created the SEIS system and multi-factor predictive SEIS model for the analysis of phase-metric spatio- 1 temporal nonlinear correlation and variations of rhythms of human health, ecological, genetic, epidemiological risks, demographic, socio-economic, bio-geophysical, bio- geodynamic processes in geopathogenic hazard zones. Cosmophotomaps "CPM" of vegetation index, anthropogenic-landscape and landscape-geophysical human health risk of Dnipropetrovsk City present synthesis-based elements of multi-layer GIS, which include multispectral images SPOT o, maps of different geophysical, geochem- ical, anthropogenic and citogenic risk factors, maps of integral oncologic case rate frequency, general and primary population sickness cases for administrative districts. Results of multi-layer spatio-temporal correlation of geophysical field parameters and variations of population sickness rate rhythms have enabled us to state grounds and to develop medico-biological and bio-geodynamic classification of geopathogenic zones. Bio-geodynamic model has served to define contours of anthropogenic-landscape and landscape-geophysical human health risk in Dnipropetrovsk City. Biorhythmic vari- ations give foundation for understanding physiological mechanisms of organism`s adaptation to extreme helio-geophysical and bio-geodynamic environmental condi- tions, which are dictated by changes in Multi-factor Correlation Stress Field "MCSF" with deformation of 5D SEIS. Interaction between organism and environment results in continuous superpositioning of external (exogenic) Nuclear-Molecular-Cristallic "NMC" MCSF rhythms on internal (endogenic) Nuclear-Molecular-Cellular "NMCl" MCSF rhythms. Their resonance wave (energy-information) integration and disinte- gration are responsible for structural and functional state of different physiological systems. Herewith, complex restructurization of defense functions blocks the adapta- tion process and may turn to be the primary reason for phase shifting, process and biorhythms hindering, appearance of different deseases. Interaction of biorhythms with natural and anthropogenic rhythms specify the peculiar features of environ- mental adaptation of living species. Such interaction results in correlation of sea- sonal rhythms in variations of thermo-baro-geodynamic "TBG" parameters of am- bient air with toxic concentration and human health risk in Dnipropetrovsk City. Bio-geodynamic analysis of medical and demographic situations has provided for search of spatio-temporal correlation between rhythms of general and primary pop- ulation sickness cases and oncologic case rate frequency, other medico-demographic rhythms, natural processes (helio-geophysical, thermodynamic, geodynamic) and an- thropogenic processes (industrial and houschold waste disposal, toxic emissions and their concentration in ambient air). The year of 1986, the year of minimum helio- geophysical activity "2G1dG1" and maximum anthropogenic processes associated with changes in sickness and death rates of the population of Earth were synchronized. With account of quantum character of SEIS rhythms, 5 reference levels of desyn- chronized helio-geophysical and bio-geodynamic processes affecting population sick- ness rate have been specified within bio-geodynamic models. The first reference level 2 of SEIS desynchronization includes rhythms with period of 22,5 years: ... 1958,2; 1980,7; 2003,2; .... The second reference level of SEIS desynchronization includes rhythms with period of 11,25 years: ... 1980,7; 1992; 2003,2;.... The third reference level covers 5,625-years periodic rhythms2:... 1980,7; 1986,3; 1992; 1997,6; 2003,2; .... The fourth quantum reference level includes rhythms 3 with period of 2,8125 years: ... 1980,7; 1983,5; 1986,3; 1989,1; 1992; 1994,8; 1997,6; 2000,4; 2003,2; .... Rhythms with 1,40625-years period fall is fifth reference level of SEIS desynchro- nization: ...1980,7; 1982,1; 1983,5; 1984,9; 1986,3; 1987,7; 1989,1; 1990,5; 1992; 1993,3; 1994,8; 1996,2; 1997,6; 1999; 2000,4; 2001,8; 2003,2;.... Analysis of alternat- ing medical and demographic situation in Ukraine (1981-1992)and in Dnipropetrovsk (1988-1995)has allowed to back up theoretical model of various-level rhythm quan- tum, with non-linear regularities due to phase-metric spatio-temporal deformation be- ing specified. Application of new technologies of Risk Analysis, Sinthesis and SEIS Modeling at the choice of a burial place for dangerous radioactive wastes in the zone of Chernobyl nuclear disaster (Shestopalov V., Bondarenko Y...., 1998) has shown their very high efficiency in comparison with GIS Analysis. IV.Recommendations and Outlook. In order to draw a conclusion regarding bio-geodynamic modeling of spatio-temporal structure of areas where common childhood sickness rate exists, it is necessary to mention that the only thing that can favour to exact predicting of where and when important catastrophes and epidemies will take place is correct and complex bio-geodynamic modeling. Imperfection of present GIS is the result of the lack of interactive facilities for multi-factor modeling of nonlinear natural and an- thropogenic processes. Equations' coefficients calculated for some areas are often irrelevant when applied to others. In this connection there arises a number of prob- lems concerning practical application and reliability of GIS-models that are used to carry out efficient ecological monitoring. References Bondarenko Y., 1997, Drawing up Cosmophotomaps and Multi-factor Forecasting of Hazard of Development of Dan- gerous Geodynamic Processes in Dnipropetrovsk,The Technically-Natural Problems of failures and catastrophes in connection with development of dangerous geological processes, Kiev, Ukraine, 1997. Bondarenko Y., 1997, The Methodology of a State the Value of Quality of the Ground and the House Level them Ecology-Genetic-Toxic of the human health risk based on multi-layer cartographical model", Experience of application GIS - Technologies for creating Cadastral Systems, Yalta, Ukraine, 1997, p. 39-40. Shestopalov V., Bondarenko Y., Zayonts I., Rudenko Y. , Bohuslavsky A., 1998, Complexation of Structural-Geodynamical and Hydrogeological Methods of Studying Areas to Reveal Geological Structural Perspectives for Deep Isolation of Radioactive Wastes, Field Testing and Associated Modeling of Potential High-Level Nuclear Waste Geologic Disposal Sites, Berkeley, USA, 1998, p.81-82. 3

  15. Integrating forest ecosystem services into the farming landscape: A stochastic economic assessment.

    PubMed

    Monge, Juan J; Parker, Warren J; Richardson, James W

    2016-06-01

    The objective of this study was to assess how payments for ecosystem services could assist plantation forestry's integration into pastoral dairy farming in order to improve environmental outcomes and increase business resilience to both price uncertainty and production limits imposed by environmental policies. Stochastic Dominance (SD) criteria and portfolio analysis, accounting for farmers' risk aversion levels, were used to rank different land-use alternatives and landscapes with different levels of plantation forestry integration. The study was focused on a modal 200-ha dairy farm in the Lake Rotorua Catchment of the Central North Island region of New Zealand, where national environmental policies are being implemented to improve water quality and reduce greenhouse gas emissions. Nitrogen and carbon payments would help farmers improve early cash flows for forestry, provide financial leverage to undertake afforestation projects and contribute to improved environmental outcomes for the catchment. The SD criteria demonstrated that although dairy farming generates the highest returns, plantation forestry with nitrogen and carbon payments would be a preferred alternative for landowners with relatively low risk aversion levels who consider return volatility and environmental limits within their land-use change criteria. Using the confidence premium concept, environmental payments to encourage plantation forestry into the landscape were shown to be lower when the majority of landowners are risk averse. The certainty equivalence approach helped to identify the optimal dairy-forestry portfolio arrangements for landowners of different levels of risk aversion, intensities of dairy farming (status quo and intensified) and nitrogen prices. At low nitrogen prices, risk neutral farmers would choose to afforest less than half of the farm and operate at the maximum nitrogen allowance, because dairy farming at both intensities provides the highest return among the different land uses available. However, at relatively low risk aversion levels, farmers would operate at levels below the maximum nitrogen allowance by including plantation forestry to a greater extent, compared to risk neutral farmers, due to its more certain returns. At a high nitrogen price of $400/kg, plantation forestry would completely subsume dairying, across risk aversion and intensity levels. These results confirm that plantation forestry as well as being an environmentally sound land-use alternative, also reduces uncertainty for landowners that are exposed to volatile international markets for dairy commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Managing Financial Risk to Hydropower in Snow Dominated Systems: A Hetch Hetchy Case Study

    NASA Astrophysics Data System (ADS)

    Hamilton, A. L.; Characklis, G. W.; Reed, P. M.

    2017-12-01

    Hydropower generation in snow dominated systems is vulnerable to severe shortfalls in years with low snowpack. Meanwhile, generators are also vulnerable to variability in electricity demand and wholesale electricity prices, both of which can be impacted by factors such as temperature and natural gas price. Year-to-year variability in these underlying stochastic variables leads to financial volatility and the threat of low revenue periods, which can be highly disruptive for generators with large fixed operating costs and debt service. In this research, the Hetch Hetchy Power system is used to characterize financial risk in a snow dominated hydropower system. Owned and operated by the San Francisco Public Utilities Commission, Hetch Hetchy generates power for its own municipal operations and sells excess power to irrigation districts, as well as on the wholesale market. This investigation considers the effects of variability in snowpack, temperature, and natural gas price on Hetch Hetchy Power's yearly revenues. This information is then used to evaluate the effectiveness of various financial risk management tools for hedging against revenue variability. These tools are designed to mitigate against all three potential forms of financial risk (i.e. low hydropower generation, low electricity demand, and low/high electricity price) and include temperature-based derivative contracts, natural gas price-based derivative contracts, and a novel form of snowpack-based index insurance contract. These are incorporated into a comprehensive risk management portfolio, along with self-insurance in which the utility buffers yearly revenue volatility using a contingency fund. By adaptively managing the portfolio strategy, a utility can efficiently spread yearly risks over a multi-year time horizon. The Borg Multiobjective Evolutionary Algorithm is used to generate a set of Pareto optimal portfolio strategies, which are used to compare the tradeoffs in objectives such as expected revenues, low revenues, revenue volatility, and portfolio complexity.

  17. GIANT IMPACT: AN EFFICIENT MECHANISM FOR THE DEVOLATILIZATION OF SUPER-EARTHS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Shang-Fei; Hori, Yasunori; Lin, D. N. C.

    Mini-Neptunes and volatile-poor super-Earths coexist on adjacent orbits in proximity to host stars such as Kepler-36 and Kepler-11. Several post-formation processes have been proposed for explaining the origin of the compositional diversity between neighboring planets: mass loss via stellar XUV irradiation, degassing of accreted material, and in situ accumulation of the disk gas. Close-in planets are also likely to experience giant impacts during the advanced stage of planet formation. This study examines the possibility of transforming volatile-rich super-Earths/mini-Neptunes into volatile-depleted super-Earths through giant impacts. We present the results of three-dimensional hydrodynamic simulations of giant impacts in the accretionary and disruptivemore » regimes. Target planets are modeled with a three-layered structure composed of an iron core, silicate mantle, and hydrogen/helium envelope. In the disruptive case, the giant impact can remove most of the H/He atmosphere immediately and homogenize the refractory material in the planetary interior. In the accretionary case, the planet is able to retain more than half of the original gaseous envelope, while a compositional gradient suppresses efficient heat transfer as the planetary interior undergoes double-diffusive convection. After the giant impact, a hot and inflated planet cools and contracts slowly. The extended atmosphere enhances the mass loss via both a Parker wind induced by thermal pressure and hydrodynamic escape driven by the stellar XUV irradiation. As a result, the entire gaseous envelope is expected to be lost due to the combination of those processes in both cases. Based on our results, we propose that Kepler-36b may have been significantly devolatilized by giant impacts, while a substantial fraction of Kepler-36c’s atmosphere may remain intact. Furthermore, the stochastic nature of giant impacts may account for the observed large dispersion in the mass–radius relationship of close-in super-Earths and mini-Neptunes (at least to some extent)« less

  18. Modern methods for the quality management of high-rate melt solidification

    NASA Astrophysics Data System (ADS)

    Vasiliev, V. A.; Odinokov, S. A.; Serov, M. M.

    2016-12-01

    The quality management of high-rate melt solidification needs combined solution obtained by methods and approaches adapted to a certain situation. Technological audit is recommended to estimate the possibilities of the process. Statistical methods are proposed with the choice of key parameters. Numerical methods, which can be used to perform simulation under multifactor technological conditions, and an increase in the quality of decisions are of particular importance.

  19. An Evaluation of the Relationship between Supervisory Techniques and Organizational Outcomes among the Supervisors in the Agricultural Extension Service in the Eastern Region Districts of Uganda. Summary of Research 81.

    ERIC Educational Resources Information Center

    Padde, Paul; And Others

    A descriptive study examined the relationship between supervisory techniques and organizational outcomes among supervisors in the agricultural extension service in eight districts in eastern Uganda. Self-rating and rater forms of the Multifactor Leadership Questionnaire were sent to 220 extension agents, 8 field supervisors, and 8 deputy field…

  20. Jackknife for Variance Analysis of Multifactor Experiments.

    DTIC Science & Technology

    1982-05-01

    variance-covariance matrix is generated y a subroutine named CORAN (UNIVAC, 1969). The jackknife variances are then punched on computer cards in the same...LEVEL OF: InMte CALL cORAN (oaILa.NSUR.NOAY.D,*OXflRRORR.PCOF.2K.1’)I WRITE IP97111 )1RRN.4 .1:NDAY) 0 a 3fill1UR I .’t UN 001f’..1uŔ:1 .w100710n

  1. Long-duration effect of multi-factor stresses on the cellular biochemistry, oil-yielding performance and morphology of Nannochloropsis oculata

    PubMed Central

    Wei, Likun; Huang, Xuxiong

    2017-01-01

    Microalga Nannochloropsis oculata is a promising alternative feedstock for biodiesel. Elevating its oil-yielding capacity is conducive to cost-saving biodiesel production. However, the regulatory processes of multi-factor collaborative stresses (MFCS) on the oil-yielding performance of N. oculata are unclear. The duration effects of MFCS (high irradiation, nitrogen deficiency and elevated iron supplementation) on N. oculata were investigated in an 18-d batch culture. Despite the reduction in cell division, the biomass concentration increased, resulting from the large accumulation of the carbon/energy-reservoir. However, different storage forms were found in different cellular storage compounds, and both the protein content and pigment composition swiftly and drastically changed. The analysis of four biodiesel properties using pertinent empirical equations indicated their progressive effective improvement in lipid classes and fatty acid composition. The variation curve of neutral lipid productivity was monitored with fluorescent Nile red and was closely correlated to the results from conventional methods. In addition, a series of changes in the organelles (e.g., chloroplast, lipid body and vacuole) and cell shape, dependent on the stress duration, were observed by TEM and LSCM. These changes presumably played an important role in the acclimation of N. oculata to MFCS and accordingly improved its oil-yielding performance. PMID:28346505

  2. Multifactor-Dimensionality Reduction Reveals High-Order Interactions among Estrogen-Metabolism Genes in Sporadic Breast Cancer

    PubMed Central

    Ritchie, Marylyn D.; Hahn, Lance W.; Roodi, Nady; Bailey, L. Renee; Dupont, William D.; Parl, Fritz F.; Moore, Jason H.

    2001-01-01

    One of the greatest challenges facing human geneticists is the identification and characterization of susceptibility genes for common complex multifactorial human diseases. This challenge is partly due to the limitations of parametric-statistical methods for detection of gene effects that are dependent solely or partially on interactions with other genes and with environmental exposures. We introduce multifactor-dimensionality reduction (MDR) as a method for reducing the dimensionality of multilocus information, to improve the identification of polymorphism combinations associated with disease risk. The MDR method is nonparametric (i.e., no hypothesis about the value of a statistical parameter is made), is model-free (i.e., it assumes no particular inheritance model), and is directly applicable to case-control and discordant-sib-pair studies. Using simulated case-control data, we demonstrate that MDR has reasonable power to identify interactions among two or more loci in relatively small samples. When it was applied to a sporadic breast cancer case-control data set, in the absence of any statistically significant independent main effects, MDR identified a statistically significant high-order interaction among four polymorphisms from three different estrogen-metabolism genes. To our knowledge, this is the first report of a four-locus interaction associated with a common complex multifactorial disease. PMID:11404819

  3. Multilevel Factorial Experiments for Developing Behavioral Interventions: Power, Sample Size, and Resource Considerations†

    PubMed Central

    Dziak, John J.; Nahum-Shani, Inbal; Collins, Linda M.

    2012-01-01

    Factorial experimental designs have many potential advantages for behavioral scientists. For example, such designs may be useful in building more potent interventions, by helping investigators to screen several candidate intervention components simultaneously and decide which are likely to offer greater benefit before evaluating the intervention as a whole. However, sample size and power considerations may challenge investigators attempting to apply such designs, especially when the population of interest is multilevel (e.g., when students are nested within schools, or employees within organizations). In this article we examine the feasibility of factorial experimental designs with multiple factors in a multilevel, clustered setting (i.e., of multilevel multifactor experiments). We conduct Monte Carlo simulations to demonstrate how design elements such as the number of clusters, the number of lower-level units, and the intraclass correlation affect power. Our results suggest that multilevel, multifactor experiments are feasible for factor-screening purposes, because of the economical properties of complete and fractional factorial experimental designs. We also discuss resources for sample size planning and power estimation for multilevel factorial experiments. These results are discussed from a resource management perspective, in which the goal is to choose a design that maximizes the scientific benefit using the resources available for an investigation. PMID:22309956

  4. Cardiovascular disease prevention and lifestyle interventions: effectiveness and efficacy.

    PubMed

    Haskell, William L

    2003-01-01

    Over the past half century scientific data support the strong relationship between the way a person or population lives and their risk for developing or dying from cardiovascular disease (CVD). While heredity can be a major factor for some people, their personal health habits and environmental/cultural exposure are more important factors. CVD is a multifactor process that is contributed to by a variety of biological and behavioral characteristics of the person including a number of well-established and emerging risk factors. Not smoking, being physically active, eating a heart healthy diet, staying reasonably lean, and avoiding major stress and depression are the major components of an effective CVD prevention program. For people at high risk of CVD, medications frequently need to be added to a healthy lifestyle to minimize their risk of a heart attack or stroke, particularly in persons with conditions such as hypertension, hypercholesterolemia, or hyperglycemia. Maintaining an effective CVD prevention program in technologically advanced societies cannot be achieved by many high-risk persons without effective and sustained support from a well-organized health care system. Nurse-provided or nurse-coordinated care management programs using an integrated or multifactor approach have been highly effective in reducing CVD morbidity and mortality of high-risk persons.

  5. Multifactor leadership styles and new exposure to workplace bullying: a six-month prospective study

    PubMed Central

    TSUNO, Kanami; KAWAKAMI, Norito

    2014-01-01

    This study investigated the prospective association between supervisor leadership styles and workplace bullying. Altogether 404 civil servants from a local government in Japan completed baseline and follow-up surveys. The leadership variables and exposure to bullying were measured by Multifactor Leadership Questionnaire and Negative Acts Questionnaire-Revised, respectively. The prevalence of workplace bullying was 14.8% at baseline and 15.1% at follow-up. Among respondents who did not experience bullying at baseline (n=216), those who worked under the supervisors as higher in passive laissez-faire leadership had a 4.3 times higher risk of new exposure to bullying. On the other hand, respondents whose supervisors with highly considerate of the individual had a 70% lower risk of new exposure to bullying. In the entire sample (n=317), passive laissez-faire leadership was significantly and positively associated, while charisma/inspiration, individual consideration, and contingent reward were negatively associated both after adjusting for demographic and occupational characteristics at baseline, life events during follow-up, and exposure to workplace bullying at baseline. Results indicated that passive laissez-faire and low individual consideration leadership style at baseline were strong predictors of new exposure to bullying and high individual consideration leadership of supervisors/managers could be a preventive factor against bullying. PMID:25382384

  6. Multifactor leadership styles and new exposure to workplace bullying: a six-month prospective study.

    PubMed

    Tsuno, Kanami; Kawakami, Norito

    2015-01-01

    This study investigated the prospective association between supervisor leadership styles and workplace bullying. Altogether 404 civil servants from a local government in Japan completed baseline and follow-up surveys. The leadership variables and exposure to bullying were measured by Multifactor Leadership Questionnaire and Negative Acts Questionnaire-Revised, respectively. The prevalence of workplace bullying was 14.8% at baseline and 15.1% at follow-up. Among respondents who did not experience bullying at baseline (n=216), those who worked under the supervisors as higher in passive laissez-faire leadership had a 4.3 times higher risk of new exposure to bullying. On the other hand, respondents whose supervisors with highly considerate of the individual had a 70% lower risk of new exposure to bullying. In the entire sample (n=317), passive laissez-faire leadership was significantly and positively associated, while charisma/inspiration, individual consideration, and contingent reward were negatively associated both after adjusting for demographic and occupational characteristics at baseline, life events during follow-up, and exposure to workplace bullying at baseline. Results indicated that passive laissez-faire and low individual consideration leadership style at baseline were strong predictors of new exposure to bullying and high individual consideration leadership of supervisors/managers could be a preventive factor against bullying.

  7. Sensitivity Analysis of Mechanical Parameters of Different Rock Layers to the Stability of Coal Roadway in Soft Rock Strata

    PubMed Central

    Zhao, Zeng-hui; Wang, Wei-ming; Gao, Xin; Yan, Ji-xing

    2013-01-01

    According to the geological characteristics of Xinjiang Ili mine in western area of China, a physical model of interstratified strata composed of soft rock and hard coal seam was established. Selecting the tunnel position, deformation modulus, and strength parameters of each layer as influencing factors, the sensitivity coefficient of roadway deformation to each parameter was firstly analyzed based on a Mohr-Columb strain softening model and nonlinear elastic-plastic finite element analysis. Then the effect laws of influencing factors which showed high sensitivity were further discussed. Finally, a regression model for the relationship between roadway displacements and multifactors was obtained by equivalent linear regression under multiple factors. The results show that the roadway deformation is highly sensitive to the depth of coal seam under the floor which should be considered in the layout of coal roadway; deformation modulus and strength of coal seam and floor have a great influence on the global stability of tunnel; on the contrary, roadway deformation is not sensitive to the mechanical parameters of soft roof; roadway deformation under random combinations of multi-factors can be deduced by the regression model. These conclusions provide theoretical significance to the arrangement and stability maintenance of coal roadway. PMID:24459447

  8. Application of an Instrumental and Computational Approach for Improving the Vibration Behavior of Structural Panels Using a Lightweight Multilayer Composite

    PubMed Central

    Sánchez, Alberto; García, Manuel; Sebastián, Miguel Angel; Camacho, Ana María

    2014-01-01

    This work presents a hybrid (experimental-computational) application for improving the vibration behavior of structural components using a lightweight multilayer composite. The vibration behavior of a flat steel plate has been improved by the gluing of a lightweight composite formed by a core of polyurethane foam and two paper mats placed on its faces. This composite enables the natural frequencies to be increased and the modal density of the plate to be reduced, moving about the natural frequencies of the plate out of excitation range, thereby improving the vibration behavior of the plate. A specific experimental model for measuring the Operating Deflection Shape (ODS) has been developed, which enables an evaluation of the goodness of the natural frequencies obtained with the computational model simulated by the finite element method (FEM). The model of composite + flat steel plate determined by FEM was used to conduct parametric study, and the most influential factors for 1st, 2nd and 3rd mode were identified using a multifactor analysis of variance (Multifactor-ANOVA). The presented results can be easily particularized for other cases, as it may be used in cycles of continuous improvement as well as in the product development at the material, piece, and complete-system levels. PMID:24618779

  9. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    This report presents the results of a fourth year effort of a research program, conducted for NASA-LeRC by the University of Texas at San Antonio (UTSA). The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subject to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation has been randomized and is included in the computer program, PROMISS. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 has been analyzed using the developed methodology.

  10. Computational simulation of probabilistic lifetime strength for aerospace materials subjected to high temperature, mechanical fatigue, creep, and thermal fatigue

    NASA Technical Reports Server (NTRS)

    Boyce, Lola; Bast, Callie C.; Trimble, Greg A.

    1992-01-01

    The results of a fourth year effort of a research program conducted for NASA-LeRC by The University of Texas at San Antonio (UTSA) are presented. The research included on-going development of methodology that provides probabilistic lifetime strength of aerospace materials via computational simulation. A probabilistic material strength degradation model, in the form of a randomized multifactor interaction equation, is postulated for strength degradation of structural components of aerospace propulsion systems subjected to a number of effects or primitive variables. These primitive variables may include high temperature, fatigue, or creep. In most cases, strength is reduced as a result of the action of a variable. This multifactor interaction strength degradation equation was randomized and is included in the computer program, PROMISC. Also included in the research is the development of methodology to calibrate the above-described constitutive equation using actual experimental materials data together with regression analysis of that data, thereby predicting values for the empirical material constants for each effect or primitive variable. This regression methodology is included in the computer program, PROMISC. Actual experimental materials data were obtained from industry and the open literature for materials typically for applications in aerospace propulsion system components. Material data for Inconel 718 was analyzed using the developed methodology.

  11. Multilevel factorial experiments for developing behavioral interventions: power, sample size, and resource considerations.

    PubMed

    Dziak, John J; Nahum-Shani, Inbal; Collins, Linda M

    2012-06-01

    Factorial experimental designs have many potential advantages for behavioral scientists. For example, such designs may be useful in building more potent interventions by helping investigators to screen several candidate intervention components simultaneously and to decide which are likely to offer greater benefit before evaluating the intervention as a whole. However, sample size and power considerations may challenge investigators attempting to apply such designs, especially when the population of interest is multilevel (e.g., when students are nested within schools, or when employees are nested within organizations). In this article, we examine the feasibility of factorial experimental designs with multiple factors in a multilevel, clustered setting (i.e., of multilevel, multifactor experiments). We conduct Monte Carlo simulations to demonstrate how design elements-such as the number of clusters, the number of lower-level units, and the intraclass correlation-affect power. Our results suggest that multilevel, multifactor experiments are feasible for factor-screening purposes because of the economical properties of complete and fractional factorial experimental designs. We also discuss resources for sample size planning and power estimation for multilevel factorial experiments. These results are discussed from a resource management perspective, in which the goal is to choose a design that maximizes the scientific benefit using the resources available for an investigation. (c) 2012 APA, all rights reserved

  12. Actor-network Procedures: Modeling Multi-factor Authentication, Device Pairing, Social Interactions

    DTIC Science & Technology

    2011-08-29

    unmodifiable properties of your body; or the capabilities that you cannot convey to others, such as your handwriting . An identity can thus be determined by...network, two principals with the same set of secrets but, say , different computational powers, can be distinguished by timing their responses. Or they... says that configurations are finite sets. Partially ordered multisets, or pomsets were introduced and extensively studied by Vaughan Pratt and his

  13. A remote sensing-assisted risk rating study to predict oak decline and recovery in the Missouri Ozark Highlands, USA

    Treesearch

    Cuizhen Wang; Hong S. He; John M. Kabrick

    2008-01-01

    Forests in the Ozark Highlands underwent widespread oak decline affected by severe droughts in 1999-2000. In this study, the differential normalized difference water index was calculated to detect crown dieback. A multi-factor risk rating system was built to map risk levels of stands. As a quick response to drought, decline in 2000 mostly occurred in stands at low to...

  14. Reactor performances and microbial communities of biogas reactors: effects of inoculum sources.

    PubMed

    Han, Sheng; Liu, Yafeng; Zhang, Shicheng; Luo, Gang

    2016-01-01

    Anaerobic digestion is a very complex process that is mediated by various microorganisms, and the understanding of the microbial community assembly and its corresponding function is critical in order to better control the anaerobic process. The present study investigated the effect of different inocula on the microbial community assembly in biogas reactors treating cellulose with various inocula, and three parallel biogas reactors with the same inoculum were also operated in order to reveal the reproducibility of both microbial communities and functions of the biogas reactors. The results showed that the biogas production, volatile fatty acid (VFA) concentrations, and pH were different for the biogas reactors with different inocula, and different steady-state microbial community patterns were also obtained in different biogas reactors as reflected by Bray-Curtis similarity matrices and taxonomic classification. It indicated that inoculum played an important role in shaping the microbial communities of biogas reactor in the present study, and the microbial community assembly in biogas reactor did not follow the niche-based ecology theory. Furthermore, it was found that the microbial communities and reactor performances of parallel biogas reactors with the same inoculum were different, which could be explained by the neutral-based ecology theory and stochastic factors should played important roles in the microbial community assembly in the biogas reactors. The Bray-Curtis similarity matrices analysis suggested that inoculum affected more on the microbial community assembly compared to stochastic factors, since the samples with different inocula had lower similarity (10-20 %) compared to the samples from the parallel biogas reactors (30 %).

  15. Applications of statistical physics to the social and economic sciences

    NASA Astrophysics Data System (ADS)

    Petersen, Alexander M.

    2011-12-01

    This thesis applies statistical physics concepts and methods to quantitatively analyze socioeconomic systems. For each system we combine theoretical models and empirical data analysis in order to better understand the real-world system in relation to the complex interactions between the underlying human agents. This thesis is separated into three parts: (i) response dynamics in financial markets, (ii) dynamics of career trajectories, and (iii) a stochastic opinion model with quenched disorder. In Part I we quantify the response of U.S. markets to financial shocks, which perturb markets and trigger "herding behavior" among traders. We use concepts from earthquake physics to quantify the decay of volatility shocks after the "main shock." We also find, surprisingly, that we can make quantitative statements even before the main shock. In order to analyze market behavior before as well as after "anticipated news" we use Federal Reserve interest-rate announcements, which are regular events that are also scheduled in advance. In Part II we analyze the statistical physics of career longevity. We construct a stochastic model for career progress which has two main ingredients: (a) random forward progress in the career and (b) random termination of the career. We incorporate the rich-get-richer (Matthew) effect into ingredient (a), meaning that it is easier to move forward in the career the farther along one is in the career. We verify the model predictions analyzing data on 400,000 scientific careers and 20,000 professional sports careers. Our model highlights the importance of early career development, showing that many careers are stunted by the relative disadvantage associated with inexperience. In Part III we analyze a stochastic two-state spin model which represents a system of voters embedded on a network. We investigate the role in consensus formation of "zealots", which are agents with time-independent opinion. Our main result is the unexpected finding that it is the number and not the density of zealots which deter- mines the steady-state opinion polarization. We compare our findings with results for United States Presidential elections.

  16. The impact of high total cholesterol and high low-density lipoprotein on avascular necrosis of the femoral head in low-energy femoral neck fractures.

    PubMed

    Zeng, Xianshang; Zhan, Ke; Zhang, Lili; Zeng, Dan; Yu, Weiguang; Zhang, Xinchao; Zhao, Mingdong; Lai, Zhicheng; Chen, Runzhen

    2017-02-17

    Avascular necrosis of the femoral head (AVNFH) typically constitutes 5 to 15% of all complications of low-energy femoral neck fractures, and due to an increasingly ageing population and a rising prevalence of femoral neck fractures, the number of patients who develop AVNFH is increasing. However, there is no consensus regarding the relationship between blood lipid abnormalities and postoperative AVNFH. The purpose of this retrospective study was to investigate the relationship between blood lipid abnormalities and AVNFH following the femoral neck fracture operation among an elderly population. A retrospective, comparative study was performed at our institution. Between June 2005 and November 2009, 653 elderly patients (653 hips) with low-energy femoral neck fractures underwent closed reduction and internal fixation with cancellous screws (Smith and Nephew, Memphis, Tennessee). Follow-up occurred at 1, 6, 12, 18, 24, 30, and 36 months after surgery. Logistic multi-factor regression analysis was used to assess the risk factors of AVNFH and to determine the effect of blood lipid levels on AVNFH development. Inclusion and exclusion criteria were predetermined to focus on isolated freshly closed femoral neck fractures in the elderly population. The primary outcome was the blood lipid levels. The secondary outcome was the logistic multi-factor regression analysis. A total of 325 elderly patients with low-energy femoral neck fractures (AVNFH, n = 160; control, n = 165) were assessed. In the AVNFH group, the average TC, TG, LDL, and Apo-B values were 7.11 ± 3.16 mmol/L, 2.15 ± 0.89 mmol/L, 4.49 ± 1.38 mmol/L, and 79.69 ± 17.29 mg/dL, respectively; all of which were significantly higher than the values in the control group. Logistic multi-factor regression analysis showed that both TC and LDL were the independent factors influencing the postoperative AVNFH within femoral neck fractures. This evidence indicates that AVNFH was significantly associated with blood lipid abnormalities in elderly patients with low-energy femoral neck fractures. The findings of this pilot trial justify a larger study to determine whether the result is more generally applicable to a broader population.

  17. Probabilisitc Geobiological Classification Using Elemental Abundance Distributions and Lossless Image Compression in Recent and Modern Organisms

    NASA Technical Reports Server (NTRS)

    Storrie-Lombardi, Michael C.; Hoover, Richard B.

    2005-01-01

    Last year we presented techniques for the detection of fossils during robotic missions to Mars using both structural and chemical signatures[Storrie-Lombardi and Hoover, 2004]. Analyses included lossless compression of photographic images to estimate the relative complexity of a putative fossil compared to the rock matrix [Corsetti and Storrie-Lombardi, 2003] and elemental abundance distributions to provide mineralogical classification of the rock matrix [Storrie-Lombardi and Fisk, 2004]. We presented a classification strategy employing two exploratory classification algorithms (Principal Component Analysis and Hierarchical Cluster Analysis) and non-linear stochastic neural network to produce a Bayesian estimate of classification accuracy. We now present an extension of our previous experiments exploring putative fossil forms morphologically resembling cyanobacteria discovered in the Orgueil meteorite. Elemental abundances (C6, N7, O8, Na11, Mg12, Ai13, Si14, P15, S16, Cl17, K19, Ca20, Fe26) obtained for both extant cyanobacteria and fossil trilobites produce signatures readily distinguishing them from meteorite targets. When compared to elemental abundance signatures for extant cyanobacteria Orgueil structures exhibit decreased abundances for C6, N7, Na11, All3, P15, Cl17, K19, Ca20 and increases in Mg12, S16, Fe26. Diatoms and silicified portions of cyanobacterial sheaths exhibiting high levels of silicon and correspondingly low levels of carbon cluster more closely with terrestrial fossils than with extant cyanobacteria. Compression indices verify that variations in random and redundant textural patterns between perceived forms and the background matrix contribute significantly to morphological visual identification. The results provide a quantitative probabilistic methodology for discriminating putatitive fossils from the surrounding rock matrix and &om extant organisms using both structural and chemical information. The techniques described appear applicable to the geobiological analysis of meteoritic samples or in situ exploration of the Mars regolith. Keywords: cyanobacteria, microfossils, Mars, elemental abundances, complexity analysis, multifactor analysis, principal component analysis, hierarchical cluster analysis, artificial neural networks, paleo-biosignatures

  18. PREFACE: Anti-counterfeit Image Analysis Methods (A Special Session of ICSXII)

    NASA Astrophysics Data System (ADS)

    Javidi, B.; Fournel, T.

    2007-06-01

    The International Congress for Stereology is dedicated to theoretical and applied aspects of stochastic tools, image analysis and mathematical morphology. A special emphasis on `anti-counterfeit image analysis methods' has been given this year for the XIIth edition (ICSXII). Facing the economic and social threat of counterfeiting, this devoted session presents recent advances and original solutions in the field. A first group of methods are related to marks located either on the product (physical marks) or on the data (hidden information) to be protected. These methods concern laser fs 3D encoding and source separation for machine-readable identification, moiré and `guilloche' engraving for visual verification and watermarking. Machine-readable travel documents are well-suited examples introducing the second group of methods which are related to cryptography. Used in passports for data authentication and identification (of people), cryptography provides some powerful tools. Opto-digital processing allows some efficient implementations described in the papers and promising applications. We would like to thank the reviewers who have contributed to a session of high quality, and the authors for their fine and hard work. We would like to address some special thanks to the invited lecturers, namely Professor Roger Hersch and Dr Isaac Amidror for their survey of moiré methods, Prof. Serge Vaudenay for his survey of existing protocols concerning machine-readable travel documents, and Dr Elisabet Pérez-Cabré for her presentation on optical encryption for multifactor authentication. We also thank Professor Dominique Jeulin, President of the International Society for Stereology, Professor Michel Jourlin, President of the organizing committee of ICSXII, for their help and advice, and Mr Graham Douglas, the Publisher of Journal of Physics: Conference Series at IOP Publishing, for his efficiency. We hope that this collection of papers will be useful as a tool to further develop a very important field. Bahram Javidi University of Connecticut (USA) Thierry Fournel University of Saint-Etienne (France) Chairs of the special session on `Anti-counterfeit image analysis methods', July 2007

  19. Time-varying economic dominance in financial markets: A bistable dynamics approach

    NASA Astrophysics Data System (ADS)

    He, Xue-Zhong; Li, Kai; Wang, Chuncheng

    2018-05-01

    By developing a continuous-time heterogeneous agent financial market model of multi-assets traded by fundamental and momentum investors, we provide a potential mechanism for generating time-varying dominance between fundamental and non-fundamental in financial markets. We show that investment constraints lead to the coexistence of a locally stable fundamental steady state and a locally stable limit cycle around the fundamental, characterized by a Bautin bifurcation. This provides a mechanism for market prices to switch stochastically between the two persistent but very different market states, leading to the coexistence and time-varying dominance of seemingly controversial efficient market and price momentum over different time periods. The model also generates other financial market stylized facts, such as spillover effects in both momentum and volatility, market booms, crashes, and correlation reduction due to cross-sectional momentum trading. Empirical evidence based on the U.S. market supports the main findings. The mechanism developed in this paper can be used to characterize time-varying economic dominance in economics and finance in general.

  20. Quantum finance

    NASA Astrophysics Data System (ADS)

    Schaden, Martin

    2002-12-01

    Quantum theory is used to model secondary financial markets. Contrary to stochastic descriptions, the formalism emphasizes the importance of trading in determining the value of a security. All possible realizations of investors holding securities and cash is taken as the basis of the Hilbert space of market states. The temporal evolution of an isolated market is unitary in this space. Linear operators representing basic financial transactions such as cash transfer and the buying or selling of securities are constructed and simple model Hamiltonians that generate the temporal evolution due to cash flows and the trading of securities are proposed. The Hamiltonian describing financial transactions becomes local when the profit/loss from trading is small compared to the turnover. This approximation may describe a highly liquid and efficient stock market. The lognormal probability distribution for the price of a stock with a variance that is proportional to the elapsed time is reproduced for an equilibrium market. The asymptotic volatility of a stock in this case is related to the long-term probability that it is traded.

  1. Time-varying economic dominance in financial markets: A bistable dynamics approach.

    PubMed

    He, Xue-Zhong; Li, Kai; Wang, Chuncheng

    2018-05-01

    By developing a continuous-time heterogeneous agent financial market model of multi-assets traded by fundamental and momentum investors, we provide a potential mechanism for generating time-varying dominance between fundamental and non-fundamental in financial markets. We show that investment constraints lead to the coexistence of a locally stable fundamental steady state and a locally stable limit cycle around the fundamental, characterized by a Bautin bifurcation. This provides a mechanism for market prices to switch stochastically between the two persistent but very different market states, leading to the coexistence and time-varying dominance of seemingly controversial efficient market and price momentum over different time periods. The model also generates other financial market stylized facts, such as spillover effects in both momentum and volatility, market booms, crashes, and correlation reduction due to cross-sectional momentum trading. Empirical evidence based on the U.S. market supports the main findings. The mechanism developed in this paper can be used to characterize time-varying economic dominance in economics and finance in general.

  2. Numerical pricing of options using high-order compact finite difference schemes

    NASA Astrophysics Data System (ADS)

    Tangman, D. Y.; Gopaul, A.; Bhuruth, M.

    2008-09-01

    We consider high-order compact (HOC) schemes for quasilinear parabolic partial differential equations to discretise the Black-Scholes PDE for the numerical pricing of European and American options. We show that for the heat equation with smooth initial conditions, the HOC schemes attain clear fourth-order convergence but fail if non-smooth payoff conditions are used. To restore the fourth-order convergence, we use a grid stretching that concentrates grid nodes at the strike price for European options. For an American option, an efficient procedure is also described to compute the option price, Greeks and the optimal exercise curve. Comparisons with a fourth-order non-compact scheme are also done. However, fourth-order convergence is not experienced with this strategy. To improve the convergence rate for American options, we discuss the use of a front-fixing transformation with the HOC scheme. We also show that the HOC scheme with grid stretching along the asset price dimension gives accurate numerical solutions for European options under stochastic volatility.

  3. Scaling and criticality in a stochastic multi-agent model of a financial market

    NASA Astrophysics Data System (ADS)

    Lux, Thomas; Marchesi, Michele

    1999-02-01

    Financial prices have been found to exhibit some universal characteristics that resemble the scaling laws characterizing physical systems in which large numbers of units interact. This raises the question of whether scaling in finance emerges in a similar way - from the interactions of a large ensemble of market participants. However, such an explanation is in contradiction to the prevalent `efficient market hypothesis' in economics, which assumes that the movements of financial prices are an immediate and unbiased reflection of incoming news about future earning prospects. Within this hypothesis, scaling in price changes would simply reflect similar scaling in the `input' signals that influence them. Here we describe a multi-agent model of financial markets which supports the idea that scaling arises from mutual interactions of participants. Although the `news arrival process' in our model lacks both power-law scaling and any temporal dependence in volatility, we find that it generates such behaviour as a result of interactions between agents.

  4. Fatigue failure of materials under broad band random vibrations

    NASA Technical Reports Server (NTRS)

    Huang, T. C.; Lanz, R. W.

    1971-01-01

    The fatigue life of material under multifactor influence of broad band random excitations has been investigated. Parameters which affect the fatigue life are postulated to be peak stress, variance of stress and the natural frequency of the system. Experimental data were processed by the hybrid computer. Based on the experimental results and regression analysis a best predicting model has been found. All values of the experimental fatigue lives are within the 95% confidence intervals of the predicting equation.

  5. An Economical Multifactor within-Subject Design Robust against Trend and Carryover Effects.

    DTIC Science & Technology

    1985-10-17

    ORGANIZATION REPORT NUMBER (S) S. MONIT ,,M.,,...---. 6a. NAME OF PERFORMING ORGANIZATION 6b. OFFICE SYMBOL 7a. NAME OF MONITORING ORGANIZATION Essex...Road Orlando, FL 32813 Orlando, FL 32803 Ba. NAME OF FUNDING/SPONSORING " Sb. OFFICE SYMBOL 9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ...ORGANIZATION (If applicable) S6~1332- &/. 0.-/195𔃺 Sc. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT

  6. [Chemical and sensory characterization of tea (Thea sinensis) consumed in Chile].

    PubMed

    Wittig de Penna, Emma; José Zúñiga, María; Fuenzalida, Regina; López-Planes, Reinaldo

    2005-03-01

    By means of descriptive analysis four varieties of tea (Thea sinensis) were assesed: Argentinean OP (orange pekoe) tea (black), Brazilian OP tea (black), Ceylan OP tea (black) and Darjeeling OP tea (green). The appearance of dry tea leaves were qualitatively characterized comparing with dry leaves standard. The attributes: colour, form, regularity of the leaves, fibre and stem cutting were evaluated The differences obtained were related to the differences produced by the effect of the fermentation process. Flavour and aroma descriptors of the tea liqueur were generated by a trained panel. Colour and astringency were evaluated in comparison with qualified standards using non structured linear scales. In order to relate the sensory analysis and the chemical composition for the different varieties of tea, following determinations were made: chemical moisture, dry material, aqueous extract, tannin and caffeine. Through multifactor regression analysis the equations in relation to the following chemical parameters were determined. Dry material, aqueous extract and tannins for colour and moisture, dry material and aqueous extract for astringency, respectively. Statistical analysis through ANOVA (3 variation sources: samples, judges and replications) showed for samples four significant different groups for astringency and three different groups for colour. No significant differences between judges or repetitions were found. By multifactor regression analysis of both, colour and astringency, on their dependence of chemist results were calculated in order to asses the corresponding equations.

  7. A multi-factor designation method for mapping particulate-pollution control zones in China.

    PubMed

    Qin, Y; Xie, S D

    2011-09-01

    A multi-factor designation method for mapping particulate-pollution control zones was brought out through synthetically considering PM(10) pollution status, PM(10) anthropogenic emissions, fine particle pollution, long-range transport and economic situation. According to this method, China was divided into four different particulate-pollution control regions: PM Suspended Control Region, PM(10) Pollution Control Region, PM(2.5) Pollution Control Region and PM(10) and PM(2.5) Common Control Region, which accounted for 69.55%, 9.66%, 4.67% and 16.13% of China's territory, respectively. The PM(10) and PM(2.5) Common Control Region was mainly distributed in Bohai Region, Yangtze River Delta, Pearl River Delta, eastern of Sichuan province and Chongqing municipality, calling for immediate control of both PM(10) and PM(2.5). Cost-effective control effects can be achieved through concentrating efforts on PM(10) and PM(2.5) Common Control Region to address 60.32% of national PM(10) anthropogenic emissions. Air quality in districts belonging to PM(2.5) Pollution Control Region suggested that Chinese national ambient air quality standard for PM(10) was not strict enough. The result derived from application to China proved that this approach was feasible for mapping pollution control regions for a country with vast territory, complicated pollution characteristics and limited available monitoring data. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Can elevated CO2 modify regeneration from seed banks of floating freshwater marshes subjected to rising sea-level?

    USGS Publications Warehouse

    Middleton, Beth A.; McKee, Karen L.

    2012-01-01

    Higher atmospheric concentrations of CO2 can offset the negative effects of flooding or salinity on plant species, but previous studies have focused on mature, rather than regenerating vegetation. This study examined how interacting environments of CO2, water regime, and salinity affect seed germination and seedling biomass of floating freshwater marshes in the Mississippi River Delta, which are dominated by C3 grasses, sedges, and forbs. Germination density and seedling growth of the dominant species depended on multifactor interactions of CO2 (385 and 720 μl l-1) with flooding (drained, +8-cm depth, +8-cm depth-gradual) and salinity (0, 6% seawater) levels. Of the three factors tested, salinity was the most important determinant of seedling response patterns. Species richness (total = 19) was insensitive to CO2. Our findings suggest that for freshwater marsh communities, seedling response to CO2 is species-specific and secondary to salinity and flooding effects. Elevated CO2 did not ameliorate flooding or salinity stress. Consequently, climate-related changes in sea level or human-caused alterations in hydrology may override atmospheric CO2 concentrations in driving shifts in this plant community. The results of this study suggest caution in making extrapolations from species-specific responses to community-level predictions without detailed attention to the nuances of multifactor responses.

  9. An assessment of two-step linear regression and a multifactor probit analysis as alternatives to acute to chronic ratios in the estimation of chronic response from acute toxicity data to derive water quality guidelines.

    PubMed

    Slaughter, Andrew R; Palmer, Carolyn G; Muller, Wilhelmine J

    2007-04-01

    In aquatic ecotoxicology, acute to chronic ratios (ACRs) are often used to predict chronic responses from available acute data to derive water quality guidelines, despite many problems associated with this method. This paper explores the comparative protectiveness and accuracy of predicted guideline values derived from the ACR, linear regression analysis (LRA), and multifactor probit analysis (MPA) extrapolation methods applied to acute toxicity data for aquatic macroinvertebrates. Although the authors of the LRA and MPA methods advocate the use of extrapolated lethal effects in the 0.01% to 10% lethal concentration (LC0.01-LC10) range to predict safe chronic exposure levels to toxicants, the use of an extrapolated LC50 value divided by a safety factor of 5 was in addition explored here because of higher statistical confidence surrounding the LC50 value. The LRA LC50/5 method was found to compare most favorably with available experimental chronic toxicity data and was therefore most likely to be sufficiently protective, although further validation with the use of additional species is needed. Values derived by the ACR method were the least protective. It is suggested that there is an argument for the replacement of ACRs in developing water quality guidelines by the LRA LC50/5 method.

  10. Soil moisture surpasses elevated CO2 and temperature as a control on soil carbon dynamics in a multi-factor climate change experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garten Jr, Charles T; Classen, Aimee T; Norby, Richard J

    2009-01-01

    Some single-factor experiments suggest that elevated CO2 concentrations can increase soil carbon, but few experiments have examined the effects of interacting environmental factors on soil carbon dynamics. We undertook studies of soil carbon and nitrogen in a multi-factor (CO2 x temperature x soil moisture) climate change experiment on a constructed old-field ecosystem. After four growing seasons, elevated CO2 had no measurable effect on carbon and nitrogen concentrations in whole soil, particulate organic matter (POM), and mineral-associated organic matter (MOM). Analysis of stable carbon isotopes, under elevated CO2, indicated between 14 and 19% new soil carbon under two different watering treatmentsmore » with as much as 48% new carbon in POM. Despite significant belowground inputs of new organic matter, soil carbon concentrations and stocks in POM declined over four years under soil moisture conditions that corresponded to prevailing precipitation inputs (1,300 mm yr-1). Changes over time in soil carbon and nitrogen under a drought treatment (approximately 20% lower soil water content) were not statistically significant. Reduced soil moisture lowered soil CO2 efflux and slowed soil carbon cycling in the POM pool. In this experiment, soil moisture (produced by different watering treatments) was more important than elevated CO2 and temperature as a control on soil carbon dynamics.« less

  11. Multifactor Determinants of Visual Accommodation as a Critical Intervening Variable in the Perception of Size and Distance: Phase I Report

    DTIC Science & Technology

    1997-11-01

    Expanded subset of the illustration to clarify the locus of the off-axis end point of retinal stimulation for correct accommodation. 55 Figure...12c. Expanded illustration to clarify the locus of the off -axis end point of retinal stimulation for myopic accommodation. 55 Figure 12d...Expanded illustration to clarify the locus of the off -axis end point of retinal stimulation for hyperopic accommodation. 56 Figure 13. Simplified

  12. Multi-factor Effects on the Durability of Recycle Aggregate Concrete

    NASA Astrophysics Data System (ADS)

    Ma, Huan; Cui, Yu-Li; Zhu, Wen-Yu; Xie, Xian-Jie

    2016-05-01

    Recycled Aggregate Concrete (RAC) was prepared with different recycled aggregate replacement ratio, 0, 30%, 70% and 100% respectively. The performances of RAC were examined by the freeze-thaw cycle, carbonization and sulfate attack to assess the durability. Results show that test sequence has different effects on the durability of RAC; the durability is poorer when carbonation experiment was carried out firstly, and then other experiment was carried out again; the durability is better when recycled aggregate replacement ratio is 70%.

  13. Multi-factor Analysis of Pre-control Fracture Simulations about Projectile Material

    NASA Astrophysics Data System (ADS)

    Wan, Ren-Yi; Zhou, Wei

    2016-05-01

    The study of projectile material pre-control fracture is helpful to improve the projectile metal effective fragmentation and the material utilization rate. Fragments muzzle velocity and lethality can be affected by the different explosive charge and the way of initiation. The finite element software can simulate the process of projectile explosive rupture which has a pre-groove in the projectile shell surface and analysis of typical node velocity change with time, to provides a reference for the design and optimization of precontrol frag.

  14. PKPass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adamson, Ryan M.

    Password management solutions exist, but few are designed for enterprise systems administrators sharing oncall rotations. Due to the Multi-Factor Level of Assurance 4 effort, DOE is now distributing PIV cards with cryptographically signed certificate and private key pairs to administrators and other security-significant users. We utilize this public key infrastructure (PKI) to encrypt passwords for other recipients in a secure way. This is cross platform (works on OSX and Linux systems), and has already been adopted internally by the NCCS systems administration staff to replace their old password book system.

  15. Dynamically orthogonal field equations for stochastic flows and particle dynamics

    DTIC Science & Technology

    2011-02-01

    where uncertainty ‘lives’ as well as a system of Stochastic Di erential Equations that de nes how the uncertainty evolves in the time varying stochastic ... stochastic dynamical component that are both time and space dependent, we derive a system of field equations consisting of a Partial Differential Equation...a system of Stochastic Differential Equations that defines how the stochasticity evolves in the time varying stochastic subspace. These new

  16. How multiple factors control evapotranspiration in North America evergreen needleleaf forests.

    PubMed

    Chen, Yueming; Xue, Yueju; Hu, Yueming

    2018-05-01

    Identifying the factors dominating ecosystem water flux is a critical step for predicting evapotranspiration (ET). Here, the fuzzy rough set with binary shuffled frog leaping (BSFL-FRSA) was used to identify both individual factors and multi-factor combinations that dominate the half-hourly ET variation at evergreen needleleaf forests (ENFs) sites across three different climatic zones in the North America. Among 21factors, air temperature (TA), atmospheric CO 2 concentration (CCO 2 ), soil temperature (TS), soil water content (SWC) and net radiation (NETRAD) were evaluated as dominant single factors, contributed to the ET variation averaged for all ENF sites by 48%, 36%, 32%, 18% and 13%, respectively. While the importance order would vary with climatic zones, and TA was assessed as the most influential factor at a single climatic zone level, counting a contribution rate of 54.7%, 49.9%, and 38.6% in the subarctic, warm summer continental, and Mediterranean climatic zones, respectively. In view of impacts of each multi-factors combination on ET, both TA and CCO 2 made a contribution of 71% across three climate zones; the combination of TA, CCO 2 and NETRAD was evaluated the most dominant at Mediterranean and subarctic ENF sites, and the combination of TA, CCO 2 and TS at warm summer continental sites. Our results suggest that temperature was most critical for ET variation at the warm summer continental ENF. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Quantum stochastic calculus associated with quadratic quantum noises

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ji, Un Cig, E-mail: uncigji@chungbuk.ac.kr; Sinha, Kalyan B., E-mail: kbs-jaya@yahoo.co.in

    2016-02-15

    We first study a class of fundamental quantum stochastic processes induced by the generators of a six dimensional non-solvable Lie †-algebra consisting of all linear combinations of the generalized Gross Laplacian and its adjoint, annihilation operator, creation operator, conservation, and time, and then we study the quantum stochastic integrals associated with the class of fundamental quantum stochastic processes, and the quantum Itô formula is revisited. The existence and uniqueness of solution of a quantum stochastic differential equation is proved. The unitarity conditions of solutions of quantum stochastic differential equations associated with the fundamental processes are examined. The quantum stochastic calculusmore » extends the Hudson-Parthasarathy quantum stochastic calculus.« less

  18. Stochastic models for inferring genetic regulation from microarray gene expression data.

    PubMed

    Tian, Tianhai

    2010-03-01

    Microarray expression profiles are inherently noisy and many different sources of variation exist in microarray experiments. It is still a significant challenge to develop stochastic models to realize noise in microarray expression profiles, which has profound influence on the reverse engineering of genetic regulation. Using the target genes of the tumour suppressor gene p53 as the test problem, we developed stochastic differential equation models and established the relationship between the noise strength of stochastic models and parameters of an error model for describing the distribution of the microarray measurements. Numerical results indicate that the simulated variance from stochastic models with a stochastic degradation process can be represented by a monomial in terms of the hybridization intensity and the order of the monomial depends on the type of stochastic process. The developed stochastic models with multiple stochastic processes generated simulations whose variance is consistent with the prediction of the error model. This work also established a general method to develop stochastic models from experimental information. 2009 Elsevier Ireland Ltd. All rights reserved.

  19. On the efficacy of stochastic collocation, stochastic Galerkin, and stochastic reduced order models for solving stochastic problems

    DOE PAGES

    Richard V. Field, Jr.; Emery, John M.; Grigoriu, Mircea Dan

    2015-05-19

    The stochastic collocation (SC) and stochastic Galerkin (SG) methods are two well-established and successful approaches for solving general stochastic problems. A recently developed method based on stochastic reduced order models (SROMs) can also be used. Herein we provide a comparison of the three methods for some numerical examples; our evaluation only holds for the examples considered in the paper. The purpose of the comparisons is not to criticize the SC or SG methods, which have proven very useful for a broad range of applications, nor is it to provide overall ratings of these methods as compared to the SROM method.more » Furthermore, our objectives are to present the SROM method as an alternative approach to solving stochastic problems and provide information on the computational effort required by the implementation of each method, while simultaneously assessing their performance for a collection of specific problems.« less

  20. [Clinical genealogy and genetic-mathematical study of families of probands with uterine cancer in the Chernovitsy Region].

    PubMed

    Galina, K P; Peresun'ko, A P; Glushchenko, N N

    2001-01-01

    Complex clinic-genealogical and genetic-mathematical investigation of 482 patients with uterus cancer from Chernovtsy region was carried out. It was proved that primary in the population is multifactoral origin of uterus cancer. Percentage of genetic component in general susceptibility to disease was 11.40 9.40. Recurrent risk of the malignant tumor in progeny has been estimated. Results of the investigation are the base for development and execution of uterus cancer precaution and segregated with it oncopathology in proband relatives.

  1. Continuous Cultivation for Apparent Optimization of Defined Media for Cellulomonas sp. and Bacillus cereus

    PubMed Central

    Summers, R. J.; Boudreaux, D. P.; Srinivasan, V. R.

    1979-01-01

    Steady-state continuous culture was used to optimize lean chemically defined media for a Cellulomonas sp. and Bacillus cereus strain T. Both organisms were extremely sensitive to variations in trace-metal concentrations. However, medium optimization by this technique proved rapid, and multifactor screening was easily conducted by using a minimum of instrumentation. The optimized media supported critical dilution rates of 0.571 and 0.467 h−1 for Cellulomonas and Bacillus, respectively. These values approximated maximum growth rate values observed in batch culture. PMID:16345417

  2. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  3. Unification theory of optimal life histories and linear demographic models in internal stochasticity.

    PubMed

    Oizumi, Ryo

    2014-01-01

    Life history of organisms is exposed to uncertainty generated by internal and external stochasticities. Internal stochasticity is generated by the randomness in each individual life history, such as randomness in food intake, genetic character and size growth rate, whereas external stochasticity is due to the environment. For instance, it is known that the external stochasticity tends to affect population growth rate negatively. It has been shown in a recent theoretical study using path-integral formulation in structured linear demographic models that internal stochasticity can affect population growth rate positively or negatively. However, internal stochasticity has not been the main subject of researches. Taking account of effect of internal stochasticity on the population growth rate, the fittest organism has the optimal control of life history affected by the stochasticity in the habitat. The study of this control is known as the optimal life schedule problems. In order to analyze the optimal control under internal stochasticity, we need to make use of "Stochastic Control Theory" in the optimal life schedule problem. There is, however, no such kind of theory unifying optimal life history and internal stochasticity. This study focuses on an extension of optimal life schedule problems to unify control theory of internal stochasticity into linear demographic models. First, we show the relationship between the general age-states linear demographic models and the stochastic control theory via several mathematical formulations, such as path-integral, integral equation, and transition matrix. Secondly, we apply our theory to a two-resource utilization model for two different breeding systems: semelparity and iteroparity. Finally, we show that the diversity of resources is important for species in a case. Our study shows that this unification theory can address risk hedges of life history in general age-states linear demographic models.

  4. Unification Theory of Optimal Life Histories and Linear Demographic Models in Internal Stochasticity

    PubMed Central

    Oizumi, Ryo

    2014-01-01

    Life history of organisms is exposed to uncertainty generated by internal and external stochasticities. Internal stochasticity is generated by the randomness in each individual life history, such as randomness in food intake, genetic character and size growth rate, whereas external stochasticity is due to the environment. For instance, it is known that the external stochasticity tends to affect population growth rate negatively. It has been shown in a recent theoretical study using path-integral formulation in structured linear demographic models that internal stochasticity can affect population growth rate positively or negatively. However, internal stochasticity has not been the main subject of researches. Taking account of effect of internal stochasticity on the population growth rate, the fittest organism has the optimal control of life history affected by the stochasticity in the habitat. The study of this control is known as the optimal life schedule problems. In order to analyze the optimal control under internal stochasticity, we need to make use of “Stochastic Control Theory” in the optimal life schedule problem. There is, however, no such kind of theory unifying optimal life history and internal stochasticity. This study focuses on an extension of optimal life schedule problems to unify control theory of internal stochasticity into linear demographic models. First, we show the relationship between the general age-states linear demographic models and the stochastic control theory via several mathematical formulations, such as path–integral, integral equation, and transition matrix. Secondly, we apply our theory to a two-resource utilization model for two different breeding systems: semelparity and iteroparity. Finally, we show that the diversity of resources is important for species in a case. Our study shows that this unification theory can address risk hedges of life history in general age-states linear demographic models. PMID:24945258

  5. Endogenous Groups and Dynamic Selection in Mechanism Design*

    PubMed Central

    Madeira, Gabriel A.; Townsend, Robert M.

    2010-01-01

    We create a dynamic theory of endogenous risk sharing groups, with good internal information, and their coexistence with relative performance, individualistic regimes, which are informationally more opaque. Inequality and organizational form are determined simultaneously. Numerical techniques and succinct re-formulations of mechanism design problems with suitable choice of promised utilities allow the computation of a stochastic steady state and its transitions. Regions of low inequality and moderate to high wealth (utility promises) produce the relative performance regime, while regions of high inequality and low wealth produce the risk sharing group regime. If there is a cost to prevent coalitions, risk sharing groups emerge at high wealth levels also. Transitions from the relative performance regime to the group regime tend to occur when rewards to observed outputs exacerbate inequality, while transitions from the group regime to the relative performance regime tend to come with a decrease in utility promises. Some regions of inequality and wealth deliver long term persistence of organization form and inequality, while other regions deliver high levels of volatility. JEL Classification Numbers: D23,D71,D85,O17. PMID:20107614

  6. Time-varying coefficient vector autoregressions model based on dynamic correlation with an application to crude oil and stock markets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Fengbin, E-mail: fblu@amss.ac.cn

    This paper proposes a new time-varying coefficient vector autoregressions (VAR) model, in which the coefficient is a linear function of dynamic lagged correlation. The proposed model allows for flexibility in choices of dynamic correlation models (e.g. dynamic conditional correlation generalized autoregressive conditional heteroskedasticity (GARCH) models, Markov-switching GARCH models and multivariate stochastic volatility models), which indicates that it can describe many types of time-varying causal effects. Time-varying causal relations between West Texas Intermediate (WTI) crude oil and the US Standard and Poor’s 500 (S&P 500) stock markets are examined by the proposed model. The empirical results show that their causal relationsmore » evolve with time and display complex characters. Both positive and negative causal effects of the WTI on the S&P 500 in the subperiods have been found and confirmed by the traditional VAR models. Similar results have been obtained in the causal effects of S&P 500 on WTI. In addition, the proposed model outperforms the traditional VAR model.« less

  7. Valuing options in shot noise market

    NASA Astrophysics Data System (ADS)

    Laskin, Nick

    2018-07-01

    A new exactly solvable option pricing model has been introduced and elaborated. It is assumed that a stock price follows a Geometric shot noise process. An arbitrage-free integro-differential option pricing equation has been obtained and solved. The new Greeks have been analytically calculated. It has been shown that in diffusion approximation the developed option pricing model incorporates the well-known Black-Scholes equation and its solution. The stochastic dynamic origin of the Black-Scholes volatility has been uncovered. To model the observed market stock price patterns consisting of high frequency small magnitude and low frequency large magnitude jumps, the superposition of two Geometric shot noises has been implemented. A new generalized option pricing equation has been obtained and its exact solution was found. Merton's jump-diffusion formula for option price was recovered in diffusion approximation. Despite the non-Gaussian nature of probability distributions involved, the new option pricing model has the same degree of analytical tractability as the Black-Scholes model and the Merton jump-diffusion model. This attractive feature allows one to derive exact formulas to value options and option related instruments in the market with jump-like price patterns.

  8. Time-varying coefficient vector autoregressions model based on dynamic correlation with an application to crude oil and stock markets.

    PubMed

    Lu, Fengbin; Qiao, Han; Wang, Shouyang; Lai, Kin Keung; Li, Yuze

    2017-01-01

    This paper proposes a new time-varying coefficient vector autoregressions (VAR) model, in which the coefficient is a linear function of dynamic lagged correlation. The proposed model allows for flexibility in choices of dynamic correlation models (e.g. dynamic conditional correlation generalized autoregressive conditional heteroskedasticity (GARCH) models, Markov-switching GARCH models and multivariate stochastic volatility models), which indicates that it can describe many types of time-varying causal effects. Time-varying causal relations between West Texas Intermediate (WTI) crude oil and the US Standard and Poor's 500 (S&P 500) stock markets are examined by the proposed model. The empirical results show that their causal relations evolve with time and display complex characters. Both positive and negative causal effects of the WTI on the S&P 500 in the subperiods have been found and confirmed by the traditional VAR models. Similar results have been obtained in the causal effects of S&P 500 on WTI. In addition, the proposed model outperforms the traditional VAR model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. On a numerical method for solving integro-differential equations with variable coefficients with applications in finance

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, O.; Rodochenko, V.

    2018-03-01

    We propose a new general numerical method aimed to solve integro-differential equations with variable coefficients. The problem under consideration arises in finance where in the context of pricing barrier options in a wide class of stochastic volatility models with jumps. To handle the effect of the correlation between the price and the variance, we use a suitable substitution for processes. Then we construct a Markov-chain approximation for the variation process on small time intervals and apply a maturity randomization technique. The result is a system of boundary problems for integro-differential equations with constant coefficients on the line in each vertex of the chain. We solve the arising problems using a numerical Wiener-Hopf factorization method. The approximate formulae for the factors are efficiently implemented by means of the Fast Fourier Transform. Finally, we use a recurrent procedure that moves backwards in time on the variance tree. We demonstrate the convergence of the method using Monte-Carlo simulations and compare our results with the results obtained by the Wiener-Hopf method with closed-form expressions of the factors.

  10. The co-evolutionary dynamics of directed network of spin market agents

    NASA Astrophysics Data System (ADS)

    Horváth, Denis; Kuscsik, Zoltán; Gmitra, Martin

    2006-09-01

    The spin market model [S. Bornholdt, Int. J. Mod. Phys. C 12 (2001) 667] is generalized by employing co-evolutionary principles, where strategies of the interacting and competitive traders are represented by local and global couplings between the nodes of dynamic directed stochastic network. The co-evolutionary principles are applied in the frame of Bak-Sneppen self-organized dynamics [P. Bak, K. Sneppen, Phys. Rev. Lett. 71 (1993) 4083] that includes the processes of selection and extinction actuated by the local (node) fitness. The local fitness is related to orientation of spin agent with respect to the instant magnetization. The stationary regime is formed due to the interplay of self-organization and adaptivity effects. The fat tailed distributions of log-price returns are identified numerically. The non-trivial model consequence is the evidence of the long time market memory indicated by the power-law range of the autocorrelation function of volatility with exponent smaller than one. The simulations yield network topology with broad-scale node degree distribution characterized by the range of exponents 1.3<γin<3 coinciding with social networks.

  11. Output Feedback Stabilization for a Class of Multi-Variable Bilinear Stochastic Systems with Stochastic Coupling Attenuation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Qichun; Zhou, Jinglin; Wang, Hong

    In this paper, stochastic coupling attenuation is investigated for a class of multi-variable bilinear stochastic systems and a novel output feedback m-block backstepping controller with linear estimator is designed, where gradient descent optimization is used to tune the design parameters of the controller. It has been shown that the trajectories of the closed-loop stochastic systems are bounded in probability sense and the stochastic coupling of the system outputs can be effectively attenuated by the proposed control algorithm. Moreover, the stability of the stochastic systems is analyzed and the effectiveness of the proposed method has been demonstrated using a simulated example.

  12. Optimal Control for Stochastic Delay Evolution Equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Qingxin, E-mail: mqx@hutc.zj.cn; Shen, Yang, E-mail: skyshen87@gmail.com

    2016-08-15

    In this paper, we investigate a class of infinite-dimensional optimal control problems, where the state equation is given by a stochastic delay evolution equation with random coefficients, and the corresponding adjoint equation is given by an anticipated backward stochastic evolution equation. We first prove the continuous dependence theorems for stochastic delay evolution equations and anticipated backward stochastic evolution equations, and show the existence and uniqueness of solutions to anticipated backward stochastic evolution equations. Then we establish necessary and sufficient conditions for optimality of the control problem in the form of Pontryagin’s maximum principles. To illustrate the theoretical results, we applymore » stochastic maximum principles to study two examples, an infinite-dimensional linear-quadratic control problem with delay and an optimal control of a Dirichlet problem for a stochastic partial differential equation with delay. Further applications of the two examples to a Cauchy problem for a controlled linear stochastic partial differential equation and an optimal harvesting problem are also considered.« less

  13. Stochastic Community Assembly: Does It Matter in Microbial Ecology?

    PubMed

    Zhou, Jizhong; Ning, Daliang

    2017-12-01

    Understanding the mechanisms controlling community diversity, functions, succession, and biogeography is a central, but poorly understood, topic in ecology, particularly in microbial ecology. Although stochastic processes are believed to play nonnegligible roles in shaping community structure, their importance relative to deterministic processes is hotly debated. The importance of ecological stochasticity in shaping microbial community structure is far less appreciated. Some of the main reasons for such heavy debates are the difficulty in defining stochasticity and the diverse methods used for delineating stochasticity. Here, we provide a critical review and synthesis of data from the most recent studies on stochastic community assembly in microbial ecology. We then describe both stochastic and deterministic components embedded in various ecological processes, including selection, dispersal, diversification, and drift. We also describe different approaches for inferring stochasticity from observational diversity patterns and highlight experimental approaches for delineating ecological stochasticity in microbial communities. In addition, we highlight research challenges, gaps, and future directions for microbial community assembly research. Copyright © 2017 American Society for Microbiology.

  14. Persistence and extinction of a stochastic single-species model under regime switching in a polluted environment II.

    PubMed

    Liu, Meng; Wang, Ke

    2010-12-07

    This is a continuation of our paper [Liu, M., Wang, K., 2010. Persistence and extinction of a stochastic single-species model under regime switching in a polluted environment, J. Theor. Biol. 264, 934-944]. Taking both white noise and colored noise into account, a stochastic single-species model under regime switching in a polluted environment is studied. Sufficient conditions for extinction, stochastic nonpersistence in the mean, stochastic weak persistence and stochastic permanence are established. The threshold between stochastic weak persistence and extinction is obtained. The results show that a different type of noise has a different effect on the survival results. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Maximum principle for a stochastic delayed system involving terminal state constraints.

    PubMed

    Wen, Jiaqiang; Shi, Yufeng

    2017-01-01

    We investigate a stochastic optimal control problem where the controlled system is depicted as a stochastic differential delayed equation; however, at the terminal time, the state is constrained in a convex set. We firstly introduce an equivalent backward delayed system depicted as a time-delayed backward stochastic differential equation. Then a stochastic maximum principle is obtained by virtue of Ekeland's variational principle. Finally, applications to a state constrained stochastic delayed linear-quadratic control model and a production-consumption choice problem are studied to illustrate the main obtained result.

  16. Momentum Maps and Stochastic Clebsch Action Principles

    NASA Astrophysics Data System (ADS)

    Cruzeiro, Ana Bela; Holm, Darryl D.; Ratiu, Tudor S.

    2018-01-01

    We derive stochastic differential equations whose solutions follow the flow of a stochastic nonlinear Lie algebra operation on a configuration manifold. For this purpose, we develop a stochastic Clebsch action principle, in which the noise couples to the phase space variables through a momentum map. This special coupling simplifies the structure of the resulting stochastic Hamilton equations for the momentum map. In particular, these stochastic Hamilton equations collectivize for Hamiltonians that depend only on the momentum map variable. The Stratonovich equations are derived from the Clebsch variational principle and then converted into Itô form. In comparing the Stratonovich and Itô forms of the stochastic dynamical equations governing the components of the momentum map, we find that the Itô contraction term turns out to be a double Poisson bracket. Finally, we present the stochastic Hamiltonian formulation of the collectivized momentum map dynamics and derive the corresponding Kolmogorov forward and backward equations.

  17. Dynamics of non-holonomic systems with stochastic transport

    NASA Astrophysics Data System (ADS)

    Holm, D. D.; Putkaradze, V.

    2018-01-01

    This paper formulates a variational approach for treating observational uncertainty and/or computational model errors as stochastic transport in dynamical systems governed by action principles under non-holonomic constraints. For this purpose, we derive, analyse and numerically study the example of an unbalanced spherical ball rolling under gravity along a stochastic path. Our approach uses the Hamilton-Pontryagin variational principle, constrained by a stochastic rolling condition, which we show is equivalent to the corresponding stochastic Lagrange-d'Alembert principle. In the example of the rolling ball, the stochasticity represents uncertainty in the observation and/or error in the computational simulation of the angular velocity of rolling. The influence of the stochasticity on the deterministically conserved quantities is investigated both analytically and numerically. Our approach applies to a wide variety of stochastic, non-holonomically constrained systems, because it preserves the mathematical properties inherited from the variational principle.

  18. Perceived Gender Presentation Among Transgender and Gender Diverse Youth: Approaches to Analysis and Associations with Bullying Victimization and Emotional Distress.

    PubMed

    Gower, Amy L; Rider, G Nicole; Coleman, Eli; Brown, Camille; McMorris, Barbara J; Eisenberg, Marla E

    2018-06-19

    As measures of birth-assigned sex, gender identity, and perceived gender presentation are increasingly included in large-scale research studies, data analysis approaches incorporating such measures are needed. Large samples capable of demonstrating variation within the transgender and gender diverse (TGD) community can inform intervention efforts to improve health equity. A population-based sample of TGD youth was used to examine associations between perceived gender presentation, bullying victimization, and emotional distress using two data analysis approaches. Secondary data analysis of the Minnesota Student Survey included 2168 9th and 11th graders who identified as "transgender, genderqueer, genderfluid, or unsure about their gender identity." Youth reported their biological sex, how others perceived their gender presentation, experiences of four forms of bullying victimization, and four measures of emotional distress. Logistic regression and multifactor analysis of variance (ANOVA) were used to compare and contrast two analysis approaches. Logistic regressions indicated that TGD youth perceived as more gender incongruent had higher odds of bullying victimization and emotional distress relative to those perceived as very congruent with their biological sex. Multifactor ANOVAs demonstrated more variable patterns and allowed for comparisons of each perceived presentation group with all other groups, reflecting nuances that exist within TGD youth. Researchers should adopt data analysis strategies that allow for comparisons of all perceived gender presentation categories rather than assigning a reference group. Those working with TGD youth should be particularly attuned to youth perceived as gender incongruent as they may be more likely to experience bullying victimization and emotional distress.

  19. Exploring the interaction among EPHX1, GSTP1, SERPINE2, and TGFB1 contributing to the quantitative traits of chronic obstructive pulmonary disease in Chinese Han population.

    PubMed

    An, Li; Lin, Yingxiang; Yang, Ting; Hua, Lin

    2016-05-18

    Currently, the majority of genetic association studies on chronic obstructive pulmonary disease (COPD) risk focused on identifying the individual effects of single nucleotide polymorphisms (SNPs) as well as their interaction effects on the disease. However, conventional genetic studies often use binary disease status as the primary phenotype, but for COPD, many quantitative traits have the potential correlation with the disease status and closely reflect pathological changes. Here, we genotyped 44 SNPs from four genes (EPHX1, GSTP1, SERPINE2, and TGFB1) in 310 patients and 203 controls which belonged to the Chinese Han population to test the two-way and three-way genetic interactions with COPD-related quantitative traits using recently developed generalized multifactor dimensionality reduction (GMDR) and quantitative multifactor dimensionality reduction (QMDR) algorithms. Based on the 310 patients and the whole samples of 513 subjects, the best gene-gene interactions models were detected for four lung-function-related quantitative traits. For the forced expiratory volume in 1 s (FEV1), the best interaction was seen from EPHX1, SERPINE2, and GSTP1. For FEV1%pre, the forced vital capacity (FVC), and FEV1/FVC, the best interactions were seen from SERPINE2 and TGFB1. The results of this study provide further evidence for the genotype combinations at risk of developing COPD in Chinese Han population and improve the understanding on the genetic etiology of COPD and COPD-related quantitative traits.

  20. Information-Theoretic Metrics for Visualizing Gene-Environment Interactions

    PubMed Central

    Chanda, Pritam ; Zhang, Aidong ; Brazeau, Daniel ; Sucheston, Lara ; Freudenheim, Jo L. ; Ambrosone, Christine ; Ramanathan, Murali 

    2007-01-01

    The purpose of our work was to develop heuristics for visualizing and interpreting gene-environment interactions (GEIs) and to assess the dependence of candidate visualization metrics on biological and study-design factors. Two information-theoretic metrics, the k-way interaction information (KWII) and the total correlation information (TCI), were investigated. The effectiveness of the KWII and TCI to detect GEIs in a diverse range of simulated data sets and a Crohn disease data set was assessed. The sensitivity of the KWII and TCI spectra to biological and study-design variables was determined. Head-to-head comparisons with the relevance-chain, multifactor dimensionality reduction, and the pedigree disequilibrium test (PDT) methods were obtained. The KWII and TCI spectra, which are graphical summaries of the KWII and TCI for each subset of environmental and genotype variables, were found to detect each known GEI in the simulated data sets. The patterns in the KWII and TCI spectra were informative for factors such as case-control misassignment, locus heterogeneity, allele frequencies, and linkage disequilibrium. The KWII and TCI spectra were found to have excellent sensitivity for identifying the key disease-associated genetic variations in the Crohn disease data set. In head-to-head comparisons with the relevance-chain, multifactor dimensionality reduction, and PDT methods, the results from visual interpretation of the KWII and TCI spectra performed satisfactorily. The KWII and TCI are promising metrics for visualizing GEIs. They are capable of detecting interactions among numerous single-nucleotide polymorphisms and environmental variables for a diverse range of GEI models. PMID:17924337

  1. Multi-factor challenge/response approach for remote biometric authentication

    NASA Astrophysics Data System (ADS)

    Al-Assam, Hisham; Jassim, Sabah A.

    2011-06-01

    Although biometric authentication is perceived to be more reliable than traditional authentication schemes, it becomes vulnerable to many attacks when it comes to remote authentication over open networks and raises serious privacy concerns. This paper proposes a biometric-based challenge-response approach to be used for remote authentication between two parties A and B over open networks. In the proposed approach, a remote authenticator system B (e.g. a bank) challenges its client A who wants to authenticate his/her self to the system by sending a one-time public random challenge. The client A responds by employing the random challenge along with secret information obtained from a password and a token to produce a one-time cancellable representation of his freshly captured biometric sample. The one-time biometric representation, which is based on multi-factor, is then sent back to B for matching. Here, we argue that eavesdropping of the one-time random challenge and/or the resulting one-time biometric representation does not compromise the security of the system, and no information about the original biometric data is leaked. In addition to securing biometric templates, the proposed protocol offers a practical solution for the replay attack on biometric systems. Moreover, we propose a new scheme for generating a password-based pseudo random numbers/permutation to be used as a building block in the proposed approach. The proposed scheme is also designed to provide protection against repudiation. We illustrate the viability and effectiveness of the proposed approach by experimental results based on two biometric modalities: fingerprint and face biometrics.

  2. Time-ordered product expansions for computational stochastic system biology.

    PubMed

    Mjolsness, Eric

    2013-06-01

    The time-ordered product framework of quantum field theory can also be used to understand salient phenomena in stochastic biochemical networks. It is used here to derive Gillespie's stochastic simulation algorithm (SSA) for chemical reaction networks; consequently, the SSA can be interpreted in terms of Feynman diagrams. It is also used here to derive other, more general simulation and parameter-learning algorithms including simulation algorithms for networks of stochastic reaction-like processes operating on parameterized objects, and also hybrid stochastic reaction/differential equation models in which systems of ordinary differential equations evolve the parameters of objects that can also undergo stochastic reactions. Thus, the time-ordered product expansion can be used systematically to derive simulation and parameter-fitting algorithms for stochastic systems.

  3. Variational principles for stochastic fluid dynamics

    PubMed Central

    Holm, Darryl D.

    2015-01-01

    This paper derives stochastic partial differential equations (SPDEs) for fluid dynamics from a stochastic variational principle (SVP). The paper proceeds by taking variations in the SVP to derive stochastic Stratonovich fluid equations; writing their Itô representation; and then investigating the properties of these stochastic fluid models in comparison with each other, and with the corresponding deterministic fluid models. The circulation properties of the stochastic Stratonovich fluid equations are found to closely mimic those of the deterministic ideal fluid models. As with deterministic ideal flows, motion along the stochastic Stratonovich paths also preserves the helicity of the vortex field lines in incompressible stochastic flows. However, these Stratonovich properties are not apparent in the equivalent Itô representation, because they are disguised by the quadratic covariation drift term arising in the Stratonovich to Itô transformation. This term is a geometric generalization of the quadratic covariation drift term already found for scalar densities in Stratonovich's famous 1966 paper. The paper also derives motion equations for two examples of stochastic geophysical fluid dynamics; namely, the Euler–Boussinesq and quasi-geostropic approximations. PMID:27547083

  4. Universal fuzzy integral sliding-mode controllers for stochastic nonlinear systems.

    PubMed

    Gao, Qing; Liu, Lu; Feng, Gang; Wang, Yong

    2014-12-01

    In this paper, the universal integral sliding-mode controller problem for the general stochastic nonlinear systems modeled by Itô type stochastic differential equations is investigated. One of the main contributions is that a novel dynamic integral sliding mode control (DISMC) scheme is developed for stochastic nonlinear systems based on their stochastic T-S fuzzy approximation models. The key advantage of the proposed DISMC scheme is that two very restrictive assumptions in most existing ISMC approaches to stochastic fuzzy systems have been removed. Based on the stochastic Lyapunov theory, it is shown that the closed-loop control system trajectories are kept on the integral sliding surface almost surely since the initial time, and moreover, the stochastic stability of the sliding motion can be guaranteed in terms of linear matrix inequalities. Another main contribution is that the results of universal fuzzy integral sliding-mode controllers for two classes of stochastic nonlinear systems, along with constructive procedures to obtain the universal fuzzy integral sliding-mode controllers, are provided, respectively. Simulation results from an inverted pendulum example are presented to illustrate the advantages and effectiveness of the proposed approaches.

  5. Stochastic stability

    NASA Technical Reports Server (NTRS)

    Kushner, H. J.

    1972-01-01

    The field of stochastic stability is surveyed, with emphasis on the invariance theorems and their potential application to systems with randomly varying coefficients. Some of the basic ideas are reviewed, which underlie the stochastic Liapunov function approach to stochastic stability. The invariance theorems are discussed in detail.

  6. Persistence and extinction of a stochastic single-specie model under regime switching in a polluted environment.

    PubMed

    Liu, Meng; Wang, Ke

    2010-06-07

    A new single-species model disturbed by both white noise and colored noise in a polluted environment is developed and analyzed. Sufficient criteria for extinction, stochastic nonpersistence in the mean, stochastic weak persistence in the mean, stochastic strong persistence in the mean and stochastic permanence of the species are established. The threshold between stochastic weak persistence in the mean and extinction is obtained. The results show that both white and colored environmental noises have sufficient effect to the survival results. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  7. Stochastic differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sobczyk, K.

    1990-01-01

    This book provides a unified treatment of both regular (or random) and Ito stochastic differential equations. It focuses on solution methods, including some developed only recently. Applications are discussed, in particular an insight is given into both the mathematical structure, and the most efficient solution methods (analytical as well as numerical). Starting from basic notions and results of the theory of stochastic processes and stochastic calculus (including Ito's stochastic integral), many principal mathematical problems and results related to stochastic differential equations are expounded here for the first time. Applications treated include those relating to road vehicles, earthquake excitations and offshoremore » structures.« less

  8. Effect of the heterogeneous neuron and information transmission delay on stochastic resonance of neuronal networks

    NASA Astrophysics Data System (ADS)

    Wang, Qingyun; Zhang, Honghui; Chen, Guanrong

    2012-12-01

    We study the effect of heterogeneous neuron and information transmission delay on stochastic resonance of scale-free neuronal networks. For this purpose, we introduce the heterogeneity to the specified neuron with the highest degree. It is shown that in the absence of delay, an intermediate noise level can optimally assist spike firings of collective neurons so as to achieve stochastic resonance on scale-free neuronal networks for small and intermediate αh, which plays a heterogeneous role. Maxima of stochastic resonance measure are enhanced as αh increases, which implies that the heterogeneity can improve stochastic resonance. However, as αh is beyond a certain large value, no obvious stochastic resonance can be observed. If the information transmission delay is introduced to neuronal networks, stochastic resonance is dramatically affected. In particular, the tuned information transmission delay can induce multiple stochastic resonance, which can be manifested as well-expressed maximum in the measure for stochastic resonance, appearing every multiple of one half of the subthreshold stimulus period. Furthermore, we can observe that stochastic resonance at odd multiple of one half of the subthreshold stimulus period is subharmonic, as opposed to the case of even multiple of one half of the subthreshold stimulus period. More interestingly, multiple stochastic resonance can also be improved by the suitable heterogeneous neuron. Presented results can provide good insights into the understanding of the heterogeneous neuron and information transmission delay on realistic neuronal networks.

  9. Ultimate open pit stochastic optimization

    NASA Astrophysics Data System (ADS)

    Marcotte, Denis; Caron, Josiane

    2013-02-01

    Classical open pit optimization (maximum closure problem) is made on block estimates, without directly considering the block grades uncertainty. We propose an alternative approach of stochastic optimization. The stochastic optimization is taken as the optimal pit computed on the block expected profits, rather than expected grades, computed from a series of conditional simulations. The stochastic optimization generates, by construction, larger ore and waste tonnages than the classical optimization. Contrary to the classical approach, the stochastic optimization is conditionally unbiased for the realized profit given the predicted profit. A series of simulated deposits with different variograms are used to compare the stochastic approach, the classical approach and the simulated approach that maximizes expected profit among simulated designs. Profits obtained with the stochastic optimization are generally larger than the classical or simulated pit. The main factor controlling the relative gain of stochastic optimization compared to classical approach and simulated pit is shown to be the information level as measured by the boreholes spacing/range ratio. The relative gains of the stochastic approach over the classical approach increase with the treatment costs but decrease with mining costs. The relative gains of the stochastic approach over the simulated pit approach increase both with the treatment and mining costs. At early stages of an open pit project, when uncertainty is large, the stochastic optimization approach appears preferable to the classical approach or the simulated pit approach for fair comparison of the values of alternative projects and for the initial design and planning of the open pit.

  10. Stochastic effects in a seasonally forced epidemic model

    NASA Astrophysics Data System (ADS)

    Rozhnova, G.; Nunes, A.

    2010-10-01

    The interplay of seasonality, the system’s nonlinearities and intrinsic stochasticity, is studied for a seasonally forced susceptible-exposed-infective-recovered stochastic model. The model is explored in the parameter region that corresponds to childhood infectious diseases such as measles. The power spectrum of the stochastic fluctuations around the attractors of the deterministic system that describes the model in the thermodynamic limit is computed analytically and validated by stochastic simulations for large system sizes. Size effects are studied through additional simulations. Other effects such as switching between coexisting attractors induced by stochasticity often mentioned in the literature as playing an important role in the dynamics of childhood infectious diseases are also investigated. The main conclusion is that stochastic amplification, rather than these effects, is the key ingredient to understand the observed incidence patterns.

  11. The relationship between stochastic and deterministic quasi-steady state approximations.

    PubMed

    Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R

    2015-11-23

    The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.

  12. Stochastic Multi-Timescale Power System Operations With Variable Wind Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Hongyu; Krad, Ibrahim; Florita, Anthony

    This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less

  13. Stochastic computing with biomolecular automata

    PubMed Central

    Adar, Rivka; Benenson, Yaakov; Linshiz, Gregory; Rosner, Amit; Tishby, Naftali; Shapiro, Ehud

    2004-01-01

    Stochastic computing has a broad range of applications, yet electronic computers realize its basic step, stochastic choice between alternative computation paths, in a cumbersome way. Biomolecular computers use a different computational paradigm and hence afford novel designs. We constructed a stochastic molecular automaton in which stochastic choice is realized by means of competition between alternative biochemical pathways, and choice probabilities are programmed by the relative molar concentrations of the software molecules coding for the alternatives. Programmable and autonomous stochastic molecular automata have been shown to perform direct analysis of disease-related molecular indicators in vitro and may have the potential to provide in situ medical diagnosis and cure. PMID:15215499

  14. Stochastic Galerkin methods for the steady-state Navier–Stokes equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sousedík, Bedřich, E-mail: sousedik@umbc.edu; Elman, Howard C., E-mail: elman@cs.umd.edu

    2016-07-01

    We study the steady-state Navier–Stokes equations in the context of stochastic finite element discretizations. Specifically, we assume that the viscosity is a random field given in the form of a generalized polynomial chaos expansion. For the resulting stochastic problem, we formulate the model and linearization schemes using Picard and Newton iterations in the framework of the stochastic Galerkin method, and we explore properties of the resulting stochastic solutions. We also propose a preconditioner for solving the linear systems of equations arising at each step of the stochastic (Galerkin) nonlinear iteration and demonstrate its effectiveness for solving a set of benchmarkmore » problems.« less

  15. Analysis of a novel stochastic SIRS epidemic model with two different saturated incidence rates

    NASA Astrophysics Data System (ADS)

    Chang, Zhengbo; Meng, Xinzhu; Lu, Xiao

    2017-04-01

    This paper presents a stochastic SIRS epidemic model with two different nonlinear incidence rates and double epidemic asymmetrical hypothesis, and we devote to develop a mathematical method to obtain the threshold of the stochastic epidemic model. We firstly investigate the boundness and extinction of the stochastic system. Furthermore, we use Ito's formula, the comparison theorem and some new inequalities techniques of stochastic differential systems to discuss persistence in mean of two diseases on three cases. The results indicate that stochastic fluctuations can suppress the disease outbreak. Finally, numerical simulations about different noise disturbance coefficients are carried out to illustrate the obtained theoretical results.

  16. Stochastic Galerkin methods for the steady-state Navier–Stokes equations

    DOE PAGES

    Sousedík, Bedřich; Elman, Howard C.

    2016-04-12

    We study the steady-state Navier–Stokes equations in the context of stochastic finite element discretizations. Specifically, we assume that the viscosity is a random field given in the form of a generalized polynomial chaos expansion. For the resulting stochastic problem, we formulate the model and linearization schemes using Picard and Newton iterations in the framework of the stochastic Galerkin method, and we explore properties of the resulting stochastic solutions. We also propose a preconditioner for solving the linear systems of equations arising at each step of the stochastic (Galerkin) nonlinear iteration and demonstrate its effectiveness for solving a set of benchmarkmore » problems.« less

  17. Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing

    2014-09-01

    We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.

  18. [New International Classification of Chronic Pancreatitis (M-ANNHEIM multifactor classification system, 2007): principles, merits, and demerits].

    PubMed

    Tsimmerman, Ia S

    2008-01-01

    The new International Classification of Chronic Pancreatitis (designated as M-ANNHEIM) proposed by a group of German specialists in late 2007 is reviewed. All its sections are subjected to analysis (risk group categories, clinical stages and phases, variants of clinical course, diagnostic criteria for "established" and "suspected" pancreatitis, instrumental methods and functional tests used in the diagnosis, evaluation of the severity of the disease using a scoring system, stages of elimination of pain syndrome). The new classification is compared with the earlier classification proposed by the author. Its merits and demerits are discussed.

  19. [Typology and systematization of residual mental disorders in alcohol dependence].

    PubMed

    Klimenko, T V; Agafonova, S S

    2007-01-01

    The study of 85 patients with alcohol dependence appointed to forensic psychiatric expertise in the Serbsky research center of social and forensic psychiatry revealed the manifestation of polymorphic psychiatric and behavioral disorders (ICD-10 diagnosis F10.7--residual and late-onset psychotic disorders) after stopping the intoxication, withdrawal and post-withdralwal disorders. Taking into account the multifactor etiology of psychiatric disorders which are observed after ending of the direct effect of alcohol, a possibility of including other ICD-10 items to extend their diagnostics and thus provide the more accurate clinical verification of these states, is discussed.

  20. Crop status evaluations and yield predictions

    NASA Technical Reports Server (NTRS)

    Haun, J. R.

    1976-01-01

    One phase of the large area crop inventory project is presented. Wheat yield models based on the input of environmental variables potentially obtainable through the use of space remote sensing were developed and demonstrated. By the use of a unique method for visually qualifying daily plant development and subsequent multifactor computer analyses, it was possible to develop practical models for predicting crop development and yield. Development of wheat yield prediction models was based on the discovery that morphological changes in plants are detected and quantified on a daily basis, and that this change during a portion of the season was proportional to yield.

  1. Characteristics of group networks in the KOSPI and the KOSDAQ

    NASA Astrophysics Data System (ADS)

    Kim, Kyungsik; Ko, Jeung-Su; Yi, Myunggi

    2012-02-01

    We investigate the main feature of group networks in the KOSPI and KOSDAQ of Korean financial markets and analyze daily cross-correlations between price fluctuations for the 5-year time period from 2006 to 2010. We discuss the stabilities by undressing the market-wide effect using the Markowitz multi-factor model and the network-based approach. In particular we ascertain the explicit list of significant firms in the few largest eigenvectors from the undressed correlation matrix. Finally, we show the structure of group correlation by applying a network-based approach. In addition, the relation between market capitalizations and businesses is examined.

  2. Logistic regression trees for initial selection of interesting loci in case-control studies

    PubMed Central

    Nickolov, Radoslav Z; Milanov, Valentin B

    2007-01-01

    Modern genetic epidemiology faces the challenge of dealing with hundreds of thousands of genetic markers. The selection of a small initial subset of interesting markers for further investigation can greatly facilitate genetic studies. In this contribution we suggest the use of a logistic regression tree algorithm known as logistic tree with unbiased selection. Using the simulated data provided for Genetic Analysis Workshop 15, we show how this algorithm, with incorporation of multifactor dimensionality reduction method, can reduce an initial large pool of markers to a small set that includes the interesting markers with high probability. PMID:18466557

  3. Stochasticity in materials structure, properties, and processing—A review

    NASA Astrophysics Data System (ADS)

    Hull, Robert; Keblinski, Pawel; Lewis, Dan; Maniatty, Antoinette; Meunier, Vincent; Oberai, Assad A.; Picu, Catalin R.; Samuel, Johnson; Shephard, Mark S.; Tomozawa, Minoru; Vashishth, Deepak; Zhang, Shengbai

    2018-03-01

    We review the concept of stochasticity—i.e., unpredictable or uncontrolled fluctuations in structure, chemistry, or kinetic processes—in materials. We first define six broad classes of stochasticity: equilibrium (thermodynamic) fluctuations; structural/compositional fluctuations; kinetic fluctuations; frustration and degeneracy; imprecision in measurements; and stochasticity in modeling and simulation. In this review, we focus on the first four classes that are inherent to materials phenomena. We next develop a mathematical framework for describing materials stochasticity and then show how it can be broadly applied to these four materials-related stochastic classes. In subsequent sections, we describe structural and compositional fluctuations at small length scales that modify material properties and behavior at larger length scales; systems with engineered fluctuations, concentrating primarily on composite materials; systems in which stochasticity is developed through nucleation and kinetic phenomena; and configurations in which constraints in a given system prevent it from attaining its ground state and cause it to attain several, equally likely (degenerate) states. We next describe how stochasticity in these processes results in variations in physical properties and how these variations are then accentuated by—or amplify—stochasticity in processing and manufacturing procedures. In summary, the origins of materials stochasticity, the degree to which it can be predicted and/or controlled, and the possibility of using stochastic descriptions of materials structure, properties, and processing as a new degree of freedom in materials design are described.

  4. Building and verifying a severity prediction model of acute pancreatitis (AP) based on BISAP, MEWS and routine test indexes.

    PubMed

    Ye, Jiang-Feng; Zhao, Yu-Xin; Ju, Jian; Wang, Wei

    2017-10-01

    To discuss the value of the Bedside Index for Severity in Acute Pancreatitis (BISAP), Modified Early Warning Score (MEWS), serum Ca2+, similarly hereinafter, and red cell distribution width (RDW) for predicting the severity grade of acute pancreatitis and to develop and verify a more accurate scoring system to predict the severity of AP. In 302 patients with AP, we calculated BISAP and MEWS scores and conducted regression analyses on the relationships of BISAP scoring, RDW, MEWS, and serum Ca2+ with the severity of AP using single-factor logistics. The variables with statistical significance in the single-factor logistic regression were used in a multi-factor logistic regression model; forward stepwise regression was used to screen variables and build a multi-factor prediction model. A receiver operating characteristic curve (ROC curve) was constructed, and the significance of multi- and single-factor prediction models in predicting the severity of AP using the area under the ROC curve (AUC) was evaluated. The internal validity of the model was verified through bootstrapping. Among 302 patients with AP, 209 had mild acute pancreatitis (MAP) and 93 had severe acute pancreatitis (SAP). According to single-factor logistic regression analysis, we found that BISAP, MEWS and serum Ca2+ are prediction indexes of the severity of AP (P-value<0.001), whereas RDW is not a prediction index of AP severity (P-value>0.05). The multi-factor logistic regression analysis showed that BISAP and serum Ca2+ are independent prediction indexes of AP severity (P-value<0.001), and MEWS is not an independent prediction index of AP severity (P-value>0.05); BISAP is negatively related to serum Ca2+ (r=-0.330, P-value<0.001). The constructed model is as follows: ln()=7.306+1.151*BISAP-4.516*serum Ca2+. The predictive ability of each model for SAP follows the order of the combined BISAP and serum Ca2+ prediction model>Ca2+>BISAP. There is no statistical significance for the predictive ability of BISAP and serum Ca2+ (P-value>0.05); however, there is remarkable statistical significance for the predictive ability using the newly built prediction model as well as BISAP and serum Ca2+ individually (P-value<0.01). Verification of the internal validity of the models by bootstrapping is favorable. BISAP and serum Ca2+ have high predictive value for the severity of AP. However, the model built by combining BISAP and serum Ca2+ is remarkably superior to those of BISAP and serum Ca2+ individually. Furthermore, this model is simple, practical and appropriate for clinical use. Copyright © 2016. Published by Elsevier Masson SAS.

  5. Stochastic P-bifurcation and stochastic resonance in a noisy bistable fractional-order system

    NASA Astrophysics Data System (ADS)

    Yang, J. H.; Sanjuán, Miguel A. F.; Liu, H. G.; Litak, G.; Li, X.

    2016-12-01

    We investigate the stochastic response of a noisy bistable fractional-order system when the fractional-order lies in the interval (0, 2]. We focus mainly on the stochastic P-bifurcation and the phenomenon of the stochastic resonance. We compare the generalized Euler algorithm and the predictor-corrector approach which are commonly used for numerical calculations of fractional-order nonlinear equations. Based on the predictor-corrector approach, the stochastic P-bifurcation and the stochastic resonance are investigated. Both the fractional-order value and the noise intensity can induce an stochastic P-bifurcation. The fractional-order may lead the stationary probability density function to turn from a single-peak mode to a double-peak mode. However, the noise intensity may transform the stationary probability density function from a double-peak mode to a single-peak mode. The stochastic resonance is investigated thoroughly, according to the linear and the nonlinear response theory. In the linear response theory, the optimal stochastic resonance may occur when the value of the fractional-order is larger than one. In previous works, the fractional-order is usually limited to the interval (0, 1]. Moreover, the stochastic resonance at the subharmonic frequency and the superharmonic frequency are investigated respectively, by using the nonlinear response theory. When it occurs at the subharmonic frequency, the resonance may be strong and cannot be ignored. When it occurs at the superharmonic frequency, the resonance is weak. We believe that the results in this paper might be useful for the signal processing of nonlinear systems.

  6. The importance of environmental variability and management control error to optimal harvest policies

    USGS Publications Warehouse

    Hunter, C.M.; Runge, M.C.

    2004-01-01

    State-dependent strategies (SDSs) are the most general form of harvest policy because they allow the harvest rate to depend, without constraint, on the state of the system. State-dependent strategies that provide an optimal harvest rate for any system state can be calculated, and stochasticity can be appropriately accommodated in this optimization. Stochasticity poses 2 challenges to harvest policies: (1) the population will never be at the equilibrium state; and (2) stochasticity induces uncertainty about future states. We investigated the effects of 2 types of stochasticity, environmental variability and management control error, on SDS harvest policies for a white-tailed deer (Odocoileus virginianus) model, and contrasted these with a harvest policy based on maximum sustainable yield (MSY). Increasing stochasticity resulted in more conservative SDSs; that is, higher population densities were required to support the same harvest rate, but these effects were generally small. As stochastic effects increased, SDSs performed much better than MSY. Both deterministic and stochastic SDSs maintained maximum mean annual harvest yield (AHY) and optimal equilibrium population size (Neq) in a stochastic environment, whereas an MSY policy could not. We suggest 3 rules of thumb for harvest management of long-lived vertebrates in stochastic systems: (1) an SDS is advantageous over an MSY policy, (2) using an SDS rather than an MSY is more important than whether a deterministic or stochastic SDS is used, and (3) for SDSs, rankings of the variability in management outcomes (e.g., harvest yield) resulting from parameter stochasticity can be predicted by rankings of the deterministic elasticities.

  7. Economic Risk Analysis of Agricultural Tillage Systems Using the SMART Stochastic Efficiency Software Package

    USDA-ARS?s Scientific Manuscript database

    Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...

  8. Control of stochastic sensitivity in a stabilization problem for gas discharge system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashkirtseva, Irina

    2015-11-30

    We consider a nonlinear dynamic stochastic system with control. A problem of stochastic sensitivity synthesis of the equilibrium is studied. A mathematical technique of the solution of this problem is discussed. This technique is applied to the problem of the stabilization of the operating mode for the stochastic gas discharge system. We construct a feedback regulator that reduces the stochastic sensitivity of the equilibrium, suppresses large-amplitude oscillations, and provides a proper operation of this engineering device.

  9. An auto-Bäcklund transformation and exact solutions for Wick-type stochastic generalized KdV equations

    NASA Astrophysics Data System (ADS)

    Xie, Yingchao

    2004-05-01

    Wick-type stochastic generalized KdV equations are researched. By using the homogeneous balance, an auto-Bäcklund transformation to the Wick-type stochastic generalized KdV equations is derived. And stochastic single soliton and stochastic multi-soliton solutions are shown by using the Hermite transform. Research supported by the National Natural Science Foundation of China (19971072) and the Natural Science Foundation of Education Committee of Jiangsu Province of China (03KJB110135).

  10. Stochastic analysis of a novel nonautonomous periodic SIRI epidemic system with random disturbances

    NASA Astrophysics Data System (ADS)

    Zhang, Weiwei; Meng, Xinzhu

    2018-02-01

    In this paper, a new stochastic nonautonomous SIRI epidemic model is formulated. Given that the incidence rates of diseases may change with the environment, we propose a novel type of transmission function. The main aim of this paper is to obtain the thresholds of the stochastic SIRI epidemic model. To this end, we investigate the dynamics of the stochastic system and establish the conditions for extinction and persistence in mean of the disease by constructing some suitable Lyapunov functions and using stochastic analysis technique. Furthermore, we show that the stochastic system has at least one nontrivial positive periodic solution. Finally, numerical simulations are introduced to illustrate our results.

  11. Stochastic foundations of undulatory transport phenomena: generalized Poisson-Kac processes—part III extensions and applications to kinetic theory and transport

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano; Brasiello, Antonio; Crescitelli, Silvestro

    2017-08-01

    This third part extends the theory of Generalized Poisson-Kac (GPK) processes to nonlinear stochastic models and to a continuum of states. Nonlinearity is treated in two ways: (i) as a dependence of the parameters (intensity of the stochastic velocity, transition rates) of the stochastic perturbation on the state variable, similarly to the case of nonlinear Langevin equations, and (ii) as the dependence of the stochastic microdynamic equations of motion on the statistical description of the process itself (nonlinear Fokker-Planck-Kac models). Several numerical and physical examples illustrate the theory. Gathering nonlinearity and a continuum of states, GPK theory provides a stochastic derivation of the nonlinear Boltzmann equation, furnishing a positive answer to the Kac’s program in kinetic theory. The transition from stochastic microdynamics to transport theory within the framework of the GPK paradigm is also addressed.

  12. Stochastic architecture for Hopfield neural nets

    NASA Technical Reports Server (NTRS)

    Pavel, Sandy

    1992-01-01

    An expandable stochastic digital architecture for recurrent (Hopfield like) neural networks is proposed. The main features and basic principles of stochastic processing are presented. The stochastic digital architecture is based on a chip with n full interconnected neurons with a pipeline, bit processing structure. For large applications, a flexible way to interconnect many such chips is provided.

  13. Hermite-Hadamard type inequality for φ{sub h}-convex stochastic processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarıkaya, Mehmet Zeki, E-mail: sarikayamz@gmail.com; Kiriş, Mehmet Eyüp, E-mail: kiris@aku.edu.tr; Çelik, Nuri, E-mail: ncelik@bartin.edu.tr

    2016-04-18

    The main aim of the present paper is to introduce φ{sub h}-convex stochastic processes and we investigate main properties of these mappings. Moreover, we prove the Hadamard-type inequalities for φ{sub h}-convex stochastic processes. We also give some new general inequalities for φ{sub h}-convex stochastic processes.

  14. RES: Regularized Stochastic BFGS Algorithm

    NASA Astrophysics Data System (ADS)

    Mokhtari, Aryan; Ribeiro, Alejandro

    2014-12-01

    RES, a regularized stochastic version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton method is proposed to solve convex optimization problems with stochastic objectives. The use of stochastic gradient descent algorithms is widespread, but the number of iterations required to approximate optimal arguments can be prohibitive in high dimensional problems. Application of second order methods, on the other hand, is impracticable because computation of objective function Hessian inverses incurs excessive computational cost. BFGS modifies gradient descent by introducing a Hessian approximation matrix computed from finite gradient differences. RES utilizes stochastic gradients in lieu of deterministic gradients for both, the determination of descent directions and the approximation of the objective function's curvature. Since stochastic gradients can be computed at manageable computational cost RES is realizable and retains the convergence rate advantages of its deterministic counterparts. Convergence results show that lower and upper bounds on the Hessian egeinvalues of the sample functions are sufficient to guarantee convergence to optimal arguments. Numerical experiments showcase reductions in convergence time relative to stochastic gradient descent algorithms and non-regularized stochastic versions of BFGS. An application of RES to the implementation of support vector machines is developed.

  15. Non-linear dynamic characteristics and optimal control of giant magnetostrictive film subjected to in-plane stochastic excitation

    NASA Astrophysics Data System (ADS)

    Zhu, Z. W.; Zhang, W. D.; Xu, J.

    2014-03-01

    The non-linear dynamic characteristics and optimal control of a giant magnetostrictive film (GMF) subjected to in-plane stochastic excitation were studied. Non-linear differential items were introduced to interpret the hysteretic phenomena of the GMF, and the non-linear dynamic model of the GMF subjected to in-plane stochastic excitation was developed. The stochastic stability was analysed, and the probability density function was obtained. The condition of stochastic Hopf bifurcation and noise-induced chaotic response were determined, and the fractal boundary of the system's safe basin was provided. The reliability function was solved from the backward Kolmogorov equation, and an optimal control strategy was proposed in the stochastic dynamic programming method. Numerical simulation shows that the system stability varies with the parameters, and stochastic Hopf bifurcation and chaos appear in the process; the area of the safe basin decreases when the noise intensifies, and the boundary of the safe basin becomes fractal; the system reliability improved through stochastic optimal control. Finally, the theoretical and numerical results were proved by experiments. The results are helpful in the engineering applications of GMF.

  16. On square-wave-driven stochastic resonance for energy harvesting in a bistable system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Dongxu, E-mail: sudx@iis.u-tokyo.ac.jp; Zheng, Rencheng; Nakano, Kimihiko

    Stochastic resonance is a physical phenomenon through which the throughput of energy within an oscillator excited by a stochastic source can be boosted by adding a small modulating excitation. This study investigates the feasibility of implementing square-wave-driven stochastic resonance to enhance energy harvesting. The motivating hypothesis was that such stochastic resonance can be efficiently realized in a bistable mechanism. However, the condition for the occurrence of stochastic resonance is conventionally defined by the Kramers rate. This definition is inadequate because of the necessity and difficulty in estimating white noise density. A bistable mechanism has been designed using an explicit analyticalmore » model which implies a new approach for achieving stochastic resonance in the paper. Experimental tests confirm that the addition of a small-scale force to the bistable system excited by a random signal apparently leads to a corresponding amplification of the response that we now term square-wave-driven stochastic resonance. The study therefore indicates that this approach may be a promising way to improve the performance of an energy harvester under certain forms of random excitation.« less

  17. Space-time-modulated stochastic processes

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.

  18. Stochastic modelling of microstructure formation in solidification processes

    NASA Astrophysics Data System (ADS)

    Nastac, Laurentiu; Stefanescu, Doru M.

    1997-07-01

    To relax many of the assumptions used in continuum approaches, a general stochastic model has been developed. The stochastic model can be used not only for an accurate description of the fraction of solid evolution, and therefore accurate cooling curves, but also for simulation of microstructure formation in castings. The advantage of using the stochastic approach is to give a time- and space-dependent description of solidification processes. Time- and space-dependent processes can also be described by partial differential equations. Unlike a differential formulation which, in most cases, has to be transformed into a difference equation and solved numerically, the stochastic approach is essentially a direct numerical algorithm. The stochastic model is comprehensive, since the competition between various phases is considered. Furthermore, grain impingement is directly included through the structure of the model. In the present research, all grain morphologies are simulated with this procedure. The relevance of the stochastic approach is that the simulated microstructures can be directly compared with microstructures obtained from experiments. The computer becomes a `dynamic metallographic microscope'. A comparison between deterministic and stochastic approaches has been performed. An important objective of this research was to answer the following general questions: (1) `Would fully deterministic approaches continue to be useful in solidification modelling?' and (2) `Would stochastic algorithms be capable of entirely replacing purely deterministic models?'

  19. Hybrid approaches for multiple-species stochastic reaction–diffusion models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spill, Fabian, E-mail: fspill@bu.edu; Department of Mechanical Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139; Guerrero, Pilar

    2015-10-15

    Reaction–diffusion models are used to describe systems in fields as diverse as physics, chemistry, ecology and biology. The fundamental quantities in such models are individual entities such as atoms and molecules, bacteria, cells or animals, which move and/or react in a stochastic manner. If the number of entities is large, accounting for each individual is inefficient, and often partial differential equation (PDE) models are used in which the stochastic behaviour of individuals is replaced by a description of the averaged, or mean behaviour of the system. In some situations the number of individuals is large in certain regions and smallmore » in others. In such cases, a stochastic model may be inefficient in one region, and a PDE model inaccurate in another. To overcome this problem, we develop a scheme which couples a stochastic reaction–diffusion system in one part of the domain with its mean field analogue, i.e. a discretised PDE model, in the other part of the domain. The interface in between the two domains occupies exactly one lattice site and is chosen such that the mean field description is still accurate there. In this way errors due to the flux between the domains are small. Our scheme can account for multiple dynamic interfaces separating multiple stochastic and deterministic domains, and the coupling between the domains conserves the total number of particles. The method preserves stochastic features such as extinction not observable in the mean field description, and is significantly faster to simulate on a computer than the pure stochastic model. - Highlights: • A novel hybrid stochastic/deterministic reaction–diffusion simulation method is given. • Can massively speed up stochastic simulations while preserving stochastic effects. • Can handle multiple reacting species. • Can handle moving boundaries.« less

  20. Genetic Variation in the Nuclear and Organellar Genomes Modulates Stochastic Variation in the Metabolome, Growth, and Defense

    PubMed Central

    Joseph, Bindu; Corwin, Jason A.; Kliebenstein, Daniel J.

    2015-01-01

    Recent studies are starting to show that genetic control over stochastic variation is a key evolutionary solution of single celled organisms in the face of unpredictable environments. This has been expanded to show that genetic variation can alter stochastic variation in transcriptional processes within multi-cellular eukaryotes. However, little is known about how genetic diversity can control stochastic variation within more non-cell autonomous phenotypes. Using an Arabidopsis reciprocal RIL population, we showed that there is significant genetic diversity influencing stochastic variation in the plant metabolome, defense chemistry, and growth. This genetic diversity included loci specific for the stochastic variation of each phenotypic class that did not affect the other phenotypic classes or the average phenotype. This suggests that the organism's networks are established so that noise can exist in one phenotypic level like metabolism and not permeate up or down to different phenotypic levels. Further, the genomic variation within the plastid and mitochondria also had significant effects on the stochastic variation of all phenotypic classes. The genetic influence over stochastic variation within the metabolome was highly metabolite specific, with neighboring metabolites in the same metabolic pathway frequently showing different levels of noise. As expected from bet-hedging theory, there was more genetic diversity and a wider range of stochastic variation for defense chemistry than found for primary metabolism. Thus, it is possible to begin dissecting the stochastic variation of whole organismal phenotypes in multi-cellular organisms. Further, there are loci that modulate stochastic variation at different phenotypic levels. Finding the identity of these genes will be key to developing complete models linking genotype to phenotype. PMID:25569687

  1. The influence of Stochastic perturbation of Geotechnical media On Electromagnetic tomography

    NASA Astrophysics Data System (ADS)

    Song, Lei; Yang, Weihao; Huangsonglei, Jiahui; Li, HaiPeng

    2015-04-01

    Electromagnetic tomography (CT) are commonly utilized in Civil engineering to detect the structure defects or geological anomalies. CT are generally recognized as a high precision geophysical method and the accuracy of CT are expected to be several centimeters and even to be several millimeters. Then, high frequency antenna with short wavelength are utilized commonly in Civil Engineering. As to the geotechnical media, stochastic perturbation of the EM parameters are inevitably exist in geological scales, in structure scales and in local scales, et al. In those cases, the geometric dimensionings of the target body, the EM wavelength and the accuracy expected might be of the same order. When the high frequency EM wave propagated in the stochastic geotechnical media, the GPR signal would be reflected not only from the target bodies but also from the stochastic perturbation of the background media. To detect the karst caves in dissolution fracture rock, one need to assess the influence of the stochastic distributed dissolution holes and fractures; to detect the void in a concrete structure, one should master the influence of the stochastic distributed stones, et al. In this paper, on the base of stochastic media discrete realizations, the authors try to evaluate quantificationally the influence of the stochastic perturbation of Geotechnical media by Radon/Iradon Transfer through full-combined Monte Carlo numerical simulation. It is found the stochastic noise is related with transfer angle, perturbing strength, angle interval, autocorrelation length, et al. And the quantitative formula of the accuracy of the electromagnetic tomography is also established, which could help on the precision estimation of GPR tomography in stochastic perturbation Geotechnical media. Key words: Stochastic Geotechnical Media; Electromagnetic Tomography; Radon/Iradon Transfer.

  2. Genetic variation in the nuclear and organellar genomes modulates stochastic variation in the metabolome, growth, and defense.

    PubMed

    Joseph, Bindu; Corwin, Jason A; Kliebenstein, Daniel J

    2015-01-01

    Recent studies are starting to show that genetic control over stochastic variation is a key evolutionary solution of single celled organisms in the face of unpredictable environments. This has been expanded to show that genetic variation can alter stochastic variation in transcriptional processes within multi-cellular eukaryotes. However, little is known about how genetic diversity can control stochastic variation within more non-cell autonomous phenotypes. Using an Arabidopsis reciprocal RIL population, we showed that there is significant genetic diversity influencing stochastic variation in the plant metabolome, defense chemistry, and growth. This genetic diversity included loci specific for the stochastic variation of each phenotypic class that did not affect the other phenotypic classes or the average phenotype. This suggests that the organism's networks are established so that noise can exist in one phenotypic level like metabolism and not permeate up or down to different phenotypic levels. Further, the genomic variation within the plastid and mitochondria also had significant effects on the stochastic variation of all phenotypic classes. The genetic influence over stochastic variation within the metabolome was highly metabolite specific, with neighboring metabolites in the same metabolic pathway frequently showing different levels of noise. As expected from bet-hedging theory, there was more genetic diversity and a wider range of stochastic variation for defense chemistry than found for primary metabolism. Thus, it is possible to begin dissecting the stochastic variation of whole organismal phenotypes in multi-cellular organisms. Further, there are loci that modulate stochastic variation at different phenotypic levels. Finding the identity of these genes will be key to developing complete models linking genotype to phenotype.

  3. Stochastic description of quantum Brownian dynamics

    NASA Astrophysics Data System (ADS)

    Yan, Yun-An; Shao, Jiushu

    2016-08-01

    Classical Brownian motion has well been investigated since the pioneering work of Einstein, which inspired mathematicians to lay the theoretical foundation of stochastic processes. A stochastic formulation for quantum dynamics of dissipative systems described by the system-plus-bath model has been developed and found many applications in chemical dynamics, spectroscopy, quantum transport, and other fields. This article provides a tutorial review of the stochastic formulation for quantum dissipative dynamics. The key idea is to decouple the interaction between the system and the bath by virtue of the Hubbard-Stratonovich transformation or Itô calculus so that the system and the bath are not directly entangled during evolution, rather they are correlated due to the complex white noises introduced. The influence of the bath on the system is thereby defined by an induced stochastic field, which leads to the stochastic Liouville equation for the system. The exact reduced density matrix can be calculated as the stochastic average in the presence of bath-induced fields. In general, the plain implementation of the stochastic formulation is only useful for short-time dynamics, but not efficient for long-time dynamics as the statistical errors go very fast. For linear and other specific systems, the stochastic Liouville equation is a good starting point to derive the master equation. For general systems with decomposable bath-induced processes, the hierarchical approach in the form of a set of deterministic equations of motion is derived based on the stochastic formulation and provides an effective means for simulating the dissipative dynamics. A combination of the stochastic simulation and the hierarchical approach is suggested to solve the zero-temperature dynamics of the spin-boson model. This scheme correctly describes the coherent-incoherent transition (Toulouse limit) at moderate dissipation and predicts a rate dynamics in the overdamped regime. Challenging problems such as the dynamical description of quantum phase transition (local- ization) and the numerical stability of the trace-conserving, nonlinear stochastic Liouville equation are outlined.

  4. Towards resiliency with micro-grids: Portfolio optimization and investment under uncertainty

    NASA Astrophysics Data System (ADS)

    Gharieh, Kaveh

    Energy security and sustained supply of power are critical for community welfare and economic growth. In the face of the increased frequency and intensity of extreme weather conditions which can result in power grid outage, the value of micro-grids to improve the communities' power reliability and resiliency is becoming more important. Micro-grids capability to operate in islanded mode in stressed-out conditions, dramatically decreases the economic loss of critical infrastructure in power shortage occasions. More wide-spread participation of micro-grids in the wholesale energy market in near future, makes the development of new investment models necessary. However, market and price risks in short term and long term along with risk factors' impacts shall be taken into consideration in development of new investment models. This work proposes a set of models and tools to address different problems associated with micro-grid assets including optimal portfolio selection, investment and financing in both community and a sample critical infrastructure (i.e. wastewater treatment plant) levels. The models account for short-term operational volatilities and long-term market uncertainties. A number of analytical methodologies and financial concepts have been adopted to develop the aforementioned models as follows. (1) Capital budgeting planning and portfolio optimization models with Monte Carlo stochastic scenario generation are applied to derive the optimal investment decision for a portfolio of micro-grid assets considering risk factors and multiple sources of uncertainties. (2) Real Option theory, Monte Carlo simulation and stochastic optimization techniques are applied to obtain optimal modularized investment decisions for hydrogen tri-generation systems in wastewater treatment facilities, considering multiple sources of uncertainty. (3) Public Private Partnership (PPP) financing concept coupled with investment horizon approach are applied to estimate public and private parties' revenue shares from a community-level micro-grid project over the course of assets' lifetime considering their optimal operation under uncertainty.

  5. Existence of time-periodic weak solutions to the stochastic Navier-Stokes equations around a moving body

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Feng, E-mail: chenfengmath@163.com, E-mail: hanyc@jlu.edu.cn; Han, Yuecai, E-mail: chenfengmath@163.com, E-mail: hanyc@jlu.edu.cn

    2013-12-15

    The existence of time-periodic stochastic motions of an incompressible fluid is obtained. Here the fluid is subject to a time-periodic body force and an additional time-periodic stochastic force that is produced by a rigid body moves periodically stochastically with the same period in the fluid.

  6. Stochastic Estimation via Polynomial Chaos

    DTIC Science & Technology

    2015-10-01

    AFRL-RW-EG-TR-2015-108 Stochastic Estimation via Polynomial Chaos Douglas V. Nance Air Force Research...COVERED (From - To) 20-04-2015 – 07-08-2015 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Stochastic Estimation via Polynomial Chaos ...This expository report discusses fundamental aspects of the polynomial chaos method for representing the properties of second order stochastic

  7. Dynamics of a stochastic tuberculosis model with constant recruitment and varying total population size

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Jiang, Daqing; Shi, Ningzhong; Hayat, Tasawar; Alsaedi, Ahmed

    2017-03-01

    In this paper, we develop a mathematical model for a tuberculosis model with constant recruitment and varying total population size by incorporating stochastic perturbations. By constructing suitable stochastic Lyapunov functions, we establish sufficient conditions for the existence of an ergodic stationary distribution as well as extinction of the disease to the stochastic system.

  8. A stochastic maximum principle for backward control systems with random default time

    NASA Astrophysics Data System (ADS)

    Shen, Yang; Kuen Siu, Tak

    2013-05-01

    This paper establishes a necessary and sufficient stochastic maximum principle for backward systems, where the state processes are governed by jump-diffusion backward stochastic differential equations with random default time. An application of the sufficient stochastic maximum principle to an optimal investment and capital injection problem in the presence of default risk is discussed.

  9. Stochastic Swift-Hohenberg Equation with Degenerate Linear Multiplicative Noise

    NASA Astrophysics Data System (ADS)

    Hernández, Marco; Ong, Kiah Wah

    2018-03-01

    We study the dynamic transition of the Swift-Hohenberg equation (SHE) when linear multiplicative noise acting on a finite set of modes of the dominant linear flow is introduced. Existence of a stochastic flow and a local stochastic invariant manifold for this stochastic form of SHE are both addressed in this work. We show that the approximate reduced system corresponding to the invariant manifold undergoes a stochastic pitchfork bifurcation, and obtain numerical evidence suggesting that this picture is a good approximation for the full system as well.

  10. The Two-On-One Stochastic Duel

    DTIC Science & Technology

    1983-12-01

    ACN 67500 TRASANA-TR-43-83 (.0 (v THE TWO-ON-ONE STOCHASTIC DUEL I • Prepared By A.V. Gafarian C.J. Ancker, Jr. DECEMBER 19833D I°"’" " TIC ELECTE...83 M A IL / _ _ 4. TITLE (and Subtitle) TYPE OF REPORT & PERIOD CO\\,ERED The Two-On-One Stochastic Duel Final Report 6. PERFORMING ORG. REPORT NUMBER...Stochastic Duels , Stochastic Processed, and Attrition. 5-14cIa~c fal roLCS-e ss 120. ABSTRACT (C’ntfMte am reverse Ed& if necesemay and idemtitf by block

  11. The Sharma-Parthasarathy stochastic two-body problem

    NASA Astrophysics Data System (ADS)

    Cresson, J.; Pierret, F.; Puig, B.

    2015-03-01

    We study the Sharma-Parthasarathy stochastic two-body problem introduced by Sharma and Parthasarathy in ["Dynamics of a stochastically perturbed two-body problem," Proc. R. Soc. A 463, 979-1003 (2007)]. In particular, we focus on the preservation of some fundamental features of the classical two-body problem like the Hamiltonian structure and first integrals in the stochastic case. Numerical simulations are performed which illustrate the dynamical behaviour of the osculating elements as the semi-major axis, the eccentricity, and the pericenter. We also derive a stochastic version of Gauss's equations in the planar case.

  12. Memristor-based neural networks: Synaptic versus neuronal stochasticity

    NASA Astrophysics Data System (ADS)

    Naous, Rawan; AlShedivat, Maruan; Neftci, Emre; Cauwenberghs, Gert; Salama, Khaled Nabil

    2016-11-01

    In neuromorphic circuits, stochasticity in the cortex can be mapped into the synaptic or neuronal components. The hardware emulation of these stochastic neural networks are currently being extensively studied using resistive memories or memristors. The ionic process involved in the underlying switching behavior of the memristive elements is considered as the main source of stochasticity of its operation. Building on its inherent variability, the memristor is incorporated into abstract models of stochastic neurons and synapses. Two approaches of stochastic neural networks are investigated. Aside from the size and area perspective, the impact on the system performance, in terms of accuracy, recognition rates, and learning, among these two approaches and where the memristor would fall into place are the main comparison points to be considered.

  13. Gene-gene interactions among genetic variants from obesity candidate genes for nonobese and obese populations in type 2 diabetes.

    PubMed

    Lin, Eugene; Pei, Dee; Huang, Yi-Jen; Hsieh, Chang-Hsun; Wu, Lawrence Shih-Hsin

    2009-08-01

    Recent studies indicate that obesity may play a key role in modulating genetic predispositions to type 2 diabetes (T2D). This study examines the main effects of both single-locus and multilocus interactions among genetic variants in Taiwanese obese and nonobese individuals to test the hypothesis that obesity-related genes may contribute to the etiology of T2D independently and/or through such complex interactions. We genotyped 11 single nucleotide polymorphisms for 10 obesity candidate genes including adrenergic beta-2-receptor surface, adrenergic beta-3-receptor surface, angiotensinogen, fat mass and obesity associated gene, guanine nucleotide binding protein beta polypeptide 3 (GNB3), interleukin 6 receptor, proprotein convertase subtilisin/kexin type 1 (PCSK1), uncoupling protein 1, uncoupling protein 2, and uncoupling protein 3. There were 389 patients diagnosed with T2D and 186 age- and sex-matched controls. Single-locus analyses showed significant main effects of the GNB3 and PCSK1 genes on the risk of T2D among the nonobese group (p = 0.002 and 0.047, respectively). Further, interactions involving GNB3 and PCSK1 were suggested among the nonobese population using the generalized multifactor dimensionality reduction method (p = 0.001). In addition, interactions among angiotensinogen, fat mass and obesity associated gene, GNB3, and uncoupling protein 3 genes were found in a significant four-locus generalized multifactor dimensionality reduction model among the obese population (p = 0.001). The results suggest that the single nucleotide polymorphisms from the obesity candidate genes may contribute to the risk of T2D independently and/or in an interactive manner according to the presence or absence of obesity.

  14. [Clinical characteristics and prognostic factors of pulmonary tuberculosis with concurrent lung cancer].

    PubMed

    Gu, Yingchun; Song, Yelin; Liu, Yufeng

    2014-09-30

    To explore the clinical characteristics and prognostic factors of pulmonary tuberculosis with concurrent lung cancer. Comprehensive analyses were conducted for 58 cases of pulmonary tuberculosis patients with lung cancer. Their clinical symptoms, signs and imaging results were analyzed between January 1998 and January 2005 at Qingdao Chest Hospital. Kaplan-Meier method was utilized to calculate their survival rates. Nine prognostic characteristics were analyzed. Single factor analysis was performed with Logrank test and multi-factor analysis with Cox regression model. The initial symptoms were cough, chest tightness, fever and hemoptysis. Chest radiology showed the coexistence of two diseases was 36 in the same lobe and 22 in different lobes. And there were pulmonary nodules (n = 24), cavities (n = 19), infiltration (n = 8) and atelectasis (n = 7). According to the pathological characteristics, there were squamous carcinoma (n = 33), adenocarcinoma (n = 17), small cell carcinoma (n = 4) and unidentified (n = 4) respectively. The TNM stages were I (n = 13), II(n = 22), III (n = 16) and IV (n = 7) respectively. The median survival period was 24 months. And the 1, 3, 5-year survival rates were 65.5%, 65.5% and 29.0% respectively. Single factor analysis showed that lung cancer TNM staging (P = 0.000) and tuberculosis activity (P = 0.024) were significantly associated with patient prognosis. And multi-factor analysis showed that lung cancer TNM staging (RR = 2.629, 95%CI: 1.759-3.928, P = 0.000) and tuberculosis activity (RR = 1.885, 95%CI: 1.023-3.471, P = 0.042) were relatively independent prognostic factors. The clinical and radiological characteristics contribute jointly to early diagnosis and therapy of tuberculosis with concurrent lung cancer. And TNM staging of lung cancer and activity of tuberculosis are major prognostic factors.

  15. Interactions among genetic variants in apoptosis pathway genes, reflux symptoms, body mass index, and smoking indicate two distinct etiologic patterns of esophageal adenocarcinoma.

    PubMed

    Zhai, Rihong; Chen, Feng; Liu, Geoffrey; Su, Li; Kulke, Matthew H; Asomaning, Kofi; Lin, Xihong; Heist, Rebecca S; Nishioka, Norman S; Sheu, Chau-Chyun; Wain, John C; Christiani, David C

    2010-05-10

    Apoptosis pathway, gastroesophageal reflux symptoms (reflux), higher body mass index (BMI), and tobacco smoking have been individually associated with esophageal adenocarcinoma (EA) development. However, how multiple factors jointly affect EA risk remains unclear. In total, 305 patients with EA and 339 age- and sex-matched controls were studied. High-order interactions among reflux, BMI, smoking, and functional polymorphisms in five apoptotic genes (FAS, FASL, IL1B, TP53BP, and BAT3) were investigated by entropy-based multifactor dimensionality reduction (MDR), classification and regression tree (CART), and traditional logistic regression (LR) models. In LR analysis, reflux, BMI, and smoking were significantly associated with EA risk, with reflux as the strongest individual factor. No individual single nucleotide polymorphism was associated with EA susceptibility. However, there was a two-way interaction between IL1B + 3954C>T and reflux (P = .008). In both CART and MDR analyses, reflux was also the strongest individual factor for EA risk. In individuals with reflux symptoms, CART analysis indicated that strongest interaction was among variant genotypes of IL1B + 3954C>T and BAT3S625P, higher BMI, and smoking (odds ratio [OR], 5.76; 95% CI, 2.48 to 13.38), a finding independently found using MDR analysis. In contrast, for participants without reflux symptoms, the strongest interaction was found between higher BMI and smoking (OR, 3.27; 95% CI, 1.88 to 5.68), also echoed by entropy-based MDR analysis. Although a history of reflux is an important risk for EA, multifactor interactions also play important roles in EA risk. Gene-environment interaction patterns differ between patients with and without reflux symptoms.

  16. A Computationally Efficient Hypothesis Testing Method for Epistasis Analysis using Multifactor Dimensionality Reduction

    PubMed Central

    Pattin, Kristine A.; White, Bill C.; Barney, Nate; Gui, Jiang; Nelson, Heather H.; Kelsey, Karl R.; Andrew, Angeline S.; Karagas, Margaret R.; Moore, Jason H.

    2008-01-01

    Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free data mining method for detecting, characterizing, and interpreting epistasis in the absence of significant main effects in genetic and epidemiologic studies of complex traits such as disease susceptibility. The goal of MDR is to change the representation of the data using a constructive induction algorithm to make nonadditive interactions easier to detect using any classification method such as naïve Bayes or logistic regression. Traditionally, MDR constructed variables have been evaluated with a naïve Bayes classifier that is combined with 10-fold cross validation to obtain an estimate of predictive accuracy or generalizability of epistasis models. Traditionally, we have used permutation testing to statistically evaluate the significance of models obtained through MDR. The advantage of permutation testing is that it controls for false-positives due to multiple testing. The disadvantage is that permutation testing is computationally expensive. This is in an important issue that arises in the context of detecting epistasis on a genome-wide scale. The goal of the present study was to develop and evaluate several alternatives to large-scale permutation testing for assessing the statistical significance of MDR models. Using data simulated from 70 different epistasis models, we compared the power and type I error rate of MDR using a 1000-fold permutation test with hypothesis testing using an extreme value distribution (EVD). We find that this new hypothesis testing method provides a reasonable alternative to the computationally expensive 1000-fold permutation test and is 50 times faster. We then demonstrate this new method by applying it to a genetic epidemiology study of bladder cancer susceptibility that was previously analyzed using MDR and assessed using a 1000-fold permutation test. PMID:18671250

  17. Risk factors associated with sporadic salmonellosis in adults: a case-control study.

    PubMed

    Ziehm, D; Dreesman, J; Campe, A; Kreienbrock, L; Pulz, M

    2013-02-01

    In order to identify and assess recent risk factors for sporadic human infections with Salmonella enterica, we conducted a case-control study in Lower Saxony, Germany. The data collection was based on standardized telephone interviews with 1017 cases and 346 controls aged >14 years. Odds ratios were calculated in single-factor and multi-factor analyses for Salmonella cases and two different control groups, i.e. population controls and controls with rotavirus infection. Multi-factor analysis revealed associations between sporadic Salmonella infections for two exposures by both sets of controls: consumption of raw ground pork [adjusted odds ratio (aOR) 2·38, 95% confidence interval (CI) 1·27-4·44] and foreign travel (aOR 2·12, 95% CI 1·00-4·52). Other exposures included consumption of food items containing eggs (aOR 1·43, 95% CI 0·80-2·54), consumption of chicken meat (aOR 1·77, 95% CI 1·26-2·50), outdoor meals/barbecues (aOR 3·96, 95% CI 1·41-11·12) and taking gastric acidity inhibitors (aOR 2·42, 95% CI 1·19-4·92), all were significantly associated with respect to one of the two control groups. The impact of consuming food items containing eggs or chicken meat was lower than expected from the literature. This might be a consequence of Salmonella control programmes as well as increased public awareness of eggs and chicken products being a risk factor for salmonellosis. Efforts to reduce Salmonella infections due to raw pork products should be intensified.

  18. The Multifactor Measure of Performance: Its Development, Norming, and Validation.

    PubMed

    Bar-On, Reuven

    2018-01-01

    This article describes the development as well as the initial norming and validation of the Multifactor Measure of Performance™ (MMP™), which is a psychometric instrument that is designed to study, assess and enhance key predictors of human performance to help individuals perform at a higher level. It was created by the author, for the purpose of going beyond existing conceptual and psychometric models that often focus on relatively few factors that are purported to assess performance at school, in the workplace and elsewhere. The relative sparsity of multifactorial pre-employment assessment instruments exemplifies, for the author, one of the important reasons for developing the MMP™, which attempts to comprehensively evaluate a wider array of factors that are thought to contribute to performance. In that this situation creates a need in the area of test-construction that should be addressed, the author sought to develop a multifactorial assessment and development instrument that could concomitantly evaluate a combination of physical, cognitive, intra-personal, inter-personal, and motivational factors that significantly contribute to performance. The specific aim of this article is to show why, how and if this could be done as well as to present and discuss the potential importance of the results obtained to date. The findings presented here will hopefully add to what is known about human performance and thus contribute to the professional literature, in addition to contribute to the continued development of the MMP™. The impetus for developing the MMP™ is first explained below, followed by a detailed description of the process involved and the findings obtained; and their potential application is then discussed as well as the possible limitations of the present research and the need for future studies to address them.

  19. Severe chronic heart failure in patients considered for heart transplantation in Poland.

    PubMed

    Korewicki, Jerzy; Leszek, Przemysław; Zieliński, Tomasz; Rywik, Tomasz; Piotrowski, Walerian; Kurjata, Paweł; Kozar-Kamińska, Katarzyna; Kodziszewska, Katarzyna

    2012-01-01

    Based on the results of clinical trials, the prognosis for patients with severe heart failure (HF) has improved over the last 20 years. However, clinical trials do not reflect 'real life' due to patient selection. Thus, the aim of the POLKARD-HF registry was the analysis of survival of patients with refractory HF referred for orthotopic heart transplantation (OHT). Between 1 November 2003 and 31 October 2007, 983 patients with severe HF, referred for OHT in Poland, were included into the registry. All patients underwent routine clinical and hemodynamic evaluation, with NT-proBNP and hsCRP assessment. Death or an emergency OHT were assumed as the endpoints. The average observation period was 601 days. Kaplan-Meier curves with log-rank and univariate together with multifactor Cox regression model the stepwise variable selection method were used to determine the predictive value of analyzed variables. Among the 983 patients, the probability of surviving for one year was approximately 80%, for two years 70%, and for three years 67%. Etiology of the HF did not significantly influence the prognosis. The patients in NYHA class IV had a three-fold higher risk of death or emergency OHT. The univariate/multifactor Cox regression analysis revealed that NYHA IV class (HR 2.578, p < 0.0001), HFSS score (HR 2.572, p < 0.0001) and NT-proBNP plasma level (HR 1.600, p = 0.0200), proved to influence survival without death or emergency OHT. Despite optimal treatment, the prognosis for patients with refractory HF is still not good. NYHA class IV, NT-proBNP and HFSS score can help define the highest risk group. The results are consistent with the prognosis of patients enrolled into the randomized trials.

  20. Surgeons' Leadership Styles and Team Behavior in the Operating Room.

    PubMed

    Hu, Yue-Yung; Parker, Sarah Henrickson; Lipsitz, Stuart R; Arriaga, Alexander F; Peyre, Sarah E; Corso, Katherine A; Roth, Emilie M; Yule, Steven J; Greenberg, Caprice C

    2016-01-01

    The importance of leadership is recognized in surgery, but the specific impact of leadership style on team behavior is not well understood. In other industries, leadership is a well-characterized construct. One dominant theory proposes that transactional (task-focused) leaders achieve minimum standards and transformational (team-oriented) leaders inspire performance beyond expectations. We videorecorded 5 surgeons performing complex operations. Each surgeon was scored on the Multifactor Leadership Questionnaire, a validated method for scoring transformational and transactional leadership style, by an organizational psychologist and a surgeon researcher. Independent coders assessed surgeons' leadership behaviors according to the Surgical Leadership Inventory and team behaviors (information sharing, cooperative, and voice behaviors). All coders were blinded. Leadership style (Multifactor Leadership Questionnaire) was correlated with surgeon behavior (Surgical Leadership Inventory) and team behavior using Poisson regression, controlling for time and the total number of behaviors, respectively. All surgeons scored similarly on transactional leadership (range 2.38 to 2.69), but varied more widely on transformational leadership (range 1.98 to 3.60). Each 1-point increase in transformational score corresponded to 3 times more information-sharing behaviors (p < 0.0001) and 5.4 times more voice behaviors (p = 0.0005) among the team. With each 1-point increase in transformational score, leaders displayed 10 times more supportive behaviors (p < 0.0001) and displayed poor behaviors 12.5 times less frequently (p < 0.0001). Excerpts of representative dialogue are included for illustration. We provide a framework for evaluating surgeons' leadership and its impact on team performance in the operating room. As in other fields, our data suggest that transformational leadership is associated with improved team behavior. Surgeon leadership development, therefore, has the potential to improve the efficiency and safety of operative care. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  1. Enhancing leadership and relationships by implementing a peer mentoring program.

    PubMed

    Gafni Lachter, Liat R; Ruland, Judith P

    2018-03-30

    Peer-mentoring is often described as effective means to promote professional and leadership skills, yet evidence on practical models of such programs for occupational therapy students are sparse. The purpose of this study was to evaluate the outcomes of a peer-mentoring program designed for graduate occupational therapy students. Forty-seven second-year student volunteers were randomly assigned to individually mentor first-year students in a year-long program. Students met biweekly virtually or in person to provide mentorship on everyday student issues, according to mentees' needs. Faculty-led group activities prior and during the peer-mentoring program took place to facilitate the mentorship relationships. Program effectiveness was measured using the Multi-factor Leadership Questionnaire (Avolio & Bass, MLQ: Multifactor Leadership Questionnaire, 2004) and an open-ended feedback survey. Results of multi-variate MANOVA for repeated measures indicating significant enhancement in several leadership skills (F(12,46) = 4.0, P = 0.001, η 2  = 0.579). Qualitative data from feedback surveys indicated that an opportunity to help; forming relationships; and structure as enabler were perceived as important participation outcomes. Students expressed high satisfaction and perceived value from their peer-mentoring experience. As we seek ways to promote our profession and the leadership of its members, it is recommended to consider student peer-mentoring to empower them to practice and advance essential career skills from the initial stages of professional development. Evidence found in this study demonstrates that peer-mentoring programs can promote leadership development and establishment of networks in an occupational therapy emerging professional community, at a low cost. The peer-mentoring blueprint and lessons learned are presented with hopes to inspire others to implement peer-mentoring programs in their settings. © 2018 Occupational Therapy Australia.

  2. Multi-Factor Impact Analysis of Agricultural Production in Bangladesh with Climate Change

    NASA Technical Reports Server (NTRS)

    Ruane, Alex C.; Major, David C.; Yu, Winston H.; Alam, Mozaharul; Hussain, Sk. Ghulam; Khan, Abu Saleh; Hassan, Ahmadul; Al Hossain, Bhuiya Md. Tamim; Goldberg, Richard; Horton, Radley M.; hide

    2012-01-01

    Diverse vulnerabilities of Bangladesh's agricultural sector in 16 sub-regions are assessed using experiments designed to investigate climate impact factors in isolation and in combination. Climate information from a suite of global climate models (GCMs) is used to drive models assessing the agricultural impact of changes in temperature, precipitation, carbon dioxide concentrations, river floods, and sea level rise for the 2040-2069 period in comparison to a historical baseline. Using the multi-factor impacts analysis framework developed in Yu et al. (2010), this study provides new sub-regional vulnerability analyses and quantifies key uncertainties in climate and production. Rice (aman, boro, and aus seasons) and wheat production are simulated in each sub-region using the biophysical Crop Environment REsource Synthesis (CERES) models. These simulations are then combined with the MIKE BASIN hydrologic model for river floods in the Ganges-Brahmaputra-Meghna (GBM) Basins, and the MIKE21Two-Dimensional Estuary Model to determine coastal inundation under conditions of higher mean sea level. The impacts of each factor depend on GCM configurations, emissions pathways, sub-regions, and particular seasons and crops. Temperature increases generally reduce production across all scenarios. Precipitation changes can have either a positive or a negative impact, with a high degree of uncertainty across GCMs. Carbon dioxide impacts on crop production are positive and depend on the emissions pathway. Increasing river flood areas reduce production in affected sub-regions. Precipitation uncertainties from different GCMs and emissions scenarios are reduced when integrated across the large GBM Basins' hydrology. Agriculture in Southern Bangladesh is severely affected by sea level rise even when cyclonic surges are not fully considered, with impacts increasing under the higher emissions scenario.

  3. The Multifactor Measure of Performance: Its Development, Norming, and Validation

    PubMed Central

    Bar-On, Reuven

    2018-01-01

    This article describes the development as well as the initial norming and validation of the Multifactor Measure of Performance™ (MMP™)1, which is a psychometric instrument that is designed to study, assess and enhance key predictors of human performance to help individuals perform at a higher level. It was created by the author, for the purpose of going beyond existing conceptual and psychometric models that often focus on relatively few factors that are purported to assess performance at school, in the workplace and elsewhere. The relative sparsity of multifactorial pre-employment assessment instruments exemplifies, for the author, one of the important reasons for developing the MMP™, which attempts to comprehensively evaluate a wider array of factors that are thought to contribute to performance. In that this situation creates a need in the area of test-construction that should be addressed, the author sought to develop a multifactorial assessment and development instrument that could concomitantly evaluate a combination of physical, cognitive, intra-personal, inter-personal, and motivational factors that significantly contribute to performance. The specific aim of this article is to show why, how and if this could be done as well as to present and discuss the potential importance of the results obtained to date. The findings presented here will hopefully add to what is known about human performance and thus contribute to the professional literature, in addition to contribute to the continued development of the MMP™. The impetus for developing the MMP™ is first explained below, followed by a detailed description of the process involved and the findings obtained; and their potential application is then discussed as well as the possible limitations of the present research and the need for future studies to address them. PMID:29515479

  4. Partial ASL extensions for stochastic programming.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gay, David

    2010-03-31

    partially completed extensions for stochastic programming to the AMPL/solver interface library (ASL).modeling and experimenting with stochastic recourse problems. This software is not primarily for military applications

  5. Stochastic model simulation using Kronecker product analysis and Zassenhaus formula approximation.

    PubMed

    Caglar, Mehmet Umut; Pal, Ranadip

    2013-01-01

    Probabilistic Models are regularly applied in Genetic Regulatory Network modeling to capture the stochastic behavior observed in the generation of biological entities such as mRNA or proteins. Several approaches including Stochastic Master Equations and Probabilistic Boolean Networks have been proposed to model the stochastic behavior in genetic regulatory networks. It is generally accepted that Stochastic Master Equation is a fundamental model that can describe the system being investigated in fine detail, but the application of this model is computationally enormously expensive. On the other hand, Probabilistic Boolean Network captures only the coarse-scale stochastic properties of the system without modeling the detailed interactions. We propose a new approximation of the stochastic master equation model that is able to capture the finer details of the modeled system including bistabilities and oscillatory behavior, and yet has a significantly lower computational complexity. In this new method, we represent the system using tensors and derive an identity to exploit the sparse connectivity of regulatory targets for complexity reduction. The algorithm involves an approximation based on Zassenhaus formula to represent the exponential of a sum of matrices as product of matrices. We derive upper bounds on the expected error of the proposed model distribution as compared to the stochastic master equation model distribution. Simulation results of the application of the model to four different biological benchmark systems illustrate performance comparable to detailed stochastic master equation models but with considerably lower computational complexity. The results also demonstrate the reduced complexity of the new approach as compared to commonly used Stochastic Simulation Algorithm for equivalent accuracy.

  6. Inducing Tropical Cyclones to Undergo Brownian Motion

    NASA Astrophysics Data System (ADS)

    Hodyss, D.; McLay, J.; Moskaitis, J.; Serra, E.

    2014-12-01

    Stochastic parameterization has become commonplace in numerical weather prediction (NWP) models used for probabilistic prediction. Here, a specific stochastic parameterization will be related to the theory of stochastic differential equations and shown to be affected strongly by the choice of stochastic calculus. From an NWP perspective our focus will be on ameliorating a common trait of the ensemble distributions of tropical cyclone (TC) tracks (or position), namely that they generally contain a bias and an underestimate of the variance. With this trait in mind we present a stochastic track variance inflation parameterization. This parameterization makes use of a properly constructed stochastic advection term that follows a TC and induces its position to undergo Brownian motion. A central characteristic of Brownian motion is that its variance increases with time, which allows for an effective inflation of an ensemble's TC track variance. Using this stochastic parameterization we present a comparison of the behavior of TCs from the perspective of the stochastic calculi of Itô and Stratonovich within an operational NWP model. The central difference between these two perspectives as pertains to TCs is shown to be properly predicted by the stochastic calculus and the Itô correction. In the cases presented here these differences will manifest as overly intense TCs, which, depending on the strength of the forcing, could lead to problems with numerical stability and physical realism.

  7. Tests of oceanic stochastic parameterisation in a seasonal forecast system.

    NASA Astrophysics Data System (ADS)

    Cooper, Fenwick; Andrejczuk, Miroslaw; Juricke, Stephan; Zanna, Laure; Palmer, Tim

    2015-04-01

    Over seasonal time scales, our aim is to compare the relative impact of ocean initial condition and model uncertainty, upon the ocean forecast skill and reliability. Over seasonal timescales we compare four oceanic stochastic parameterisation schemes applied in a 1x1 degree ocean model (NEMO) with a fully coupled T159 atmosphere (ECMWF IFS). The relative impacts upon the ocean of the resulting eddy induced activity, wind forcing and typical initial condition perturbations are quantified. Following the historical success of stochastic parameterisation in the atmosphere, two of the parameterisations tested were multiplicitave in nature: A stochastic variation of the Gent-McWilliams scheme and a stochastic diffusion scheme. We also consider a surface flux parameterisation (similar to that introduced by Williams, 2012), and stochastic perturbation of the equation of state (similar to that introduced by Brankart, 2013). The amplitude of the stochastic term in the Williams (2012) scheme was set to the physically reasonable amplitude considered in that paper. The amplitude of the stochastic term in each of the other schemes was increased to the limits of model stability. As expected, variability was increased. Up to 1 month after initialisation, ensemble spread induced by stochastic parameterisation is greater than that induced by the atmosphere, whilst being smaller than the initial condition perturbations currently used at ECMWF. After 1 month, the wind forcing becomes the dominant source of model ocean variability, even at depth.

  8. Teaching Tip: When a Matrix and Its Inverse Are Stochastic

    ERIC Educational Resources Information Center

    Ding, J.; Rhee, N. H.

    2013-01-01

    A stochastic matrix is a square matrix with nonnegative entries and row sums 1. The simplest example is a permutation matrix, whose rows permute the rows of an identity matrix. A permutation matrix and its inverse are both stochastic. We prove the converse, that is, if a matrix and its inverse are both stochastic, then it is a permutation matrix.

  9. Non-linear dynamic characteristics and optimal control of giant magnetostrictive film subjected to in-plane stochastic excitation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Z. W., E-mail: zhuzhiwen@tju.edu.cn; Tianjin Key Laboratory of Non-linear Dynamics and Chaos Control, 300072, Tianjin; Zhang, W. D., E-mail: zhangwenditju@126.com

    2014-03-15

    The non-linear dynamic characteristics and optimal control of a giant magnetostrictive film (GMF) subjected to in-plane stochastic excitation were studied. Non-linear differential items were introduced to interpret the hysteretic phenomena of the GMF, and the non-linear dynamic model of the GMF subjected to in-plane stochastic excitation was developed. The stochastic stability was analysed, and the probability density function was obtained. The condition of stochastic Hopf bifurcation and noise-induced chaotic response were determined, and the fractal boundary of the system's safe basin was provided. The reliability function was solved from the backward Kolmogorov equation, and an optimal control strategy was proposedmore » in the stochastic dynamic programming method. Numerical simulation shows that the system stability varies with the parameters, and stochastic Hopf bifurcation and chaos appear in the process; the area of the safe basin decreases when the noise intensifies, and the boundary of the safe basin becomes fractal; the system reliability improved through stochastic optimal control. Finally, the theoretical and numerical results were proved by experiments. The results are helpful in the engineering applications of GMF.« less

  10. An efficient computational method for solving nonlinear stochastic Itô integral equations: Application for stochastic problems in physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heydari, M.H., E-mail: heydari@stu.yazd.ac.ir; The Laboratory of Quantum Information Processing, Yazd University, Yazd; Hooshmandasl, M.R., E-mail: hooshmandasl@yazd.ac.ir

    Because of the nonlinearity, closed-form solutions of many important stochastic functional equations are virtually impossible to obtain. Thus, numerical solutions are a viable alternative. In this paper, a new computational method based on the generalized hat basis functions together with their stochastic operational matrix of Itô-integration is proposed for solving nonlinear stochastic Itô integral equations in large intervals. In the proposed method, a new technique for computing nonlinear terms in such problems is presented. The main advantage of the proposed method is that it transforms problems under consideration into nonlinear systems of algebraic equations which can be simply solved. Errormore » analysis of the proposed method is investigated and also the efficiency of this method is shown on some concrete examples. The obtained results reveal that the proposed method is very accurate and efficient. As two useful applications, the proposed method is applied to obtain approximate solutions of the stochastic population growth models and stochastic pendulum problem.« less

  11. Delay-distribution-dependent H∞ state estimation for delayed neural networks with (x,v)-dependent noises and fading channels.

    PubMed

    Sheng, Li; Wang, Zidong; Tian, Engang; Alsaadi, Fuad E

    2016-12-01

    This paper deals with the H ∞ state estimation problem for a class of discrete-time neural networks with stochastic delays subject to state- and disturbance-dependent noises (also called (x,v)-dependent noises) and fading channels. The time-varying stochastic delay takes values on certain intervals with known probability distributions. The system measurement is transmitted through fading channels described by the Rice fading model. The aim of the addressed problem is to design a state estimator such that the estimation performance is guaranteed in the mean-square sense against admissible stochastic time-delays, stochastic noises as well as stochastic fading signals. By employing the stochastic analysis approach combined with the Kronecker product, several delay-distribution-dependent conditions are derived to ensure that the error dynamics of the neuron states is stochastically stable with prescribed H ∞ performance. Finally, a numerical example is provided to illustrate the effectiveness of the obtained results. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Integrating stochastic time-dependent travel speed in solution methods for the dynamic dial-a-ride problem.

    PubMed

    Schilde, M; Doerner, K F; Hartl, R F

    2014-10-01

    In urban areas, logistic transportation operations often run into problems because travel speeds change, depending on the current traffic situation. If not accounted for, time-dependent and stochastic travel speeds frequently lead to missed time windows and thus poorer service. Especially in the case of passenger transportation, it often leads to excessive passenger ride times as well. Therefore, time-dependent and stochastic influences on travel speeds are relevant for finding feasible and reliable solutions. This study considers the effect of exploiting statistical information available about historical accidents, using stochastic solution approaches for the dynamic dial-a-ride problem (dynamic DARP). The authors propose two pairs of metaheuristic solution approaches, each consisting of a deterministic method (average time-dependent travel speeds for planning) and its corresponding stochastic version (exploiting stochastic information while planning). The results, using test instances with up to 762 requests based on a real-world road network, show that in certain conditions, exploiting stochastic information about travel speeds leads to significant improvements over deterministic approaches.

  13. Problems of Mathematical Finance by Stochastic Control Methods

    NASA Astrophysics Data System (ADS)

    Stettner, Łukasz

    The purpose of this paper is to present main ideas of mathematics of finance using the stochastic control methods. There is an interplay between stochastic control and mathematics of finance. On the one hand stochastic control is a powerful tool to study financial problems. On the other hand financial applications have stimulated development in several research subareas of stochastic control in the last two decades. We start with pricing of financial derivatives and modeling of asset prices, studying the conditions for the absence of arbitrage. Then we consider pricing of defaultable contingent claims. Investments in bonds lead us to the term structure modeling problems. Special attention is devoted to historical static portfolio analysis called Markowitz theory. We also briefly sketch dynamic portfolio problems using viscosity solutions to Hamilton-Jacobi-Bellman equation, martingale-convex analysis method or stochastic maximum principle together with backward stochastic differential equation. Finally, long time portfolio analysis for both risk neutral and risk sensitive functionals is introduced.

  14. Variety and volatility in financial markets

    NASA Astrophysics Data System (ADS)

    Lillo, Fabrizio; Mantegna, Rosario N.

    2000-11-01

    We study the price dynamics of stocks traded in a financial market by considering the statistical properties of both a single time series and an ensemble of stocks traded simultaneously. We use the n stocks traded on the New York Stock Exchange to form a statistical ensemble of daily stock returns. For each trading day of our database, we study the ensemble return distribution. We find that a typical ensemble return distribution exists in most of the trading days with the exception of crash and rally days and of the days following these extreme events. We analyze each ensemble return distribution by extracting its first two central moments. We observe that these moments fluctuate in time and are stochastic processes, themselves. We characterize the statistical properties of ensemble return distribution central moments by investigating their probability density functions and temporal correlation properties. In general, time-averaged and portfolio-averaged price returns have different statistical properties. We infer from these differences information about the relative strength of correlation between stocks and between different trading days. Last, we compare our empirical results with those predicted by the single-index model and we conclude that this simple model cannot explain the statistical properties of the second moment of the ensemble return distribution.

  15. Evolution and anti-evolution in a minimal stock market model

    NASA Astrophysics Data System (ADS)

    Rothenstein, R.; Pawelzik, K.

    2003-08-01

    We present a novel microscopic stock market model consisting of a large number of random agents modeling traders in a market. Each agent is characterized by a set of parameters that serve to make iterated predictions of two successive returns. The future price is determined according to the offer and the demand of all agents. The system evolves by redistributing the capital among the agents in each trading cycle. Without noise the dynamics of this system is nearly regular and thereby fails to reproduce the stochastic return fluctuations observed in real markets. However, when in each cycle a small amount of noise is introduced we find the typical features of real financial time series like fat-tails of the return distribution and large temporal correlations in the volatility without significant correlations in the price returns. Introducing the noise by an evolutionary process leads to different scalings of the return distributions that depend on the definition of fitness. Because our realistic model has only very few parameters, and the results appear to be robust with respect to the noise level and the number of agents we expect that our framework may serve as new paradigm for modeling self-generated return fluctuations in markets.

  16. The effects of behavioral and structural assumptions in artificial stock market

    NASA Astrophysics Data System (ADS)

    Liu, Xinghua; Gregor, Shirley; Yang, Jianmei

    2008-04-01

    Recent literature has developed the conjecture that important statistical features of stock price series, such as the fat tails phenomenon, may depend mainly on the market microstructure. This conjecture motivated us to investigate the roles of both the market microstructure and agent behavior with respect to high-frequency returns and daily returns. We developed two simple models to investigate this issue. The first one is a stochastic model with a clearing house microstructure and a population of zero-intelligence agents. The second one has more behavioral assumptions based on Minority Game and also has a clearing house microstructure. With the first model we found that a characteristic of the clearing house microstructure, namely the clearing frequency, can explain fat tail, excess volatility and autocorrelation phenomena of high-frequency returns. However, this feature does not cause the same phenomena in daily returns. So the Stylized Facts of daily returns depend mainly on the agents’ behavior. With the second model we investigated the effects of behavioral assumptions on daily returns. Our study implicates that the aspects which are responsible for generating the stylized facts of high-frequency returns and daily returns are different.

  17. Structural model for fluctuations in financial markets

    NASA Astrophysics Data System (ADS)

    Anand, Kartik; Khedair, Jonathan; Kühn, Reimer

    2018-05-01

    In this paper we provide a comprehensive analysis of a structural model for the dynamics of prices of assets traded in a market which takes the form of an interacting generalization of the geometric Brownian motion model. It is formally equivalent to a model describing the stochastic dynamics of a system of analog neurons, which is expected to exhibit glassy properties and thus many metastable states in a large portion of its parameter space. We perform a generating functional analysis, introducing a slow driving of the dynamics to mimic the effect of slowly varying macroeconomic conditions. Distributions of asset returns over various time separations are evaluated analytically and are found to be fat-tailed in a manner broadly in line with empirical observations. Our model also allows us to identify collective, interaction-mediated properties of pricing distributions and it predicts pricing distributions which are significantly broader than their noninteracting counterparts, if interactions between prices in the model contain a ferromagnetic bias. Using simulations, we are able to substantiate one of the main hypotheses underlying the original modeling, viz., that the phenomenon of volatility clustering can be rationalized in terms of an interplay between the dynamics within metastable states and the dynamics of occasional transitions between them.

  18. Endogenous Lunar Volatiles

    NASA Technical Reports Server (NTRS)

    McCubbin, F. M.; Liu, Y.; Barnes, J. J.; Boyce, J. W.; Day, J. M. D.; Elardo, S. M.; Hui, H.; Magna, T.; Ni, P.; Tartese, R.; hide

    2017-01-01

    The chapter will begin with an introduction that defines magmatic volatiles (e.g., H, F, Cl, S) versus geochemical volatiles (e.g., K, Rb, Zn). We will discuss our approach of understanding both types of volatiles in lunar samples and lay the ground work for how we will determine the overall volatile budget of the Moon. We will then discuss the importance of endogenous volatiles in shaping the "Newer Views of the Moon", specifically how endogenous volatiles feed forward into processes such as the origin of the Moon, magmatic differentiation, volcanism, and secondary processes during surface and crustal interactions. After the introduction, we will include a re-view/synthesis on the current state of 1) apatite compositions (volatile abundances and isotopic compositions); 2) nominally anhydrous mineral phases (moderately to highly volatile); 3) volatile (moderately to highly volatile) abundances in and isotopic compositions of lunar pyroclastic glass beads; 4) volatile (moderately to highly volatile) abundances in and isotopic compositions of lunar basalts; 5) volatile (moderately to highly volatile) abundances in and isotopic compositions of melt inclusions; and finally 6) experimental constraints on mineral-melt partitioning of moderately to highly volatile elements under lunar conditions. We anticipate that each section will summarize results since 2007 and focus on new results published since the 2015 Am Min review paper on lunar volatiles [9]. The next section will discuss how to use sample abundances of volatiles to understand the source region and potential caveats in estimating source abundances of volatiles. The following section will include our best estimates of volatile abundances and isotopic compositions (where permitted by available data) for each volatile element of interest in a number of important lunar reservoirs, including the crust, mantle, KREEP, and bulk Moon. The final section of the chapter will focus upon future work, outstanding questions, and any in-sights on the types of samples or experimental studies that will be needed to answer these questions.

  19. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.

    PubMed

    Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M

    2016-12-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.

  20. Diffusion approximations to the chemical master equation only have a consistent stochastic thermodynamics at chemical equilibrium

    NASA Astrophysics Data System (ADS)

    Horowitz, Jordan M.

    2015-07-01

    The stochastic thermodynamics of a dilute, well-stirred mixture of chemically reacting species is built on the stochastic trajectories of reaction events obtained from the chemical master equation. However, when the molecular populations are large, the discrete chemical master equation can be approximated with a continuous diffusion process, like the chemical Langevin equation or low noise approximation. In this paper, we investigate to what extent these diffusion approximations inherit the stochastic thermodynamics of the chemical master equation. We find that a stochastic-thermodynamic description is only valid at a detailed-balanced, equilibrium steady state. Away from equilibrium, where there is no consistent stochastic thermodynamics, we show that one can still use the diffusive solutions to approximate the underlying thermodynamics of the chemical master equation.

  1. Diffusion approximations to the chemical master equation only have a consistent stochastic thermodynamics at chemical equilibrium.

    PubMed

    Horowitz, Jordan M

    2015-07-28

    The stochastic thermodynamics of a dilute, well-stirred mixture of chemically reacting species is built on the stochastic trajectories of reaction events obtained from the chemical master equation. However, when the molecular populations are large, the discrete chemical master equation can be approximated with a continuous diffusion process, like the chemical Langevin equation or low noise approximation. In this paper, we investigate to what extent these diffusion approximations inherit the stochastic thermodynamics of the chemical master equation. We find that a stochastic-thermodynamic description is only valid at a detailed-balanced, equilibrium steady state. Away from equilibrium, where there is no consistent stochastic thermodynamics, we show that one can still use the diffusive solutions to approximate the underlying thermodynamics of the chemical master equation.

  2. Breaking the theoretical scaling limit for predicting quasiparticle energies: the stochastic GW approach.

    PubMed

    Neuhauser, Daniel; Gao, Yi; Arntsen, Christopher; Karshenas, Cyrus; Rabani, Eran; Baer, Roi

    2014-08-15

    We develop a formalism to calculate the quasiparticle energy within the GW many-body perturbation correction to the density functional theory. The occupied and virtual orbitals of the Kohn-Sham Hamiltonian are replaced by stochastic orbitals used to evaluate the Green function G, the polarization potential W, and, thereby, the GW self-energy. The stochastic GW (sGW) formalism relies on novel theoretical concepts such as stochastic time-dependent Hartree propagation, stochastic matrix compression, and spatial or temporal stochastic decoupling techniques. Beyond the theoretical interest, the formalism enables linear scaling GW calculations breaking the theoretical scaling limit for GW as well as circumventing the need for energy cutoff approximations. We illustrate the method for silicon nanocrystals of varying sizes with N_{e}>3000 electrons.

  3. Biochemical simulations: stochastic, approximate stochastic and hybrid approaches.

    PubMed

    Pahle, Jürgen

    2009-01-01

    Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem.

  4. Biochemical simulations: stochastic, approximate stochastic and hybrid approaches

    PubMed Central

    2009-01-01

    Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem. PMID:19151097

  5. Towards sub-optimal stochastic control of partially observable stochastic systems

    NASA Technical Reports Server (NTRS)

    Ruzicka, G. J.

    1980-01-01

    A class of multidimensional stochastic control problems with noisy data and bounded controls encountered in aerospace design is examined. The emphasis is on suboptimal design, the optimality being taken in quadratic mean sense. To that effect the problem is viewed as a stochastic version of the Lurie problem known from nonlinear control theory. The main result is a separation theorem (involving a nonlinear Kalman-like filter) suitable for Lurie-type approximations. The theorem allows for discontinuous characteristics. As a byproduct the existence of strong solutions to a class of non-Lipschitzian stochastic differential equations in dimensions is proven.

  6. Modeling and Properties of Nonlinear Stochastic Dynamical System of Continuous Culture

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Feng, Enmin; Ye, Jianxiong; Xiu, Zhilong

    The stochastic counterpart to the deterministic description of continuous fermentation with ordinary differential equation is investigated in the process of glycerol bio-dissimilation to 1,3-propanediol by Klebsiella pneumoniae. We briefly discuss the continuous fermentation process driven by three-dimensional Brownian motion and Lipschitz coefficients, which is suitable for the factual fermentation. Subsequently, we study the existence and uniqueness of solutions for the stochastic system as well as the boundedness of the Two-order Moment and the Markov property of the solution. Finally stochastic simulation is carried out under the Stochastic Euler-Maruyama method.

  7. The Sharma-Parthasarathy stochastic two-body problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cresson, J.; SYRTE/Observatoire de Paris, 75014 Paris; Pierret, F.

    2015-03-15

    We study the Sharma-Parthasarathy stochastic two-body problem introduced by Sharma and Parthasarathy in [“Dynamics of a stochastically perturbed two-body problem,” Proc. R. Soc. A 463, 979-1003 (2007)]. In particular, we focus on the preservation of some fundamental features of the classical two-body problem like the Hamiltonian structure and first integrals in the stochastic case. Numerical simulations are performed which illustrate the dynamical behaviour of the osculating elements as the semi-major axis, the eccentricity, and the pericenter. We also derive a stochastic version of Gauss’s equations in the planar case.

  8. 2–stage stochastic Runge–Kutta for stochastic delay differential equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosli, Norhayati; Jusoh Awang, Rahimah; Bahar, Arifah

    2015-05-15

    This paper proposes a newly developed one-step derivative-free method, that is 2-stage stochastic Runge-Kutta (SRK2) to approximate the solution of stochastic delay differential equations (SDDEs) with a constant time lag, r > 0. General formulation of stochastic Runge-Kutta for SDDEs is introduced and Stratonovich Taylor series expansion for numerical solution of SRK2 is presented. Local truncation error of SRK2 is measured by comparing the Stratonovich Taylor expansion of the exact solution with the computed solution. Numerical experiment is performed to assure the validity of the method in simulating the strong solution of SDDEs.

  9. Approximate Dynamic Programming and Aerial Refueling

    DTIC Science & Technology

    2007-06-01

    by two Army Air Corps de Havilland DH -4Bs (9). While crude by modern standards, the passing of hoses be- tween planes is effectively the same approach...incorporating stochastic data sets. . . . . . . . . . . 106 55 Total Cost Stochastically Trained Simulations versus Deterministically Trained Simulations...incorporating stochastic data sets. 106 To create meaningful results when testing stochastic data, the data sets are av- eraged so that conclusions are not

  10. Artificial Neural Network Metamodels of Stochastic Computer Simulations

    DTIC Science & Technology

    1994-08-10

    SUBTITLE r 5. FUNDING NUMBERS Artificial Neural Network Metamodels of Stochastic I () Computer Simulations 6. AUTHOR(S) AD- A285 951 Robert Allen...8217!298*1C2 ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC COMPUTER SIMULATIONS by Robert Allen Kilmer B.S. in Education Mathematics, Indiana...dedicate this document to the memory of my father, William Ralph Kilmer. mi ABSTRACT Signature ARTIFICIAL NEURAL NETWORK METAMODELS OF STOCHASTIC

  11. Approximation of Quantum Stochastic Differential Equations for Input-Output Model Reduction

    DTIC Science & Technology

    2016-02-25

    Approximation of Quantum Stochastic Differential Equations for Input-Output Model Reduction We have completed a short program of theoretical research...on dimensional reduction and approximation of models based on quantum stochastic differential equations. Our primary results lie in the area of...2211 quantum probability, quantum stochastic differential equations REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR

  12. Evaporative fractionation of volatile stable isotopes and their bearing on the origin of the Moon

    PubMed Central

    Day, James M. D.; Moynier, Frederic

    2014-01-01

    The Moon is depleted in volatile elements relative to the Earth and Mars. Low abundances of volatile elements, fractionated stable isotope ratios of S, Cl, K and Zn, high μ (238U/204Pb) and long-term Rb/Sr depletion are distinguishing features of the Moon, relative to the Earth. These geochemical characteristics indicate both inheritance of volatile-depleted materials that formed the Moon and planets and subsequent evaporative loss of volatile elements that occurred during lunar formation and differentiation. Models of volatile loss through localized eruptive degassing are not consistent with the available S, Cl, Zn and K isotopes and abundance data for the Moon. The most probable cause of volatile depletion is global-scale evaporation resulting from a giant impact or a magma ocean phase where inefficient volatile loss during magmatic convection led to the present distribution of volatile elements within mantle and crustal reservoirs. Problems exist for models of planetary volatile depletion following giant impact. Most critically, in this model, the volatile loss requires preferential delivery and retention of late-accreted volatiles to the Earth compared with the Moon. Different proportions of late-accreted mass are computed to explain present-day distributions of volatile and moderately volatile elements (e.g. Pb, Zn; 5 to >10%) relative to highly siderophile elements (approx. 0.5%) for the Earth. Models of early magma ocean phases may be more effective in explaining the volatile loss. Basaltic materials (e.g. eucrites and angrites) from highly differentiated airless asteroids are volatile-depleted, like the Moon, whereas the Earth and Mars have proportionally greater volatile contents. Parent-body size and the existence of early atmospheres are therefore likely to represent fundamental controls on planetary volatile retention or loss. PMID:25114311

  13. Doubly stochastic Poisson processes in artificial neural learning.

    PubMed

    Card, H C

    1998-01-01

    This paper investigates neuron activation statistics in artificial neural networks employing stochastic arithmetic. It is shown that a doubly stochastic Poisson process is an appropriate model for the signals in these circuits.

  14. Accounting for mudflow genesis in preliminary assessment of the maximum volume of solid mudflow sediments in the North Caucasus

    NASA Astrophysics Data System (ADS)

    Zalikhanov, M. Ch.; Kondratieva, N. V.; Adzhiev, A. Kh.; Razumov, V. V.

    2016-09-01

    The area of investigation was subject to multifactor analysis of the relationship between the maximum amount of mudflow solid sediments ( W) and parameters such as the mudflow basin area ( S), average channel slope (α), and mudflow channel length ( L). They were used to obtain analytical expressions in order to approximate the W( S, L, α) relation based on the mudflow genesis and source height. Statistical data on mudflow manifestations in different basins in the North Caucasus covering more than fifty years were used to obtain the analytical expressions in order to assess the maximum volume of mudflow solid sediments.

  15. Prospect Theory and Interval-Valued Hesitant Set for Safety Evacuation Model

    NASA Astrophysics Data System (ADS)

    Kou, Meng; Lu, Na

    2018-01-01

    The study applies the research results of prospect theory and multi attribute decision making theory, combined with the complexity, uncertainty and multifactor influence of the underground mine fire system and takes the decision makers’ psychological behavior of emotion and intuition into full account to establish the intuitionistic fuzzy multiple attribute decision making method that is based on the prospect theory. The model established by this method can explain the decision maker’s safety evacuation decision behavior in the complex system of underground mine fire due to the uncertainty of the environment, imperfection of the information and human psychological behavior and other factors.

  16. Financial model calibration using consistency hints.

    PubMed

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  17. Leadership Styles of Oxford House Officers

    PubMed Central

    Komer, Anne C; Jason, Leonard A; Harvey, Ronald; Olson, Brad

    2015-01-01

    Oxford House recovery homes are unusual compared to most recovery homes in that they function entirely without the use of staff; instead members are elected to officer positions. The aim of this study was to perform preliminary analysis of the types of leadership styles utilized by members of oxford house. Twentynine house residents of five Oxford Houses were asked to rate their own leadership styles using the leader behavior description questionnaire and the multifactor leader questionnaire. Results showed that participants were more likely to use person-oriented behaviors above task-oriented actions. Transformational leadership was associated with higher outcomes than Transactional leadership. Implications for future research are discussed. PMID:26380329

  18. Leadership Styles of Oxford House Officers.

    PubMed

    Komer, Anne C; Jason, Leonard A; Harvey, Ronald; Olson, Brad

    Oxford House recovery homes are unusual compared to most recovery homes in that they function entirely without the use of staff; instead members are elected to officer positions. The aim of this study was to perform preliminary analysis of the types of leadership styles utilized by members of oxford house. Twentynine house residents of five Oxford Houses were asked to rate their own leadership styles using the leader behavior description questionnaire and the multifactor leader questionnaire. Results showed that participants were more likely to use person-oriented behaviors above task-oriented actions. Transformational leadership was associated with higher outcomes than Transactional leadership. Implications for future research are discussed.

  19. Entropic stochastic resonance of a self-propelled Janus particle

    NASA Astrophysics Data System (ADS)

    Liu, Zhenzhen; Du, Luchun; Guo, Wei; Mei, Dong-Cheng

    2016-10-01

    Entropic stochastic resonance is investigated when a self-propelled Janus particle moves in a double-cavity container. Numerical simulation results indicate the entropic stochastic resonance can survive even if there is no symmetry breaking in any direction. This is the essential distinction between the property of a self-propelled Janus particle and that of a passive Brownian particle, for the symmetry breaking is necessary for the entropic stochastic resonance of a passive Brownian particle. With the rotational noise intensity growing at small fixed noise intensity of translational motion, the signal power amplification increases monotonically towards saturation which also can be regarded as a kind of stochastic resonance effect. Besides, the increase in the natural frequency of the periodic driving depresses the degree of the stochastic resonance, whereas the rise in its amplitude enhances and then suppresses the behavior.

  20. Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology

    PubMed Central

    Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.

    2016-01-01

    Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915

  1. Stochastic receding horizon control: application to an octopedal robot

    NASA Astrophysics Data System (ADS)

    Shah, Shridhar K.; Tanner, Herbert G.

    2013-06-01

    Miniature autonomous systems are being developed under ARL's Micro Autonomous Systems and Technology (MAST). These systems can only be fitted with a small-size processor, and their motion behavior is inherently uncertain due to manufacturing and platform-ground interactions. One way to capture this uncertainty is through a stochastic model. This paper deals with stochastic motion control design and implementation for MAST- specific eight-legged miniature crawling robots, which have been kinematically modeled as systems exhibiting the behavior of a Dubin's car with stochastic noise. The control design takes the form of stochastic receding horizon control, and is implemented on a Gumstix Overo Fire COM with 720 MHz processor and 512 MB RAM, weighing 5.5 g. The experimental results show the effectiveness of this control law for miniature autonomous systems perturbed by stochastic noise.

  2. Optimality, stochasticity, and variability in motor behavior

    PubMed Central

    Guigon, Emmanuel; Baraduc, Pierre; Desmurget, Michel

    2008-01-01

    Recent theories of motor control have proposed that the nervous system acts as a stochastically optimal controller, i.e. it plans and executes motor behaviors taking into account the nature and statistics of noise. Detrimental effects of noise are converted into a principled way of controlling movements. Attractive aspects of such theories are their ability to explain not only characteristic features of single motor acts, but also statistical properties of repeated actions. Here, we present a critical analysis of stochastic optimality in motor control which reveals several difficulties with this hypothesis. We show that stochastic control may not be necessary to explain the stochastic nature of motor behavior, and we propose an alternative framework, based on the action of a deterministic controller coupled with an optimal state estimator, which relieves drawbacks of stochastic optimality and appropriately explains movement variability. PMID:18202922

  3. Stochastic partial differential fluid equations as a diffusive limit of deterministic Lagrangian multi-time dynamics.

    PubMed

    Cotter, C J; Gottwald, G A; Holm, D D

    2017-09-01

    In Holm (Holm 2015 Proc. R. Soc. A 471 , 20140963. (doi:10.1098/rspa.2014.0963)), stochastic fluid equations were derived by employing a variational principle with an assumed stochastic Lagrangian particle dynamics. Here we show that the same stochastic Lagrangian dynamics naturally arises in a multi-scale decomposition of the deterministic Lagrangian flow map into a slow large-scale mean and a rapidly fluctuating small-scale map. We employ homogenization theory to derive effective slow stochastic particle dynamics for the resolved mean part, thereby obtaining stochastic fluid partial equations in the Eulerian formulation. To justify the application of rigorous homogenization theory, we assume mildly chaotic fast small-scale dynamics, as well as a centring condition. The latter requires that the mean of the fluctuating deviations is small, when pulled back to the mean flow.

  4. Effects of stochastic sodium channels on extracellular excitation of myelinated nerve fibers.

    PubMed

    Mino, Hiroyuki; Grill, Warren M

    2002-06-01

    The effects of the stochastic gating properties of sodium channels on the extracellular excitation properties of mammalian nerve fibers was determined by computer simulation. To reduce computation time, a hybrid multicompartment cable model including five central nodes of Ranvier containing stochastic sodium channels and 16 flanking nodes containing detenninistic membrane dynamics was developed. The excitation properties of the hybrid cable model were comparable with those of a full stochastic cable model including 21 nodes of Ranvier containing stochastic sodium channels, indicating the validity of the hybrid cable model. The hybrid cable model was used to investigate whether or not the excitation properties of extracellularly activated fibers were influenced by the stochastic gating of sodium channels, including spike latencies, strength-duration (SD), current-distance (IX), and recruitment properties. The stochastic properties of the sodium channels in the hybrid cable model had the greatest impact when considering the temporal dynamics of nerve fibers, i.e., a large variability in latencies, while they did not influence the SD, IX, or recruitment properties as compared with those of the conventional deterministic cable model. These findings suggest that inclusion of stochastic nodes is not important for model-based design of stimulus waveforms for activation of motor nerve fibers. However, in cases where temporal fine structure is important, for example in sensory neural prostheses in the auditory and visual systems, the stochastic properties of the sodium channels may play a key role in the design of stimulus waveforms.

  5. Accurate hybrid stochastic simulation of a system of coupled chemical or biochemical reactions.

    PubMed

    Salis, Howard; Kaznessis, Yiannis

    2005-02-01

    The dynamical solution of a well-mixed, nonlinear stochastic chemical kinetic system, described by the Master equation, may be exactly computed using the stochastic simulation algorithm. However, because the computational cost scales with the number of reaction occurrences, systems with one or more "fast" reactions become costly to simulate. This paper describes a hybrid stochastic method that partitions the system into subsets of fast and slow reactions, approximates the fast reactions as a continuous Markov process, using a chemical Langevin equation, and accurately describes the slow dynamics using the integral form of the "Next Reaction" variant of the stochastic simulation algorithm. The key innovation of this method is its mechanism of efficiently monitoring the occurrences of slow, discrete events while simultaneously simulating the dynamics of a continuous, stochastic or deterministic process. In addition, by introducing an approximation in which multiple slow reactions may occur within a time step of the numerical integration of the chemical Langevin equation, the hybrid stochastic method performs much faster with only a marginal decrease in accuracy. Multiple examples, including a biological pulse generator and a large-scale system benchmark, are simulated using the exact and proposed hybrid methods as well as, for comparison, a previous hybrid stochastic method. Probability distributions of the solutions are compared and the weak errors of the first two moments are computed. In general, these hybrid methods may be applied to the simulation of the dynamics of a system described by stochastic differential, ordinary differential, and Master equations.

  6. Three Dimensional Time Dependent Stochastic Method for Cosmic-ray Modulation

    NASA Astrophysics Data System (ADS)

    Pei, C.; Bieber, J. W.; Burger, R. A.; Clem, J. M.

    2009-12-01

    A proper understanding of the different behavior of intensities of galactic cosmic rays in different solar cycle phases requires solving the modulation equation with time dependence. We present a detailed description of our newly developed stochastic approach for cosmic ray modulation which we believe is the first attempt to solve the time dependent Parker equation in 3D evolving from our 3D steady state stochastic approach, which has been benchmarked extensively by using the finite difference method. Our 3D stochastic method is different from other stochastic approaches in literature (Ball et al 2005, Miyake et al 2005, and Florinski 2008) in several ways. For example, we employ spherical coordinates which makes the code much more efficient by reducing coordinate transformations. What's more, our stochastic differential equations are different from others because our map from Parker's original equation to the Fokker-Planck equation extends the method used by Jokipii and Levy 1977 while others don't although all 3D stochastic methods are essentially based on Ito formula. The advantage of the stochastic approach is that it also gives the probability information of travel times and path lengths of cosmic rays besides the intensities. We show that excellent agreement exists between solutions obtained by our steady state stochastic method and by the traditional finite difference method. We also show time dependent solutions for an idealized heliosphere which has a Parker magnetic field, a planar current sheet, and a simple initial condition.

  7. An overview of plant volatile metabolomics, sample treatment and reporting considerations with emphasis on mechanical damage and biological control of weeds.

    PubMed

    Beck, John J; Smith, Lincoln; Baig, Nausheena

    2014-01-01

    The technology for the collection and analysis of plant-emitted volatiles for understanding chemical cues of plant-plant, plant-insect or plant-microbe interactions has increased over the years. Consequently, the in situ collection, analysis and identification of volatiles are considered integral to elucidation of complex plant communications. Due to the complexity and range of emissions the conditions for consistent emission of volatiles are difficult to standardise. To discuss: evaluation of emitted volatile metabolites as a means of screening potential target- and non-target weeds/plants for insect biological control agents; plant volatile metabolomics to analyse resultant data; importance of considering volatiles from damaged plants; and use of a database for reporting experimental conditions and results. Recent literature relating to plant volatiles and plant volatile metabolomics are summarised to provide a basic understanding of how metabolomics can be applied to the study of plant volatiles. An overview of plant secondary metabolites, plant volatile metabolomics, analysis of plant volatile metabolomics data and the subsequent input into a database, the roles of plant volatiles, volatile emission as a function of treatment, and the application of plant volatile metabolomics to biological control of invasive weeds. It is recommended that in addition to a non-damaged treatment, plants be damaged prior to collecting volatiles to provide the greatest diversity of odours. For the model system provided, optimal volatile emission occurred when the leaf was punctured with a needle. Results stored in a database should include basic environmental conditions or treatments. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Contribution to volatile organic compound exposures from time spent in stores and restaurants and bars.

    PubMed

    Loh, Miranda M; Houseman, E Andres; Levy, Jonathan I; Spengler, John D; Bennett, Deborah H

    2009-11-01

    Many people spend time in stores and restaurants, yet there has been little investigation of the influence of these microenvironments on personal exposure. Relative to the outdoors, transportation, and the home, these microenvironments have high concentrations of several volatile organic compounds (VOCs). We developed a stochastic model to examine the effect of VOC concentrations in these microenvironments on total personal exposure for (1) non-smoking adults working in offices who spend time in stores and restaurants or bars and (2) non-smoking adults who work in these establishments. We also compared the effect of working in a smoking versus non-smoking restaurant or bar. Input concentrations for each microenvironment were developed from the literature whereas time activity inputs were taken from the National Human Activity Patterns Survey. Time-averaged exposures were simulated for 5000 individuals over a weeklong period for each analysis. Mean contributions to personal exposure from non-working time spent in stores and restaurants or bars range from <5% to 20%, depending on the VOC and time-activity patterns. At the 95th percentile of the distribution of the proportion of personal exposure attributable to time spent in stores and restaurants or bars, these microenvironments can be responsible for over half of a person's total exposure to certain VOCs. People working in restaurants or bars where smoking is allowed had the highest fraction of exposure attributable to their workplace. At the median, people who worked in stores or restaurants tended to have 20-60% of their total exposures from time spent at work. These results indicate that stores and restaurants can be large contributors to personal exposure to VOCs for both workers in those establishments and for a subset of people who visit these places, and that incorporation of these non-residential microenvironments can improve models of personal exposure distributions.

  9. Making an Iron Planet: The Case for Repeated Hit and Run Collisions

    NASA Astrophysics Data System (ADS)

    Asphaug, E. I.; Reufer, A.

    2014-12-01

    Earth, Venus, Mars and some of the largest asteroids have massive silicate mantles surrounding iron cores, and chondritic compositions. Against this backdrop are anomalies like the iron planet Mercury, and the Moon with almost no core, and metallic asteroids like Psyche. The Moon can be explained by giant impact, but for Mercury a giant impact (Benz et al., Icarus 1988) is problematic. Mercury must retain substantial volatiles after its obliteration (e.g. Peplowski et al., Science 2011), and must somehow avoid accreting its ejected silicates (Gladman and Coffey, MAPS 2009). SPH simulations have shown (Asphaug and Reufer, Nature Geosciences 2014; Sarid et al., LPSC 2014) that a differentiated chondritic proto-Mercury about 3 times its present mass can be stripped of its mantle in one energetic hit and run collision with a larger planet (proto-Venus or proto-Earth). To preserve Mercury's volatiles we also consider the scenario of lower energy hit and runs, in succession. We show that if 20 Mars-like planets accreted stochastically to form Venus and the Earth, then the statistics of attrition is likely to lead to one planet (Mercury) expressing repeated mantle stripping, and another planet (Mars) relatively undisturbed. For iron asteroids the "missing mantle paradox" likewise looms prominent. Where does it go, and how do we strip away so much mantle rock (in some cases down to a bare iron core; Yang et al., Nature 2007, Moskovitz et al., EPSL 2011) while leaving asteroids like Vesta presumably intact? According to the hit and run hypothesis, the sink for all this missing silicate is the larger accreted bodies at the top of the feeding chain, as they win the pairwise dynamical competition for stripped materials. This exotic origin of relics is only relevant to those few pairwise encounters that do not accrete both bodies. So the small survivors are lucky, and how they are lucky -- their attrition bias -- is manifested as compositional diversity and a preponderance of iron relics.

  10. Stochastic Nature in Cellular Processes

    NASA Astrophysics Data System (ADS)

    Liu, Bo; Liu, Sheng-Jun; Wang, Qi; Yan, Shi-Wei; Geng, Yi-Zhao; Sakata, Fumihiko; Gao, Xing-Fa

    2011-11-01

    The importance of stochasticity in cellular processes is increasingly recognized in both theoretical and experimental studies. General features of stochasticity in gene regulation and expression are briefly reviewed in this article, which include the main experimental phenomena, classification, quantization and regulation of noises. The correlation and transmission of noise in cascade networks are analyzed further and the stochastic simulation methods that can capture effects of intrinsic and extrinsic noise are described.

  11. Dynamical Epidemic Suppression Using Stochastic Prediction and Control

    DTIC Science & Technology

    2004-10-28

    initial probability density function (PDF), p: D C R2 -- R, is defined by the stochastic Frobenius - Perron For deterministic systems, normal methods of...induced chaos. To analyze the qualitative change, we apply the technique of the stochastic Frobenius - Perron operator [L. Billings et al., Phys. Rev. Lett...transition matrix describing the probability of transport from one region of phase space to another, which approximates the stochastic Frobenius - Perron

  12. Optimal Stochastic Modeling and Control of Flexible Structures

    DTIC Science & Technology

    1988-09-01

    1.37] and McLane [1.18] considered multivariable systems and derived their optimal control characteristics. Kleinman, Gorman and Zaborsky considered...Leondes [1.72,1.73] studied various aspects of multivariable linear stochastic, discrete-time systems that are partly deterministic, and partly stochastic...June 1966. 1.8. A.V. Balaknishnan, Applied Functional Analaysis , 2nd ed., New York, N.Y.: Springer-Verlag, 1981 1.9. Peter S. Maybeck, Stochastic

  13. Optimal Control of Stochastic Systems Driven by Fractional Brownian Motions

    DTIC Science & Technology

    2014-10-09

    problems for stochastic partial differential equations driven by fractional Brownian motions are explicitly solved. For the control of a continuous time...linear systems with Brownian motion or a discrete time linear system with a white Gaussian noise and costs 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 stochastic optimal control, fractional Brownian motion , stochastic

  14. Stochastic Game Analysis and Latency Awareness for Self-Adaptation

    DTIC Science & Technology

    2014-01-01

    this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to

  15. A Rigorous Temperature-Dependent Stochastic Modelling and Testing for MEMS-Based Inertial Sensor Errors.

    PubMed

    El-Diasty, Mohammed; Pagiatakis, Spiros

    2009-01-01

    In this paper, we examine the effect of changing the temperature points on MEMS-based inertial sensor random error. We collect static data under different temperature points using a MEMS-based inertial sensor mounted inside a thermal chamber. Rigorous stochastic models, namely Autoregressive-based Gauss-Markov (AR-based GM) models are developed to describe the random error behaviour. The proposed AR-based GM model is initially applied to short stationary inertial data to develop the stochastic model parameters (correlation times). It is shown that the stochastic model parameters of a MEMS-based inertial unit, namely the ADIS16364, are temperature dependent. In addition, field kinematic test data collected at about 17 °C are used to test the performance of the stochastic models at different temperature points in the filtering stage using Unscented Kalman Filter (UKF). It is shown that the stochastic model developed at 20 °C provides a more accurate inertial navigation solution than the ones obtained from the stochastic models developed at -40 °C, -20 °C, 0 °C, +40 °C, and +60 °C. The temperature dependence of the stochastic model is significant and should be considered at all times to obtain optimal navigation solution for MEMS-based INS/GPS integration.

  16. Multivariate moment closure techniques for stochastic kinetic models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lakatos, Eszter, E-mail: e.lakatos13@imperial.ac.uk; Ale, Angelique; Kirk, Paul D. W.

    2015-09-07

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporallymore » evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.« less

  17. Does NVIX matter for market volatility? Evidence from Asia-Pacific markets

    NASA Astrophysics Data System (ADS)

    Su, Zhi; Fang, Tong; Yin, Libo

    2018-02-01

    Forecasting financial market volatility is an important issue in the area of econophysics, and revealing the determinants of the market volatility has drawn much attentions of the academics. In order to better predict market volatilities, we use news-based implied volatility (NVIX) to measure uncertainty, and examine the predictive power of NVIX on the stock market volatility in both long and short-term among Asia-Pacific markets via GARCH-MIDAS model. We find that NVIX does not well explain long-term volatility variants in the full sample period, and it is positively associated with market volatility through a subsample analysis starting from the Financial Crisis. We also find that NVIX is more efficient in determining short-term volatility than the long-term volatility, indicating that the impact of NVIX is short-lived and information that investors concern could be quickly reflected in the stock market volatilities.

  18. Optimal estimation of parameters and states in stochastic time-varying systems with time delay

    NASA Astrophysics Data System (ADS)

    Torkamani, Shahab; Butcher, Eric A.

    2013-08-01

    In this study estimation of parameters and states in stochastic linear and nonlinear delay differential systems with time-varying coefficients and constant delay is explored. The approach consists of first employing a continuous time approximation to approximate the stochastic delay differential equation with a set of stochastic ordinary differential equations. Then the problem of parameter estimation in the resulting stochastic differential system is represented as an optimal filtering problem using a state augmentation technique. By adapting the extended Kalman-Bucy filter to the resulting system, the unknown parameters of the time-delayed system are estimated from noise-corrupted, possibly incomplete measurements of the states.

  19. Use of behavioural stochastic resonance by paddle fish for feeding

    NASA Astrophysics Data System (ADS)

    Russell, David F.; Wilkens, Lon A.; Moss, Frank

    1999-11-01

    Stochastic resonance is the phenomenon whereby the addition of an optimal level of noise to a weak information-carrying input to certain nonlinear systems can enhance the information content at their outputs. Computer analysis of spike trains has been needed to reveal stochastic resonance in the responses of sensory receptors except for one study on human psychophysics. But is an animal aware of, and can it make use of, the enhanced sensory information from stochastic resonance? Here, we show that stochastic resonance enhances the normal feeding behaviour of paddlefish (Polyodon spathula), which use passive electroreceptors to detect electrical signals from planktonic prey. We demonstrate significant broadening of the spatial range for the detection of plankton when a noisy electric field of optimal amplitude is applied in the water. We also show that swarms of Daphnia plankton are a natural source of electrical noise. Our demonstration of stochastic resonance at the level of a vital animal behaviour, feeding, which has probably evolved for functional success, provides evidence that stochastic resonance in sensory nervous systems is an evolutionary adaptation.

  20. Phenomenology of stochastic exponential growth

    NASA Astrophysics Data System (ADS)

    Pirjol, Dan; Jafarpour, Farshid; Iyer-Biswas, Srividya

    2017-06-01

    Stochastic exponential growth is observed in a variety of contexts, including molecular autocatalysis, nuclear fission, population growth, inflation of the universe, viral social media posts, and financial markets. Yet literature on modeling the phenomenology of these stochastic dynamics has predominantly focused on one model, geometric Brownian motion (GBM), which can be described as the solution of a Langevin equation with linear drift and linear multiplicative noise. Using recent experimental results on stochastic exponential growth of individual bacterial cell sizes, we motivate the need for a more general class of phenomenological models of stochastic exponential growth, which are consistent with the observation that the mean-rescaled distributions are approximately stationary at long times. We show that this behavior is not consistent with GBM, instead it is consistent with power-law multiplicative noise with positive fractional powers. Therefore, we consider this general class of phenomenological models for stochastic exponential growth, provide analytical solutions, and identify the important dimensionless combination of model parameters, which determines the shape of the mean-rescaled distribution. We also provide a prescription for robustly inferring model parameters from experimentally observed stochastic growth trajectories.

  1. Integrating stochastic time-dependent travel speed in solution methods for the dynamic dial-a-ride problem

    PubMed Central

    Schilde, M.; Doerner, K.F.; Hartl, R.F.

    2014-01-01

    In urban areas, logistic transportation operations often run into problems because travel speeds change, depending on the current traffic situation. If not accounted for, time-dependent and stochastic travel speeds frequently lead to missed time windows and thus poorer service. Especially in the case of passenger transportation, it often leads to excessive passenger ride times as well. Therefore, time-dependent and stochastic influences on travel speeds are relevant for finding feasible and reliable solutions. This study considers the effect of exploiting statistical information available about historical accidents, using stochastic solution approaches for the dynamic dial-a-ride problem (dynamic DARP). The authors propose two pairs of metaheuristic solution approaches, each consisting of a deterministic method (average time-dependent travel speeds for planning) and its corresponding stochastic version (exploiting stochastic information while planning). The results, using test instances with up to 762 requests based on a real-world road network, show that in certain conditions, exploiting stochastic information about travel speeds leads to significant improvements over deterministic approaches. PMID:25844013

  2. Dynamics of a stochastic multi-strain SIS epidemic model driven by Lévy noise

    NASA Astrophysics Data System (ADS)

    Chen, Can; Kang, Yanmei

    2017-01-01

    A stochastic multi-strain SIS epidemic model is formulated by introducing Lévy noise into the disease transmission rate of each strain. First, we prove that the stochastic model admits a unique global positive solution, and, by the comparison theorem, we show that the solution remains within a positively invariant set almost surely. Next we investigate stochastic stability of the disease-free equilibrium, including stability in probability and pth moment asymptotic stability. Then sufficient conditions for persistence in the mean of the disease are established. Finally, based on an Euler scheme for Lévy-driven stochastic differential equations, numerical simulations for a stochastic two-strain model are carried out to verify the theoretical results. Moreover, numerical comparison results of the stochastic two-strain model and the deterministic version are also given. Lévy noise can cause the two strains to become extinct almost surely, even though there is a dominant strain that persists in the deterministic model. It can be concluded that the introduction of Lévy noise reduces the disease extinction threshold, which indicates that Lévy noise may suppress the disease outbreak.

  3. Volatile Metabolites

    PubMed Central

    Rowan, Daryl D.

    2011-01-01

    Volatile organic compounds (volatiles) comprise a chemically diverse class of low molecular weight organic compounds having an appreciable vapor pressure under ambient conditions. Volatiles produced by plants attract pollinators and seed dispersers, and provide defense against pests and pathogens. For insects, volatiles may act as pheromones directing social behavior or as cues for finding hosts or prey. For humans, volatiles are important as flavorants and as possible disease biomarkers. The marine environment is also a major source of halogenated and sulfur-containing volatiles which participate in the global cycling of these elements. While volatile analysis commonly measures a rather restricted set of analytes, the diverse and extreme physical properties of volatiles provide unique analytical challenges. Volatiles constitute only a small proportion of the total number of metabolites produced by living organisms, however, because of their roles as signaling molecules (semiochemicals) both within and between organisms, accurately measuring and determining the roles of these compounds is crucial to an integrated understanding of living systems. This review summarizes recent developments in volatile research from a metabolomics perspective with a focus on the role of recent technical innovation in developing new areas of volatile research and expanding the range of ecological interactions which may be mediated by volatile organic metabolites. PMID:24957243

  4. q-Gaussian distributions and multiplicative stochastic processes for analysis of multiple financial time series

    NASA Astrophysics Data System (ADS)

    Sato, Aki-Hiro

    2010-12-01

    This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.

  5. Modelling the cancer growth process by Stochastic Differential Equations with the effect of Chondroitin Sulfate (CS) as anticancer therapeutics

    NASA Astrophysics Data System (ADS)

    Syahidatul Ayuni Mazlan, Mazma; Rosli, Norhayati; Jauhari Arief Ichwan, Solachuddin; Suhaity Azmi, Nina

    2017-09-01

    A stochastic model is introduced to describe the growth of cancer affected by anti-cancer therapeutics of Chondroitin Sulfate (CS). The parameters values of the stochastic model are estimated via maximum likelihood function. The numerical method of Euler-Maruyama will be employed to solve the model numerically. The efficiency of the stochastic model is measured by comparing the simulated result with the experimental data.

  6. Hyperbolic Cross Truncations for Stochastic Fourier Cosine Series

    PubMed Central

    Zhang, Zhihua

    2014-01-01

    Based on our decomposition of stochastic processes and our asymptotic representations of Fourier cosine coefficients, we deduce an asymptotic formula of approximation errors of hyperbolic cross truncations for bivariate stochastic Fourier cosine series. Moreover we propose a kind of Fourier cosine expansions with polynomials factors such that the corresponding Fourier cosine coefficients decay very fast. Although our research is in the setting of stochastic processes, our results are also new for deterministic functions. PMID:25147842

  7. Stochastic associative memory

    NASA Astrophysics Data System (ADS)

    Baumann, Erwin W.; Williams, David L.

    1993-08-01

    Artificial neural networks capable of learning and recalling stochastic associations between non-deterministic quantities have received relatively little attention to date. One potential application of such stochastic associative networks is the generation of sensory 'expectations' based on arbitrary subsets of sensor inputs to support anticipatory and investigate behavior in sensor-based robots. Another application of this type of associative memory is the prediction of how a scene will look in one spectral band, including noise, based upon its appearance in several other wavebands. This paper describes a semi-supervised neural network architecture composed of self-organizing maps associated through stochastic inter-layer connections. This 'Stochastic Associative Memory' (SAM) can learn and recall non-deterministic associations between multi-dimensional probability density functions. The stochastic nature of the network also enables it to represent noise distributions that are inherent in any true sensing process. The SAM architecture, training process, and initial application to sensor image prediction are described. Relationships to Fuzzy Associative Memory (FAM) are discussed.

  8. Random-order fractional bistable system and its stochastic resonance

    NASA Astrophysics Data System (ADS)

    Gao, Shilong; Zhang, Li; Liu, Hui; Kan, Bixia

    2017-01-01

    In this paper, the diffusion motion of Brownian particles in a viscous liquid suffering from stochastic fluctuations of the external environment is modeled as a random-order fractional bistable equation, and as a typical nonlinear dynamic behavior, the stochastic resonance phenomena in this system are investigated. At first, the derivation process of the random-order fractional bistable system is given. In particular, the random-power-law memory is deeply discussed to obtain the physical interpretation of the random-order fractional derivative. Secondly, the stochastic resonance evoked by random-order and external periodic force is mainly studied by numerical simulation. In particular, the frequency shifting phenomena of the periodical output are observed in SR induced by the excitation of the random order. Finally, the stochastic resonance of the system under the double stochastic excitations of the random order and the internal color noise is also investigated.

  9. Variance decomposition in stochastic simulators.

    PubMed

    Le Maître, O P; Knio, O M; Moraes, A

    2015-06-28

    This work aims at the development of a mathematical and computational approach that enables quantification of the inherent sources of stochasticity and of the corresponding sensitivities in stochastic simulations of chemical reaction networks. The approach is based on reformulating the system dynamics as being generated by independent standardized Poisson processes. This reformulation affords a straightforward identification of individual realizations for the stochastic dynamics of each reaction channel, and consequently a quantitative characterization of the inherent sources of stochasticity in the system. By relying on the Sobol-Hoeffding decomposition, the reformulation enables us to perform an orthogonal decomposition of the solution variance. Thus, by judiciously exploiting the inherent stochasticity of the system, one is able to quantify the variance-based sensitivities associated with individual reaction channels, as well as the importance of channel interactions. Implementation of the algorithms is illustrated in light of simulations of simplified systems, including the birth-death, Schlögl, and Michaelis-Menten models.

  10. Cox process representation and inference for stochastic reaction-diffusion processes

    NASA Astrophysics Data System (ADS)

    Schnoerr, David; Grima, Ramon; Sanguinetti, Guido

    2016-05-01

    Complex behaviour in many systems arises from the stochastic interactions of spatially distributed particles or agents. Stochastic reaction-diffusion processes are widely used to model such behaviour in disciplines ranging from biology to the social sciences, yet they are notoriously difficult to simulate and calibrate to observational data. Here we use ideas from statistical physics and machine learning to provide a solution to the inverse problem of learning a stochastic reaction-diffusion process from data. Our solution relies on a non-trivial connection between stochastic reaction-diffusion processes and spatio-temporal Cox processes, a well-studied class of models from computational statistics. This connection leads to an efficient and flexible algorithm for parameter inference and model selection. Our approach shows excellent accuracy on numeric and real data examples from systems biology and epidemiology. Our work provides both insights into spatio-temporal stochastic systems, and a practical solution to a long-standing problem in computational modelling.

  11. Feynman-Kac formula for stochastic hybrid systems.

    PubMed

    Bressloff, Paul C

    2017-01-01

    We derive a Feynman-Kac formula for functionals of a stochastic hybrid system evolving according to a piecewise deterministic Markov process. We first derive a stochastic Liouville equation for the moment generator of the stochastic functional, given a particular realization of the underlying discrete Markov process; the latter generates transitions between different dynamical equations for the continuous process. We then analyze the stochastic Liouville equation using methods recently developed for diffusion processes in randomly switching environments. In particular, we obtain dynamical equations for the moment generating function, averaged with respect to realizations of the discrete Markov process. The resulting Feynman-Kac formula takes the form of a differential Chapman-Kolmogorov equation. We illustrate the theory by calculating the occupation time for a one-dimensional velocity jump process on the infinite or semi-infinite real line. Finally, we present an alternative derivation of the Feynman-Kac formula based on a recent path-integral formulation of stochastic hybrid systems.

  12. Hybrid ODE/SSA methods and the cell cycle model

    NASA Astrophysics Data System (ADS)

    Wang, S.; Chen, M.; Cao, Y.

    2017-07-01

    Stochastic effect in cellular systems has been an important topic in systems biology. Stochastic modeling and simulation methods are important tools to study stochastic effect. Given the low efficiency of stochastic simulation algorithms, the hybrid method, which combines an ordinary differential equation (ODE) system with a stochastic chemically reacting system, shows its unique advantages in the modeling and simulation of biochemical systems. The efficiency of hybrid method is usually limited by reactions in the stochastic subsystem, which are modeled and simulated using Gillespie's framework and frequently interrupt the integration of the ODE subsystem. In this paper we develop an efficient implementation approach for the hybrid method coupled with traditional ODE solvers. We also compare the efficiency of hybrid methods with three widely used ODE solvers RADAU5, DASSL, and DLSODAR. Numerical experiments with three biochemical models are presented. A detailed discussion is presented for the performances of three ODE solvers.

  13. Stochastic goal-oriented error estimation with memory

    NASA Astrophysics Data System (ADS)

    Ackmann, Jan; Marotzke, Jochem; Korn, Peter

    2017-11-01

    We propose a stochastic dual-weighted error estimator for the viscous shallow-water equation with boundaries. For this purpose, previous work on memory-less stochastic dual-weighted error estimation is extended by incorporating memory effects. The memory is introduced by describing the local truncation error as a sum of time-correlated random variables. The random variables itself represent the temporal fluctuations in local truncation errors and are estimated from high-resolution information at near-initial times. The resulting error estimator is evaluated experimentally in two classical ocean-type experiments, the Munk gyre and the flow around an island. In these experiments, the stochastic process is adapted locally to the respective dynamical flow regime. Our stochastic dual-weighted error estimator is shown to provide meaningful error bounds for a range of physically relevant goals. We prove, as well as show numerically, that our approach can be interpreted as a linearized stochastic-physics ensemble.

  14. Individualism in plant populations: using stochastic differential equations to model individual neighbourhood-dependent plant growth.

    PubMed

    Lv, Qiming; Schneider, Manuel K; Pitchford, Jonathan W

    2008-08-01

    We study individual plant growth and size hierarchy formation in an experimental population of Arabidopsis thaliana, within an integrated analysis that explicitly accounts for size-dependent growth, size- and space-dependent competition, and environmental stochasticity. It is shown that a Gompertz-type stochastic differential equation (SDE) model, involving asymmetric competition kernels and a stochastic term which decreases with the logarithm of plant weight, efficiently describes individual plant growth, competition, and variability in the studied population. The model is evaluated within a Bayesian framework and compared to its deterministic counterpart, and to several simplified stochastic models, using distributional validation. We show that stochasticity is an important determinant of size hierarchy and that SDE models outperform the deterministic model if and only if structural components of competition (asymmetry; size- and space-dependence) are accounted for. Implications of these results are discussed in the context of plant ecology and in more general modelling situations.

  15. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    NASA Astrophysics Data System (ADS)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  16. p-adic stochastic hidden variable model

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrew

    1998-03-01

    We propose stochastic hidden variables model in which hidden variables have a p-adic probability distribution ρ(λ) and at the same time conditional probabilistic distributions P(U,λ), U=A,A',B,B', are ordinary probabilities defined on the basis of the Kolmogorov measure-theoretical axiomatics. A frequency definition of p-adic probability is quite similar to the ordinary frequency definition of probability. p-adic frequency probability is defined as the limit of relative frequencies νn but in the p-adic metric. We study a model with p-adic stochastics on the level of the hidden variables description. But, of course, responses of macroapparatuses have to be described by ordinary stochastics. Thus our model describes a mixture of p-adic stochastics of the microworld and ordinary stochastics of macroapparatuses. In this model probabilities for physical observables are the ordinary probabilities. At the same time Bell's inequality is violated.

  17. Nonholonomic relativistic diffusion and exact solutions for stochastic Einstein spaces

    NASA Astrophysics Data System (ADS)

    Vacaru, S. I.

    2012-03-01

    We develop an approach to the theory of nonholonomic relativistic stochastic processes in curved spaces. The Itô and Stratonovich calculus are formulated for spaces with conventional horizontal (holonomic) and vertical (nonholonomic) splitting defined by nonlinear connection structures. Geometric models of the relativistic diffusion theory are elaborated for nonholonomic (pseudo) Riemannian manifolds and phase velocity spaces. Applying the anholonomic deformation method, the field equations in Einstein's gravity and various modifications are formally integrated in general forms, with generic off-diagonal metrics depending on some classes of generating and integration functions. Choosing random generating functions we can construct various classes of stochastic Einstein manifolds. We show how stochastic gravitational interactions with mixed holonomic/nonholonomic and random variables can be modelled in explicit form and study their main geometric and stochastic properties. Finally, the conditions when non-random classical gravitational processes transform into stochastic ones and inversely are analyzed.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chuchu, E-mail: chenchuchu@lsec.cc.ac.cn; Hong, Jialin, E-mail: hjl@lsec.cc.ac.cn; Zhang, Liying, E-mail: lyzhang@lsec.cc.ac.cn

    Stochastic Maxwell equations with additive noise are a system of stochastic Hamiltonian partial differential equations intrinsically, possessing the stochastic multi-symplectic conservation law. It is shown that the averaged energy increases linearly with respect to the evolution of time and the flow of stochastic Maxwell equations with additive noise preserves the divergence in the sense of expectation. Moreover, we propose three novel stochastic multi-symplectic methods to discretize stochastic Maxwell equations in order to investigate the preservation of these properties numerically. We make theoretical discussions and comparisons on all of the three methods to observe that all of them preserve the correspondingmore » discrete version of the averaged divergence. Meanwhile, we obtain the corresponding dissipative property of the discrete averaged energy satisfied by each method. Especially, the evolution rates of the averaged energies for all of the three methods are derived which are in accordance with the continuous case. Numerical experiments are performed to verify our theoretical results.« less

  19. Switching of bound vector solitons for the coupled nonlinear Schrödinger equations with nonhomogenously stochastic perturbations

    NASA Astrophysics Data System (ADS)

    Sun, Zhi-Yuan; Gao, Yi-Tian; Yu, Xin; Liu, Ying

    2012-12-01

    We investigate the dynamics of the bound vector solitons (BVSs) for the coupled nonlinear Schrödinger equations with the nonhomogenously stochastic perturbations added on their dispersion terms. Soliton switching (besides soliton breakup) can be observed between the two components of the BVSs. Rate of the maximum switched energy (absolute values) within the fixed propagation distance (about 10 periods of the BVSs) enhances in the sense of statistics when the amplitudes of stochastic perturbations increase. Additionally, it is revealed that the BVSs with enhanced coherence are more robust against the perturbations with nonhomogenous stochasticity. Diagram describing the approximate borders of the splitting and non-splitting areas is also given. Our results might be helpful in dynamics of the BVSs with stochastic noises in nonlinear optical fibers or with stochastic quantum fluctuations in Bose-Einstein condensates.

  20. Switching of bound vector solitons for the coupled nonlinear Schrödinger equations with nonhomogenously stochastic perturbations.

    PubMed

    Sun, Zhi-Yuan; Gao, Yi-Tian; Yu, Xin; Liu, Ying

    2012-12-01

    We investigate the dynamics of the bound vector solitons (BVSs) for the coupled nonlinear Schrödinger equations with the nonhomogenously stochastic perturbations added on their dispersion terms. Soliton switching (besides soliton breakup) can be observed between the two components of the BVSs. Rate of the maximum switched energy (absolute values) within the fixed propagation distance (about 10 periods of the BVSs) enhances in the sense of statistics when the amplitudes of stochastic perturbations increase. Additionally, it is revealed that the BVSs with enhanced coherence are more robust against the perturbations with nonhomogenous stochasticity. Diagram describing the approximate borders of the splitting and non-splitting areas is also given. Our results might be helpful in dynamics of the BVSs with stochastic noises in nonlinear optical fibers or with stochastic quantum fluctuations in Bose-Einstein condensates.

Top