Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model
NASA Astrophysics Data System (ADS)
Yuan, Zhongda; Deng, Junxiang; Wang, Dawei
2018-02-01
Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.
Quang V. Cao; Shanna M. McCarty
2006-01-01
Diameter distributions in a forest stand have been successfully characterized by use of the Weibull function. Of special interest are cases where parameters of a Weibull distribution that models a future stand are predicted, either directly or indirectly, from current stand density and dominant height. This study evaluated four methods of predicting the Weibull...
Fisher information for two gamma frailty bivariate Weibull models.
Bjarnason, H; Hougaard, P
2000-03-01
The asymptotic properties of frailty models for multivariate survival data are not well understood. To study this aspect, the Fisher information is derived in the standard bivariate gamma frailty model, where the survival distribution is of Weibull form conditional on the frailty. For comparison, the Fisher information is also derived in the bivariate gamma frailty model, where the marginal distribution is of Weibull form.
1981-12-01
CONCERNING THE RELIABILITY OF A SYSTEM MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS AFIT/GOR/MA/81D-8 Philippe A. Lussier 2nd Lt USAF... MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology...repetitions are used for these test procedures. vi Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter
Earthquakes: Recurrence and Interoccurrence Times
NASA Astrophysics Data System (ADS)
Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.
2008-04-01
The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.
Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield
2014-01-01
Two important wood properties are the modulus of elasticity (MOE) and the modulus of rupture (MOR). In the past, the statistical distribution of the MOE has often been modeled as Gaussian, and that of the MOR as lognormal or as a two- or three-parameter Weibull distribution. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior...
Steve P. Verrill; Frank C. Owens; David E. Kretschmann; Rubin Shmulsky
2017-01-01
It is common practice to assume that a two-parameter Weibull probability distribution is suitable for modeling lumber properties. Verrill and co-workers demonstrated theoretically and empirically that the modulus of rupture (MOR) distribution of visually graded or machine stress rated (MSR) lumber is not distributed as a Weibull. Instead, the tails of the MOR...
The Effect of Roughness Model on Scattering Properties of Ice Crystals.
NASA Technical Reports Server (NTRS)
Geogdzhayev, Igor V.; Van Diedenhoven, Bastiaan
2016-01-01
We compare stochastic models of microscale surface roughness assuming uniform and Weibull distributions of crystal facet tilt angles to calculate scattering by roughened hexagonal ice crystals using the geometric optics (GO) approximation. Both distributions are determined by similar roughness parameters, while the Weibull model depends on the additional shape parameter. Calculations were performed for two visible wavelengths (864 nm and 410 nm) for roughness values between 0.2 and 0.7 and Weibull shape parameters between 0 and 1.0 for crystals with aspect ratios of 0.21, 1 and 4.8. For this range of parameters we find that, for a given roughness level, varying the Weibull shape parameter can change the asymmetry parameter by up to about 0.05. The largest effect of the shape parameter variation on the phase function is found in the backscattering region, while the degree of linear polarization is most affected at the side-scattering angles. For high roughness, scattering properties calculated using the uniform and Weibull models are in relatively close agreement for a given roughness parameter, especially when a Weibull shape parameter of 0.75 is used. For smaller roughness values, a shape parameter close to unity provides a better agreement. Notable differences are observed in the phase function over the scattering angle range from 5deg to 20deg, where the uniform roughness model produces a plateau while the Weibull model does not.
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.
NASA Technical Reports Server (NTRS)
Kranz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.
Two-sided Topp-Leone Weibull distribution
NASA Astrophysics Data System (ADS)
Podeang, Krittaya; Bodhisuwan, Winai
2017-11-01
In this paper, we introduce a general class of lifetime distributions, called the two-sided Topp-Leone generated family of distribution. A special case of new family is the two-sided Topp-Leone Weibull distribution. This distribution used the two-sided Topp-Leone distribution as a generator for the Weibull distribution. The two-sided Topp-Leone Weibull distribution is presented in several shapes of distributions such as decreasing, unimodal, and bimodal which make this distribution more than flexible than the Weibull distribution. Its quantile function is presented. The parameter estimation method by using maximum likelihood estimation is discussed. The proposed distribution is applied to the strength data set, remission times of bladder cancer patients data set and time to failure of turbocharger data set. We compare the proposed distribution to the Topp-Leone Generated Weibull distribution. In conclusion, the two-sided Topp-Leone Weibull distribution performs similarly as the Topp-Leone Generated Weibull distribution in the first and second data sets. However, the proposed distribution can perform better than fit to Topp-Leone Generated Weibull distribution for the other.
Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...
2016-02-02
Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less
Statistical distribution of mechanical properties for three graphite-epoxy material systems
NASA Technical Reports Server (NTRS)
Reese, C.; Sorem, J., Jr.
1981-01-01
Graphite-epoxy composites are playing an increasing role as viable alternative materials in structural applications necessitating thorough investigation into the predictability and reproducibility of their material strength properties. This investigation was concerned with tension, compression, and short beam shear coupon testing of large samples from three different material suppliers to determine their statistical strength behavior. Statistical results indicate that a two Parameter Weibull distribution model provides better overall characterization of material behavior for the graphite-epoxy systems tested than does the standard Normal distribution model that is employed for most design work. While either a Weibull or Normal distribution model provides adequate predictions for average strength values, the Weibull model provides better characterization in the lower tail region where the predictions are of maximum design interest. The two sets of the same material were found to have essentially the same material properties, and indicate that repeatability can be achieved.
An EOQ Model with Two-Parameter Weibull Distribution Deterioration and Price-Dependent Demand
ERIC Educational Resources Information Center
Mukhopadhyay, Sushanta; Mukherjee, R. N.; Chaudhuri, K. S.
2005-01-01
An inventory replenishment policy is developed for a deteriorating item and price-dependent demand. The rate of deterioration is taken to be time-proportional and the time to deterioration is assumed to follow a two-parameter Weibull distribution. A power law form of the price dependence of demand is considered. The model is solved analytically…
Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.
1998-01-01
Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.
Distributions of Autocorrelated First-Order Kinetic Outcomes: Illness Severity
Englehardt, James D.
2015-01-01
Many complex systems produce outcomes having recurring, power law-like distributions over wide ranges. However, the form necessarily breaks down at extremes, whereas the Weibull distribution has been demonstrated over the full observed range. Here the Weibull distribution is derived as the asymptotic distribution of generalized first-order kinetic processes, with convergence driven by autocorrelation, and entropy maximization subject to finite positive mean, of the incremental compounding rates. Process increments represent multiplicative causes. In particular, illness severities are modeled as such, occurring in proportion to products of, e.g., chronic toxicant fractions passed by organs along a pathway, or rates of interacting oncogenic mutations. The Weibull form is also argued theoretically and by simulation to be robust to the onset of saturation kinetics. The Weibull exponential parameter is shown to indicate the number and widths of the first-order compounding increments, the extent of rate autocorrelation, and the degree to which process increments are distributed exponential. In contrast with the Gaussian result in linear independent systems, the form is driven not by independence and multiplicity of process increments, but by increment autocorrelation and entropy. In some physical systems the form may be attracting, due to multiplicative evolution of outcome magnitudes towards extreme values potentially much larger and smaller than control mechanisms can contain. The Weibull distribution is demonstrated in preference to the lognormal and Pareto I for illness severities versus (a) toxicokinetic models, (b) biologically-based network models, (c) scholastic and psychological test score data for children with prenatal mercury exposure, and (d) time-to-tumor data of the ED01 study. PMID:26061263
On the Distribution of Earthquake Interevent Times and the Impact of Spatial Scale
NASA Astrophysics Data System (ADS)
Hristopulos, Dionissios
2013-04-01
The distribution of earthquake interevent times is a subject that has attracted much attention in the statistical physics literature [1-3]. A recent paper proposes that the distribution of earthquake interevent times follows from the the interplay of the crustal strength distribution and the loading function (stress versus time) of the Earth's crust locally [4]. It was also shown that the Weibull distribution describes earthquake interevent times provided that the crustal strength also follows the Weibull distribution and that the loading function follows a power-law during the loading cycle. I will discuss the implications of this work and will present supporting evidence based on the analysis of data from seismic catalogs. I will also discuss the theoretical evidence in support of the Weibull distribution based on models of statistical physics [5]. Since other-than-Weibull interevent times distributions are not excluded in [4], I will illustrate the use of the Kolmogorov-Smirnov test in order to determine which probability distributions are not rejected by the data. Finally, we propose a modification of the Weibull distribution if the size of the system under investigation (i.e., the area over which the earthquake activity occurs) is finite with respect to a critical link size. keywords: hypothesis testing, modified Weibull, hazard rate, finite size References [1] Corral, A., 2004. Long-term clustering, scaling, and universality in the temporal occurrence of earthquakes, Phys. Rev. Lett., 9210) art. no. 108501. [2] Saichev, A., Sornette, D. 2007. Theory of earthquake recurrence times, J. Geophys. Res., Ser. B 112, B04313/1-26. [3] Touati, S., Naylor, M., Main, I.G., 2009. Origin and nonuniversality of the earthquake interevent time distribution Phys. Rev. Lett., 102 (16), art. no. 168501. [4] Hristopulos, D.T., 2003. Spartan Gibbs random field models for geostatistical applications, SIAM Jour. Sci. Comput., 24, 2125-2162. [5] I. Eliazar and J. Klafter, 2006. Growth-collapse and decay-surge evolutions, and geometric Langevin equations, Physica A, 367, 106 - 128.
NASA Astrophysics Data System (ADS)
Santi, D. N.; Purnaba, I. G. P.; Mangku, I. W.
2016-01-01
Bonus-Malus system is said to be optimal if it is financially balanced for insurance companies and fair for policyholders. Previous research about Bonus-Malus system concern with the determination of the risk premium which applied to all of the severity that guaranteed by the insurance company. In fact, not all of the severity that proposed by policyholder may be covered by insurance company. When the insurance company sets a maximum bound of the severity incurred, so it is necessary to modify the model of the severity distribution into the severity bound distribution. In this paper, optimal Bonus-Malus system is compound of claim frequency component has geometric distribution and severity component has truncated Weibull distribution is discussed. The number of claims considered to follow a Poisson distribution, and the expected number λ is exponentially distributed, so the number of claims has a geometric distribution. The severity with a given parameter θ is considered to have a truncated exponential distribution is modelled using the Levy distribution, so the severity have a truncated Weibull distribution.
A Model Based on Environmental Factors for Diameter Distribution in Black Wattle in Brazil
Sanquetta, Carlos Roberto; Behling, Alexandre; Dalla Corte, Ana Paula; Péllico Netto, Sylvio; Rodrigues, Aurelio Lourenço; Simon, Augusto Arlindo
2014-01-01
This article discusses the dynamics of a diameter distribution in stands of black wattle throughout its growth cycle using the Weibull probability density function. Moreover, the parameters of this distribution were related to environmental variables from meteorological data and surface soil horizon with the aim of finding a model for diameter distribution which their coefficients were related to the environmental variables. We found that the diameter distribution of the stand changes only slightly over time and that the estimators of the Weibull function are correlated with various environmental variables, with accumulated rainfall foremost among them. Thus, a model was obtained in which the estimators of the Weibull function are dependent on rainfall. Such a function can have important applications, such as in simulating growth potential in regions where historical growth data is lacking, as well as the behavior of the stand under different environmental conditions. The model can also be used to project growth in diameter, based on the rainfall affecting the forest over a certain time period. PMID:24932909
Mixture distributions of wind speed in the UAE
NASA Astrophysics Data System (ADS)
Shin, J.; Ouarda, T.; Lee, T. S.
2013-12-01
Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.
Statistical wind analysis for near-space applications
NASA Astrophysics Data System (ADS)
Roney, Jason A.
2007-09-01
Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.
Scaling in the distribution of intertrade durations of Chinese stocks
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Chen, Wei; Zhou, Wei-Xing
2008-10-01
The distribution of intertrade durations, defined as the waiting times between two consecutive transactions, is investigated based upon the limit order book data of 23 liquid Chinese stocks listed on the Shenzhen Stock Exchange in the whole year 2003. A scaling pattern is observed in the distributions of intertrade durations, where the empirical density functions of the normalized intertrade durations of all 23 stocks collapse onto a single curve. The scaling pattern is also observed in the intertrade duration distributions for filled and partially filled trades and in the conditional distributions. The ensemble distributions for all stocks are modeled by the Weibull and the Tsallis q-exponential distributions. Maximum likelihood estimation shows that the Weibull distribution outperforms the q-exponential for not-too-large intertrade durations which account for more than 98.5% of the data. Alternatively, nonlinear least-squares estimation selects the q-exponential as a better model, in which the optimization is conducted on the distance between empirical and theoretical values of the logarithmic probability densities. The distribution of intertrade durations is Weibull followed by a power-law tail with an asymptotic tail exponent close to 3.
On alternative q-Weibull and q-extreme value distributions: Properties and applications
NASA Astrophysics Data System (ADS)
Zhang, Fode; Ng, Hon Keung Tony; Shi, Yimin
2018-01-01
Tsallis statistics and Tsallis distributions have been attracting a significant amount of research work in recent years. Importantly, the Tsallis statistics, q-distributions have been applied in different disciplines. Yet, a relationship between some existing q-Weibull distributions and q-extreme value distributions that is parallel to the well-established relationship between the conventional Weibull and extreme value distributions through a logarithmic transformation has not be established. In this paper, we proposed an alternative q-Weibull distribution that leads to a q-extreme value distribution via the q-logarithm transformation. Some important properties of the proposed q-Weibull and q-extreme value distributions are studied. Maximum likelihood and least squares estimation methods are used to estimate the parameters of q-Weibull distribution and their performances are investigated through a Monte Carlo simulation study. The methodologies and the usefulness of the proposed distributions are illustrated by fitting the 2014 traffic fatalities data from The National Highway Traffic Safety Administration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less
Weibull mixture regression for marginal inference in zero-heavy continuous outcomes.
Gebregziabher, Mulugeta; Voronca, Delia; Teklehaimanot, Abeba; Santa Ana, Elizabeth J
2017-06-01
Continuous outcomes with preponderance of zero values are ubiquitous in data that arise from biomedical studies, for example studies of addictive disorders. This is known to lead to violation of standard assumptions in parametric inference and enhances the risk of misleading conclusions unless managed properly. Two-part models are commonly used to deal with this problem. However, standard two-part models have limitations with respect to obtaining parameter estimates that have marginal interpretation of covariate effects which are important in many biomedical applications. Recently marginalized two-part models are proposed but their development is limited to log-normal and log-skew-normal distributions. Thus, in this paper, we propose a finite mixture approach, with Weibull mixture regression as a special case, to deal with the problem. We use extensive simulation study to assess the performance of the proposed model in finite samples and to make comparisons with other family of models via statistical information and mean squared error criteria. We demonstrate its application on real data from a randomized controlled trial of addictive disorders. Our results show that a two-component Weibull mixture model is preferred for modeling zero-heavy continuous data when the non-zero part are simulated from Weibull or similar distributions such as Gamma or truncated Gauss.
Use of the Weibull function to predict future diameter distributions from current plot data
Quang V. Cao
2012-01-01
The Weibull function has been widely used to characterize diameter distributions in forest stands. The future diameter distribution of a forest stand can be predicted by use of a Weibull probability density function from current inventory data for that stand. The parameter recovery approach has been used to ârecoverâ the Weibull parameters from diameter moments or...
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
The distribution of first-passage times and durations in FOREX and future markets
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-ichi; Scalas, Enrico
2009-07-01
Possible distributions are discussed for intertrade durations and first-passage processes in financial markets. The view-point of renewal theory is assumed. In order to represent market data with relatively long durations, two types of distributions are used, namely a distribution derived from the Mittag-Leffler survival function and the Weibull distribution. For the Mittag-Leffler type distribution, the average waiting time (residual life time) is strongly dependent on the choice of a cut-off parameter tmax, whereas the results based on the Weibull distribution do not depend on such a cut-off. Therefore, a Weibull distribution is more convenient than a Mittag-Leffler type if one wishes to evaluate relevant statistics such as average waiting time in financial markets with long durations. On the other hand, we find that the Gini index is rather independent of the cut-off parameter. Based on the above considerations, we propose a good candidate for describing the distribution of first-passage time in a market: The Weibull distribution with a power-law tail. This distribution compensates the gap between theoretical and empirical results more efficiently than a simple Weibull distribution. It should be stressed that a Weibull distribution with a power-law tail is more flexible than the Mittag-Leffler distribution, which itself can be approximated by a Weibull distribution and a power-law. Indeed, the key point is that in the former case there is freedom of choice for the exponent of the power-law attached to the Weibull distribution, which can exceed 1 in order to reproduce decays faster than possible with a Mittag-Leffler distribution. We also give a useful formula to determine an optimal crossover point minimizing the difference between the empirical average waiting time and the one predicted from renewal theory. Moreover, we discuss the limitation of our distributions by applying our distribution to the analysis of the BTP future and calculating the average waiting time. We find that our distribution is applicable as long as durations follow a Weibull law for short times and do not have too heavy a tail.
A log-Weibull spatial scan statistic for time to event data.
Usman, Iram; Rosychuk, Rhonda J
2018-06-13
Spatial scan statistics have been used for the identification of geographic clusters of elevated numbers of cases of a condition such as disease outbreaks. These statistics accompanied by the appropriate distribution can also identify geographic areas with either longer or shorter time to events. Other authors have proposed the spatial scan statistics based on the exponential and Weibull distributions. We propose the log-Weibull as an alternative distribution for the spatial scan statistic for time to events data and compare and contrast the log-Weibull and Weibull distributions through simulation studies. The effect of type I differential censoring and power have been investigated through simulated data. Methods are also illustrated on time to specialist visit data for discharged patients presenting to emergency departments for atrial fibrillation and flutter in Alberta during 2010-2011. We found northern regions of Alberta had longer times to specialist visit than other areas. We proposed the spatial scan statistic for the log-Weibull distribution as a new approach for detecting spatial clusters for time to event data. The simulation studies suggest that the test performs well for log-Weibull data.
NASA Astrophysics Data System (ADS)
Nadarajah, Saralees; Kotz, Samuel
2007-04-01
Various q-type distributions have appeared in the physics literature in the recent years, see e.g. L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57. It is pointed out in the paper that many of these are the same as or particular cases of what has been known in the statistics literature. Several of these statistical distributions are discussed and references provided. We feel that this paper could be of assistance for modeling problems of the type considered by L.C. Malacarne, R.S. Mendes, E. K. Lenzi, q-exponential distribution in urban agglomeration, Phys. Rev. E 65, (2002) 017106. S.M.D. Queiros, On a possible dynamical scenario leading to a generalised Gamma distribution, in xxx.lanl.gov-physics/0411111. U.M.S. Costa, V.N. Freire, L.C. Malacarne, R.S. Mendes, S. Picoli Jr., E.A. de Vasconcelos, E.F. da Silva Jr., An improved description of the dielectric breakdown in oxides based on a generalized Weibull distribution, Physica A 361, (2006) 215. S. Picoli, Jr., R.S. Mendes, L.C. Malacarne, q-exponential, Weibull, and q-Weibull distributions: an empirical analysis, Physica A 324 (2003) 678-688. A.M.C. de Souza, C. Tsallis, Student's t- and r- distributions: unified derivation from an entropic variational principle, Physica A 236 (1997) 52-57 and others.
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
Hirose, H
1997-01-01
This paper proposes a new treatment for electrical insulation degradation. Some types of insulation which have been used under various circumstances are considered to degrade at various rates in accordance with their stress circumstances. The cross-linked polyethylene (XLPE) insulated cables inspected by major Japanese electric companies clearly indicate such phenomena. By assuming that the inspected specimen is sampled from one of the clustered groups, a mixed degradation model can be constructed. Since the degradation of the insulation under common circumstances is considered to follow a Weibull distribution, a mixture model and a Weibull power law can be combined. This is called The mixture Weibull power law model. By using the maximum likelihood estimation for the newly proposed model to Japanese 22 and 33 kV insulation class cables, they are clustered into a certain number of groups by using the AIC and the generalized likelihood ratio test method. The reliability of the cables at specified years are assessed.
Wang, Mingyu; Han, Lijuan; Liu, Shasha; Zhao, Xuebing; Yang, Jinghua; Loh, Soh Kheang; Sun, Xiaomin; Zhang, Chenxi; Fang, Xu
2015-09-01
Renewable energy from lignocellulosic biomass has been deemed an alternative to depleting fossil fuels. In order to improve this technology, we aim to develop robust mathematical models for the enzymatic lignocellulose degradation process. By analyzing 96 groups of previously published and newly obtained lignocellulose saccharification results and fitting them to Weibull distribution, we discovered Weibull statistics can accurately predict lignocellulose saccharification data, regardless of the type of substrates, enzymes and saccharification conditions. A mathematical model for enzymatic lignocellulose degradation was subsequently constructed based on Weibull statistics. Further analysis of the mathematical structure of the model and experimental saccharification data showed the significance of the two parameters in this model. In particular, the λ value, defined the characteristic time, represents the overall performance of the saccharification system. This suggestion was further supported by statistical analysis of experimental saccharification data and analysis of the glucose production levels when λ and n values change. In conclusion, the constructed Weibull statistics-based model can accurately predict lignocellulose hydrolysis behavior and we can use the λ parameter to assess the overall performance of enzymatic lignocellulose degradation. Advantages and potential applications of the model and the λ value in saccharification performance assessment were discussed. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
pT spectra in pp and AA collisions at RHIC and LHC energies using the Tsallis-Weibull approach
NASA Astrophysics Data System (ADS)
Dash, Sadhana; Mahapatra, D. P.
2018-04-01
The Tsallis q -statistics have been incorporated in the Weibull model of particle production, in the form of q-Weibull distribution, to describe the transverse momentum (pT) distribution of charged hadrons at mid-rapidity, measured at RHIC and LHC energies. The q-Weibull distribution is found to describe the observed pT distributions over all ranges of measured pT. Below 2.2 GeV/c, while going from peripheral to central collisions, the parameter q is found to decrease systematically towards unity, indicating an evolution from a non-equilibrated system in peripheral collisions, towards a more thermalized system in central collisions. However, the trend is reversed in the all inclusive pT regime. This can be attributed to an increase in relative contribution of hard pQCD processes in central collisions. The λ-parameter is found to be associated with the mean pT or the collective expansion velocity of the produced hadrons, which shows an expected increase with centrality of collisions. The k parameter is observed to increase with the onset of hard QCD scatterings, initial fluctuations, and other processes leading to non-equilibrium conditions.
NASA Astrophysics Data System (ADS)
Cianciara, Aleksander
2016-09-01
The paper presents the results of research aimed at verifying the hypothesis that the Weibull distribution is an appropriate statistical distribution model of microseismicity emission characteristics, namely: energy of phenomena and inter-event time. It is understood that the emission under consideration is induced by the natural rock mass fracturing. Because the recorded emission contain noise, therefore, it is subjected to an appropriate filtering. The study has been conducted using the method of statistical verification of null hypothesis that the Weibull distribution fits the empirical cumulative distribution function. As the model describing the cumulative distribution function is given in an analytical form, its verification may be performed using the Kolmogorov-Smirnov goodness-of-fit test. Interpretations by means of probabilistic methods require specifying the correct model describing the statistical distribution of data. Because in these methods measurement data are not used directly, but their statistical distributions, e.g., in the method based on the hazard analysis, or in that that uses maximum value statistics.
2012-01-01
Background The goals of our study are to determine the most appropriate model for alcohol consumption as an exposure for burden of disease, to analyze the effect of the chosen alcohol consumption distribution on the estimation of the alcohol Population- Attributable Fractions (PAFs), and to characterize the chosen alcohol consumption distribution by exploring if there is a global relationship within the distribution. Methods To identify the best model, the Log-Normal, Gamma, and Weibull prevalence distributions were examined using data from 41 surveys from Gender, Alcohol and Culture: An International Study (GENACIS) and from the European Comparative Alcohol Study. To assess the effect of these distributions on the estimated alcohol PAFs, we calculated the alcohol PAF for diabetes, breast cancer, and pancreatitis using the three above-named distributions and using the more traditional approach based on categories. The relationship between the mean and the standard deviation from the Gamma distribution was estimated using data from 851 datasets for 66 countries from GENACIS and from the STEPwise approach to Surveillance from the World Health Organization. Results The Log-Normal distribution provided a poor fit for the survey data, with Gamma and Weibull distributions providing better fits. Additionally, our analyses showed that there were no marked differences for the alcohol PAF estimates based on the Gamma or Weibull distributions compared to PAFs based on categorical alcohol consumption estimates. The standard deviation of the alcohol distribution was highly dependent on the mean, with a unit increase in alcohol consumption associated with a unit increase in the mean of 1.258 (95% CI: 1.223 to 1.293) (R2 = 0.9207) for women and 1.171 (95% CI: 1.144 to 1.197) (R2 = 0. 9474) for men. Conclusions Although the Gamma distribution and the Weibull distribution provided similar results, the Gamma distribution is recommended to model alcohol consumption from population surveys due to its fit, flexibility, and the ease with which it can be modified. The results showed that a large degree of variance of the standard deviation of the alcohol consumption Gamma distribution was explained by the mean alcohol consumption, allowing for alcohol consumption to be modeled through a Gamma distribution using only average consumption. PMID:22490226
NASA Astrophysics Data System (ADS)
Kundu, Pradeep; Nath, Tameshwer; Palani, I. A.; Lad, Bhupesh K.
2018-06-01
The present paper tackles an important but unmapped problem of the reliability estimations of smart materials. First, an experimental setup is developed for accelerated life testing of the shape memory alloy (SMA) springs. Generalized log-linear Weibull (GLL-Weibull) distribution-based novel approach is then developed for SMA spring life estimation. Applied stimulus (voltage), elongation and cycles of operation are used as inputs for the life prediction model. The values of the parameter coefficients of the model provide better interpretability compared to artificial intelligence based life prediction approaches. In addition, the model also considers the effect of operating conditions, making it generic for a range of the operating conditions. Moreover, a Bayesian framework is used to continuously update the prediction with the actual degradation value of the springs, thereby reducing the uncertainty in the data and improving the prediction accuracy. In addition, the deterioration of material with number of cycles is also investigated using thermogravimetric analysis and scanning electron microscopy.
Maximum likelihood estimates, from censored data, for mixed-Weibull distributions
NASA Astrophysics Data System (ADS)
Jiang, Siyuan; Kececioglu, Dimitri
1992-06-01
A new algorithm for estimating the parameters of mixed-Weibull distributions from censored data is presented. The algorithm follows the principle of maximum likelihood estimate (MLE) through the expectation and maximization (EM) algorithm, and it is derived for both postmortem and nonpostmortem time-to-failure data. It is concluded that the concept of the EM algorithm is easy to understand and apply (only elementary statistics and calculus are required). The log-likelihood function cannot decrease after an EM sequence; this important feature was observed in all of the numerical calculations. The MLEs of the nonpostmortem data were obtained successfully for mixed-Weibull distributions with up to 14 parameters in a 5-subpopulation, mixed-Weibull distribution. Numerical examples indicate that some of the log-likelihood functions of the mixed-Weibull distributions have multiple local maxima; therefore, the algorithm should start at several initial guesses of the parameter set.
Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield
2012-01-01
Two important wood properties are stiffness (modulus of elasticity or MOE) and bending strength (modulus of rupture or MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two or three parameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of...
Steve P. Verrill; David E. Kretschmann; James W. Evans
2016-01-01
Two important wood properties are stiffness (modulus of elasticity, MOE) and bending strength (modulus of rupture, MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two- or threeparameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of wood...
Pal, Suvra; Balakrishnan, Narayanaswamy
2018-05-01
In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.
Failure rate and reliability of the KOMATSU hydraulic excavator in surface limestone mine
NASA Astrophysics Data System (ADS)
Harish Kumar N., S.; Choudhary, R. P.; Murthy, Ch. S. N.
2018-04-01
The model with failure rate function of bathtub-shaped is helpful in reliability analysis of any system and particularly in reliability associated privative maintenance. The usual Weibull distribution is, however, not capable to model the complete lifecycle of the any with a bathtub-shaped failure rate function. In this paper, failure rate and reliability analysis of the KOMATSU hydraulic excavator/shovel in surface mine is presented and also to improve the reliability and decrease the failure rate of each subsystem of the shovel based on the preventive maintenance. The model of the bathtub-shaped for shovel can also be seen as a simplification of the Weibull distribution.
NASA Astrophysics Data System (ADS)
Pasari, S.; Kundu, D.; Dikshit, O.
2012-12-01
Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Exponential, gamma, Weibull and lognormal distributions are quite established probability models in this recurrence interval estimation. However, they have certain shortcomings too. Thus, it is imperative to search for some alternative sophisticated distributions. In this paper, we introduce a three-parameter (location, scale and shape) exponentiated exponential distribution and investigate the scope of this distribution as an alternative of the afore-mentioned distributions in earthquake recurrence studies. This distribution is a particular member of the exponentiated Weibull distribution. Despite of its complicated form, it is widely accepted in medical and biological applications. Furthermore, it shares many physical properties with gamma and Weibull family. Unlike gamma distribution, the hazard function of generalized exponential distribution can be easily computed even if the shape parameter is not an integer. To contemplate the plausibility of this model, a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (20-32 deg N and 87-100 deg E) has been used. The model parameters are estimated using maximum likelihood estimator (MLE) and method of moment estimator (MOME). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.
An EOQ model for weibull distribution deterioration with time-dependent cubic demand and backlogging
NASA Astrophysics Data System (ADS)
Santhi, G.; Karthikeyan, K.
2017-11-01
In this article we introduce an economic order quantity model with weibull deterioration and time dependent cubic demand rate where holding costs as a linear function of time. Shortages are allowed in the inventory system are partially and fully backlogging. The objective of this model is to minimize the total inventory cost by using the optimal order quantity and the cycle length. The proposed model is illustrated by numerical examples and the sensitivity analysis is performed to study the effect of changes in parameters on the optimum solutions.
Reliability Analysis of Uniaxially Ground Brittle Materials
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.; Nemeth, Noel N.; Powers, Lynn M.; Choi, Sung R.
1995-01-01
The fast fracture strength distribution of uniaxially ground, alpha silicon carbide was investigated as a function of grinding angle relative to the principal stress direction in flexure. Both as-ground and ground/annealed surfaces were investigated. The resulting flexural strength distributions were used to verify reliability models and predict the strength distribution of larger plate specimens tested in biaxial flexure. Complete fractography was done on the specimens. Failures occurred from agglomerates, machining cracks, or hybrid flaws that consisted of a machining crack located at a processing agglomerate. Annealing eliminated failures due to machining damage. Reliability analyses were performed using two and three parameter Weibull and Batdorf methodologies. The Weibull size effect was demonstrated for machining flaws. Mixed mode reliability models reasonably predicted the strength distributions of uniaxial flexure and biaxial plate specimens.
Rafal Podlaski; Francis Roesch
2014-01-01
In recent years finite-mixture models have been employed to approximate and model empirical diameter at breast height (DBH) distributions. We used two-component mixtures of either the Weibull distribution or the gamma distribution for describing the DBH distributions of mixed-species, two-cohort forest stands, to analyse the relationships between the DBH components,...
Reliability analysis of structural ceramic components using a three-parameter Weibull distribution
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Powers, Lynn M.; Starlinger, Alois
1992-01-01
Described here are nonlinear regression estimators for the three-Weibull distribution. Issues relating to the bias and invariance associated with these estimators are examined numerically using Monte Carlo simulation methods. The estimators were used to extract parameters from sintered silicon nitride failure data. A reliability analysis was performed on a turbopump blade utilizing the three-parameter Weibull distribution and the estimates from the sintered silicon nitride data.
Development of Testing Methodologies for the Mechanical Properties of MEMS
NASA Technical Reports Server (NTRS)
Ekwaro-Osire, Stephen
2003-01-01
This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.
A practical and systematic review of Weibull statistics for reporting strengths of dental materials
Quinn, George D.; Quinn, Janet B.
2011-01-01
Objectives To review the history, theory and current applications of Weibull analyses sufficient to make informed decisions regarding practical use of the analysis in dental material strength testing. Data References are made to examples in the engineering and dental literature, but this paper also includes illustrative analyses of Weibull plots, fractographic interpretations, and Weibull distribution parameters obtained for a dense alumina, two feldspathic porcelains, and a zirconia. Sources Informational sources include Weibull's original articles, later articles specific to applications and theoretical foundations of Weibull analysis, texts on statistics and fracture mechanics and the international standards literature. Study Selection The chosen Weibull analyses are used to illustrate technique, the importance of flaw size distributions, physical meaning of Weibull parameters and concepts of “equivalent volumes” to compare measured strengths obtained from different test configurations. Conclusions Weibull analysis has a strong theoretical basis and can be of particular value in dental applications, primarily because of test specimen size limitations and the use of different test configurations. Also endemic to dental materials, however, is increased difficulty in satisfying application requirements, such as confirming fracture origin type and diligence in obtaining quality strength data. PMID:19945745
Rafal Podlaski; Francis A. Roesch
2013-01-01
Study assessed the usefulness of various methods for choosing the initial values for the numerical procedures for estimating the parameters of mixture distributions and analysed variety of mixture models to approximate empirical diameter at breast height (dbh) distributions. Two-component mixtures of either the Weibull distribution or the gamma distribution were...
Program for Weibull Analysis of Fatigue Data
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2005-01-01
A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.
Life and reliability models for helicopter transmissions
NASA Technical Reports Server (NTRS)
Savage, M.; Knorr, R. J.; Coy, J. J.
1982-01-01
Computer models of life and reliability are presented for planetary gear trains with a fixed ring gear, input applied to the sun gear, and output taken from the planet arm. For this transmission the input and output shafts are co-axial and the input and output torques are assumed to be coaxial with these shafts. Thrust and side loading are neglected. The reliability model is based on the Weibull distributions of the individual reliabilities of the in transmission components. The system model is also a Weibull distribution. The load versus life model for the system is a power relationship as the models for the individual components. The load-life exponent and basic dynamic capacity are developed as functions of the components capacities. The models are used to compare three and four planet, 150 kW (200 hp), 5:1 reduction transmissions with 1500 rpm input speed to illustrate their use.
NASA Astrophysics Data System (ADS)
Sazuka, Naoya
2007-03-01
We analyze waiting times for price changes in a foreign currency exchange rate. Recent empirical studies of high-frequency financial data support that trades in financial markets do not follow a Poisson process and the waiting times between trades are not exponentially distributed. Here we show that our data is well approximated by a Weibull distribution rather than an exponential distribution in the non-asymptotic regime. Moreover, we quantitatively evaluate how much an empirical data is far from an exponential distribution using a Weibull fit. Finally, we discuss a transition between a Weibull-law and a power-law in the long time asymptotic regime.
Landy, Rebecca; Cheung, Li C; Schiffman, Mark; Gage, Julia C; Hyun, Noorie; Wentzensen, Nicolas; Kinney, Walter K; Castle, Philip E; Fetterman, Barbara; Poitras, Nancy E; Lorey, Thomas; Sasieni, Peter D; Katki, Hormuzd A
2018-06-01
Electronic health-records (EHR) are increasingly used by epidemiologists studying disease following surveillance testing to provide evidence for screening intervals and referral guidelines. Although cost-effective, undiagnosed prevalent disease and interval censoring (in which asymptomatic disease is only observed at the time of testing) raise substantial analytic issues when estimating risk that cannot be addressed using Kaplan-Meier methods. Based on our experience analysing EHR from cervical cancer screening, we previously proposed the logistic-Weibull model to address these issues. Here we demonstrate how the choice of statistical method can impact risk estimates. We use observed data on 41,067 women in the cervical cancer screening program at Kaiser Permanente Northern California, 2003-2013, as well as simulations to evaluate the ability of different methods (Kaplan-Meier, Turnbull, Weibull and logistic-Weibull) to accurately estimate risk within a screening program. Cumulative risk estimates from the statistical methods varied considerably, with the largest differences occurring for prevalent disease risk when baseline disease ascertainment was random but incomplete. Kaplan-Meier underestimated risk at earlier times and overestimated risk at later times in the presence of interval censoring or undiagnosed prevalent disease. Turnbull performed well, though was inefficient and not smooth. The logistic-Weibull model performed well, except when event times didn't follow a Weibull distribution. We have demonstrated that methods for right-censored data, such as Kaplan-Meier, result in biased estimates of disease risks when applied to interval-censored data, such as screening programs using EHR data. The logistic-Weibull model is attractive, but the model fit must be checked against Turnbull non-parametric risk estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Statistical Models of Fracture Relevant to Nuclear-Grade Graphite: Review and Recommendations
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Bratton, Robert L.
2011-01-01
The nuclear-grade (low-impurity) graphite needed for the fuel element and moderator material for next-generation (Gen IV) reactors displays large scatter in strength and a nonlinear stress-strain response from damage accumulation. This response can be characterized as quasi-brittle. In this expanded review, relevant statistical failure models for various brittle and quasi-brittle material systems are discussed with regard to strength distribution, size effect, multiaxial strength, and damage accumulation. This includes descriptions of the Weibull, Batdorf, and Burchell models as well as models that describe the strength response of composite materials, which involves distributed damage. Results from lattice simulations are included for a physics-based description of material breakdown. Consideration is given to the predicted transition between brittle and quasi-brittle damage behavior versus the density of damage (level of disorder) within the material system. The literature indicates that weakest-link-based failure modeling approaches appear to be reasonably robust in that they can be applied to materials that display distributed damage, provided that the level of disorder in the material is not too large. The Weibull distribution is argued to be the most appropriate statistical distribution to model the stochastic-strength response of graphite.
NASA Technical Reports Server (NTRS)
Shantaram, S. Pai; Gyekenyesi, John P.
1989-01-01
The calculation of shape and scale parametes of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by using the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
ZERODUR strength modeling with Weibull statistical distributions
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2016-07-01
The decisive influence on breakage strength of brittle materials such as the low expansion glass ceramic ZERODUR is the surface condition. For polished or etched surfaces it is essential if micro cracks are present and how deep they are. Ground surfaces have many micro cracks caused by the generation process. Here only the depths of the micro cracks are relevant. In any case presence and depths of micro cracks are statistical by nature. The Weibull distribution is the model used traditionally for the representation of such data sets. It is based on the weakest link ansatz. The use of the two or three parameter Weibull distribution for data representation and reliability prediction depends on the underlying crack generation mechanisms. Before choosing the model for a specific evaluation, some checks should be done. Is there only one mechanism present or is it to be expected that an additional mechanism might contribute deviating results? For ground surfaces the main mechanism is the diamond grains' action on the surface. However, grains breaking from their bonding might be moved by the tool across the surface introducing a slightly deeper crack. It is not to be expected that these scratches follow the same statistical distribution as the grinding process. Hence, their description with the same distribution parameters is not adequate. Before including them a dedicated discussion should be performed. If there is additional information available influencing the selection of the model, for example the existence of a maximum crack depth, this should be taken into account also. Micro cracks introduced by small diamond grains on tools working with limited forces cannot be arbitrarily deep. For data obtained with such surfaces the existence of a threshold breakage stress should be part of the hypothesis. This leads to the use of the three parameter Weibull distribution. A differentiation based on the data set alone without preexisting information is possible but requires a large data set. With only 20 specimens per sample such differentiation is not possible. This requires 100 specimens per set, the more the better. The validity of the statistical evaluation methods is discussed with several examples. These considerations are of special importance because of their consequences on the prognosis methods and results. Especially the use of the two parameter Weibull distribution for high strength surfaces has led to non-realistic results. Extrapolation down to low acceptable probability of failure covers a wide range without data points existing and is mainly influenced by the slope determined by the high strength specimens. In the past this misconception has prevented the use of brittle materials for stress loads, which they could have endured easily.
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
Application of Weibull analysis to SSME hardware
NASA Technical Reports Server (NTRS)
Gray, L. A. B.
1986-01-01
Generally, it has been documented that the wearing of engine parts forms a failure distribution which can be approximated by a function developed by Weibull. The purpose here is to examine to what extent the Weibull distribution approximates failure data for designated engine parts of the Space Shuttle Main Engine (SSME). The current testing certification requirements will be examined in order to establish confidence levels. An examination of the failure history of SSME parts/assemblies (turbine blades, main combustion chamber, or high pressure fuel pump first stage impellers) which are limited in usage by time or starts will be done by using updated Weibull techniques. Efforts will be made by the investigator to predict failure trends by using Weibull techniques for SSME parts (turbine temperature sensors, chamber pressure transducers, actuators, and controllers) which are not severely limited by time or starts.
NASA Technical Reports Server (NTRS)
Gross, Bernard
1996-01-01
Material characterization parameters obtained from naturally flawed specimens are necessary for reliability evaluation of non-deterministic advanced ceramic structural components. The least squares best fit method is applied to the three parameter uniaxial Weibull model to obtain the material parameters from experimental tests on volume or surface flawed specimens subjected to pure tension, pure bending, four point or three point loading. Several illustrative example problems are provided.
Size Effect on Specific Energy Distribution in Particle Comminution
NASA Astrophysics Data System (ADS)
Xu, Yongfu; Wang, Yidong
A theoretical study is made to derive an energy distribution equation for the size reduction process from the fractal model for the particle comminution. Fractal model is employed as a valid measure of the self-similar size distribution of comminution daughter products. The tensile strength of particles varies with particle size in the manner of a power function law. The energy consumption for comminuting single particle is found to be proportional to the 5(D-3)/3rd order of the particle size, D being the fractal dimension of particle comminution daughter. The Weibull statistics is applied to describe the relationship between the breakage probability and specific energy of particle comminution. A simple equation is derived for the breakage probability of particles in view of the dependence of fracture energy on particle size. The calculated exponents and Weibull coefficients are generally in conformity with published data for fracture of particles.
Bayesian Weibull tree models for survival analysis of clinico-genomic data
Clarke, Jennifer; West, Mike
2008-01-01
An important goal of research involving gene expression data for outcome prediction is to establish the ability of genomic data to define clinically relevant risk factors. Recent studies have demonstrated that microarray data can successfully cluster patients into low- and high-risk categories. However, the need exists for models which examine how genomic predictors interact with existing clinical factors and provide personalized outcome predictions. We have developed clinico-genomic tree models for survival outcomes which use recursive partitioning to subdivide the current data set into homogeneous subgroups of patients, each with a specific Weibull survival distribution. These trees can provide personalized predictive distributions of the probability of survival for individuals of interest. Our strategy is to fit multiple models; within each model we adopt a prior on the Weibull scale parameter and update this prior via Empirical Bayes whenever the sample is split at a given node. The decision to split is based on a Bayes factor criterion. The resulting trees are weighted according to their relative likelihood values and predictions are made by averaging over models. In a pilot study of survival in advanced stage ovarian cancer we demonstrate that clinical and genomic data are complementary sources of information relevant to survival, and we use the exploratory nature of the trees to identify potential genomic biomarkers worthy of further study. PMID:18618012
Effect of Individual Component Life Distribution on Engine Life Prediction
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.
2003-01-01
The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.
Rafal Podlaski; Francis A. Roesch
2014-01-01
Two-component mixtures of either the Weibull distribution or the gamma distribution and the kernel density estimator were used for describing the diameter at breast height (dbh) empirical distributions of two-cohort stands. The data consisted of study plots from the Å wietokrzyski National Park (central Poland) and areas close to and including the North Carolina section...
Idealized models of the joint probability distribution of wind speeds
NASA Astrophysics Data System (ADS)
Monahan, Adam H.
2018-05-01
The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.
NASA Astrophysics Data System (ADS)
Hasan, Md. Fahad; Wang, James; Berndt, Christopher
2015-06-01
The microhardness and elastic modulus of plasma-sprayed hydroxyapatite coatings were evaluated using Knoop indentation on the cross section and on the top surface. The effects of indentation angle, testing direction, measurement location and applied load on the microhardness and elastic modulus were investigated. The variability and distribution of the microhardness and elastic modulus data were statistically analysed using the Weibull modulus distribution. The results indicate that the dependence of microhardness and elastic modulus on the indentation angle exhibits a parabolic shape. Dependence of the microhardness values on the indentation angle follows Pythagoras's theorem. The microhardness, Weibull modulus of microhardness and Weibull modulus of elastic modulus reach their maximum at the central position (175 µm) on the cross section of the coatings. The Weibull modulus of microhardness revealed similar values throughout the thickness, and the Weibull modulus of elastic modulus shows higher values on the top surface compared to the cross section.
Recurrence and interoccurrence behavior of self-organized complex phenomena
NASA Astrophysics Data System (ADS)
Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.
2007-08-01
The sandpile, forest-fire and slider-block models are said to exhibit self-organized criticality. Associated natural phenomena include landslides, wildfires, and earthquakes. In all cases the frequency-size distributions are well approximated by power laws (fractals). Another important aspect of both the models and natural phenomena is the statistics of interval times. These statistics are particularly important for earthquakes. For earthquakes it is important to make a distinction between interoccurrence and recurrence times. Interoccurrence times are the interval times between earthquakes on all faults in a region whereas recurrence times are interval times between earthquakes on a single fault or fault segment. In many, but not all cases, interoccurrence time statistics are exponential (Poissonian) and the events occur randomly. However, the distribution of recurrence times are often Weibull to a good approximation. In this paper we study the interval statistics of slip events using a slider-block model. The behavior of this model is sensitive to the stiffness α of the system, α=kC/kL where kC is the spring constant of the connector springs and kL is the spring constant of the loader plate springs. For a soft system (small α) there are no system-wide events and interoccurrence time statistics of the larger events are Poissonian. For a stiff system (large α), system-wide events dominate the energy dissipation and the statistics of the recurrence times between these system-wide events satisfy the Weibull distribution to a good approximation. We argue that this applicability of the Weibull distribution is due to the power-law (scale invariant) behavior of the hazard function, i.e. the probability that the next event will occur at a time t0 after the last event has a power-law dependence on t0. The Weibull distribution is the only distribution that has a scale invariant hazard function. We further show that the onset of system-wide events is a well defined critical point. We find that the number of system-wide events NSWE satisfies the scaling relation NSWE ∝(α-αC)δ where αC is the critical value of the stiffness. The system-wide events represent a new phase for the slider-block system.
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Gyekenyesi, John P.
1988-01-01
The calculation of shape and scale parameters of the two-parameter Weibull distribution is described using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. Detailed procedures are given for evaluating 90 percent confidence intervals for maximum likelihood estimates of shape and scale parameters, the unbiased estimates of the shape parameters, and the Weibull mean values and corresponding standard deviations. Furthermore, the necessary steps are described for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull distribution. It also shows how to calculate the Batdorf flaw-density constants by uing the Weibull distribution statistical parameters. The techniques described were verified with several example problems, from the open literature, and were coded. The techniques described were verified with several example problems from the open literature, and were coded in the Structural Ceramics Analysis and Reliability Evaluation (SCARE) design program.
Parametric regression model for survival data: Weibull regression model as an example
2016-01-01
Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846
Balakrishnan, Narayanaswamy; Pal, Suvra
2016-08-01
Recently, a flexible cure rate survival model has been developed by assuming the number of competing causes of the event of interest to follow the Conway-Maxwell-Poisson distribution. This model includes some of the well-known cure rate models discussed in the literature as special cases. Data obtained from cancer clinical trials are often right censored and expectation maximization algorithm can be used in this case to efficiently estimate the model parameters based on right censored data. In this paper, we consider the competing cause scenario and assuming the time-to-event to follow the Weibull distribution, we derive the necessary steps of the expectation maximization algorithm for estimating the parameters of different cure rate survival models. The standard errors of the maximum likelihood estimates are obtained by inverting the observed information matrix. The method of inference developed here is examined by means of an extensive Monte Carlo simulation study. Finally, we illustrate the proposed methodology with a real data on cancer recurrence. © The Author(s) 2013.
Wang, Ping; Zhang, Lu; Guo, Lixin; Huang, Feng; Shang, Tao; Wang, Ranran; Yang, Yintang
2014-08-25
The average bit error rate (BER) for binary phase-shift keying (BPSK) modulation in free-space optical (FSO) links over turbulence atmosphere modeled by the exponentiated Weibull (EW) distribution is investigated in detail. The effects of aperture averaging on the average BERs for BPSK modulation under weak-to-strong turbulence conditions are studied. The average BERs of EW distribution are compared with Lognormal (LN) and Gamma-Gamma (GG) distributions in weak and strong turbulence atmosphere, respectively. The outage probability is also obtained for different turbulence strengths and receiver aperture sizes. The analytical results deduced by the generalized Gauss-Laguerre quadrature rule are verified by the Monte Carlo simulation. This work is helpful for the design of receivers for FSO communication systems.
Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K
2016-01-01
Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.
Availability Estimation for Facilities in Extreme Geographical Locations
NASA Technical Reports Server (NTRS)
Fischer, Gerd M.; Omotoso, Oluseun; Chen, Guangming; Evans, John W.
2012-01-01
A value added analysis for the Reliability. Availability and Maintainability of McMurdo Ground Station was developed, which will be a useful tool for system managers in sparing, maintenance planning and determining vital performance metrics needed for readiness assessment of the upgrades to the McMurdo System. Output of this study can also be used as inputs and recommendations for the application of Reliability Centered Maintenance (RCM) for the system. ReliaSoft's BlockSim. a commercial Reliability Analysis software package, has been used to model the availability of the system upgrade to the National Aeronautics and Space Administration (NASA) Near Earth Network (NEN) Ground Station at McMurdo Station in the Antarctica. The logistics challenges due to the closure of access to McMurdo Station during the Antarctic winter was modeled using a weighted composite of four Weibull distributions. one of the possible choices for statistical distributions throughout the software program and usually used to account for failure rates of components supplied by different manufacturers. The inaccessibility of the antenna site on a hill outside McMurdo Station throughout one year due to severe weather was modeled with a Weibull distribution for the repair crew availability. The Weibull distribution is based on an analysis of the available weather data for the antenna site for 2007 in combination with the rules for travel restrictions due to severe weather imposed by the administrating agency, the National Science Foundation (NSF). The simulations resulted in an upper bound for the system availability and allowed for identification of components that would improve availability based on a higher on-site spare count than initially planned.
NASA Astrophysics Data System (ADS)
Sanford, W. E.
2015-12-01
Age distributions of base flow to streams are important to estimate for predicting the timing of water-quality responses to changes in distributed inputs of nutrients or pollutants at the land surface. Simple models of shallow aquifers will predict exponential age distributions, but more realistic 3-D stream-aquifer geometries will cause deviations from an exponential curve. In addition, in fractured rock terrains the dual nature of the effective and total porosity of the system complicates the age distribution further. In this study shallow groundwater flow and advective transport were simulated in two regions in the Eastern United States—the Delmarva Peninsula and the upper Potomac River basin. The former is underlain by layers of unconsolidated sediment, while the latter consists of folded and fractured sedimentary rocks. Transport of groundwater to streams was simulated using the USGS code MODPATH within 175 and 275 watersheds, respectively. For the fractured rock terrain, calculations were also performed along flow pathlines to account for exchange between mobile and immobile flow zones. Porosities at both sites were calibrated using environmental tracer data (3H, 3He, CFCs and SF6) in wells and springs, and with a 30-year tritium record from the Potomac River. Carbonate and siliciclastic rocks were calibrated to have mobile porosity values of one and six percent, and immobile porosity values of 18 and 12 percent, respectively. The age distributions were fitted to Weibull functions. Whereas an exponential function has one parameter that controls the median age of the distribution, a Weibull function has an extra parameter that controls the slope of the curve. A weighted Weibull function was also developed that potentially allows for four parameters, two that control the median age and two that control the slope, one of each weighted toward early or late arrival times. For both systems the two-parameter Weibull function nearly always produced a substantially better fit to the data than the one-parameter exponential function. For the single porosity system it was found that the use of three parameters was often optimal for accurately describing the base-flow age distribution, whereas for the dual porosity system the fourth parameter was often required to fit the more complicated response curves.
Large-Scale Weibull Analysis of H-451 Nuclear- Grade Graphite Specimen Rupture Data
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Walker, Andrew; Baker, Eric H.; Murthy, Pappu L.; Bratton, Robert L.
2012-01-01
A Weibull analysis was performed of the strength distribution and size effects for 2000 specimens of H-451 nuclear-grade graphite. The data, generated elsewhere, measured the tensile and four-point-flexure room-temperature rupture strength of specimens excised from a single extruded graphite log. Strength variation was compared with specimen location, size, and orientation relative to the parent body. In our study, data were progressively and extensively pooled into larger data sets to discriminate overall trends from local variations and to investigate the strength distribution. The CARES/Life and WeibPar codes were used to investigate issues regarding the size effect, Weibull parameter consistency, and nonlinear stress-strain response. Overall, the Weibull distribution described the behavior of the pooled data very well. However, the issue regarding the smaller-than-expected size effect remained. This exercise illustrated that a conservative approach using a two-parameter Weibull distribution is best for designing graphite components with low probability of failure for the in-core structures in the proposed Generation IV (Gen IV) high-temperature gas-cooled nuclear reactors. This exercise also demonstrated the continuing need to better understand the mechanisms driving stochastic strength response. Extensive appendixes are provided with this report to show all aspects of the rupture data and analytical results.
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus ([Formula: see text]) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 [Formula: see text]m) and lower pore volume (54.5%).
NASA Astrophysics Data System (ADS)
Seuba, Jordi; Deville, Sylvain; Guizard, Christian; Stevenson, Adam J.
2016-01-01
Macroporous ceramics exhibit an intrinsic strength variability caused by the random distribution of defects in their structure. However, the precise role of microstructural features, other than pore volume, on reliability is still unknown. Here, we analyze the applicability of the Weibull analysis to unidirectional macroporous yttria-stabilized-zirconia (YSZ) prepared by ice-templating. First, we performed crush tests on samples with controlled microstructural features with the loading direction parallel to the porosity. The compressive strength data were fitted using two different fitting techniques, ordinary least squares and Bayesian Markov Chain Monte Carlo, to evaluate whether Weibull statistics are an adequate descriptor of the strength distribution. The statistical descriptors indicated that the strength data are well described by the Weibull statistical approach, for both fitting methods used. Furthermore, we assess the effect of different microstructural features (volume, size, densification of the walls, and morphology) on Weibull modulus and strength. We found that the key microstructural parameter controlling reliability is wall thickness. In contrast, pore volume is the main parameter controlling the strength. The highest Weibull modulus (?) and mean strength (198.2 MPa) were obtained for the samples with the smallest and narrowest wall thickness distribution (3.1 ?m) and lower pore volume (54.5%).
1989-08-01
Random variables for the conditional exponential distribution are generated using the inverse transform method. C1) Generate U - UCO,i) (2) Set s - A ln...e - [(x+s - 7)/ n] 0 + [Cx-T)/n]0 c. Random variables from the conditional weibull distribution are generated using the inverse transform method. C1...using a standard normal transformation and the inverse transform method. B - 3 APPENDIX 3 DISTRIBUTIONS SUPPORTED BY THE MODEL (1) Generate Y - PCX S
NASA Astrophysics Data System (ADS)
Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi
To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.
HPC simulations of grain-scale spallation to improve thermal spallation drilling
NASA Astrophysics Data System (ADS)
Walsh, S. D.; Lomov, I.; Wideman, T. W.; Potter, J.
2012-12-01
Thermal spallation drilling and related hard-rock hole opening techniques are transformative technologies with the potential to dramatically reduce the costs associated with EGS well drilling and improve the productivity of new and existing wells. In contrast to conventional drilling methods that employ mechanical means to penetrate rock, thermal spallation methods fragment rock into small pieces ("spalls") without contact via the rapid transmission of heat to the rock surface. State-of-the-art constitutive models of thermal spallation employ Weibull statistical failure theory to represent the relationship between rock heterogeneity and its propensity to produce spalls when heat is applied to the rock surface. These models have been successfully used to predict such factors as penetration rate, spall-size distribution and borehole radius from drilling jet velocity and applied heat flux. A properly calibrated Weibull model would permit design optimization of thermal spallation drilling under geothermal field conditions. However, although useful for predicting system response in a given context, Weibull models are by their nature empirically derived. In the past, the parameters used in these models were carefully determined from laboratory tests, and thus model applicability was limited by experimental scope. This becomes problematic, for example, if simulating spall production at depths relevant for geothermal energy production, or modeling thermal spallation drilling in new rock types. Nevertheless, with sufficient computational resources, Weibull models could be validated in the absence of experimental data by explicit small-scale simulations that fully resolve rock grains. This presentation will discuss how high-fidelity simulations can be used to inform Weibull models of thermal spallation, and what these simulations reveal about the processes driving spallation at the grain-scale - in particular, the role that inter-grain boundaries and micro-pores play in the onset and extent of spallation. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Rafal Podlaski; Francis .A. Roesch
2013-01-01
The goals of this study are (1) to analyse the accuracy of the approximation of empirical distributions of diameter at breast height (dbh) using two-component mixtures of either the Weibull distribution or the gamma distribution in two−cohort stands, and (2) to discuss the procedure of choosing goodness−of−fit tests. The study plots were...
Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata
2012-05-01
The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
A New Lifetime Distribution with Bathtube and Unimodal Hazard Function
NASA Astrophysics Data System (ADS)
Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.
2008-11-01
In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.
Experimental investigation of mode I fracture for brittle tube-shaped particles
NASA Astrophysics Data System (ADS)
Stasiak, Marta; Combe, Gaël; Desrues, Jacques; Richefeu, Vincent; Villard, Pascal; Armand, Gilles; Zghondi, Jad
2017-06-01
We focus herein on the mechanical behavior of highly crushable grains. The object of our interest, named shell, is a hollow cylinder grain with ring cross-section, made of baked clay. The objective is to model the fragmentation of such shells, by means of discrete element (DE) approach. To this end, fracture modes I (opening fracture) and II (in-plane shear fracture) have to be investigated experimentally. This paper is essentially dedicated to mode I fracture. Therefore, a campaign of Brazilian-like compression tests, that result in crack opening, has been performed. The distribution of the occurrence of tensile strength is shown to obey a Weibull distribution for the studied shells, and Weibull's modulus was quantified. Finally, an estimate of the numerical/physical parameters required in a DE model (local strength), is proposed on the basis of the energy required to fracture through a given surface in mode I or II.
A Novel Solution-Technique Applied to a Novel WAAS Architecture
NASA Technical Reports Server (NTRS)
Bavuso, J.
1998-01-01
The Federal Aviation Administration has embarked on an historic task of modernizing and significantly improving the national air transportation system. One system that uses the Global Positioning System (GPS) to determine aircraft navigational information is called the Wide Area Augmentation System (WAAS). This paper describes a reliability assessment of one candidate system architecture for the WAAS. A unique aspect of this study regards the modeling and solution of a candidate system that allows a novel cold sparing scheme. The cold spare is a WAAS communications satellite that is fabricated and launched after a predetermined number of orbiting satellite failures have occurred and after some stochastic fabrication time transpires. Because these satellites are complex systems with redundant components, they exhibit an increasing failure rate with a Weibull time to failure distribution. Moreover, the cold spare satellite build-time is Weibull and upon launch is considered to be a good-as-new system with an increasing failure rate and a Weibull time to failure distribution as well. The reliability model for this system is non-Markovian because three distinct system clocks are required: the time to failure of the orbiting satellites, the build time for the cold spare, and the time to failure for the launched spare satellite. A powerful dynamic fault tree modeling notation and Monte Carlo simulation technique with importance sampling are shown to arrive at a reliability prediction for a 10 year mission.
1978-03-01
for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented
Design of ceramic components with the NASA/CARES computer program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
The ceramics analysis and reliability evaluation of structures (CARES) computer program is described. The primary function of the code is to calculate the fast-fracture reliability or failure probability of macro-scopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. CARES uses results from MSC/NASTRAN or ANSYS finite-element analysis programs to evaluate how inherent surface and/or volume type flaws component reliability. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for a single or multiple failure modes by using a least-squares analysis or a maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-to-fit-tests, 90 percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan 90 percent confidence band values are also provided. Examples are provided to illustrate the various features of CARES.
The effect of mis-specification on mean and selection between the Weibull and lognormal models
NASA Astrophysics Data System (ADS)
Jia, Xiang; Nadarajah, Saralees; Guo, Bo
2018-02-01
The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.
Pal, Suvra; Balakrishnan, N
2017-10-01
In this paper, we consider a competing cause scenario and assume the number of competing causes to follow a Conway-Maxwell Poisson distribution which can capture both over and under dispersion that is usually encountered in discrete data. Assuming the population of interest having a component cure and the form of the data to be interval censored, as opposed to the usually considered right-censored data, the main contribution is in developing the steps of the expectation maximization algorithm for the determination of the maximum likelihood estimates of the model parameters of the flexible Conway-Maxwell Poisson cure rate model with Weibull lifetimes. An extensive Monte Carlo simulation study is carried out to demonstrate the performance of the proposed estimation method. Model discrimination within the Conway-Maxwell Poisson distribution is addressed using the likelihood ratio test and information-based criteria to select a suitable competing cause distribution that provides the best fit to the data. A simulation study is also carried out to demonstrate the loss in efficiency when selecting an improper competing cause distribution which justifies the use of a flexible family of distributions for the number of competing causes. Finally, the proposed methodology and the flexibility of the Conway-Maxwell Poisson distribution are illustrated with two known data sets from the literature: smoking cessation data and breast cosmesis data.
NASA Astrophysics Data System (ADS)
Janković, Bojan
2009-10-01
The decomposition process of sodium bicarbonate (NaHCO3) has been studied by thermogravimetry in isothermal conditions at four different operating temperatures (380 K, 400 K, 420 K, and 440 K). It was found that the experimental integral and differential conversion curves at the different operating temperatures can be successfully described by the isothermal Weibull distribution function with a unique value of the shape parameter ( β = 1.07). It was also established that the Weibull distribution parameters ( β and η) show independent behavior on the operating temperature. Using the integral and differential (Friedman) isoconversional methods, in the conversion (α) range of 0.20 ≤ α ≤ 0.80, the apparent activation energy ( E a ) value was approximately constant ( E a, int = 95.2 kJmol-1 and E a, diff = 96.6 kJmol-1, respectively). The values of E a calculated by both isoconversional methods are in good agreement with the value of E a evaluated from the Arrhenius equation (94.3 kJmol-1), which was expressed through the scale distribution parameter ( η). The Málek isothermal procedure was used for estimation of the kinetic model for the investigated decomposition process. It was found that the two-parameter Šesták-Berggren (SB) autocatalytic model best describes the NaHCO3 decomposition process with the conversion function f(α) = α0.18(1-α)1.19. It was also concluded that the calculated density distribution functions of the apparent activation energies ( ddfE a ’s) are not dependent on the operating temperature, which exhibit the highly symmetrical behavior (shape factor = 1.00). The obtained isothermal decomposition results were compared with corresponding results of the nonisothermal decomposition process of NaHCO3.
Statistical study of air pollutant concentrations via generalized gamma distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marani, A.; Lavagnini, I.; Buttazzoni, C.
1986-11-01
This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.
A Weibull characterization for tensile fracture of multicomponent brittle fibers
NASA Technical Reports Server (NTRS)
Barrows, R. G.
1977-01-01
A statistical characterization for multicomponent brittle fibers in presented. The method, which is an extension of usual Weibull distribution procedures, statistically considers the components making up a fiber (e.g., substrate, sheath, and surface) as separate entities and taken together as in a fiber. Tensile data for silicon carbide fiber and for an experimental carbon-boron alloy fiber are evaluated in terms of the proposed multicomponent Weibull characterization.
NASA Astrophysics Data System (ADS)
Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao
2018-06-01
This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.
NASA Astrophysics Data System (ADS)
Huang, D.; Liu, Y.
2014-12-01
The effects of subgrid cloud variability on grid-average microphysical rates and radiative fluxes are examined by use of long-term retrieval products at the Tropical West Pacific (TWP), Southern Great Plains (SGP), and North Slope of Alaska (NSA) sites of the Department of Energy's Atmospheric Radiation Measurement (ARM) Program. Four commonly used distribution functions, the truncated Gaussian, Gamma, lognormal, and Weibull distributions, are constrained to have the same mean and standard deviation as observed cloud liquid water content. The PDFs are then used to upscale relevant physical processes to obtain grid-average process rates. It is found that the truncated Gaussian representation results in up to 30% mean bias in autoconversion rate whereas the mean bias for the lognormal representation is about 10%. The Gamma and Weibull distribution function performs the best for the grid-average autoconversion rate with the mean relative bias less than 5%. For radiative fluxes, the lognormal and truncated Gaussian representations perform better than the Gamma and Weibull representations. The results show that the optimal choice of subgrid cloud distribution function depends on the nonlinearity of the process of interest and thus there is no single distribution function that works best for all parameterizations. Examination of the scale (window size) dependence of the mean bias indicates that the bias in grid-average process rates monotonically increases with increasing window sizes, suggesting the increasing importance of subgrid variability with increasing grid sizes.
Testing homogeneity in Weibull-regression models.
Bolfarine, Heleno; Valença, Dione M
2005-10-01
In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.
Set statistics in conductive bridge random access memory device with Cu/HfO{sub 2}/Pt structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Meiyun; Long, Shibing, E-mail: longshibing@ime.ac.cn; Wang, Guoming
2014-11-10
The switching parameter variation of resistive switching memory is one of the most important challenges in its application. In this letter, we have studied the set statistics of conductive bridge random access memory with a Cu/HfO{sub 2}/Pt structure. The experimental distributions of the set parameters in several off resistance ranges are shown to nicely fit a Weibull model. The Weibull slopes of the set voltage and current increase and decrease logarithmically with off resistance, respectively. This experimental behavior is perfectly captured by a Monte Carlo simulator based on the cell-based set voltage statistics model and the Quantum Point Contact electronmore » transport model. Our work provides indications for the improvement of the switching uniformity.« less
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.
1993-01-01
New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.
NASA Astrophysics Data System (ADS)
Abas, Norzaida; Daud, Zalina M.; Yusof, Fadhilah
2014-11-01
A stochastic rainfall model is presented for the generation of hourly rainfall data in an urban area in Malaysia. In view of the high temporal and spatial variability of rainfall within the tropical rain belt, the Spatial-Temporal Neyman-Scott Rectangular Pulse model was used. The model, which is governed by the Neyman-Scott process, employs a reasonable number of parameters to represent the physical attributes of rainfall. A common approach is to attach each attribute to a mathematical distribution. With respect to rain cell intensity, this study proposes the use of a mixed exponential distribution. The performance of the proposed model was compared to a model that employs the Weibull distribution. Hourly and daily rainfall data from four stations in the Damansara River basin in Malaysia were used as input to the models, and simulations of hourly series were performed for an independent site within the basin. The performance of the models was assessed based on how closely the statistical characteristics of the simulated series resembled the statistics of the observed series. The findings obtained based on graphical representation revealed that the statistical characteristics of the simulated series for both models compared reasonably well with the observed series. However, a further assessment using the AIC, BIC and RMSE showed that the proposed model yields better results. The results of this study indicate that for tropical climates, the proposed model, using a mixed exponential distribution, is the best choice for generation of synthetic data for ungauged sites or for sites with insufficient data within the limit of the fitted region.
Lianjun Zhang; Jeffrey H. Gove; Chuangmin Liu; William B. Leak
2001-01-01
The rotated-sigmoid form is a characteristic of old-growth, uneven-aged forest stands caused by past disturbances such as cutting, fire, disease, and insect attacks. The diameter frequency distribution of the rotated-sigmoid form is bimodal with the second rounded peak in the midsized classes, rather than a smooth, steeply descending, monotonic curve. In this study a...
Establishment of a center of excellence for applied mathematical and statistical research
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
The state of the art was assessed with regards to efforts in support of the crop production estimation problem and alternative generic proportion estimation techniques were investigated. Topics covered include modeling the greeness profile (Badhwarmos model), parameter estimation using mixture models such as CLASSY, and minimum distance estimation as an alternative to maximum likelihood estimation. Approaches to the problem of obtaining proportion estimates when the underlying distributions are asymmetric are examined including the properties of Weibull distribution.
Assessing a Tornado Climatology from Global Tornado Intensity Distributions.
NASA Astrophysics Data System (ADS)
Feuerstein, Bernold; Dotzek, Nikolai; Grieser, Jürgen
2005-02-01
Recent work demonstrated that the shape of tornado intensity distributions from various regions worldwide is well described by Weibull functions. This statistical modeling revealed a strong correlation between the fit parameters c for shape and b for scale regardless of the data source. In the present work it is shown that the quality of the Weibull fits is optimized if only tornado reports of F1 and higher intensity are used and that the c-b correlation does indeed reflect a universal feature of the observed tornado intensity distributions. For regions with likely supercell tornado dominance, this feature is the number ratio of F4 to F3 tornado reports R(F4/F3) = 0.238. The c-b diagram for the Weibull shape and scale parameters is used as a climatological chart, which allows different types of tornado climatology to be distinguished, presumably arising from supercell versus nonsupercell tornadogenesis. Assuming temporal invariance of the climatology and using a detection efficiency function for tornado observations, a stationary climatological probability distribution from large tornado records (U.S. decadal data 1950-99) is extracted. This can be used for risk assessment, comparative studies on tornado intensity distributions worldwide, and estimates of the degree of underreporting for areas with poor databases. For the 1990s U.S. data, a likely tornado underreporting of the weak events (F0, F1) by a factor of 2 can be diagnosed, as well as asymptotic climatological c,b values of c = 1.79 and b = 2.13, to which a convergence in the 1950-99 U.S. decadal data is verified.
A nonlinear model of gold production in Malaysia
NASA Astrophysics Data System (ADS)
Ramli, Norashikin; Muda, Nora; Umor, Mohd Rozi
2014-06-01
Malaysia is a country which is rich in natural resources and one of it is a gold. Gold has already become an important national commodity. This study is conducted to determine a model that can be well fitted with the gold production in Malaysia from the year 1995-2010. Five nonlinear models are presented in this study which are Logistic model, Gompertz, Richard, Weibull and Chapman-Richard model. These model are used to fit the cumulative gold production in Malaysia. The best model is then selected based on the model performance. The performance of the fitted model is measured by sum squares error, root mean squares error, coefficient of determination, mean relative error, mean absolute error and mean absolute percentage error. This study has found that a Weibull model is shown to have significantly outperform compare to the other models. To confirm that Weibull is the best model, the latest data are fitted to the model. Once again, Weibull model gives the lowest readings at all types of measurement error. We can concluded that the future gold production in Malaysia can be predicted according to the Weibull model and this could be important findings for Malaysia to plan their economic activities.
ZERODUR: deterministic approach for strength design
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2012-12-01
There is an increasing request for zero expansion glass ceramic ZERODUR substrates being capable of enduring higher operational static loads or accelerations. The integrity of structures such as optical or mechanical elements for satellites surviving rocket launches, filigree lightweight mirrors, wobbling mirrors, and reticle and wafer stages in microlithography must be guaranteed with low failure probability. Their design requires statistically relevant strength data. The traditional approach using the statistical two-parameter Weibull distribution suffered from two problems. The data sets were too small to obtain distribution parameters with sufficient accuracy and also too small to decide on the validity of the model. This holds especially for the low failure probability levels that are required for reliable applications. Extrapolation to 0.1% failure probability and below led to design strengths so low that higher load applications seemed to be not feasible. New data have been collected with numbers per set large enough to enable tests on the applicability of the three-parameter Weibull distribution. This distribution revealed to provide much better fitting of the data. Moreover it delivers a lower threshold value, which means a minimum value for breakage stress, allowing of removing statistical uncertainty by introducing a deterministic method to calculate design strength. Considerations taken from the theory of fracture mechanics as have been proven to be reliable with proof test qualifications of delicate structures made from brittle materials enable including fatigue due to stress corrosion in a straight forward way. With the formulae derived, either lifetime can be calculated from given stress or allowable stress from minimum required lifetime. The data, distributions, and design strength calculations for several practically relevant surface conditions of ZERODUR are given. The values obtained are significantly higher than those resulting from the two-parameter Weibull distribution approach and no longer subject to statistical uncertainty.
Study on constant-step stress accelerated life tests in white organic light-emitting diodes.
Zhang, J P; Liu, C; Chen, X; Cheng, G L; Zhou, A X
2014-11-01
In order to obtain reliability information for a white organic light-emitting diode (OLED), two constant and one step stress tests were conducted with its working current increased. The Weibull function was applied to describe the OLED life distribution, and the maximum likelihood estimation (MLE) and its iterative flow chart were used to calculate shape and scale parameters. Furthermore, the accelerated life equation was determined using the least squares method, a Kolmogorov-Smirnov test was performed to assess if the white OLED life follows a Weibull distribution, and self-developed software was used to predict the average and the median lifetimes of the OLED. The numerical results indicate that white OLED life conforms to a Weibull distribution, and that the accelerated life equation completely satisfies the inverse power law. The estimated life of a white OLED may provide significant guidelines for its manufacturers and customers. Copyright © 2014 John Wiley & Sons, Ltd.
A spatial scan statistic for survival data based on Weibull distribution.
Bhatt, Vijaya; Tiwari, Neeraj
2014-05-20
The spatial scan statistic has been developed as a geographical cluster detection analysis tool for different types of data sets such as Bernoulli, Poisson, ordinal, normal and exponential. We propose a scan statistic for survival data based on Weibull distribution. It may also be used for other survival distributions, such as exponential, gamma, and log normal. The proposed method is applied on the survival data of tuberculosis patients for the years 2004-2005 in Nainital district of Uttarakhand, India. Simulation studies reveal that the proposed method performs well for different survival distribution functions. Copyright © 2013 John Wiley & Sons, Ltd.
Juckett, D A; Rosenberg, B
1992-04-21
The distributions for human disease-specific mortality exhibit two striking characteristics: survivorship curves that intersect near the longevity limit; and, the clustering of best-fitting Weibull shape parameter values into groups centered on integers. Correspondingly, we have hypothesized that the distribution intersections result from either competitive processes or population partitioning and the integral clustering in the shape parameter results from the occurrence of a small number of rare, rate-limiting events in disease progression. In this report we initiate a theoretical examination of these questions by exploring serial chain model dynamics and parameteric competing risks theory. The links in our chain models are composed of more than one bond, where the number of bonds in a link are denoted the link size and are the number of events necessary to break the link and, hence, the chain. We explored chains with all links of the same size or with segments of the chain composed of different size links (competition). Simulations showed that chain breakage dynamics depended on the weakest-link principle and followed kinetics of extreme-values which were very similar to human mortality kinetics. In particular, failure distributions for simple chains were Weibull-type extreme-value distributions with shape parameter values that were identifiable with the integral link size in the limit of infinite chain length. Furthermore, for chains composed of several segments of differing link size, the survival distributions for the various segments converged at a point in the S(t) tails indistinguishable from human data. This was also predicted by parameteric competing risks theory using Weibull underlying distributions. In both the competitive chain simulations and the parametric competing risks theory, however, the shape values for the intersecting distributions deviated from the integer values typical of human data. We conclude that rare events can be the source of integral shapes in human mortality, that convergence is a salient feature of multiple endpoints, but that pure competition may not be the best explanation for the exact type of convergence observable in human mortality. Finally, while the chain models were not motivated by any specific biological structures, interesting biological correlates to them may be useful in gerontological research.
Modeling the survival of Salmonella spp. in chorizos.
Hajmeer, M; Basheer, I; Hew, C; Cliver, D O
2006-03-01
The survival of Salmonella spp. in chorizos has been studied under the effect of storage conditions; namely temperature (T=6, 25, 30 degrees C), air inflow velocity (F=0, 28.4 m/min), and initial water activity (a(w0)=0.85, 0.90, 0.93, 0.95, 0.97). The pH was held at 5.0. A total of 20 survival curves were experimentally obtained at various combinations of operating conditions. The chorizos were stored under four conditions: in the refrigerator (Ref: T=6 degrees C, F=0 m/min), at room temperature (RT: T=25 degrees C, F=0 m/min), in the hood (Hd: T=25 degrees C, F=28.4 m/min), and in the incubator (Inc: T=30 degrees C, F=0 m/min). Semi-logarithmic plots of counts vs. time revealed nonlinear trends for all the survival curves, indicating that the first-order kinetics model (exponential distribution function) was not suitable. The Weibull cumulative distribution function, for which the exponential function is only a special case, was selected and used to model the survival curves. The Weibull model was fitted to the 20 curves and the model parameters (alpha and beta) were determined. The fitted survival curves agreed with the experimental data with R(2)=0.951, 0.969, 0.908, and 0.871 for the Ref, RT, Hd, and Inc curves, respectively. Regression models relating alpha and beta to T, F, and a(w0) resulted in R(2) values of 0.975 for alpha and 0.988 for beta. The alpha and beta models can be used to generate a survival curve for Salmonella in chorizos for a given set of operating conditions. Additionally, alpha and beta can be used to determine the times needed to reduce the count by 1 or 2 logs t(1D) and t(2D). It is concluded that the Weibull cumulative distribution function offers a powerful model for describing microbial survival data. A comparison with the pathogen modeling program (PMP) revealed that the survival kinetics of Salmonella spp. in chorizos could not be adequately predicted using PMP which underestimated the t(1D) and t(2D). The mean of the Weibull probability density function correlated strongly with t(1D) and t(2D), and can serve as an alternative to the D-values normally used with first-order kinetic models. Parametric studies were conducted and sensitivity of survival to operating conditions was evaluated and discussed in the paper. The models derived herein provide a means for the development of a reliable risk assessment system for controlling Salmonella spp. in chorizos.
Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.
A probabilistic approach to photovoltaic generator performance prediction
NASA Astrophysics Data System (ADS)
Khallat, M. A.; Rahman, S.
1986-09-01
A method for predicting the performance of a photovoltaic (PV) generator based on long term climatological data and expected cell performance is described. The equations for cell model formulation are provided. Use of the statistical model for characterizing the insolation level is discussed. The insolation data is fitted to appropriate probability distribution functions (Weibull, beta, normal). The probability distribution functions are utilized to evaluate the capacity factors of PV panels or arrays. An example is presented revealing the applicability of the procedure.
NASA Astrophysics Data System (ADS)
Le, Jia-Liang; Bažant, Zdeněk P.; Bazant, Martin Z.
2011-07-01
Engineering structures must be designed for an extremely low failure probability such as 10 -6, which is beyond the means of direct verification by histogram testing. This is not a problem for brittle or ductile materials because the type of probability distribution of structural strength is fixed and known, making it possible to predict the tail probabilities from the mean and variance. It is a problem, though, for quasibrittle materials for which the type of strength distribution transitions from Gaussian to Weibullian as the structure size increases. These are heterogeneous materials with brittle constituents, characterized by material inhomogeneities that are not negligible compared to the structure size. Examples include concrete, fiber composites, coarse-grained or toughened ceramics, rocks, sea ice, rigid foams and bone, as well as many materials used in nano- and microscale devices. This study presents a unified theory of strength and lifetime for such materials, based on activation energy controlled random jumps of the nano-crack front, and on the nano-macro multiscale transition of tail probabilities. Part I of this study deals with the case of monotonic and sustained (or creep) loading, and Part II with fatigue (or cyclic) loading. On the scale of the representative volume element of material, the probability distribution of strength has a Gaussian core onto which a remote Weibull tail is grafted at failure probability of the order of 10 -3. With increasing structure size, the Weibull tail penetrates into the Gaussian core. The probability distribution of static (creep) lifetime is related to the strength distribution by the power law for the static crack growth rate, for which a physical justification is given. The present theory yields a simple relation between the exponent of this law and the Weibull moduli for strength and lifetime. The benefit is that the lifetime distribution can be predicted from short-time tests of the mean size effect on strength and tests of the power law for the crack growth rate. The theory is shown to match closely numerous test data on strength and static lifetime of ceramics and concrete, and explains why their histograms deviate systematically from the straight line in Weibull scale. Although the present unified theory is built on several previous advances, new contributions are here made to address: (i) a crack in a disordered nano-structure (such as that of hydrated Portland cement), (ii) tail probability of a fiber bundle (or parallel coupling) model with softening elements, (iii) convergence of this model to the Gaussian distribution, (iv) the stress-life curve under constant load, and (v) a detailed random walk analysis of crack front jumps in an atomic lattice. The nonlocal behavior is captured in the present theory through the finiteness of the number of links in the weakest-link model, which explains why the mean size effect coincides with that of the previously formulated nonlocal Weibull theory. Brittle structures correspond to the large-size limit of the present theory. An important practical conclusion is that the safety factors for strength and tolerable minimum lifetime for large quasibrittle structures (e.g., concrete structures and composite airframes or ship hulls, as well as various micro-devices) should be calculated as a function of structure size and geometry.
NASA Technical Reports Server (NTRS)
Salem, Jonathan A.
2002-01-01
A generalized reliability model was developed for use in the design of structural components made from brittle, homogeneous anisotropic materials such as single crystals. The model is based on the Weibull distribution and incorporates a variable strength distribution and any equivalent stress failure criteria. In addition to the reliability model, an energy based failure criterion for elastically anisotropic materials was formulated. The model is different from typical Weibull-based models in that it accounts for strength anisotropy arising from fracture toughness anisotropy and thereby allows for strength and reliability predictions of brittle, anisotropic single crystals subjected to multiaxial stresses. The model is also applicable to elastically isotropic materials exhibiting strength anisotropy due to an anisotropic distribution of flaws. In order to develop and experimentally verify the model, the uniaxial and biaxial strengths of a single crystal nickel aluminide were measured. The uniaxial strengths of the <100> and <110> crystal directions were measured in three and four-point flexure. The biaxial strength was measured by subjecting <100> plates to a uniform pressure in a test apparatus that was developed and experimentally verified. The biaxial strengths of the single crystal plates were estimated by extending and verifying the displacement solution for a circular, anisotropic plate to the case of a variable radius and thickness. The best correlation between the experimental strength data and the model predictions occurred when an anisotropic stress analysis was combined with the normal stress criterion and the strength parameters associated with the <110> crystal direction.
An evaluation of percentile and maximum likelihood estimators of weibull paremeters
Stanley J. Zarnoch; Tommy R. Dell
1985-01-01
Two methods of estimating the three-parameter Weibull distribution were evaluated by computer simulation and field data comparison. Maximum likelihood estimators (MLB) with bias correction were calculated with the computer routine FITTER (Bailey 1974); percentile estimators (PCT) were those proposed by Zanakis (1979). The MLB estimators had superior smaller bias and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallocchia, G.; Laurenza, M.; Consolini, G.
2017-03-10
Some interplanetary shocks are associated with short-term and sharp particle flux enhancements near the shock front. Such intensity enhancements, known as shock-spike events (SSEs), represent a class of relatively energetic phenomena as they may extend to energies of some tens of MeV or even beyond. Here we present an SSE case study in order to shed light on the nature of the particle acceleration involved in this kind of event. Our observations refer to an SSE registered on 2011 October 3 at 22:23 UT, by STEREO B instrumentation when, at a heliocentric distance of 1.08 au, the spacecraft was sweptmore » by a perpendicular shock moving away from the Sun. The main finding from the data analysis is that a Weibull distribution represents a good fitting function to the measured particle spectrum over the energy range from 0.1 to 30 MeV. To interpret such an observational result, we provide a theoretical derivation of the Weibull spectrum in the framework of the acceleration by “killed” stochastic processes exhibiting power-law growth in time of the velocity expectation, such as the classical Fermi process. We find an overall coherence between the experimental values of the Weibull spectrum parameters and their physical meaning within the above scenario. Hence, our approach based on the Weibull distribution proves to be useful for understanding SSEs. With regard to the present event, we also provide an alternative explanation of the Weibull spectrum in terms of shock-surfing acceleration.« less
Global sensitivity analysis in wind energy assessment
NASA Astrophysics Data System (ADS)
Tsvetkova, O.; Ouarda, T. B.
2012-12-01
Wind energy is one of the most promising renewable energy sources. Nevertheless, it is not yet a common source of energy, although there is enough wind potential to supply world's energy demand. One of the most prominent obstacles on the way of employing wind energy is the uncertainty associated with wind energy assessment. Global sensitivity analysis (SA) studies how the variation of input parameters in an abstract model effects the variation of the variable of interest or the output variable. It also provides ways to calculate explicit measures of importance of input variables (first order and total effect sensitivity indices) in regard to influence on the variation of the output variable. Two methods of determining the above mentioned indices were applied and compared: the brute force method and the best practice estimation procedure In this study a methodology for conducting global SA of wind energy assessment at a planning stage is proposed. Three sampling strategies which are a part of SA procedure were compared: sampling based on Sobol' sequences (SBSS), Latin hypercube sampling (LHS) and pseudo-random sampling (PRS). A case study of Masdar City, a showcase of sustainable living in the UAE, is used to exemplify application of the proposed methodology. Sources of uncertainty in wind energy assessment are very diverse. In the case study the following were identified as uncertain input parameters: the Weibull shape parameter, the Weibull scale parameter, availability of a wind turbine, lifetime of a turbine, air density, electrical losses, blade losses, ineffective time losses. Ineffective time losses are defined as losses during the time when the actual wind speed is lower than the cut-in speed or higher than the cut-out speed. The output variable in the case study is the lifetime energy production. Most influential factors for lifetime energy production are identified with the ranking of the total effect sensitivity indices. The results of the present research show that the brute force method is best for wind assessment purpose, SBSS outperforms other sampling strategies in the majority of cases. The results indicate that the Weibull scale parameter, turbine lifetime and Weibull shape parameter are the three most influential variables in the case study setting. The following conclusions can be drawn from these results: 1) SBSS should be recommended for use in Monte Carlo experiments, 2) The brute force method should be recommended for conducting sensitivity analysis in wind resource assessment, and 3) Little variation in the Weibull scale causes significant variation in energy production. The presence of the two distribution parameters in the top three influential variables (the Weibull shape and scale) emphasizes the importance of accuracy of (a) choosing the distribution to model wind regime at a site and (b) estimating probability distribution parameters. This can be labeled as the most important conclusion of this research because it opens a field for further research, which the authors see could change the wind energy field tremendously.
A Monte Carlo Risk Analysis of Life Cycle Cost Prediction.
1975-09-01
process which occurs with each FLU failure. With this in mind there is no alternative other than the binomial distribution. 24 GOR/SM/75D-6 With all of...Weibull distribution of failures as selected by user. For each failure of the ith FLU, the model then samples from the binomial distribution to deter- mine...which is sampled from the binomial . Neither of the two conditions for normality are met, i.e., that RTS Ie close to .5 and the number of samples close
Vector wind profile gust model
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1981-01-01
To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.
Shoari, Niloofar; Dubé, Jean-Sébastien; Chenouri, Shoja'eddin
2015-11-01
In environmental studies, concentration measurements frequently fall below detection limits of measuring instruments, resulting in left-censored data. Some studies employ parametric methods such as the maximum likelihood estimator (MLE), robust regression on order statistic (rROS), and gamma regression on order statistic (GROS), while others suggest a non-parametric approach, the Kaplan-Meier method (KM). Using examples of real data from a soil characterization study in Montreal, we highlight the need for additional investigations that aim at unifying the existing literature. A number of studies have examined this issue; however, those considering data skewness and model misspecification are rare. These aspects are investigated in this paper through simulations. Among other findings, results show that for low skewed data, the performance of different statistical methods is comparable, regardless of the censoring percentage and sample size. For highly skewed data, the performance of the MLE method under lognormal and Weibull distributions is questionable; particularly, when the sample size is small or censoring percentage is high. In such conditions, MLE under gamma distribution, rROS, GROS, and KM are less sensitive to skewness. Related to model misspecification, MLE based on lognormal and Weibull distributions provides poor estimates when the true distribution of data is misspecified. However, the methods of rROS, GROS, and MLE under gamma distribution are generally robust to model misspecifications regardless of skewness, sample size, and censoring percentage. Since the characteristics of environmental data (e.g., type of distribution and skewness) are unknown a priori, we suggest using MLE based on gamma distribution, rROS and GROS. Copyright © 2015 Elsevier Ltd. All rights reserved.
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
Cure modeling in real-time prediction: How much does it help?
Ying, Gui-Shuang; Zhang, Qiang; Lan, Yu; Li, Yimei; Heitjan, Daniel F
2017-08-01
Various parametric and nonparametric modeling approaches exist for real-time prediction in time-to-event clinical trials. Recently, Chen (2016 BMC Biomedical Research Methodology 16) proposed a prediction method based on parametric cure-mixture modeling, intending to cover those situations where it appears that a non-negligible fraction of subjects is cured. In this article we apply a Weibull cure-mixture model to create predictions, demonstrating the approach in RTOG 0129, a randomized trial in head-and-neck cancer. We compare the ultimate realized data in RTOG 0129 to interim predictions from a Weibull cure-mixture model, a standard Weibull model without a cure component, and a nonparametric model based on the Bayesian bootstrap. The standard Weibull model predicted that events would occur earlier than the Weibull cure-mixture model, but the difference was unremarkable until late in the trial when evidence for a cure became clear. Nonparametric predictions often gave undefined predictions or infinite prediction intervals, particularly at early stages of the trial. Simulations suggest that cure modeling can yield better-calibrated prediction intervals when there is a cured component, or the appearance of a cured component, but at a substantial cost in the average width of the intervals. Copyright © 2017 Elsevier Inc. All rights reserved.
A Compatible Hardware/Software Reliability Prediction Model.
1981-07-22
machines. In particular, he was interested in the following problem: assu me that one has a collection of connected elements computing and transmitting...software reliability prediction model is desirable, the findings about the Weibull distribution are intriguing. After collecting failure data from several...capacitor, some of the added charge carriers are collected by the capacitor. If the added charge is sufficiently large, the information stored is changed
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-06-29
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images' spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines.
Moody, John A.
2017-01-01
A superslug was deposited in a basin in the Colorado Front Range Mountains as a consequence of an extreme flood following a wildfire disturbance in 1996. The subsequent evolution of this superslug was measured by repeat topographic surveys (31 surveys from 1996 through 2014) of 18 cross sections approximately uniformly spaced over 1500 m immediately above the basin outlet. These surveys allowed the identification within the superslug of chronostratigraphic units deposited and eroded by different geomorphic processes in response to different flow regimes.Over the time period of the study, the superslug went through aggradation, incision, and stabilization phases that were controlled by a shift in geomorphic processes from generally short-duration, episodic, large-magnitude floods that deposited new chronostratigraphic units to long-duration processes that eroded units. These phases were not contemporaneous at each channel cross section, which resulted in a complex response that preserved different chronostratigraphic units at each channel cross section having, in general, two dominant types of alluvial architecture—laminar and fragmented. Age and transit-time distributions for these two alluvial architectures evolved with time since the extreme flood. Because of the complex shape of the distributions they were best modeled by two-parameter Weibull functions. The Weibull scale parameter approximated the median age of the distributions, and the Weibull shape parameter generally had a linear relation that increased with time since the extreme flood. Additional results indicated that deposition of new chronostratigraphic units can be represented by a power-law frequency distribution, and that the erosion of units decreases with depth of burial to a limiting depth. These relations can be used to model other situations with different flow regimes where vertical aggradation and incision are dominant processes, to predict the residence time of possible contaminated sediment stored in channels or on floodplains, and to provide insight into the interpretation of recent or ancient fluvial deposits.
NASA Astrophysics Data System (ADS)
Moody, John A.
2017-10-01
A superslug was deposited in a basin in the Colorado Front Range Mountains as a consequence of an extreme flood following a wildfire disturbance in 1996. The subsequent evolution of this superslug was measured by repeat topographic surveys (31 surveys from 1996 through 2014) of 18 cross sections approximately uniformly spaced over 1500 m immediately above the basin outlet. These surveys allowed the identification within the superslug of chronostratigraphic units deposited and eroded by different geomorphic processes in response to different flow regimes. Over the time period of the study, the superslug went through aggradation, incision, and stabilization phases that were controlled by a shift in geomorphic processes from generally short-duration, episodic, large-magnitude floods that deposited new chronostratigraphic units to long-duration processes that eroded units. These phases were not contemporaneous at each channel cross section, which resulted in a complex response that preserved different chronostratigraphic units at each channel cross section having, in general, two dominant types of alluvial architecture-laminar and fragmented. Age and transit-time distributions for these two alluvial architectures evolved with time since the extreme flood. Because of the complex shape of the distributions they were best modeled by two-parameter Weibull functions. The Weibull scale parameter approximated the median age of the distributions, and the Weibull shape parameter generally had a linear relation that increased with time since the extreme flood. Additional results indicated that deposition of new chronostratigraphic units can be represented by a power-law frequency distribution, and that the erosion of units decreases with depth of burial to a limiting depth. These relations can be used to model other situations with different flow regimes where vertical aggradation and incision are dominant processes, to predict the residence time of possible contaminated sediment stored in channels or on floodplains, and to provide insight into the interpretation of recent or ancient fluvial deposits.
Bozkurt, Hayriye; D'Souza, Doris H; Davidson, P Michael
2014-05-01
Hepatitis A virus (HAV) is a food-borne enteric virus responsible for outbreaks of hepatitis associated with shellfish consumption. The objectives of this study were to determine the thermal inactivation behavior of HAV in blue mussels, to compare the first-order and Weibull models to describe the data, to calculate Arrhenius activation energy for each model, and to evaluate model efficiency by using selected statistical criteria. The times required to reduce the population by 1 log cycle (D-values) calculated from the first-order model (50 to 72°C) ranged from 1.07 to 54.17 min for HAV. Using the Weibull model, the times required to destroy 1 log unit (tD = 1) of HAV at the same temperatures were 1.57 to 37.91 min. At 72°C, the treatment times required to achieve a 6-log reduction were 7.49 min for the first-order model and 8.47 min for the Weibull model. The z-values (changes in temperature required for a 90% change in the log D-values) calculated for HAV were 15.88 ± 3.97°C (R(2), 0.94) with the Weibull model and 12.97 ± 0.59°C (R(2), 0.93) with the first-order model. The calculated activation energies for the first-order model and the Weibull model were 165 and 153 kJ/mol, respectively. The results revealed that the Weibull model was more appropriate for representing the thermal inactivation behavior of HAV in blue mussels. Correct understanding of the thermal inactivation behavior of HAV could allow precise determination of the thermal process conditions to prevent food-borne viral outbreaks associated with the consumption of contaminated mussels.
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
Creep test observation of viscoelastic failure of edible fats
NASA Astrophysics Data System (ADS)
Vithanage, C. R.; Grimson, M. J.; Smith, B. G.; Wills, P. R.
2011-03-01
A rheological creep test was used to investigate the viscoelastic failure of five edible fats. Butter, spreadable blend and spread were selected as edible fats because they belong to three different groups according to the Codex Alimentarius. Creep curves were analysed according to the Burger model. Results were fitted to a Weibull distribution representing the strain-dependent lifetime of putative fibres in the material. The Weibull shape and scale (lifetime) parameters were estimated for each substance. A comparison of the rheometric measurements of edible fats demonstrated a clear difference between the three different groups. Taken together the results indicate that butter has a lower threshold for mechanical failure than spreadable blend and spread. The observed behaviour of edible fats can be interpreted using a model in which there are two types of bonds between fat crystals; primary bonds that are strong and break irreversibly, and secondary bonds, which are weaker but break and reform reversibly.
NASA Technical Reports Server (NTRS)
Gyekenyesi, John P.; Nemeth, Noel N.
1987-01-01
The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Rolling Bearing Life Prediction, Theory, and Application
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
2013-01-01
A tutorial is presented outlining the evolution, theory, and application of rolling-element bearing life prediction from that of A. Palmgren, 1924; W. Weibull, 1939; G. Lundberg and A. Palmgren, 1947 and 1952; E. Ioannides and T. Harris, 1985; and E. Zaretsky, 1987. Comparisons are made between these life models. The Ioannides-Harris model without a fatigue limit is identical to the Lundberg-Palmgren model. The Weibull model is similar to that of Zaretsky if the exponents are chosen to be identical. Both the load-life and Hertz stress-life relations of Weibull, Lundberg and Palmgren, and Ioannides and Harris reflect a strong dependence on the Weibull slope. The Zaretsky model decouples the dependence of the critical shear stress-life relation from the Weibull slope. This results in a nominal variation of the Hertz stress-life exponent. For 9th- and 8th-power Hertz stress-life exponents for ball and roller bearings, respectively, the Lundberg- Palmgren model best predicts life. However, for 12th- and 10th-power relations reflected by modern bearing steels, the Zaretsky model based on the Weibull equation is superior. Under the range of stresses examined, the use of a fatigue limit would suggest that (for most operating conditions under which a rolling-element bearing will operate) the bearing will not fail from classical rolling-element fatigue. Realistically, this is not the case. The use of a fatigue limit will significantly overpredict life over a range of normal operating Hertz stresses. Since the predicted lives of rolling-element bearings are high, the problem can become one of undersizing a bearing for a particular application.
Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model
NASA Astrophysics Data System (ADS)
Al Sobhi, Mashail M.
2015-02-01
Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.
Weibull analysis of fracture test data on bovine cortical bone: influence of orientation.
Khandaker, Morshed; Ekwaro-Osire, Stephen
2013-01-01
The fracture toughness, K IC, of a cortical bone has been experimentally determined by several researchers. The variation of K IC values occurs from the variation of specimen orientation, shape, and size during the experiment. The fracture toughness of a cortical bone is governed by the severest flaw and, hence, may be analyzed using Weibull statistics. To the best of the authors' knowledge, however, no studies of this aspect have been published. The motivation of the study is the evaluation of Weibull parameters at the circumferential-longitudinal (CL) and longitudinal-circumferential (LC) directions. We hypothesized that Weibull parameters vary depending on the bone microstructure. In the present work, a two-parameter Weibull statistical model was applied to calculate the plane-strain fracture toughness of bovine femoral cortical bone obtained using specimens extracted from CL and LC directions of the bone. It was found that the Weibull modulus of fracture toughness was larger for CL specimens compared to LC specimens, but the opposite trend was seen for the characteristic fracture toughness. The reason for these trends is the microstructural and extrinsic toughening mechanism differences between CL and LC directions bone. The Weibull parameters found in this study can be applied to develop a damage-mechanics model for bone.
Morphological study on the prediction of the site of surface slides
Hiromasa Hiura
1991-01-01
The annual continual occurrence of surface slides in the basin was estimated by modifying the estimation formula of Yoshimatsu. The Weibull Distribution Function revealed to be usefull for presenting the state and the transition of surface slides in the basin. Three parameters of the Weibull Function are recognized to be the linear function of the area ratio a/A. The...
Automatic Threshold Detector Techniques
1976-07-15
Averaging CFAR in Non- Stationary Weibull Clutter, " L. Novak, (1974 IEEE Symposium on Information Theory ). 8. "The Weibull Distribution Applied to the... UGTS (K) ,Kml NPTS) 140 DO 153 K~lvNPT9 IF(SIGCSO(K) .LT.0. )SIOCSO(K).1 .E-50 IF(SIOWSO(K) .LT.0. )SIGWSQ(K)-1 .E-50 IF(SIONSG(K) .LT.O. )SIG3NSQCIO-1.E
Stanley J. Zarnoch; Donald P. Feduccia; V. Clark Baldwin; Tommy R. Dell
1991-01-01
A-growth and yield model has been developed for slash pine plantations on problem-free cutover sites in the west gulf region. The model was based on the moment-percentile method using the Weibull distribution for tree diameters. This technique was applied to untbinned and thinned stand projections and, subsequently, to the prediction of residual stands immediately...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muñoz-Jaramillo, Andrés; Windmueller, John C.; Amouzou, Ernest C.
2015-02-10
In this work, we take advantage of 11 different sunspot group, sunspot, and active region databases to characterize the area and flux distributions of photospheric magnetic structures. We find that, when taken separately, different databases are better fitted by different distributions (as has been reported previously in the literature). However, we find that all our databases can be reconciled by the simple application of a proportionality constant, and that, in reality, different databases are sampling different parts of a composite distribution. This composite distribution is made up by linear combination of Weibull and log-normal distributions—where a pure Weibull (log-normal) characterizesmore » the distribution of structures with fluxes below (above) 10{sup 21}Mx (10{sup 22}Mx). Additionally, we demonstrate that the Weibull distribution shows the expected linear behavior of a power-law distribution (when extended to smaller fluxes), making our results compatible with the results of Parnell et al. We propose that this is evidence of two separate mechanisms giving rise to visible structures on the photosphere: one directly connected to the global component of the dynamo (and the generation of bipolar active regions), and the other with the small-scale component of the dynamo (and the fragmentation of magnetic structures due to their interaction with turbulent convection)« less
Functional models for colloid retention in porous media at the triple line.
Dathe, Annette; Zevi, Yuniati; Richards, Brian K; Gao, Bin; Parlange, J-Yves; Steenhuis, Tammo S
2014-01-01
Spectral confocal microscope visualizations of microsphere movement in unsaturated porous media showed that attachment at the Air Water Solid (AWS) interface was an important retention mechanism. These visualizations can aid in resolving the functional form of retention rates of colloids at the AWS interface. In this study, soil adsorption isotherm equations were adapted by replacing the chemical concentration in the water as independent variable by the cumulative colloids passing by. In order of increasing number of fitted parameters, the functions tested were the Langmuir adsorption isotherm, the Logistic distribution, and the Weibull distribution. The functions were fitted against colloid concentrations obtained from time series of images acquired with a spectral confocal microscope for three experiments performed where either plain or carboxylated polystyrene latex microspheres were pulsed in a small flow chamber filled with cleaned quartz sand. Both moving and retained colloids were quantified over time. In fitting the models to the data, the agreement improved with increasing number of model parameters. The Weibull distribution gave overall the best fit. The logistic distribution did not fit the initial retention of microspheres well but otherwise the fit was good. The Langmuir isotherm only fitted the longest time series well. The results can be explained that initially when colloids are first introduced the rate of retention is low. Once colloids are at the AWS interface they act as anchor point for other colloids to attach and thereby increasing the retention rate as clusters form. Once the available attachment sites diminish, the retention rate decreases.
Bazant, Zdenĕk P; Pang, Sze-Dai
2006-06-20
In mechanical design as well as protection from various natural hazards, one must ensure an extremely low failure probability such as 10(-6). How to achieve that goal is adequately understood only for the limiting cases of brittle or ductile structures. Here we present a theory to do that for the transitional class of quasibrittle structures, having brittle constituents and characterized by nonnegligible size of material inhomogeneities. We show that the probability distribution of strength of the representative volume element of material is governed by the Maxwell-Boltzmann distribution of atomic energies and the stress dependence of activation energy barriers; that it is statistically modeled by a hierarchy of series and parallel couplings; and that it consists of a broad Gaussian core having a grafted far-left power-law tail with zero threshold and amplitude depending on temperature and load duration. With increasing structure size, the Gaussian core shrinks and Weibull tail expands according to the weakest-link model for a finite chain of representative volume elements. The model captures experimentally observed deviations of the strength distribution from Weibull distribution and of the mean strength scaling law from a power law. These deviations can be exploited for verification and calibration. The proposed theory will increase the safety of concrete structures, composite parts of aircraft or ships, microelectronic components, microelectromechanical systems, prosthetic devices, etc. It also will improve protection against hazards such as landslides, avalanches, ice breaks, and rock or soil failures.
Roos, Malgorzata; Stawarczyk, Bogna
2012-07-01
This study evaluated and compared Weibull parameters of resin bond strength values using six different general-purpose statistical software packages for two-parameter Weibull distribution. Two-hundred human teeth were randomly divided into 4 groups (n=50), prepared and bonded on dentin according to the manufacturers' instructions using the following resin cements: (i) Variolink (VAN, conventional resin cement), (ii) Panavia21 (PAN, conventional resin cement), (iii) RelyX Unicem (RXU, self-adhesive resin cement) and (iv) G-Cem (GCM, self-adhesive resin cement). Subsequently, all specimens were stored in water for 24h at 37°C. Shear bond strength was measured and the data were analyzed using Anderson-Darling goodness-of-fit (MINITAB 16) and two-parameter Weibull statistics with the following statistical software packages: Excel 2011, SPSS 19, MINITAB 16, R 2.12.1, SAS 9.1.3. and STATA 11.2 (p≤0.05). Additionally, the three-parameter Weibull was fitted using MNITAB 16. Two-parameter Weibull calculated with MINITAB and STATA can be compared using an omnibus test and using 95% CI. In SAS only 95% CI were directly obtained from the output. R provided no estimates of 95% CI. In both SAS and R the global comparison of the characteristic bond strength among groups is provided by means of the Weibull regression. EXCEL and SPSS provided no default information about 95% CI and no significance test for the comparison of Weibull parameters among the groups. In summary, conventional resin cement VAN showed the highest Weibull modulus and characteristic bond strength. There are discrepancies in the Weibull statistics depending on the software package and the estimation method. The information content in the default output provided by the software packages differs to very high extent. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Juliano, Pablo; Knoerzer, Kai; Fryer, Peter J; Versteeg, Cornelis
2009-01-01
High-pressure, high-temperature (HPHT) processing is effective for microbial spore inactivation using mild preheating, followed by rapid volumetric compression heating and cooling on pressure release, enabling much shorter processing times than conventional thermal processing for many food products. A computational thermal fluid dynamic (CTFD) model has been developed to model all processing steps, including the vertical pressure vessel, an internal polymeric carrier, and food packages in an axis-symmetric geometry. Heat transfer and fluid dynamic equations were coupled to four selected kinetic models for the inactivation of C. botulinum; the traditional first-order kinetic model, the Weibull model, an nth-order model, and a combined discrete log-linear nth-order model. The models were solved to compare the resulting microbial inactivation distributions. The initial temperature of the system was set to 90 degrees C and pressure was selected at 600 MPa, holding for 220 s, with a target temperature of 121 degrees C. A representation of the extent of microbial inactivation throughout all processing steps was obtained for each microbial model. Comparison of the models showed that the conventional thermal processing kinetics (not accounting for pressure) required shorter holding times to achieve a 12D reduction of C. botulinum spores than the other models. The temperature distribution inside the vessel resulted in a more uniform inactivation distribution when using a Weibull or an nth-order kinetics model than when using log-linear kinetics. The CTFD platform could illustrate the inactivation extent and uniformity provided by the microbial models. The platform is expected to be useful to evaluate models fitted into new C. botulinum inactivation data at varying conditions of pressure and temperature, as an aid for regulatory filing of the technology as well as in process and equipment design.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
Background In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. Methods A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. Results All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. Conclusions The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method. PMID:25803659
A comparative study of mixture cure models with covariate
NASA Astrophysics Data System (ADS)
Leng, Oh Yit; Khalid, Zarina Mohd
2017-05-01
In survival analysis, the survival time is assumed to follow a non-negative distribution, such as the exponential, Weibull, and log-normal distributions. In some cases, the survival time is influenced by some observed factors. The absence of these observed factors may cause an inaccurate estimation in the survival function. Therefore, a survival model which incorporates the influences of observed factors is more appropriate to be used in such cases. These observed factors are included in the survival model as covariates. Besides that, there are cases where a group of individuals who are cured, that is, not experiencing the event of interest. Ignoring the cure fraction may lead to overestimate in estimating the survival function. Thus, a mixture cure model is more suitable to be employed in modelling survival data with the presence of a cure fraction. In this study, three mixture cure survival models are used to analyse survival data with a covariate and a cure fraction. The first model includes covariate in the parameterization of the susceptible individuals survival function, the second model allows the cure fraction to depend on covariate, and the third model incorporates covariate in both cure fraction and survival function of susceptible individuals. This study aims to compare the performance of these models via a simulation approach. Therefore, in this study, survival data with varying sample sizes and cure fractions are simulated and the survival time is assumed to follow the Weibull distribution. The simulated data are then modelled using the three mixture cure survival models. The results show that the three mixture cure models are more appropriate to be used in modelling survival data with the presence of cure fraction and an observed factor.
Decision Models for Determining the Optimal Life Test Sampling Plans
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Strelchonok, Vladimir F.
2010-11-01
Life test sampling plan is a technique, which consists of sampling, inspection, and decision making in determining the acceptance or rejection of a batch of products by experiments for examining the continuous usage time of the products. In life testing studies, the lifetime is usually assumed to be distributed as either a one-parameter exponential distribution, or a two-parameter Weibull distribution with the assumption that the shape parameter is known. Such oversimplified assumptions can facilitate the follow-up analyses, but may overlook the fact that the lifetime distribution can significantly affect the estimation of the failure rate of a product. Moreover, sampling costs, inspection costs, warranty costs, and rejection costs are all essential, and ought to be considered in choosing an appropriate sampling plan. The choice of an appropriate life test sampling plan is a crucial decision problem because a good plan not only can help producers save testing time, and reduce testing cost; but it also can positively affect the image of the product, and thus attract more consumers to buy it. This paper develops the frequentist (non-Bayesian) decision models for determining the optimal life test sampling plans with an aim of cost minimization by identifying the appropriate number of product failures in a sample that should be used as a threshold in judging the rejection of a batch. The two-parameter exponential and Weibull distributions with two unknown parameters are assumed to be appropriate for modelling the lifetime of a product. A practical numerical application is employed to demonstrate the proposed approach.
Time-dependent reliability analysis of ceramic engine components
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.
Statistical properties of world investment networks
NASA Astrophysics Data System (ADS)
Song, Dong-Ming; Jiang, Zhi-Qiang; Zhou, Wei-Xing
2009-06-01
We have performed a detailed investigation on the world investment networks constructed from the Coordinated Portfolio Investment Survey (CPIS) data of the International Monetary Fund, ranging from 2001 to 2006. The distributions of degrees and node strengths are scale-free. The weight distributions can be well modeled by the Weibull distribution. The maximum flow spanning trees of the world investment networks possess two universal allometric scaling relations, independent of time and the investment type. The topological scaling exponent is 1.17±0.02 and the flow scaling exponent is 1.03±0.01.
Virlogeux, Victor; Li, Ming; Tsang, Tim K; Feng, Luzhao; Fang, Vicky J; Jiang, Hui; Wu, Peng; Zheng, Jiandong; Lau, Eric H Y; Cao, Yu; Qin, Ying; Liao, Qiaohong; Yu, Hongjie; Cowling, Benjamin J
2015-10-15
A novel avian influenza virus, influenza A(H7N9), emerged in China in early 2013 and caused severe disease in humans, with infections occurring most frequently after recent exposure to live poultry. The distribution of A(H7N9) incubation periods is of interest to epidemiologists and public health officials, but estimation of the distribution is complicated by interval censoring of exposures. Imputation of the midpoint of intervals was used in some early studies, resulting in estimated mean incubation times of approximately 5 days. In this study, we estimated the incubation period distribution of human influenza A(H7N9) infections using exposure data available for 229 patients with laboratory-confirmed A(H7N9) infection from mainland China. A nonparametric model (Turnbull) and several parametric models accounting for the interval censoring in some exposures were fitted to the data. For the best-fitting parametric model (Weibull), the mean incubation period was 3.4 days (95% confidence interval: 3.0, 3.7) and the variance was 2.9 days; results were very similar for the nonparametric Turnbull estimate. Under the Weibull model, the 95th percentile of the incubation period distribution was 6.5 days (95% confidence interval: 5.9, 7.1). The midpoint approximation for interval-censored exposures led to overestimation of the mean incubation period. Public health observation of potentially exposed persons for 7 days after exposure would be appropriate. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Rotondi, Renata; Varini, Elisa
2016-04-01
The long-term recurrence of strong earthquakes is often modelled by the stationary Poisson process for the sake of simplicity, although renewal and self-correcting point processes (with non-decreasing hazard functions) are more appropriate. Short-term models mainly fit earthquake clusters due to the tendency of an earthquake to trigger other earthquakes; in this case, self-exciting point processes with non-increasing hazard are especially suitable. In order to provide a unified framework for analyzing earthquake catalogs, Schoenberg and Bolt proposed the SELC (Short-term Exciting Long-term Correcting) model (BSSA, 2000) and Varini employed a state-space model for estimating the different phases of a seismic cycle (PhD Thesis, 2005). Both attempts are combinations of long- and short-term models, but results are not completely satisfactory, due to the different scales at which these models appear to operate. In this study, we split a seismic sequence in two groups: the leader events, whose magnitude exceeds a threshold magnitude, and the remaining ones considered as subordinate events. The leader events are assumed to follow a well-known self-correcting point process named stress release model (Vere-Jones, J. Phys. Earth, 1978; Bebbington & Harte, GJI, 2003, Varini & Rotondi, Env. Ecol. Stat., 2015). In the interval between two subsequent leader events, subordinate events are expected to cluster at the beginning (aftershocks) and at the end (foreshocks) of that interval; hence, they are modeled by a failure processes that allows bathtub-shaped hazard function. In particular, we have examined the generalized Weibull distributions, a large family that contains distributions with different bathtub-shaped hazard as well as the standard Weibull distribution (Lai, Springer, 2014). The model is fitted to a dataset of Italian historical earthquakes and the results of Bayesian inference are shown.
Relationship between Defect Size and Fatigue Life Distributions in Al-7 Pct Si-Mg Alloy Castings
NASA Astrophysics Data System (ADS)
Tiryakioğlu, Murat
2009-07-01
A new method for predicting the variability in fatigue life of castings was developed by combining the size distribution for the fatigue-initiating defects and a fatigue life model based on the Paris-Erdoğan law for crack propagation. Two datasets for the fatigue-initiating defects in Al-7 pct Si-Mg alloy castings, reported previously in the literature, were used to demonstrate that (1) the size of fatigue-initiating defects follow the Gumbel distribution; (2) the crack propagation model developed previously provides respectable fits to experimental data; and (3) the method developed in the present study expresses the variability in both datasets, almost as well as the lognormal distribution and better than the Weibull distribution.
Cai, Jing; Tyree, Melvin T
2010-07-01
The objective of this study was to quantify the relationship between vulnerability to cavitation and vessel diameter within a species. We measured vulnerability curves (VCs: percentage loss hydraulic conductivity versus tension) in aspen stems and measured vessel-size distributions. Measurements were done on seed-grown, 4-month-old aspen (Populus tremuloides Michx) grown in a greenhouse. VCs of stem segments were measured using a centrifuge technique and by a staining technique that allowed a VC to be constructed based on vessel diameter size-classes (D). Vessel-based VCs were also fitted to Weibull cumulative distribution functions (CDF), which provided best-fit values of Weibull CDF constants (c and b) and P(50) = the tension causing 50% loss of hydraulic conductivity. We show that P(50) = 6.166D(-0.3134) (R(2) = 0.995) and that b and 1/c are both linear functions of D with R(2) > 0.95. The results are discussed in terms of models of VCs based on vessel D size-classes and in terms of concepts such as the 'pit area hypothesis' and vessel pathway redundancy.
Liu, Jinping; Tang, Zhaohui; Xu, Pengfei; Liu, Wenzhong; Zhang, Jin; Zhu, Jianyong
2016-01-01
The topic of online product quality inspection (OPQI) with smart visual sensors is attracting increasing interest in both the academic and industrial communities on account of the natural connection between the visual appearance of products with their underlying qualities. Visual images captured from granulated products (GPs), e.g., cereal products, fabric textiles, are comprised of a large number of independent particles or stochastically stacking locally homogeneous fragments, whose analysis and understanding remains challenging. A method of image statistical modeling-based OPQI for GP quality grading and monitoring by a Weibull distribution(WD) model with a semi-supervised learning classifier is presented. WD-model parameters (WD-MPs) of GP images’ spatial structures, obtained with omnidirectional Gaussian derivative filtering (OGDF), which were demonstrated theoretically to obey a specific WD model of integral form, were extracted as the visual features. Then, a co-training-style semi-supervised classifier algorithm, named COSC-Boosting, was exploited for semi-supervised GP quality grading, by integrating two independent classifiers with complementary nature in the face of scarce labeled samples. Effectiveness of the proposed OPQI method was verified and compared in the field of automated rice quality grading with commonly-used methods and showed superior performance, which lays a foundation for the quality control of GP on assembly lines. PMID:27367703
Mortality profiles of Rhodnius prolixus (Heteroptera: Reduviidae), vector of Chagas disease.
Chaves, Luis Fernando; Hernandez, Maria-Josefina; Revilla, Tomás A; Rodríguez, Diego J; Rabinovich, Jorge E
2004-10-01
Life table data of Rhodnius prolixus (Heteroptera: Reduviidae) kept at laboratory conditions were analysed in search for mortality patterns. Gompertz and Weibull mortality models seem adequate to explain the sigmoid shape of the survivorship curve. A significant fit was obtained with both models for females (R(2) = 0.70, P < 0.0005 for the Gompertz model; R(2) = 0.78, P < 0.0005 for the Weibull model) and for males (R(2) = 0.39, P < 0.0005 for the Gompertz model; R(2) = 0.48, P < 0.0005 for the Weibull model). The mortality parameter (b) is higher for females in Gompertz and Weibull models, using smoothed and non-smoothed data (P < 0.05), revealing a significant sex mortality differential. Given the particular life history of this insect, the non-linear relationship between the force of mortality and age may have an important impact in the vectorial capacity of R. prolixus as Chagas disease vector, and its consideration should be included as an important factor in the transmission of Trypanosoma cruzi by triatomines.
Modelling Limit Order Execution Times from Market Data
NASA Astrophysics Data System (ADS)
Kim, Adlar; Farmer, Doyne; Lo, Andrew
2007-03-01
Although the term ``liquidity'' is widely used in finance literatures, its meaning is very loosely defined and there is no quantitative measure for it. Generally, ``liquidity'' means an ability to quickly trade stocks without causing a significant impact on the stock price. From this definition, we identified two facets of liquidity -- 1.execution time of limit orders, and 2.price impact of market orders. The limit order is an order to transact a prespecified number of shares at a prespecified price, which will not cause an immediate execution. On the other hand, the market order is an order to transact a prespecified number of shares at a market price, which will cause an immediate execution, but are subject to price impact. Therefore, when the stock is liquid, market participants will experience quick limit order executions and small market order impacts. As a first step to understand market liquidity, we studied the facet of liquidity related to limit order executions -- execution times. In this talk, we propose a novel approach of modeling limit order execution times and show how they are affected by size and price of orders. We used q-Weibull distribution, which is a generalized form of Weibull distribution that can control the fatness of tail to model limit order execution times.
Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's
NASA Technical Reports Server (NTRS)
Jadaan, Osama
2003-01-01
This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.
Weibull models of fracture strengths and fatigue behavior of dental resins in flexure and shear.
Baran, G R; McCool, J I; Paul, D; Boberick, K; Wunder, S
1998-01-01
In estimating lifetimes of dental restorative materials, it is useful to have available data on the fatigue behavior of these materials. Current efforts at estimation include several untested assumptions related to the equivalence of flaw distributions sampled by shear, tensile, and compressive stresses. Environmental influences on material properties are not accounted for, and it is unclear if fatigue limits exist. In this study, the shear and flexural strengths of three resins used as matrices in dental restorative composite materials were characterized by Weibull parameters. It was found that shear strengths were lower than flexural strengths, liquid sorption had a profound effect on characteristic strengths, and the Weibull shape parameter obtained from shear data differed for some materials from that obtained in flexure. In shear and flexural fatigue, a power law relationship applied for up to 250,000 cycles; no fatigue limits were found, and the data thus imply only one flaw population is responsible for failure. Again, liquid sorption adversely affected strength levels in most materials (decreasing shear strengths and flexural strengths by factors of 2-3) and to a greater extent than did the degree of cure or material chemistry.
NASA Astrophysics Data System (ADS)
Zaidman, Paula C.; Morsan, Enrique
2018-05-01
In the development of management measures for sustainable fisheries, estimating the natural mortality rate and recruitment are fundamental. In northern Patagonia, Argentina, the southern geoduck, Panopea abbreviata, a long-lived clam that forms spatially disjunct subpopulations, supports an unregulated fishery. In this study, we estimate natural mortality. We studied the age structure of beds within the northern Patagonia gulfs, San Matías Gulf (SMG) and San Jose Gulf (SJG), and we estimated a time series for back-reconstructed recruitment to explore spatial coherence in relation to local oceanographic conditions and to elucidate its population dynamics. We constructed a cumulative frequency distribution of the age of dead shells collected and used the exponential and Weibull models to model mortality. Live geoducks were sampled from six populations between 2000 and 2006. Age-frequency distributions and mortality models were used to back-calculate the time series of recruitment for each population. The recruitment time series was analysed using continuous wavelet transform. The value of natural mortality estimated by the exponential model was 0.054 years-1, whereas those estimated by the Weibull model were α = 0.00085 years-1 and β = 2.1. For the latter, M values for cohorts were 0.01 for 10 years, 0.02 for 20 years, 0.04 for 30 years and 0.05 for 40 years. The Weibull model was observed to be the best fit to the data. The natural mortality rate of P. abbreviata estimated in this study was lower than that estimated in a previous work for populations from SMG. The back-calculated time series for recruitment demonstrated considerable yearly variation, suggesting that local conditions have an important role in recruitment regulation. At a decadal temporal scale, a clear increasing recruitment trend was evident over the last 20 years in all populations. Populations in SMG were settled >60 years ago. In contrast, no individuals older than 30 years were observed in the populations from SJG. P. abbreviata has several characteristics, such as longevity and low instantaneous natural mortality rate, which require attention in any resource planning. However, this species also has positive characteristics for fishery development, as historical recruitment trends indicate that populations are expanding and are part of a widely distributed metapopulation, suggesting that sustainable exploitation is possible.
Jahid, Iqbal Kabir; Ha, Sang-Do
2014-05-01
The present article focuses on the inactivation kinetics of various disinfectants including ethanol, sodium hypochlorite, hydrogen peroxide, peracetic acid, and benzalkonium chloride against Aeromonas hydrophila biofilms and planktonic cells. Efficacy was determined by viable plate count and compared using a modified Weibull model. The removal of the biofilms matrix was determined by the crystal violet assay and was confirmed by field-emission scanning electron microscope. The results revealed that all the experimental data and calculated Weibull α (scale) and β (shape) parameters had a good fit, as the R(2) values were between 0.88 and 0.99. Biofilms are more resistant to disinfectants than planktonic cells. Ethanol (70%) was the most effective in killing cells in the biofilms and significantly reduced (p<0.05) the biofilms matrix. The Weibull parameter b-value correlated (R(2)=0.6835) with the biofilms matrix removal. The present findings deduce that the Weibull model is suitable to determine biofilms matrix reduction as well as the effectiveness of chemical disinfectants on biofilms. The study showed that the Weibull model could successfully be used on food and food contact surfaces to determine the exact contact time for killing biofilms-forming foodborne pathogens.
Estimating Tree Height-Diameter Models with the Bayesian Method
Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei
2014-01-01
Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the “best” model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2. PMID:24711733
Estimating tree height-diameter models with the Bayesian method.
Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei
2014-01-01
Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the "best" model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2.
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.
Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N
2016-01-01
Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.
Thermal inactivation kinetics of hepatitis A virus in homogenized clam meat (Mercenaria mercenaria).
Bozkurt, H; D'Souza, D H; Davidson, P M
2015-09-01
Epidemiological evidence suggests that hepatitis A virus (HAV) is the most common pathogen transmitted by bivalve molluscs such as clams, cockles, mussels and oysters. This study aimed to generate thermal inactivation kinetics for HAV as a first step to design adequate thermal processes to control clam-associated HAV outbreaks. Survivor curves and thermal death curves were generated for different treatment times (0-6 min) at different temperatures (50-72°C) and Weibull and first-order models were compared. D-values for HAV ranged from 47·37 ± 1·23 to 1·55 ± 0·12 min for the first-order model and 64·43 ± 3·47 to 1·25 ± 0·45 min for the Weibull model at temperatures from 50 to 72°C. z-Values for HAV in clams were 12·97 ± 0·59°C and 14·83 ± 0·0·28°C using the Weibull and first-order model respectively. The calculated activation energies for the first-order and Weibull model were 145 and 170 kJ mole(-1) respectively. The Weibull model described the thermal inactivation behaviour of HAV better than the first-order model. This study provides novel and precise information on thermal inactivation kinetics of HAV in homogenized clams. This will enable reliable thermal process calculations for HAV inactivation in clams and closely related seafood. © 2015 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Iskandar, I.
2018-03-01
The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.
NASA Astrophysics Data System (ADS)
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.
Stochastic Analysis of Wind Energy for Wind Pump Irrigation in Coastal Andhra Pradesh, India
NASA Astrophysics Data System (ADS)
Raju, M. M.; Kumar, A.; Bisht, D.; Rao, D. B.
2014-09-01
The rapid escalation in the prices of oil and gas as well as increasing demand for energy has attracted the attention of scientists and researchers to explore the possibility of generating and utilizing the alternative and renewable sources of wind energy in the long coastal belt of India with considerable wind energy resources. A detailed analysis of wind potential is a prerequisite to harvest the wind energy resources efficiently. Keeping this in view, the present study was undertaken to analyze the wind energy potential to assess feasibility of the wind-pump operated irrigation system in the coastal region of Andhra Pradesh, India, where high ground water table conditions are available. The stochastic analysis of wind speed data were tested to fit a probability distribution, which describes the wind energy potential in the region. The normal and Weibull probability distributions were tested; and on the basis of Chi square test, the Weibull distribution gave better results. Hence, it was concluded that the Weibull probability distribution may be used to stochastically describe the annual wind speed data of coastal Andhra Pradesh with better accuracy. The size as well as the complete irrigation system with mass curve analysis was determined to satisfy various daily irrigation demands at different risk levels.
Performance of mixed RF/FSO systems in exponentiated Weibull distributed channels
NASA Astrophysics Data System (ADS)
Zhao, Jing; Zhao, Shang-Hong; Zhao, Wei-Hu; Liu, Yun; Li, Xuan
2017-12-01
This paper presented the performances of asymmetric mixed radio frequency (RF)/free-space optical (FSO) system with the amplify-and-forward relaying scheme. The RF channel undergoes Nakagami- m channel, and the Exponentiated Weibull distribution is adopted for the FSO component. The mathematical formulas for cumulative distribution function (CDF), probability density function (PDF) and moment generating function (MGF) of equivalent signal-to-noise ratio (SNR) are achieved. According to the end-to-end statistical characteristics, the new analytical expressions of outage probability are obtained. Under various modulation techniques, we derive the average bit-error-rate (BER) based on the Meijer's G function. The evaluation and simulation are provided for the system performance, and the aperture average effect is discussed as well.
Probabilistic thermal-shock strength testing using infrared imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, A.A.; Scheidt, R.A.; Ferber, M.K.
1999-12-01
A thermal-shock strength-testing technique has been developed that uses a high-resolution, high-temperature infrared camera to capture a specimen's surface temperature distribution at fracture. Aluminum nitride (AlN) substrates are thermally shocked to fracture to demonstrate the technique. The surface temperature distribution for each test and AlN's thermal expansion are used as input in a finite-element model to determine the thermal-shock strength for each specimen. An uncensored thermal-shock strength Weibull distribution is then determined. The test and analysis algorithm show promise as a means to characterize thermal shock strength of ceramic materials.
We compared two regression models, which are based on the Weibull and probit functions, for the analysis of pesticide toxicity data from laboratory studies on Illinois crop and native plant species. Both mathematical models are continuous, differentiable, strictly positive, and...
NASA Technical Reports Server (NTRS)
Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.
1990-01-01
This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.
Monolithic ceramic analysis using the SCARE program
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.
1988-01-01
The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.
NASA Technical Reports Server (NTRS)
Gyekenyesi, J. P.
1985-01-01
A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Weibull crack density coefficient for polydimensional stress states
NASA Technical Reports Server (NTRS)
Gross, Bernard; Gyekenyesi, John P.
1989-01-01
A structural ceramic analysis and reliability evaluation code has recently been developed encompassing volume and surface flaw induced fracture, modeled by the two-parameter Weibull probability density function. A segment of the software involves computing the Weibull polydimensional stress state crack density coefficient from uniaxial stress experimental fracture data. The relationship of the polydimensional stress coefficient to the uniaxial stress coefficient is derived for a shear-insensitive material with a random surface flaw population.
The probability distribution model of air pollution index and its dominants in Kuala Lumpur
NASA Astrophysics Data System (ADS)
AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah
2016-11-01
This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.
Historical floods in flood frequency analysis: Is this game worth the candle?
NASA Astrophysics Data System (ADS)
Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa
2017-11-01
In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.
Durability evaluation of ceramic components using CARES/LIFE
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1994-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens which exhibit SCG when exposed to water.
Durability evaluation of ceramic components using CARES/LIFE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nemeth, N.N.; Janosik, L.A.; Gyekenyesi, J.P.
1996-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength andmore » fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens, which exhibit SCG when exposed to water.« less
Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.
Caraviello, D Z; Weigel, K A; Gianola, D
2004-05-01
Predicted transmitting abilities (PTA) of US Jersey sires for daughter longevity were calculated using a Weibull proportional hazards sire model and compared with predictions from a conventional linear animal model. Culling data from 268,008 Jersey cows with first calving from 1981 to 2000 were used. The proportional hazards model included time-dependent effects of herd-year-season contemporary group and parity by stage of lactation interaction, as well as time-independent effects of sire and age at first calving. Sire variances and parameters of the Weibull distribution were estimated, providing heritability estimates of 4.7% on the log scale and 18.0% on the original scale. The PTA of each sire was expressed as the expected risk of culling relative to daughters of an average sire. Risk ratios (RR) ranged from 0.7 to 1.3, indicating that the risk of culling for daughters of the best sires was 30% lower than for daughters of average sires and nearly 50% lower than than for daughters of the poorest sires. Sire PTA from the proportional hazards model were compared with PTA from a linear model similar to that used for routine national genetic evaluation of length of productive life (PL) using cross-validation in independent samples of herds. Models were compared using logistic regression of daughters' stayability to second, third, fourth, or fifth lactation on their sires' PTA values, with alternative approaches for weighting the contribution of each sire. Models were also compared using logistic regression of daughters' stayability to 36, 48, 60, 72, and 84 mo of life. The proportional hazards model generally yielded more accurate predictions according to these criteria, but differences in predictive ability between methods were smaller when using a Kullback-Leibler distance than with other approaches. Results of this study suggest that survival analysis methodology may provide more accurate predictions of genetic merit for longevity than conventional linear models.
Life and reliability modeling of bevel gear reductions
NASA Technical Reports Server (NTRS)
Savage, M.; Brikmanis, C. K.; Lewicki, D. G.; Coy, J. J.
1985-01-01
A reliability model is presented for bevel gear reductions with either a single input pinion or dual input pinions of equal size. The dual pinions may or may not have the same power applied for the analysis. The gears may be straddle mounted or supported in a bearing quill. The reliability model is based on the Weibull distribution. The reduction's basic dynamic capacity is defined as the output torque which may be applied for one million output rotations of the bevel gear with a 90 percent probability of reduction survival.
Weibull-Based Design Methodology for Rotating Aircraft Engine Structures
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry
2002-01-01
The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.
A Mixed Kijima Model Using the Weibull-Based Generalized Renewal Processes
2015-01-01
Generalized Renewal Processes are useful for approaching the rejuvenation of dynamical systems resulting from planned or unplanned interventions. We present new perspectives for the Generalized Renewal Processes in general and for the Weibull-based Generalized Renewal Processes in particular. Disregarding from literature, we present a mixed Generalized Renewal Processes approach involving Kijima Type I and II models, allowing one to infer the impact of distinct interventions on the performance of the system under study. The first and second theoretical moments of this model are introduced as well as its maximum likelihood estimation and random sampling approaches. In order to illustrate the usefulness of the proposed Weibull-based Generalized Renewal Processes model, some real data sets involving improving, stable, and deteriorating systems are used. PMID:26197222
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models
Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.
2016-01-01
Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906
Ferrario, Mariana I; Guerrero, Sandra N
The purpose of this study was to analyze the response of different initial contamination levels of Alicyclobacillus acidoterrestris ATCC 49025 spores in apple juice as affected by pulsed light treatment (PL, batch mode, xenon lamp, 3pulses/s, 0-71.6J/cm 2 ). Biphasic and Weibull frequency distribution models were used to characterize the relationship between inoculum size and treatment time with the reductions achieved after PL exposure. Additionally, a second order polynomial model was computed to relate required PL processing time to inoculum size and requested log reductions. PL treatment caused up to 3.0-3.5 log reductions, depending on the initial inoculum size. Inactivation curves corresponding to PL-treated samples were adequately characterized by both Weibull and biphasic models (R adj 2 94-96%), and revealed that lower initial inoculum sizes were associated with higher inactivation rates. According to the polynomial model, the predicted time for PL treatment increased exponentially with inoculum size. Copyright © 2017 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.
Cheung, Li C; Pan, Qing; Hyun, Noorie; Schiffman, Mark; Fetterman, Barbara; Castle, Philip E; Lorey, Thomas; Katki, Hormuzd A
2017-09-30
For cost-effectiveness and efficiency, many large-scale general-purpose cohort studies are being assembled within large health-care providers who use electronic health records. Two key features of such data are that incident disease is interval-censored between irregular visits and there can be pre-existing (prevalent) disease. Because prevalent disease is not always immediately diagnosed, some disease diagnosed at later visits are actually undiagnosed prevalent disease. We consider prevalent disease as a point mass at time zero for clinical applications where there is no interest in time of prevalent disease onset. We demonstrate that the naive Kaplan-Meier cumulative risk estimator underestimates risks at early time points and overestimates later risks. We propose a general family of mixture models for undiagnosed prevalent disease and interval-censored incident disease that we call prevalence-incidence models. Parameters for parametric prevalence-incidence models, such as the logistic regression and Weibull survival (logistic-Weibull) model, are estimated by direct likelihood maximization or by EM algorithm. Non-parametric methods are proposed to calculate cumulative risks for cases without covariates. We compare naive Kaplan-Meier, logistic-Weibull, and non-parametric estimates of cumulative risk in the cervical cancer screening program at Kaiser Permanente Northern California. Kaplan-Meier provided poor estimates while the logistic-Weibull model was a close fit to the non-parametric. Our findings support our use of logistic-Weibull models to develop the risk estimates that underlie current US risk-based cervical cancer screening guidelines. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2017. This article has been contributed to by US Government employees and their work is in the public domain in the USA.
Choice of time-scale in Cox's model analysis of epidemiologic cohort data: a simulation study.
Thiébaut, Anne C M; Bénichou, Jacques
2004-12-30
Cox's regression model is widely used for assessing associations between potential risk factors and disease occurrence in epidemiologic cohort studies. Although age is often a strong determinant of disease risk, authors have frequently used time-on-study instead of age as the time-scale, as for clinical trials. Unless the baseline hazard is an exponential function of age, this approach can yield different estimates of relative hazards than using age as the time-scale, even when age is adjusted for. We performed a simulation study in order to investigate the existence and magnitude of bias for different degrees of association between age and the covariate of interest. Age to disease onset was generated from exponential, Weibull or piecewise Weibull distributions, and both fixed and time-dependent dichotomous covariates were considered. We observed no bias upon using age as the time-scale. Upon using time-on-study, we verified the absence of bias for exponentially distributed age to disease onset. For non-exponential distributions, we found that bias could occur even when the covariate of interest was independent from age. It could be severe in case of substantial association with age, especially with time-dependent covariates. These findings were illustrated on data from a cohort of 84,329 French women followed prospectively for breast cancer occurrence. In view of our results, we strongly recommend not using time-on-study as the time-scale for analysing epidemiologic cohort data. 2004 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Rikitake, T.
1999-03-01
In light of newly-acquired geophysical information about earthquake generation in the Tokai area, Central Japan, where occurrence of a great earthquake of magnitude 8 or so has recently been feared, probabilities of earthquake occurrence in the near future are reevaluated. Much of the data used for evaluation here relies on recently-developed paleoseismology, tsunami study and GPS geodesy.The new Weibull distribution analysis of recurrence tendency of great earthquakes in the Tokai-Nankai zone indicates that the mean return period of great earthquakes there is estimated as 109 yr with a standard deviation amounting to 33 yr. These values do not differ much from those of previous studies (Rikitake, 1976, 1986; Utsu, 1984).Taking the newly-determined velocities of the motion of Philippine Sea plate at various portions of the Tokai-Nankai zone into account, the ultimate displacements to rupture at the plate boundary are obtained. A Weibull distribution analysis results in the mean ultimate displacement amounting to 4.70 m with a standard deviation estimated as 0.86 m. A return period amounting to 117 yr is obtained at the Suruga Bay portion by dividing the mean ultimate displacement by the relative plate velocity.With the aid of the fault models as determined from the tsunami studies, the increases in the cumulative seismic slips associated with the great earthquakes are examined at various portions of the zone. It appears that a slip-predictable model can better be applied to the occurrence mode of great earthquakes in the zone than a time-predictable model. The crustal strain accumulating over the Tokai area as estimated from the newly-developed geodetic work including the GPS observations is compared to the ultimate strain presumed by the above two models.The probabilities for a great earthquake to recur in the Tokai district are then estimated with the aid of the Weibull analysis parameters obtained for the four cases discussed in the above. All the probabilities evaluated for the four cases take on values ranging 35-45 percent for a ten-year period following the year 2000.
Bonded-cell model for particle fracture.
Nguyen, Duc-Hanh; Azéma, Emilien; Sornay, Philippe; Radjai, Farhang
2015-02-01
Particle degradation and fracture play an important role in natural granular flows and in many applications of granular materials. We analyze the fracture properties of two-dimensional disklike particles modeled as aggregates of rigid cells bonded along their sides by a cohesive Mohr-Coulomb law and simulated by the contact dynamics method. We show that the compressive strength scales with tensile strength between cells but depends also on the friction coefficient and a parameter describing cell shape distribution. The statistical scatter of compressive strength is well described by the Weibull distribution function with a shape parameter varying from 6 to 10 depending on cell shape distribution. We show that this distribution may be understood in terms of percolating critical intercellular contacts. We propose a random-walk model of critical contacts that leads to particle size dependence of the compressive strength in good agreement with our simulation data.
NASA Astrophysics Data System (ADS)
Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.
2017-04-01
We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for effective selection of equipment in the process of designing a power supply system in a certain location.
Bevilacqua, Antonio; Speranza, Barbara; Sinigaglia, Milena; Corbo, Maria Rosaria
2015-01-01
Predictive Microbiology (PM) deals with the mathematical modeling of microorganisms in foods for different applications (challenge test, evaluation of microbiological shelf life, prediction of the microbiological hazards connected with foods, etc.). An interesting and important part of PM focuses on the use of primary functions to fit data of death kinetics of spoilage, pathogenic, and useful microorganisms following thermal or non-conventional treatments and can also be used to model survivors throughout storage. The main topic of this review is a focus on the most important death models (negative Gompertz, log-linear, shoulder/tail, Weibull, Weibull+tail, re-parameterized Weibull, biphasic approach, etc.) to pinpoint the benefits and the limits of each model; in addition, the last section addresses the most important tools for the use of death kinetics and predictive microbiology in a user-friendly way. PMID:28231222
Application of the weibull distribution function to the molecular weight distribution of cellulose
A. Broido; Hsiukang Yow
1977-01-01
The molecular weight distribution of a linear homologous polymer is usually obtained empirically for any particular sample. Sample-to-sample comparisons are made in terms of the weight- or number-average molecular weights and graphic displays of the distribution curves. Such treatment generally precludes data interpretations in which a distribution can be described in...
Chang, Wen-Ruey; Matz, Simon; Chang, Chien-Chi
2014-05-01
The maximum coefficient of friction that can be supported at the shoe and floor interface without a slip is usually called the available coefficient of friction (ACOF) for human locomotion. The probability of a slip could be estimated using a statistical model by comparing the ACOF with the required coefficient of friction (RCOF), assuming that both coefficients have stochastic distributions. An investigation of the stochastic distributions of the ACOF of five different floor surfaces under dry, water and glycerol conditions is presented in this paper. One hundred friction measurements were performed on each floor surface under each surface condition. The Kolmogorov-Smirnov goodness-of-fit test was used to determine if the distribution of the ACOF was a good fit with the normal, log-normal and Weibull distributions. The results indicated that the ACOF distributions had a slightly better match with the normal and log-normal distributions than with the Weibull in only three out of 15 cases with a statistical significance. The results are far more complex than what had heretofore been published and different scenarios could emerge. Since the ACOF is compared with the RCOF for the estimate of slip probability, the distribution of the ACOF in seven cases could be considered a constant for this purpose when the ACOF is much lower or higher than the RCOF. A few cases could be represented by a normal distribution for practical reasons based on their skewness and kurtosis values without a statistical significance. No representation could be found in three cases out of 15. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Failure rate analysis of Goddard Space Flight Center spacecraft performance during orbital life
NASA Technical Reports Server (NTRS)
Norris, H. P.; Timmins, A. R.
1976-01-01
Space life performance data on 57 Goddard Space Flight Center spacecraft are analyzed from the standpoint of determining an appropriate reliability model and the associated reliability parameters. Data from published NASA reports, which cover the space performance of GSFC spacecraft launched in the 1960-1970 decade, form the basis of the analyses. The results of the analyses show that the time distribution of 449 malfunctions, of which 248 were classified as failures (not necessarily catastrophic), follow a reliability growth pattern that can be described with either the Duane model or a Weibull distribution. The advantages of both mathematical models are used in order to: identify space failure rates, observe chronological trends, and compare failure rates with those experienced during the prelaunch environmental tests of the flight model spacecraft.
Categorical Data Analysis Using a Skewed Weibull Regression Model
NASA Astrophysics Data System (ADS)
Caron, Renault; Sinha, Debajyoti; Dey, Dipak; Polpo, Adriano
2018-03-01
In this paper, we present a Weibull link (skewed) model for categorical response data arising from binomial as well as multinomial model. We show that, for such types of categorical data, the most commonly used models (logit, probit and complementary log-log) can be obtained as limiting cases. We further compare the proposed model with some other asymmetrical models. The Bayesian as well as frequentist estimation procedures for binomial and multinomial data responses are presented in details. The analysis of two data sets to show the efficiency of the proposed model is performed.
Polynomial probability distribution estimation using the method of moments
Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949
Polynomial probability distribution estimation using the method of moments.
Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper
2017-01-01
We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.
Jafari-Koshki, Tohid; Mansourian, Marjan; Mokarian, Fariborz
2014-01-01
Breast cancer is a fatal disease and the most frequently diagnosed cancer in women with an increasing pattern worldwide. The burden is mostly attributed to metastatic cancers that occur in one-third of patients and the treatments are palliative. It is of great interest to determine factors affecting time from cancer diagnosis to secondary metastasis. Cure rate models assume a Poisson distribution for the number of unobservable metastatic-component cells that are completely deleted from the non-metastasis patient body but some may remain and result in metastasis. Time to metastasis is defined as a function of the number of these cells and the time for each cell to develop a detectable sign of metastasis. Covariates are introduced to the model via the rate of metastatic-component cells. We used non-mixture cure rate models with Weibull and log-logistic distributions in a Bayesian setting to assess the relationship between metastasis free survival and covariates. The median of metastasis free survival was 76.9 months. Various models showed that from covariates in the study, lymph node involvement ratio and being progesterone receptor positive were significant, with an adverse and a beneficial effect on metastasis free survival, respectively. The estimated fraction of patients cured from metastasis was almost 48%. The Weibull model had a slightly better performance than log-logistic. Cure rate models are popular in survival studies and outperform other models under certain conditions. We explored the prognostic factors of metastatic breast cancer from a different viewpoint. In this study, metastasis sites were analyzed all together. Conducting similar studies in a larger sample of cancer patients as well as evaluating the prognostic value of covariates in metastasis to each site separately are recommended.
NASA Astrophysics Data System (ADS)
Witt, Annette; Ehlers, Frithjof; Luther, Stefan
2017-09-01
We have analyzed symbol sequences of heart beat annotations obtained from 24-h electrocardiogram recordings of 184 post-infarction patients (from the Cardiac Arrhythmia Suppression Trial database, CAST). In the symbol sequences, each heart beat was coded as an arrhythmic or as a normal beat. The symbol sequences were analyzed with a model-based approach which relies on two-parametric peaks over the threshold (POT) model, interpreting each premature ventricular contraction (PVC) as an extreme event. For the POT model, we explored (i) the Shannon entropy which was estimated in terms of the Lempel-Ziv complexity, (ii) the shape parameter of the Weibull distribution that best fits the PVC return times, and (iii) the strength of long-range correlations quantified by detrended fluctuation analysis (DFA) for the two-dimensional parameter space. We have found that in the frame of our model the Lempel-Ziv complexity is functionally related to the shape parameter of the Weibull distribution. Thus, two complementary measures (entropy and strength of long-range correlations) are sufficient to characterize realizations of the two-parametric model. For the CAST data, we have found evidence for an intermediate strength of long-range correlations in the PVC timings, which are correlated to the age of the patient: younger post-infarction patients have higher strength of long-range correlations than older patients. The normalized Shannon entropy has values in the range 0.5
Nguyen, Harrison H; Fong, Hanson; Paranjpe, Avina; Flake, Natasha M; Johnson, James D; Peters, Ove A
2014-08-01
The purpose of this study was to compare the fracture resistance to cyclic fatigue of ProTaper Next (PTN; Dentsply Tulsa Dental Specialties, Tulsa, OK), ProTaper Universal (PTU, Dentsply Tulsa Dental Specialties), and Vortex Blue (VB, Dentsply Tulsa Dental Specialties) rotary instruments. Twenty instruments each of PTN X1-X5, PTU S1-F5, and VB 20/04-50/04 were rotated until fracture in a simulated canal of 90° and a 5-mm radius using a custom-made testing platform. The number of cycles to fracture (NCF) was calculated. Weibull analysis was used to predict the maximum number of cycles when 99% of the instrument samples survive. VB 20/04-30/04 had significantly higher NCF than PTU S1-F5 and PTN X1-X5. VB 35/04-45/04 had significantly higher NCF than PTU S2-F5 and PTN X2-X5. PTN X1 had higher NCF than PTU S1-F5. PTN X2 had higher NCF than PTU F2-F5. The Weibull distribution predicted the highest number of cycles at which 99% of instruments survive to be 766 cycles for VB 25/04 and the lowest to be 50 cycles for PTU F2. Under the limitations of this study, VB 20/04-45/04 were more resistant to cyclic fatigue than PTN X2-X5 and PTU S2-F5. PTN X1 and X2 were more resistant to cyclic fatigue than PTU F2-F5. The Weibull distribution appears to be a feasible and potentially clinically relevant model to predict resistance to cyclic fatigue. Copyright © 2014 American Association of Endodontists. All rights reserved.
Analysis of Flexural Fatigue Strength of Self Compacting Fibre Reinforced Concrete Beams
NASA Astrophysics Data System (ADS)
Murali, G.; Sudar Celestina, J. P. Arul; Subhashini, N.; Vigneshwari, M.
2017-07-01
This study presents the extensive statistical investigation ofvariations in flexural fatigue life of self-compacting Fibrous Concrete (FC) beams. For this purpose, the experimental data of earlier researchers were examined by two parameter Weibull distribution.Two methods namely Graphical and moment wereused to analyse the variations in experimental data and the results have been presented in the form of probability of survival. The Weibull parameters values obtained from graphical and method of moments are precise. At 0.7 stress level, the fatigue life shows 59861 cyclesfor areliability of 90%.
Transmission overhaul and replacement predictions using Weibull and renewel theory
NASA Technical Reports Server (NTRS)
Savage, M.; Lewicki, D. G.
1989-01-01
A method to estimate the frequency of transmission overhauls is presented. This method is based on the two-parameter Weibull statistical distribution for component life. A second method is presented to estimate the number of replacement components needed to support the transmission overhaul pattern. The second method is based on renewal theory. Confidence statistics are applied with both methods to improve the statistical estimate of sample behavior. A transmission example is also presented to illustrate the use of the methods. Transmission overhaul frequency and component replacement calculations are included in the example.
EXPERIMENTAL DESIGN STRATEGY FOR THE WEIBULL DOSE RESPONSE MODEL (JOURNAL VERSION)
The objective of the research was to determine optimum design point allocation for estimation of relative yield losses from ozone pollution when the true and fitted yield-ozone dose response relationship follows the Weibull. The optimum design is dependent on the values of the We...
Modeling of Abrasion and Crushing of Unbound Granular Materials During Compaction
NASA Astrophysics Data System (ADS)
Ocampo, Manuel S.; Caicedo, Bernardo
2009-06-01
Unbound compacted granular materials are commonly used in engineering structures as layers in road pavements, railroad beds, highway embankments, and foundations. These structures are generally subjected to dynamic loading by construction operations, traffic and wheel loads. These repeated or cyclic loads cause abrasion and crushing of the granular materials. Abrasion changes a particle's shape, and crushing divides the particle into a mixture of many small particles of varying sizes. Particle breakage is important because the mechanical and hydraulic properties of these materials depend upon their grain size distribution. Therefore, it is important to evaluate the evolution of the grain size distribution of these materials. In this paper an analytical model for unbound granular materials is proposed in order to evaluate particle crushing of gravels and soils subjected to cyclic loads. The model is based on a Markov chain which describes the development of grading changes in the material as a function of stress levels. In the model proposed, each particle size is a state in the system, and the evolution of the material is the movement of particles from one state to another in n steps. Each step is a load cycle, and movement between states is possible with a transition probability. The crushing of particles depends on the mechanical properties of each grain and the packing density of the granular material. The transition probability was calculated using both the survival probability defined by Weibull and the compressible packing model developed by De Larrard. Material mechanical properties are considered using the Weibull probability theory. The size and shape of the grains, as well as the method of processing the packing density are considered using De Larrard's model. Results of the proposed analytical model show a good agreement with the experimental tests carried out using the gyratory compaction test.
Kosmidis, Kosmas; Argyrakis, Panos; Macheras, Panos
2003-07-01
To verify the Higuchi law and study the drug release from cylindrical and spherical matrices by means of Monte Carlo computer simulation. A one-dimensional matrix, based on the theoretical assumptions of the derivation of the Higuchi law, was simulated and its time evolution was monitored. Cylindrical and spherical three-dimensional lattices were simulated with sites at the boundary of the lattice having been denoted as leak sites. Particles were allowed to move inside it using the random walk model. Excluded volume interactions between the particles was assumed. We have monitored the system time evolution for different lattice sizes and different initial particle concentrations. The Higuchi law was verified using the Monte Carlo technique in a one-dimensional lattice. It was found that Fickian drug release from cylindrical matrices can be approximated nicely with the Weibull function. A simple linear relation between the Weibull function parameters and the specific surface of the system was found. Drug release from a matrix, as a result of a diffusion process assuming excluded volume interactions between the drug molecules, can be described using a Weibull function. This model, although approximate and semiempirical, has the benefit of providing a simple physical connection between the model parameters and the system geometry, which was something missing from other semiempirical models.
On the robustness of a Bayes estimate. [in reliability theory
NASA Technical Reports Server (NTRS)
Canavos, G. C.
1974-01-01
This paper examines the robustness of a Bayes estimator with respect to the assigned prior distribution. A Bayesian analysis for a stochastic scale parameter of a Weibull failure model is summarized in which the natural conjugate is assigned as the prior distribution of the random parameter. The sensitivity analysis is carried out by the Monte Carlo method in which, although an inverted gamma is the assigned prior, realizations are generated using distribution functions of varying shape. For several distributional forms and even for some fixed values of the parameter, simulated mean squared errors of Bayes and minimum variance unbiased estimators are determined and compared. Results indicate that the Bayes estimator remains squared-error superior and appears to be largely robust to the form of the assigned prior distribution.
Modeling Mass and Thermal Transport in Thin Porous Media of PEM Fuel Cells
NASA Astrophysics Data System (ADS)
Konduru, Vinaykumar
Water transport in the Porous Transport Layer (PTL) plays an important role in the efficient operation of polymer electrolyte membrane fuel cells (PEMFC). Excessive water content as well as dry operating conditions are unfavorable for efficient and reliable operation of the fuel cell. The effect of thermal conductivity and porosity on water management are investigated by simulating two-phase flow in the PTL of the fuel cell using a network model. In the model, the PTL consists of a pore-phase and a solid-phase. Different models of the PTLs are generated using independent Weibull distributions for the pore-phase and the solid-phase. The specific arrangement of the pores and solid elements is varied to obtain different PTL realizations for the same Weibull parameters. The properties of PTL are varied by changing the porosity and thermal conductivity. The parameters affecting operating conditions include the temperature, relative humidity in the flow channel and voltage and current density. In addition, a novel high-speed capable Surface Plasmon Resonance (SPR) microscope was built based on Kretschmann's configuration utilizing a collimated Kohler illumination. The SPR allows thin film characterization in a thickness of approximately 0-200nm by measuring the changes in the refractive index. Various independent experiments were run to measure film thickness during droplet coalescence during condensation.
Pugno, Nicola M
2007-01-01
In this paper we present a statistical analogy between the collapse of solids and living organisms; in particular we deduce a statistical law governing their probability of death. We have derived such a law coupling the widely used Weibull Statistics, developed for describing the distribution of the strength of solids, with a general model for ontogenetic growth recently proposed in literature. The main idea presented in this paper is that cracks can propagate in solids and cause their failure as sick cells in living organisms can cause their death. Making a rough analogy, living organisms are found to behave as "growing" mechanical components under cyclic, i.e., fatigue, loadings and composed by a dynamic evolutionary material that, as an ineluctable fate, deteriorates. The implications on biological scaling laws are discussed. As an example, we apply such a Dynamic Weibull Statistics to large data collections on human deaths due to cancer of various types recorded in Italy: a significant agreement is observed.
Statistical analysis of lithium iron sulfide status cell cycle life and failure mode
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gay, E.C.; Battles, J.E.; Miller, W.E.
1983-08-01
A statistical model was developed for life cycle testing of electrochemical cell life cycle trials and verified experimentally. The Weibull distribution was selected to predict the end of life for a cell, based on a 20 percent loss of initial stabilized capacity or a decrease to less than 95 percent coulombic efficiency. Groups of 12 or more Li-alloy/FeS cells were cycled to determine the mean time to failure (MTTF) and also to identify the failure modes. The cells were all full size electric vehicle batteries with 150-350 A-hr capacity. The Weibull shape factors were determined and verified in prediction ofmore » the number of cell failures in two 10 cell modules. The short circuit failure in the cells with BN-felt and MgO powder separators were found to be caused by the formation of Li-Al protrusions that penetrated the BN-felt separators, and the extrusion of active material at the edge of the electrodes.« less
Prediction of Mean and Design Fatigue Lives of Self Compacting Concrete Beams in Flexure
NASA Astrophysics Data System (ADS)
Goel, S.; Singh, S. P.; Singh, P.; Kaushik, S. K.
2012-02-01
In this paper, result of an investigation conducted to study the flexural fatigue characteristics of self compacting concrete (SCC) beams in flexure are presented. An experimental programme was planned in which approximately 60 SCC beam specimens of size 100 × 100 × 500 mm were tested under flexural fatigue loading. Approximately 45 static flexural tests were also conducted to facilitate fatigue testing. The flexural fatigue and static flexural strength tests were conducted on a 100 kN servo-controlled actuator. The fatigue life data thus obtained have been used to establish the probability distributions of fatigue life of SCC using two-parameter Weibull distribution. The parameters of the Weibull distribution have been obtained by different methods of analysis. Using the distribution parameters, the mean and design fatigue lives of SCC have been estimated and compared with Normally vibrated concrete (NVC), the data for which have been taken from literature. It has been observed that SCC exhibits higher mean and design fatigue lives compared to NVC.
NASA Astrophysics Data System (ADS)
Okeniyi, Joshua Olusegun; Nwadialo, Christopher Chukwuweike; Olu-Steven, Folusho Emmanuel; Ebinne, Samaru Smart; Coker, Taiwo Ebenezer; Okeniyi, Elizabeth Toyin; Ogbiye, Adebanji Samuel; Durotoye, Taiwo Omowunmi; Badmus, Emmanuel Omotunde Oluwasogo
2017-02-01
This paper investigates C3H7NO2S (Cysteine) effect on the inhibition of reinforcing steel corrosion in concrete immersed in 0.5 M H2SO4, for simulating industrial/microbial environment. Different C3H7NO2S concentrations were admixed, in duplicates, in steel-reinforced concrete samples that were partially immersed in the acidic sulphate environment. Electrochemical monitoring techniques of open circuit potential, as per ASTM C876-91 R99, and corrosion rate, by linear polarization resistance, were then employed for studying anticorrosion effect in steel-reinforced concrete samples by the organic hydrocarbon admixture. Analyses of electrochemical test-data followed ASTM G16-95 R04 prescriptions including probability distribution modeling with significant testing by Kolmogorov-Smirnov and student's t-tests statistics. Results established that all datasets of corrosion potential distributed like the Normal, the Gumbel and the Weibull distributions but that only the Weibull model described all the corrosion rate datasets in the study, as per the Kolmogorov-Smirnov test-statistics. Results of the student's t-test showed that differences of corrosion test-data between duplicated samples with the same C3H7NO2S concentrations were not statistically significant. These results indicated that 0.06878 M C3H7NO2S exhibited optimal inhibition efficiency η = 90.52±1.29% on reinforcing steel corrosion in the concrete samples immersed in 0.5 M H2SO4, simulating industrial/microbial service-environment.
NASA Astrophysics Data System (ADS)
Okabe, Shigemitsu; Tsuboi, Toshihiro; Takami, Jun
The power-frequency withstand voltage tests are regulated on electric power equipment in JEC by evaluating the lifetime reliability with a Weibull distribution function. The evaluation method is still controversial in terms of consideration of a plural number of faults and some alternative methods were proposed on this subject. The present paper first discusses the physical meanings of the various kinds of evaluating methods and secondly examines their effects on the power-frequency withstand voltage tests. Further, an appropriate method is investigated for an oil-filled transformer and a gas insulated switchgear with taking notice of dielectric breakdown or partial discharge mechanism under various insulating material and structure conditions and the tentative conclusion gives that the conventional method would be most pertinent under the present conditions.
Improved silicon carbide for advanced heat engines
NASA Technical Reports Server (NTRS)
Whalen, Thomas J.
1989-01-01
The development of high strength, high reliability silicon carbide parts with complex shapes suitable for use in advanced heat engines is studied. Injection molding was the forming method selected for the program because it is capable of forming complex parts adaptable for mass production on an economically sound basis. The goals were to reach a Weibull characteristic strength of 550 MPa (80 ksi) and a Weibull modulus of 16 for bars tested in four-point loading. Statistically designed experiments were performed throughout the program and a fluid mixing process employing an attritor mixer was developed. Compositional improvements in the amounts and sources of boron and carbon used and a pressureless sintering cycle were developed which provided samples of about 99 percent of theoretical density. Strengths were found to improve significantly by annealing in air. Strengths in excess of 550 MPa (80 ksi) with Weibull modulus of about 9 were obtained. Further improvements in Weibull modulus to about 16 were realized by proof testing. This is an increase of 86 percent in strength and 100 percent in Weibull modulus over the baseline data generated at the beginning of the program. Molding yields were improved and flaw distributions were observed to follow a Poisson process. Magic angle spinning nuclear magnetic resonance spectra were found to be useful in characterizing the SiC powder and the sintered samples. Turbocharger rotors were molded and examined as an indication of the moldability of the mixes which were developed in this program.
Models, Entropy and Information of Temporal Social Networks
NASA Astrophysics Data System (ADS)
Zhao, Kun; Karsai, Márton; Bianconi, Ginestra
Temporal social networks are characterized by heterogeneous duration of contacts, which can either follow a power-law distribution, such as in face-to-face interactions, or a Weibull distribution, such as in mobile-phone communication. Here we model the dynamics of face-to-face interaction and mobile phone communication by a reinforcement dynamics, which explains the data observed in these different types of social interactions. We quantify the information encoded in the dynamics of these networks by the entropy of temporal networks. Finally, we show evidence that human dynamics is able to modulate the information present in social network dynamics when it follows circadian rhythms and when it is interfacing with a new technology such as the mobile-phone communication technology.
Analysis of Weibull Grading Test for Solid Tantalum Capacitors
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This, model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
NASA Astrophysics Data System (ADS)
Costa, A.; Pioli, L.; Bonadonna, C.
2017-05-01
The authors found a mistake in the formulation of the distribution named Bi-Weibull distribution reported in the equation (A.2) of the Appendix A. The error affects equation (4) (which is the same as eq. (A.2)) and Table 4 in the original manuscript.
NASA Astrophysics Data System (ADS)
Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming
2018-01-01
This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.
NASA Astrophysics Data System (ADS)
Sembiring, N.; Ginting, E.; Darnello, T.
2017-12-01
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
NASA Astrophysics Data System (ADS)
Caputo, Riccardo
2010-09-01
It is a commonplace field observation that extension fractures are more abundant than shear fractures. The questions of how much more abundant, and why, are posed in this paper and qualitative estimates of their ratio within a rock volume are made on the basis of field observations and mechanical considerations. A conceptual model is also proposed to explain the common range of ratios between extension and shear fractures, here called the j/ f ratio. The model considers three major genetic stress components originated from overburden, pore-fluid pressure and tectonics and assumes that some of the remote genetic stress components vary with time ( i.e. stress-rates are included). Other important assumptions of the numerical model are that: i) the strength of the sub-volumes is randomly attributed following a Weibull probabilistic distribution, ii) all fractures heal after a given time, thus simulating the cementation process, and therefore iii) both extensional jointing and shear fracturing could be recurrent events within the same sub-volume. As a direct consequence of these assumptions, the stress tensor at any point varies continuously in time and these variations are caused by both remote stresses and local stress drops associated with in-situ and neighbouring fracturing events. The conceptual model is implemented in a computer program to simulate layered carbonate rock bodies undergoing brittle deformation. The numerical results are obtained by varying the principal parameters, like depth ( viz. confining pressure), tensile strength, pore-fluid pressure and shape of the Weibull distribution function, in a wide range of values, therefore simulating a broad spectrum of possible mechanical and lithological conditions. The quantitative estimates of the j/ f ratio confirm the general predominance of extensional failure events during brittle deformation in shallow crustal rocks and provide useful insights for better understanding the role played by the different parameters. For example, as a general trend it is observed that the j/ f ratio is inversely proportional to depth ( viz. confining pressure) and directly proportional to pore-fluid pressure, while the stronger is the rock, the wider is the range of depths showing a finite value of the j/ f ratio and in general the deeper are the conditions where extension fractures can form. Moreover, the wider is the strength variability of rocks ( i.e. the lower is the m parameter of the Weibull probabilistic distribution function), the wider is the depth range where both fractures can form providing a finite value of the j/ f ratio. Natural case studies from different geological and tectonic settings are also used to test the conceptual model and the numerical results showing a good agreement between measured and predicted j/ f ratios.
Rolling Bearing Life Prediction, Theory, and Application
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
2016-01-01
A tutorial is presented outlining the evolution, theory, and application of rolling-element bearing life prediction from that of A. Palmgren, 1924; W. Weibull, 1939; G. Lundberg and A. Palmgren, 1947 and 1952; E. Ioannides and T. Harris, 1985; and E. Zaretsky, 1987. Comparisons are made between these life models. The Ioannides-Harris model without a fatigue limit is identical to the Lundberg-Palmgren model. The Weibull model is similar to that of Zaretsky if the exponents are chosen to be identical. Both the load-life and Hertz stress-life relations of Weibull, Lundberg and Palmgren, and Ioannides and Harris reflect a strong dependence on the Weibull slope. The Zaretsky model decouples the dependence of the critical shear stress-life relation from the Weibull slope. This results in a nominal variation of the Hertz stress-life exponent. For 9th- and 8th-power Hertz stress-life exponents for ball and roller bearings, respectively, the Lundberg-Palmgren model best predicts life. However, for 12th- and 10th-power relations reflected by modern bearing steels, the Zaretsky model based on the Weibull equation is superior. Under the range of stresses examined, the use of a fatigue limit would suggest that (for most operating conditions under which a rolling-element bearing will operate) the bearing will not fail from classical rolling-element fatigue. Realistically, this is not the case. The use of a fatigue limit will significantly overpredict life over a range of normal operating Hertz stresses. (The use of ISO 281:2007 with a fatigue limit in these calculations would result in a bearing life approaching infinity.) Since the predicted lives of rolling-element bearings are high, the problem can become one of undersizing a bearing for a particular application. Rules had been developed to distinguish and compare predicted lives with those actually obtained. Based upon field and test results of 51 ball and roller bearing sets, 98 percent of these bearing sets had acceptable life results using the Lundberg- Palmgren equations with life adjustment factors to predict bearing life. That is, they had lives equal to or greater than that predicted. The Lundberg-Palmgren model was used to predict the life of a commercial turboprop gearbox. The life prediction was compared with the field lives of 64 gearboxes. From these results, the roller bearing lives exhibited a load-life exponent of 5.2, which correlated with the Zaretsky model. The use of the ANSI/ABMA and ISO standards load-life exponent of 10/3 to predict roller bearing life is not reflective of modern roller bearings and will underpredict bearing lives.
Wang, Tao; Wu, Jinhui; Qi, Jiancheng; Hao, Limei; Yi, Ying; Zhang, Zongxing
2016-05-15
Bacillus subtilis subsp. niger spore and Staphylococcus albus are typical biological indicators for the inactivation of airborne pathogens. The present study characterized and compared the behaviors of B. subtilis subsp. niger spores and S. albus in regard to inactivation by chlorine dioxide (ClO2) gas under different gas concentrations and relative humidity (RH) conditions. The inactivation kinetics under different ClO2 gas concentrations (1 to 5 mg/liter) were determined by first-order and Weibull models. A new model (the Weibull-H model) was established to reveal the inactivation tendency and kinetics for ClO2 gas under different RH conditions (30 to 90%). The results showed that both the gas concentration and RH were significantly (P < 0.05) and positively correlated with the inactivation of the two chosen indicators. There was a rapid improvement in the inactivation efficiency under high RH (>70%). Compared with the first-order model, the Weibull and Weibull-H models demonstrated a better fit for the experimental data, indicating nonlinear inactivation behaviors of the vegetative bacteria and spores following exposure to ClO2 gas. The times to achieve a six-log reduction of B. subtilis subsp. niger spore and S. albus were calculated based on the established models. Clarifying the kinetics of inactivation of B. subtilis subsp. niger spores and S. albus by ClO2 gas will allow the development of ClO2 gas treatments that provide an effective disinfection method. Chlorine dioxide (ClO2) gas is a novel and effective fumigation agent with strong oxidization ability and a broad biocidal spectrum. The antimicrobial efficacy of ClO2 gas has been evaluated in many previous studies. However, there are presently no published models that can be used to describe the kinetics of inactivation of airborne pathogens by ClO2 gas under different gas concentrations and RH conditions. The first-order and Weibull (Weibull-H) models established in this study can characterize and compare the behaviors of Bacillus subtilis subsp. niger spores and Staphylococcus albus in regard to inactivation by ClO2 gas, determine the kinetics of inactivation of two chosen strains under different conditions of gas concentration and RH, and provide the calculated time to achieve a six-log reduction. These results will be useful to determine effective conditions for ClO2 gas to inactivate airborne pathogens in contaminated air and other environments and thus prevent outbreaks of airborne illness. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Wang, Tao; Wu, Jinhui; Hao, Limei; Yi, Ying; Zhang, Zongxing
2016-01-01
ABSTRACT Bacillus subtilis subsp. niger spore and Staphylococcus albus are typical biological indicators for the inactivation of airborne pathogens. The present study characterized and compared the behaviors of B. subtilis subsp. niger spores and S. albus in regard to inactivation by chlorine dioxide (ClO2) gas under different gas concentrations and relative humidity (RH) conditions. The inactivation kinetics under different ClO2 gas concentrations (1 to 5 mg/liter) were determined by first-order and Weibull models. A new model (the Weibull-H model) was established to reveal the inactivation tendency and kinetics for ClO2 gas under different RH conditions (30 to 90%). The results showed that both the gas concentration and RH were significantly (P < 0.05) and positively correlated with the inactivation of the two chosen indicators. There was a rapid improvement in the inactivation efficiency under high RH (>70%). Compared with the first-order model, the Weibull and Weibull-H models demonstrated a better fit for the experimental data, indicating nonlinear inactivation behaviors of the vegetative bacteria and spores following exposure to ClO2 gas. The times to achieve a six-log reduction of B. subtilis subsp. niger spore and S. albus were calculated based on the established models. Clarifying the kinetics of inactivation of B. subtilis subsp. niger spores and S. albus by ClO2 gas will allow the development of ClO2 gas treatments that provide an effective disinfection method. IMPORTANCE Chlorine dioxide (ClO2) gas is a novel and effective fumigation agent with strong oxidization ability and a broad biocidal spectrum. The antimicrobial efficacy of ClO2 gas has been evaluated in many previous studies. However, there are presently no published models that can be used to describe the kinetics of inactivation of airborne pathogens by ClO2 gas under different gas concentrations and RH conditions. The first-order and Weibull (Weibull-H) models established in this study can characterize and compare the behaviors of Bacillus subtilis subsp. niger spores and Staphylococcus albus in regard to inactivation by ClO2 gas, determine the kinetics of inactivation of two chosen strains under different conditions of gas concentration and RH, and provide the calculated time to achieve a six-log reduction. These results will be useful to determine effective conditions for ClO2 gas to inactivate airborne pathogens in contaminated air and other environments and thus prevent outbreaks of airborne illness. PMID:26969707
Cui, Zhiwei; Huang, Yongmin; Liu, Honglai
2017-07-01
In this work, a micromechanical study using the lattice spring model (LSM) was performed to predict the mechanical properties of BPMs by simulation of the Brazilian test. Stress-strain curve and Weibull plot were analyzed for the determination of fracture strength and Weibull modulus. The presented model composed of linear elastic elements is capable of reproducing the non-linear behavior of BPMs resulting from the damage accumulation and provides consistent results which are in agreement with experimental measurements. Besides, it is also found that porosity shows significant impact on fracture strength while pore size dominates the Weibull modulus, which enables us to establish how choices made in the microstructure to meet the demand of brittle porous materials functioning in various operating conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji Hyun, Yoon; Byun, Thak Sang; Strizak, Joe P
2011-01-01
The mechanical properties of NBG-18 nuclear grade graphite have been characterized using small specimen test techniques and statistical treatment on the test results. New fracture strength and toughness test techniques were developed to use subsize cylindrical specimens with glued heads and to reuse their broken halves. Three sets of subsize cylindrical specimens with the different diameters of 4 mm, 8 mm, and 12 mm were tested to obtain tensile fracture strength. The longer piece of the broken halves was cracked from side surfaces and tested under three-point bend loading to obtain fracture toughness. Both the strength and fracture toughness datamore » were analyzed using Weibull distribution models focusing on size effect. The mean fracture strength decreased from 22.9 MPa to 21.5 MPa as the diameter increased from 4 mm to 12 mm, and the mean strength of 15.9 mm diameter standard specimen, 20.9 MPa, was on the extended trend line. These fracture strength data indicate that in the given diameter range the size effect is not significant and much smaller than that predicted by the Weibull statistics-based model. Further, no noticeable size effect existed in the fracture toughness data, whose mean values were in a narrow range of 1.21 1.26 MPa. The Weibull moduli measured for fracture strength and fracture toughness datasets were around 10. It is therefore believed that the small or negligible size effect enables to use the subsize specimens and that the new fracture toughness test method to reuse the broken specimens to help minimize irradiation space and radioactive waste.« less
Microstructure and Mechanical Properties of Reaction-Formed Silicon Carbide (RFSC) Ceramics
NASA Technical Reports Server (NTRS)
Singh, M.; Behrendt, D. R.
1994-01-01
The microstructure and mechanical properties of reaction-formed silicon carbide (RFSC) ceramics fabricated by silicon infiltration of porous carbon preforms are discussed. The morphological characterization of the carbon preforms indicates a very narrow pore size distribution. Measurements of the preform density by physical methods and by mercury porosimetry agree very well and indicate that virtually all of the porosity in the preforms is open to infiltrating liquids. The average room temperature flexural strength of the RFSC material with approximately 8 at.% free silicon is 369 +/- 28 MPa (53.5 +/- 4 ksi). The Weibull strength distribution data give a characteristic strength value of 381 MPa (55 ksi) and a Weibull modulus of 14.3. The residual silicon content is lower and the strengths are superior to those of most commercially available reaction-bonded silicon carbide materials.
On the non-Poissonian repetition pattern of FRB121102
NASA Astrophysics Data System (ADS)
Oppermann, Niels; Yu, Hao-Ran; Pen, Ue-Li
2018-04-01
The Fast Radio Burst FRB121102 has been observed to repeat in an irregular fashion. Using published timing data of the observed bursts, we show that Poissonian statistics are not a good description of this random process. As an alternative, we suggest to describe the intervals between bursts with a Weibull distribution with a shape parameter smaller than one, which allows for the clustered nature of the bursts. We quantify the amount of clustering using the parameters of the Weibull distribution and discuss the consequences that it has for the detection probabilities of future observations and for the optimization of observing strategies. Allowing for this generalization, we find a mean repetition rate of r=5.7^{+3.0}_{-2.0} per day and index k=0.34^{+0.06}_{-0.05} for a correlation function ξ(t) = (t/t0)k - 1.
Four Theorems on the Psychometric Function
May, Keith A.; Solomon, Joshua A.
2013-01-01
In a 2-alternative forced-choice (2AFC) discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, . This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull “slope” parameter, , can be approximated by , where is the of the Weibull function that fits best to the cumulative noise distribution, and depends on the transducer. We derive general expressions for and , from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when , . We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4–0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of for contrast discrimination suggests that, if internal noise is stimulus-independent, it has lower kurtosis than a Gaussian. PMID:24124456
Equations for estimating loblolly pine branch and foliage weight and surface area distributions
V. Clark Baldwin; Kelly D. Peterson; Harold E. Burkhatt; Ralph L. Amateis; Phillip M. Dougherty
1996-01-01
Equations to predict foliage weight and surface area, and their vertical and horizontal distributions, within the crowns of unthinned loblolly pine (Pinus tueduL.) trees are presented. A right-truncated Weibull function was used for describing vertical foliage distributions. This function ensures that all of the foliage located between the tree tip and the foliage base...
The Topp-Leone generalized Rayleigh cure rate model and its application
NASA Astrophysics Data System (ADS)
Nanthaprut, Pimwarat; Bodhisuwan, Winai; Patummasut, Mena
2017-11-01
Cure rate model is one of the survival analysis when model consider a proportion of the censored data. In clinical trials, the data represent time to recurrence of event or death of patients are used to improve the efficiency of treatments. Each dataset can be separated into two groups: censored and uncensored data. In this work, the new mixture cure rate model is introduced based on the Topp-Leone generalized Rayleigh distribution. The Bayesian approach is employed to estimate its parameters. In addition, a breast cancer dataset is analyzed for model illustration purpose. According to the deviance information criterion, the Topp-Leone generalized Rayleigh cure rate model shows better result than the Weibull and exponential cure rate models.
Kaur, A; Takhar, P S; Smith, D M; Mann, J E; Brashears, M M
2008-10-01
A fractional differential equations (FDEs)-based theory involving 1- and 2-term equations was developed to predict the nonlinear survival and growth curves of foodborne pathogens. It is interesting to note that the solution of 1-term FDE leads to the Weibull model. Nonlinear regression (Gauss-Newton method) was performed to calculate the parameters of the 1-term and 2-term FDEs. The experimental inactivation data of Salmonella cocktail in ground turkey breast, ground turkey thigh, and pork shoulder; and cocktail of Salmonella, E. coli, and Listeria monocytogenes in ground beef exposed at isothermal cooking conditions of 50 to 66 degrees C were used for validation. To evaluate the performance of 2-term FDE in predicting the growth curves-growth of Salmonella typhimurium, Salmonella Enteritidis, and background flora in ground pork and boneless pork chops; and E. coli O157:H7 in ground beef in the temperature range of 22.2 to 4.4 degrees C were chosen. A program was written in Matlab to predict the model parameters and survival and growth curves. Two-term FDE was more successful in describing the complex shapes of microbial survival and growth curves as compared to the linear and Weibull models. Predicted curves of 2-term FDE had higher magnitudes of R(2) (0.89 to 0.99) and lower magnitudes of root mean square error (0.0182 to 0.5461) for all experimental cases in comparison to the linear and Weibull models. This model was capable of predicting the tails in survival curves, which was not possible using Weibull and linear models. The developed model can be used for other foodborne pathogens in a variety of food products to study the destruction and growth behavior.
Motion of kinesin in a viscoelastic medium
NASA Astrophysics Data System (ADS)
Knoops, Gert; Vanderzande, Carlo
2018-05-01
Kinesin is a molecular motor that transports cargo along microtubules. The results of many in vitro experiments on kinesin-1 are described by kinetic models in which one transition corresponds to the forward motion and subsequent binding of the tethered motor head. We argue that in a viscoelastic medium like the cytosol of a cell this step is not Markov and has to be described by a nonexponential waiting time distribution. We introduce a semi-Markov kinetic model for kinesin that takes this effect into account. We calculate, for arbitrary waiting time distributions, the moment generating function of the number of steps made, and determine from this the average velocity and the diffusion constant of the motor. We illustrate our results for the case of a waiting time distribution that is Weibull. We find that for realistic parameter values, viscoelasticity decreases the velocity and the diffusion constant, but increases the randomness (or Fano factor).
Experimental Study on Fatigue Performance of Foamed Lightweight Soil
NASA Astrophysics Data System (ADS)
Qiu, Youqiang; Yang, Ping; Li, Yongliang; Zhang, Liujun
2017-12-01
In order to study fatigue performance of foamed lightweight soil and forecast its fatigue life in the supporting project, on the base of preliminary tests, beam fatigue tests on foamed lightweight soil is conducted by using UTM-100 test system. Based on Weibull distribution and lognormal distribution, using the mathematical statistics method, fatigue equations of foamed lightweight soil are obtained. At the same time, according to the traffic load on real road surface of the supporting project, fatigue life of formed lightweight soil is analyzed and compared with the cumulative equivalent axle loads during the design period of the pavement. The results show that even the fatigue life of foamed lightweight soil has discrete property, the linear relationship between logarithmic fatigue life and stress ratio still performs well. Especially, the fatigue life of Weibull distribution is more close to that derived from the lognormal distribution, in the instance of 50% guarantee ratio. In addition, the results demonstrated that foamed lightweight soil as subgrade filler has good anti-fatigue performance, which can be further adopted by other projects in the similar research domain.
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers.
Phoenix, S Leigh; Newman, William I
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent rho , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, beta. Thus the failure rate of a fiber depends on its past load history, except for beta=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. E 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 10(6) fibers in 10(3) realizations). In particular, our algorithm is O(N ln N) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (beta,rho) pairs that yield contrasting behavior for large N. For rho>1 and large N, brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N-->infinity , unlike ELS, which yields a finite limiting mean. For 1/2
Time-dependent fiber bundles with local load sharing. II. General Weibull fibers
NASA Astrophysics Data System (ADS)
Phoenix, S. Leigh; Newman, William I.
2009-12-01
Fiber bundle models (FBMs) are useful tools in understanding failure processes in a variety of material systems. While the fibers and load sharing assumptions are easily described, FBM analysis is typically difficult. Monte Carlo methods are also hampered by the severe computational demands of large bundle sizes, which overwhelm just as behavior relevant to real materials starts to emerge. For large size scales, interest continues in idealized FBMs that assume either equal load sharing (ELS) or local load sharing (LLS) among fibers, rules that reflect features of real load redistribution in elastic lattices. The present work focuses on a one-dimensional bundle of N fibers under LLS where life consumption in a fiber follows a power law in its load, with exponent ρ , and integrated over time. This life consumption function is further embodied in a functional form resulting in a Weibull distribution for lifetime under constant fiber stress and with Weibull exponent, β . Thus the failure rate of a fiber depends on its past load history, except for β=1 . We develop asymptotic results validated by Monte Carlo simulation using a computational algorithm developed in our previous work [Phys. Rev. EPLEEE81063-651X 63, 021507 (2001)] that greatly increases the size, N , of treatable bundles (e.g., 106 fibers in 103 realizations). In particular, our algorithm is O(NlnN) in contrast with former algorithms which were O(N2) making this investigation possible. Regimes are found for (β,ρ) pairs that yield contrasting behavior for large N . For ρ>1 and large N , brittle weakest volume behavior emerges in terms of characteristic elements (groupings of fibers) derived from critical cluster formation, and the lifetime eventually goes to zero as N→∞ , unlike ELS, which yields a finite limiting mean. For 1/2≤ρ≤1 , however, LLS has remarkably similar behavior to ELS (appearing to be virtually identical for ρ=1 ) with an asymptotic Gaussian lifetime distribution and a finite limiting mean for large N . The coefficient of variation follows a power law in increasing N but, except for ρ=1 , the value of the negative exponent is clearly less than 1/2 unlike in ELS bundles where the exponent remains 1/2 for 1/2<ρ≤1 . For sufficiently small values 0<ρ≪1 , a transition occurs, depending on β , whereby LLS bundle lifetimes become dominated by a few long-lived fibers. Thus the bundle lifetime appears to approximately follow an extreme-value distribution for the longest lived of a parallel group of independent elements, which applies exactly to ρ=0 . The lower the value of β , the higher the transition value of ρ , below which such extreme-value behavior occurs. No evidence was found for limiting Gaussian behavior for ρ>1 but with 0<β(ρ+1)<1 , as might be conjectured from quasistatic bundle models where β(ρ+1) mimics the Weibull exponent for fiber strength.
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
NASA Astrophysics Data System (ADS)
Neumeister, Jonas M.
1993-08-01
THE TENSILE BEHAVIOR of a brittle matrix composite is studied for post matrix crack saturation conditions. Scatter of fiber strength following the Weibull distribution as well as the influence of the major microstructural variables is considered. The stress in a fiber is assumed to recover linearly around a failure due to a fiber-matrix interface behavior mainly ruled by friction. The constitutive behavior for such a composite is analysed. Results are given for a simplified and a refined approximate description and compared with an analysis resulting from the exact analytical theory of fiber fragmentation. It is shown that the stress-strain relation for the refined model excellently follows the exact solution and gives the location of the maximum to within 1% in both stress and strain; for most materials the agreement is even better. Also it is shown that all relations can be normalized to depend on only two variables; a stress reference and the Weibull exponent. For systems with low scatter in fiber strength the simplified model is sufficient to determine the stress maximum but not the postcritical behavior. In addition, the simplified model gives explicit analytical expressions for the maximum stress and corresponding strain. None of the models contain any volume dependence or statistical scatter, but the maximum stress given by the stress-strain relation constitutes an upper bound for the ultimate tensile strength of the composite.
Joint modelling of annual maximum drought severity and corresponding duration
NASA Astrophysics Data System (ADS)
Tosunoglu, Fatih; Kisi, Ozgur
2016-12-01
In recent years, the joint distribution properties of drought characteristics (e.g. severity, duration and intensity) have been widely evaluated using copulas. However, history of copulas in modelling drought characteristics obtained from streamflow data is still short, especially in semi-arid regions, such as Turkey. In this study, unlike previous studies, drought events are characterized by annual maximum severity (AMS) and corresponding duration (CD) which are extracted from daily streamflow of the seven gauge stations located in Çoruh Basin, Turkey. On evaluation of the various univariate distributions, the Exponential, Weibull and Logistic distributions are identified as marginal distributions for the AMS and CD series. Archimedean copulas, namely Ali-Mikhail-Haq, Clayton, Frank and Gumbel-Hougaard, are then employed to model joint distribution of the AMS and CD series. With respect to the Anderson Darling and Cramér-von Mises statistical tests and the tail dependence assessment, Gumbel-Hougaard copula is identified as the most suitable model for joint modelling of the AMS and CD series at each station. Furthermore, the developed Gumbel-Hougaard copulas are used to derive the conditional and joint return periods of the AMS and CD series which can be useful for designing and management of reservoirs in the basin.
Application of the Weibull extrapolation to 137Cs geochronology in Tokyo Bay and Ise Bay, Japan.
Lu, Xueqiang
2004-01-01
Considerable doubt surrounds the nature of processes by which 137Cs is deposited in marine sediments, leading to a situation where 137Cs geochronology cannot be always applied suitably. Based on extrapolation with Weibull distribution, the maximum concentration of 137Cs derived from asymptotic values for cumulative specific inventory was used to re-establish 137Cs geochronology, instead of original 137Cs profiles. Corresponding dating results for cores in Tokyo Bay and Ise Bay, Japan, by means of this new method, are in much closer agreement with those calculated from 210Pb method than the previous method.
Strain-controlled fatigue of acrylic bone cement.
Carter, D R; Gates, E I; Harris, W H
1982-09-01
Monotonic tensile tests and tension-compression fatigue tests were conducted of wet acrylic bone cement specimens at 37 degrees C. All testing was conducted in strain control at a strain rate of 0.02/s. Weibull analysis of the tensile tests indicated that monotonic fracture was governed more strongly by strain than stress. The number of cycles to fatigue failure was also more strongly controlled by strain amplitude than stress amplitude. Specimen porosity distribution played a major role in determining the tensile and fatigue strengths. The degree of data scatter suggests that Weibull analysis of fatigue data may be useful in developing design criteria for the surgical use of bone cement.
NASA Technical Reports Server (NTRS)
Wheeler, J. T.
1990-01-01
The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.
How extreme is extreme hourly precipitation?
NASA Astrophysics Data System (ADS)
Papalexiou, Simon Michael; Dialynas, Yannis G.; Pappas, Christoforos
2016-04-01
The importance of accurate representation of precipitation at fine time scales (e.g., hourly), directly associated with flash flood events, is crucial in hydrological design and prediction. The upper part of a probability distribution, known as the distribution tail, determines the behavior of extreme events. In general, and loosely speaking, tails can be categorized in two families: the subexponential and the hyperexponential family, with the first generating more intense and more frequent extremes compared to the latter. In past studies, the focus has been mainly on daily precipitation, with the Gamma distribution being the most popular model. Here, we investigate the behaviour of tails of hourly precipitation by comparing the upper part of empirical distributions of thousands of records with three general types of tails corresponding to the Pareto, Lognormal, and Weibull distributions. Specifically, we use thousands of hourly rainfall records from all over the USA. The analysis indicates that heavier-tailed distributions describe better the observed hourly rainfall extremes in comparison to lighter tails. Traditional representations of the marginal distribution of hourly rainfall may significantly deviate from observed behaviours of extremes, with direct implications on hydroclimatic variables modelling and engineering design.
Reproducibility of structural strength and stiffness for graphite-epoxy aircraft spoilers
NASA Technical Reports Server (NTRS)
Howell, W. E.; Reese, C. D.
1978-01-01
Structural strength reproducibility of graphite epoxy composite spoilers for the Boeing 737 aircraft was evaluated by statically loading fifteen spoilers to failure at conditions simulating aerodynamic loads. Spoiler strength and stiffness data were statistically modeled using a two parameter Weibull distribution function. Shape parameter values calculated for the composite spoiler strength and stiffness were within the range of corresponding shape parameter values calculated for material property data of composite laminates. This agreement showed that reproducibility of full scale component structural properties was within the reproducibility range of data from material property tests.
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1992-01-01
Statistical analysis and correlation between pore-size distribution and fracture strength distribution using the theory of extreme-value statistics is presented for a sintered silicon nitride. The pore-size distribution on a polished surface of this material was characterized, using an automatic optical image analyzer. The distribution measured on the two-dimensional plane surface was transformed to a population (volume) distribution, using the Schwartz-Saltykov diameter method. The population pore-size distribution and the distribution of the pore size at the fracture origin were correllated by extreme-value statistics. Fracture strength distribution was then predicted from the extreme-value pore-size distribution, usin a linear elastic fracture mechanics model of annular crack around pore and the fracture toughness of the ceramic. The predicted strength distribution was in good agreement with strength measurements in bending. In particular, the extreme-value statistics analysis explained the nonlinear trend in the linearized Weibull plot of measured strengths without postulating a lower-bound strength.
NASA Astrophysics Data System (ADS)
Shah, Nita H.; Soni, Hardik N.; Gupta, Jyoti
2014-08-01
In a recent paper, Begum et al. (2012, International Journal of Systems Science, 43, 903-910) established pricing and replenishment policy for an inventory system with price-sensitive demand rate, time-proportional deterioration rate which follows three parameters, Weibull distribution and no shortages. In their model formulation, it is observed that the retailer's stock level reaches zero before the deterioration occurs. Consequently, the model resulted in traditional inventory model with price sensitive demand rate and no shortages. Hence, the main purpose of this note is to modify and present complete model formulation for Begum et al. (2012). The proposed model is validated by a numerical example and the sensitivity analysis of parameters is carried out.
Renewal models and coseismic stress transfer in the Corinth Gulf, Greece, fault system
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Falcone, Giuseppe; Karakostas, Vassilis; Murru, Maura; Papadimitriou, Eleftheria; Rhoades, David
2013-07-01
model interevent times and Coulomb static stress transfer on the rupture segments along the Corinth Gulf extension zone, a region with a wealth of observations on strong-earthquake recurrence behavior. From the available information on past seismic activity, we have identified eight segments without significant overlapping that are aligned along the southern boundary of the Corinth rift. We aim to test if strong earthquakes on these segments are characterized by some kind of time-predictable behavior, rather than by complete randomness. The rationale for time-predictable behavior is based on the characteristic earthquake hypothesis, the necessary ingredients of which are a known faulting geometry and slip rate. The tectonic loading rate is characterized by slip of 6 mm/yr on the westernmost fault segment, diminishing to 4 mm/yr on the easternmost segment, based on the most reliable geodetic data. In this study, we employ statistical and physical modeling to account for stress transfer among these fault segments. The statistical modeling is based on the definition of a probability density distribution of the interevent times for each segment. Both the Brownian Passage-Time (BPT) and Weibull distributions are tested. The time-dependent hazard rate thus obtained is then modified by the inclusion of a permanent physical effect due to the Coulomb static stress change caused by failure of neighboring faults since the latest characteristic earthquake on the fault of interest. The validity of the renewal model is assessed retrospectively, using the data of the last 300 years, by comparison with a plain time-independent Poisson model, by means of statistical tools including the Relative Operating Characteristic diagram, the R-score, the probability gain and the log-likelihood ratio. We treat the uncertainties in the parameters of each examined fault source, such as linear dimensions, depth of the fault center, focal mechanism, recurrence time, coseismic slip, and aperiodicity of the statistical distribution, by a Monte Carlo technique. The Monte Carlo samples for all these parameters are drawn from a uniform distribution within their uncertainty limits. We find that the BPT and the Weibull renewal models yield comparable results, and both of them perform significantly better than the Poisson hypothesis. No clear performance enhancement is achieved by the introduction of the Coulomb static stress change into the renewal model.
Survival Analysis of Patients with End Stage Renal Disease
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.
2015-06-01
This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.
Meteorite fractures and the behavior of meteoroids in the atmosphere
NASA Astrophysics Data System (ADS)
Bryson, K.; Ostrowski, D. R.; Sears, D. W. G.
2015-12-01
Arguably the major difficulty faced to model the atmospheric behavior of objects entering the atmosphere is that we know very little about the internal structure of these objects and their methods of fragmentation during fall. In a study of over a thousand meteorite fragments (mostly hand-sized, some 40 or 50 cm across) in the collections of the Natural History Museums in Vienna and London, we identified six kinds of fracturing behavior. (1) Chondrites usually showed random fractures with no particular sensitivity to meteorite texture. (2) Coarse irons fractured along kamacite grain boundaries, while (3) fine irons fragmented randomly, c.f. chondrites. (4) Fine irons with large crystal boundaries (e.g. Arispe) fragmented along the crystal boundaries. (5) A few chondrites, three in the present study, have a distinct and strong network of fractures making a brickwork or chicken-wire structure. The Chelyabinsk meteorite has the chicken-wire structure of fractures, which explains the very large number of centimeter-sized fragments that showered the Earth. Finally, (6) previous work on Sutter's Mill showed that water-rich meteorites fracture around clasts. To scale the meteorite fractures to the fragmentation behavior of near-Earth asteroids, it has been suggested that the fracturing behavior follows a statistical prediction made in the 1930s, the Weibull distribution, where fractures are assumed to be randomly distributed through the target and the likelihood of encountering a fracture increases with distance. This results in a relationship: σl = σs(ns/nl)α, where σs and σl refers to stress in the small and large object and ns and nl refer to the number of cracks per unit volume of the small and large object. The value for α, the Weibull coefficient, is unclear. Ames meteorite laboratory is working to measure the density and length of fractures observed in these six types of fracture to determine values for the Weibull coefficient for each type of object.
Metocean design parameter estimation for fixed platform based on copula functions
NASA Astrophysics Data System (ADS)
Zhai, Jinjin; Yin, Qilin; Dong, Sheng
2017-08-01
Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.
Analysis of survival in breast cancer patients by using different parametric models
NASA Astrophysics Data System (ADS)
Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti
2017-09-01
In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.
Coercivity mechanisms and thermal stability of thin film magnetic recording media
NASA Astrophysics Data System (ADS)
Yang, Cheng
1999-09-01
Coercivity mechanisms and thermal stability of magnetic recording media were studied. It was found that magnetization reversal mainly occurs by nucleation mechanism. The correlation was established between the c/ a ratio of Co HCP structure and other process parameters that are thought to be the dominant factors in determining the anisotropy and therefore the coercivity of Co based thin film magnetic recording media. Time decay and switching of the magnetization in thin film magnetic recording media depend on the grain size distribution and easy-axis orientation distribution according to the proposed two- energy-level model. Relaxation time is the most fundamental parameter that determines the time decay performance of the magnetic recording media. An algorithm was proposed to calculate its distribution directly from the experimental data without any presumption. It was found for the first time that the distribution of relaxation time takes the form of Weibull distribution.
Tensile Strength and Microstructural Characterization of Uncoated and Coated HPZ Ceramic Fibers
NASA Technical Reports Server (NTRS)
Bansal, Narottam P.; Wheeler, Donald R.; Dickerson, Robert M.
1996-01-01
Tensile strengths of as-received HPZ fiber and those surface coated with BN, BN/SiC, and BN/Si3N4 have been determined at room temperature using a two-parameter Weibull distribution. Nominally approx. 0.4 micron BN and 0.2 micron SiC or Si3N4 coatings were deposited on the fibers by chemical vapor deposition using a continuous reactor. The average tensile strength of uncoated HPZ fiber was 2.0 +/- 0.56 GPa (290 +/- 81 ksi) with a Weibull modulus of 4.1. For the BN coated fibers, the average strength and the Weibull modulus increased to 2.39 +/- 0.44 GPa (346 +/- 64 ksi) and 6.5, respectively. The HPZ/BN/SiC fibers showed an average strength of 2.0 +/- 0.32 GPa (290 +/- 47 ksi) and Weibull modulus of 7.3. Average strength of the fibers having a dual BN/Si3N4 surface coating degraded to 1.15 +/- 0.26 GPa (166 +/- 38 ksi) with a Weibull modulus of 5.3. The chemical composition and thickness of the fiber coatings were determined using scanning Auger analysis. Microstructural analysis of the fibers and the coatings was carried out by scanning electron microscopy and transmission electron microscopy. A microporous silica-rich layer approx. 200 nm thick is present on the as-received HPZ fiber surface. The BN coatings on the fibers are amorphous to partly turbostratic and contaminated with carbon and oxygen. Silicon carbide coating was crystalline whereas the silicon nitride coating was amorphous. The silicon carbide and silicon nitride coatings are non-stoichiometric, non-uniform, and granular. Within a fiber tow, the fibers on the outside had thicker and more granular coatings than those on the inside.
NASA Astrophysics Data System (ADS)
Linstrom, Elizabeth Jane
A new approach to the nondestructive evaluation of polymer matrix/graphite fiber composites is presented. This technique permits the determination of the top ply bond strength of a laminate based on the results of ultrasonic testing. This technique is designed to be used for the real-time, nondestructive evaluation of composites during tape laying. By separately bonding the top ply of thermoset and thermoplastic polymer composite laminates, a poor ply bond was achieved solely at the interface of the top ply and the rest of the laminate. Using angled incidence, a 5 MHz, 4 musecond ultrasonic pulse was induced into the composite samples. This created waves traveling along the surface of the composite samples that were picked up by a receiving transducer. The received signal was cross-correlated with an artificially constructed replica of the input signal. The maximum amplitude of the cross-correlated signal was recorded. The cross-correlated signal was then converted to the frequency spectra using a fast Fourier transform. The maximum amplitude of the frequency spectra was then recorded. These measurements were repeated at 18 to 30 different locations on each composite sample. The resulting collection of maximum amplitudes of cross-correlated signals and frequency spectra were fit to two parameter Weibull distributions. The composite samples were destructively evaluated using a flat-wise tensile test. The B-basis values of the ultrasonic data Weibull distributions were compared to the B-basis values of the Weibull distribution of the strength data. A good correlation was found.
Accounting for inherent variability of growth in microbial risk assessment.
Marks, H M; Coleman, M E
2005-04-15
Risk assessments of pathogens need to account for the growth of small number of cells under varying conditions. In order to determine the possible risks that occur when there are small numbers of cells, stochastic models of growth are needed that would capture the distribution of the number of cells over replicate trials of the same scenario or environmental conditions. This paper provides a simple stochastic growth model, accounting only for inherent cell-growth variability, assuming constant growth kinetic parameters, for an initial, small, numbers of cells assumed to be transforming from a stationary to an exponential phase. Two, basic, microbial sets of assumptions are considered: serial, where it is assume that cells transform through a lag phase before entering the exponential phase of growth; and parallel, where it is assumed that lag and exponential phases develop in parallel. The model is based on, first determining the distribution of the time when growth commences, and then modelling the conditional distribution of the number of cells. For the latter distribution, it is found that a Weibull distribution provides a simple approximation to the conditional distribution of the relative growth, so that the model developed in this paper can be easily implemented in risk assessments using commercial software packages.
Modelling the Ozone-Based Treatments for Inactivation of Microorganisms.
Brodowska, Agnieszka Joanna; Nowak, Agnieszka; Kondratiuk-Janyska, Alina; Piątkowski, Marcin; Śmigielski, Krzysztof
2017-10-09
The paper presents the development of a model for ozone treatment in a dynamic bed of different microorganisms ( Bacillus subtilis , B. cereus , B. pumilus , Escherichia coli , Pseudomonas fluorescens , Aspergillus niger , Eupenicillium cinnamopurpureum ) on a heterogeneous matrix (juniper berries, cardamom seeds) initially treated with numerous ozone doses during various contact times was studied. Taking into account various microorganism susceptibility to ozone, it was of great importance to develop a sufficiently effective ozone dose to preserve food products using different strains based on the microbial model. For this purpose, we have chosen the Weibull model to describe the survival curves of different microorganisms. Based on the results of microorganism survival modelling after ozone treatment and considering the least susceptible strains to ozone, we selected the critical ones. Among tested strains, those from genus Bacillus were recognized as the most critical strains. In particular, B. subtilis and B. pumilus possessed the highest resistance to ozone treatment because the time needed to achieve the lowest level of its survival was the longest (up to 17.04 min and 16.89 min for B. pumilus reduction on juniper berry and cardamom seed matrix, respectively). Ozone treatment allow inactivate microorganisms to achieving lower survival rates by ozone dose (20.0 g O₃/m³ O₂, with a flow rate of 0.4 L/min) and contact time (up to 20 min). The results demonstrated that a linear correlation between parameters p and k in Weibull distribution, providing an opportunity to calculate a fitted equation of the process.
CARES/PC - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES
NASA Technical Reports Server (NTRS)
Szatmary, S. A.
1994-01-01
The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES/PC performs statistical analysis of data obtained from the fracture of simple, uniaxial tensile or flexural specimens and estimates the Weibull and Batdorf material parameters from this data. CARES/PC is a subset of the program CARES (COSMIC program number LEW-15168) which calculates the fast-fracture reliability or failure probability of ceramic components utilizing the Batdorf and Weibull models to describe the effects of multi-axial stress states on material strength. CARES additionally requires that the ceramic structure be modeled by a finite element program such as MSC/NASTRAN or ANSYS. The more limited CARES/PC does not perform fast-fracture reliability estimation of components. CARES/PC estimates ceramic material properties from uniaxial tensile or from three- and four-point bend bar data. In general, the parameters are obtained from the fracture stresses of many specimens (30 or more are recommended) whose geometry and loading configurations are held constant. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests measure the accuracy of the hypothesis that the fracture data comes from a population with a distribution specified by the estimated Weibull parameters. Ninety-percent confidence intervals on the Weibull parameters and the unbiased value of the shape parameter for complete samples are provided when the maximum likelihood technique is used. CARES/PC is written and compiled with the Microsoft FORTRAN v5.0 compiler using the VAX FORTRAN extensions and dynamic array allocation supported by this compiler for the IBM/MS-DOS or OS/2 operating systems. The dynamic array allocation routines allow the user to match the number of fracture sets and test specimens to the memory available. Machine requirements include IBM PC compatibles with optional math coprocessor. Program output is designed to fit 80-column format printers. Executables for both DOS and OS/2 are provided. CARES/PC is distributed on one 5.25 inch 360K MS-DOS format diskette in compressed format. The expansion tool PKUNZIP.EXE is supplied on the diskette. CARES/PC was developed in 1990. IBM PC and OS/2 are trademarks of International Business Machines. MS-DOS and MS OS/2 are trademarks of Microsoft Corporation. VAX is a trademark of Digital Equipment Corporation.
Recurrence time statistics of landslide events simulated by a cellular automaton model
NASA Astrophysics Data System (ADS)
Piegari, Ester; Di Maio, Rosa; Avella, Adolfo
2014-05-01
The recurrence time statistics of a cellular automaton modelling landslide events is analyzed by performing a numerical analysis in the parameter space and estimating Fano factor behaviors. The model is an extended version of the OFC model, which is a paradigm for SOC in non-conserved systems, but it works differently from the original OFC model as a finite value of the driving rate is applied. By driving the system to instability with different rates, the model exhibits a smooth transition from a correlated to an uncorrelated regime as the effect of a change in predominant mechanisms to propagate instability. If the rate at which instability is approached is small, chain processes dominate the landslide dynamics, and power laws govern probability distributions. However, the power-law regime typical of SOC-like systems is found in a range of return intervals that becomes shorter and shorter by increasing the values of the driving rates. Indeed, if the rates at which instability is approached are large, domino processes are no longer active in propagating instability, and large events simply occur because a large number of cells simultaneously reach instability. Such a gradual loss of the effectiveness of the chain propagation mechanism causes the system gradually enter to an uncorrelated regime where recurrence time distributions are characterized by Weibull behaviors. Simulation results are qualitatively compared with those from a recent analysis performed by Witt et al.(Earth Surf. Process. Landforms, 35, 1138, 2010) for the first complete databases of landslide occurrences over a period as large as fifty years. From the comparison with the extensive landslide data set, the numerical analysis suggests that statistics of such landslide data seem to be described by a crossover region between a correlated regime and an uncorrelated regime, where recurrence time distributions are characterized by power-law and Weibull behaviors for short and long return times, respectively. Finally, in such a region of the parameter space, clear indications of temporal correlations and clustering by the Fano factor behaviors support, at least in part, the analysis performed by Witt et al. (2010).
ZERODUR - bending strength: review of achievements
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2017-08-01
Increased demand for using the glass ceramic ZERODUR® with high mechanical loads called for strength data based on larger statistical samples. Design calculations for failure probability target value below 1: 100 000 cannot be made reliable with parameters derived from 20 specimen samples. The data now available for a variety of surface conditions, ground with different grain sizes and acid etched for full micro crack removal, allow stresses by factors four to ten times higher than before. The large sample revealed that breakage stresses of ground surfaces follow the three parameter Weibull distribution instead of the two parameter version. This is more reasonable considering that the micro cracks of such surfaces have a maximum depth which is reflected in the existence of a threshold breakage stress below which breakage probability is zero. This minimum strength allows calculating minimum lifetimes. Fatigue under load can be taken into account by using the stress corrosion coefficient for the actual environmental humidity. For fully etched surfaces Weibull statistics fails. The precondition of the Weibull distribution, the existence of one unique failure mechanism, is not given anymore. ZERODUR® with fully etched surfaces free from damages introduced after etching endures easily 100 MPa tensile stress. The possibility to use ZERODUR® for combined high precision and high stress application was confirmed by the successful launch and continuing operation of LISA Pathfinder the precursor experiment for the gravitational wave antenna satellite array eLISA.
Wang, Ping; Liu, Xiaoxia; Cao, Tian; Fu, Huihua; Wang, Ranran; Guo, Lixin
2016-09-20
The impact of nonzero boresight pointing errors on the system performance of decode-and-forward protocol-based multihop parallel optical wireless communication systems is studied. For the aggregated fading channel, the atmospheric turbulence is simulated by an exponentiated Weibull model, and pointing errors are described by one recently proposed statistical model including both boresight and jitter. The binary phase-shift keying subcarrier intensity modulation-based analytical average bit error rate (ABER) and outage probability expressions are achieved for a nonidentically and independently distributed system. The ABER and outage probability are then analyzed with different turbulence strengths, receiving aperture sizes, structure parameters (P and Q), jitter variances, and boresight displacements. The results show that aperture averaging offers almost the same system performance improvement with boresight included or not, despite the values of P and Q. The performance enhancement owing to the increase of cooperative path (P) is more evident with nonzero boresight than that with zero boresight (jitter only), whereas the performance deterioration because of the increasing hops (Q) with nonzero boresight is almost the same as that with zero boresight. Monte Carlo simulation is offered to verify the validity of ABER and outage probability expressions.
Ulusoy, Nuran
2017-01-01
The aim of this study was to evaluate the effects of two endocrown designs and computer aided design/manufacturing (CAD/CAM) materials on stress distribution and failure probability of restorations applied to severely damaged endodontically treated maxillary first premolar tooth (MFP). Two types of designs without and with 3 mm intraradicular extensions, endocrown (E) and modified endocrown (ME), were modeled on a 3D Finite element (FE) model of the MFP. Vitablocks Mark II (VMII), Vita Enamic (VE), and Lava Ultimate (LU) CAD/CAM materials were used for each type of design. von Mises and maximum principle values were evaluated and the Weibull function was incorporated with FE analysis to calculate the long term failure probability. Regarding the stresses that occurred in enamel, for each group of material, ME restoration design transmitted less stress than endocrown. During normal occlusal function, the overall failure probability was minimum for ME with VMII. ME restoration design with VE was the best restorative option for premolar teeth with extensive loss of coronal structure under high occlusal loads. Therefore, ME design could be a favorable treatment option for MFPs with missing palatal cusp. Among the CAD/CAM materials tested, VMII and VE were found to be more tooth-friendly than LU. PMID:29119108
NASA Astrophysics Data System (ADS)
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2017-02-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2017-02-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
In silico study on the effects of matrix structure in controlled drug release
NASA Astrophysics Data System (ADS)
Villalobos, Rafael; Cordero, Salomón; Maria Vidales, Ana; Domínguez, Armando
2006-07-01
Purpose: To study the effects of drug concentration and spatial distribution of the medicament, in porous solid dosage forms, on the kinetics and total yield of drug release. Methods: Cubic networks are used as models of drug release systems. They were constructed by means of the dual site-bond model framework, which allows a substrate to have adequate geometrical and topological distribution of its pore elements. Drug particles can move inside the networks by following a random walk model with excluded volume interactions between the particles. The drug release time evolution for different drug concentration and different initial drug spatial distribution has been monitored. Results: The numerical results show that in all the studied cases, drug release presents an anomalous behavior, and the consequences of the matrix structural properties, i.e., drug spatial distribution and drug concentration, on the drug release profile have been quantified. Conclusions: The Weibull function provides a simple connection between the model parameters and the microstructure of the drug release device. A critical modeling of drug release from matrix-type delivery systems is important in order to understand the transport mechanisms that are implicated, and to predict the effect of the device design parameters on the release rate.
Lysyk, T J; Danyk, T
2007-09-01
The effect of temperature on survival, oviposition, gonotrophic development, and a life history factor of vectorial capacity were examined in adult Culicoides sonorensis (Wirth & Jones) (Diptera: Ceratopogonidae) that originated from two geographic locations. Flies originating from the United States (Colorado) had slightly reduced survival after a bloodmeal compared with wild flies collected in southern Alberta (AB), Canada. Survival of AB flies declined in a curvilinear manner with temperature, whereas survival of U.S. flies showed a linear response to temperature. The survivorship curve of the AB flies more closely followed a Weibull distribution than an exponential, indicating survival was age-dependent. Survivorship of the U.S. flies followed an exponential distribution. Females from both sources laid similar numbers of eggs throughout their life. The first eggs were laid by females from both sources at 31.9 degree-day (DD)9.3. Dissections of blood-fed flies reared at various temperatures indicated that flies from both sources were 90% gravid at 32 DD9.3. Relationships among temperature and life history components of vectorial capacity were similar among flies from the two sources and indicated that vectorial capacity would be approximately 1.8-2.6-fold greater in a southern U.S. climate compared with southwestern Canada due solely to the effects of temperature on the life history of C. sonorensis. Using life history estimates derived from Weibull model had little effect on estimating vectorial capacity, whereas using estimates derived from the exponential model slightly overestimated vectorial capacity.
Regional And Seasonal Aspects Of Within-The-Hour Tec Statistics
NASA Astrophysics Data System (ADS)
Koroglu, Ozan; Arikan, Feza; Koroglu, Meltem
2015-04-01
Ionosphere is one of the atmosphere layers which has a plasma structure. Several mechanisms originating from both space and earth itself governs this plasma layer such as solar radiation and geomagnetic effects. Ionosphere plays important role for HF and satellite communication, and space based positioning systems. Therefore, the determination of statistical behavior of ionosphere has utmost importance. The variability of the ionosphere has complex spatio-temporal characteristics, which depends on solar, geomagnetic, gravitational and seismic activities. Total Electron Content (TEC) is one of the major observables for investigating and determining this variability. In this study, spatio-temporal within-the-hour statistical behavior of TEC is determined for Turkey, which is located in mid-latitude, using the TEC estimates from Turkish National Permanent GPS Network (TNPGN)-Active between the years 2009 and 2012. TEC estimates are obtained as IONOLAB-TEC which is developed by IONOLAB group (www.ionolab.org) from Hacettepe University. IONOLAB-TEC for each station in TNPGN-Active is organized in a database and grouped with respect to years, ionospheric seasons, hours and regions 2 degree by 3 degree, in latitude and longitude, respectively. The data sets are used to calculate within-the-hour parametric Probability Density Functions (PDF). For every year, every region and every hour, a representative PDF is determined. It is observed that TEC values have a strong hourly, seasonal and positional dependence on east-west direction, and the growing trend shifts according to sunrise and sunset times. It is observed that the data are distributed predominantly as Lognormal and Weibull. The averages and standard deviations of the chosen distributions follow the trends in 24 hour diurnal and 11 year solar cycle periods. The regional and seasonal behavior of PDFs are investigated using a representative GPS station within each region. Within-the-hour PDF estimates are grouped into ionospheric seasons as Winter, Summer, March equinox and September equinox. In winter and summer seasons, Lognormal distribution is observed. During equinox seasons, Weibull distribution is observed more frequently. Furthermore, all hourly TEC values in same region are combined in order to improve the reliability and accuracy of the probability density function estimates. It is observed that as being in mid-latitude region, the ionosphere over Turkey has robust characteristics that are distributed as Lognormal and Weibull. Statistical observations on PDF estimates of TEC of the ionosphere over Turkey will contribute to developing a regional and seasonal random field model, which will further contribute to HF channel characterization. This study is supported by a joint grant of TUBITAK 112E568 and RFBR 13-02-91370-CT_a.
NASA Astrophysics Data System (ADS)
Li, T.; Griffiths, W. D.; Chen, J.
2017-11-01
The Maximum Likelihood method and the Linear Least Squares (LLS) method have been widely used to estimate Weibull parameters for reliability of brittle and metal materials. In the last 30 years, many researchers focused on the bias of Weibull modulus estimation, and some improvements have been achieved, especially in the case of the LLS method. However, there is a shortcoming in these methods for a specific type of data, where the lower tail deviates dramatically from the well-known linear fit in a classic LLS Weibull analysis. This deviation can be commonly found from the measured properties of materials, and previous applications of the LLS method on this kind of dataset present an unreliable linear regression. This deviation was previously thought to be due to physical flaws ( i.e., defects) contained in materials. However, this paper demonstrates that this deviation can also be caused by the linear transformation of the Weibull function, occurring in the traditional LLS method. Accordingly, it may not be appropriate to carry out a Weibull analysis according to the linearized Weibull function, and the Non-linear Least Squares method (Non-LS) is instead recommended for the Weibull modulus estimation of casting properties.
Migration kinetics of four photo-initiators from paper food packaging to solid food simulants.
Cai, Huimei; Ji, Shuilin; Zhang, Juzhou; Tao, Gushuai; Peng, Chuanyi; Hou, Ruyan; Zhang, Liang; Sun, Yue; Wan, Xiaochun
2017-09-01
The migration behaviour of four photo-initiators (BP, EHA, MBP and Irgacure 907) was studied by 'printing' onto four different food-packaging materials (Kraft paper, white cardboard, Polyethylene (PE)-coated paper and composite paper) and tracking movement into the food simulant: Tenax-TA (porous polymer 2,6-diphenyl furan resin). The results indicated that the migration of the photo-initiators was related to the molecular weight and log K o/w of each photo-initiator. At different temperatures, the migration rates of the photo-initiators were different in papers with different thicknesses. The amount of each photo-initiator found in the food was closely related to the food matrix. The Weibull model was used to predict the migration load into the food simulants by calculating the parameters τ and β and determining the relationship of the two parameters with temperature and paper thickness. The established Weibull model was then used to predict the migration of each photo-initiator with respect to different foods. A two-parameter Weibull model fitted the actual situation, with some deviation from the actual migration amount.
de Oliveira, Thales Leandro Coutinho; Soares, Rodrigo de Araújo; Piccoli, Roberta Hilsdorf
2013-03-01
The antimicrobial effect of oregano (Origanum vulgare L.) and lemongrass (Cymbopogon citratus (DC.) Stapf.) essential oils (EOs) against Salmonella enterica serotype Enteritidis in in vitro experiments, and inoculated in ground bovine meat during refrigerated storage (4±2 °C) for 6 days was evaluated. The Weibull model was tested to fit survival/inactivation bacterial curves (estimating of p and δ parameters). The minimum inhibitory concentration (MIC) value for both EOs on S. Enteritidis was 3.90 μl/ml. The EO concentrations applied in the ground beef were 3.90, 7.80 and 15.60 μl/g, based on MIC levels and possible activity reduction by food constituents. Both evaluated EOs in all tested levels, showed antimicrobial effects, with microbial populations reducing (p≤0.05) along time storage. Evaluating fit-quality parameters (RSS and RSE) Weibull models are able to describe the inactivation curves of EOs against S. Enteritidis. The application of EOs in processed meats can be used to control pathogens during refrigerated shelf-life. Copyright © 2012 Elsevier Ltd. All rights reserved.
The Effect of Sr Modifier Additions on Double Oxide Film Defects in 2L99 Alloy Castings
NASA Astrophysics Data System (ADS)
Chen, Qi; Griffiths, W. D.
2017-11-01
In this paper, Sr modifier (300 ppm) was added to 2L99 alloy sand castings to investigate its effect on bifilm defects in the castings. Two different sand molds were used in this study, with good and bad running system designs, to introduce different amounts of bifilm defects into the castings. The mechanical properties of the modified 2L99 castings were compared to the properties of unmodified castings and showed that with high bifilm defect contents (H) the Sr addition reduced the Weibull modulus of the UTS by 67 pct and the Position Parameter by 5 pct, and introduced a bimodal distribution into the Weibull plot of the pct Elongation. However, for castings with low bifilm defect content (L), the Weibull moduli of both the UTS and pct Elongation were significantly improved (by 78 and 73 pct, respectively) with the addition of Sr. The Position Parameter of the pct Elongation was improved by 135 pct. The results suggested that a desirable modification effect can only be achieved while the bifilm defect content in a casting was low.
Jeffrey H. Gove
2003-01-01
Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...
Pocket Handbook on Reliability
1975-09-01
exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future
Richard A. Johnson; James W. Evans; David W. Green
2003-01-01
Ratios of strength properties of lumber are commonly used to calculate property values for standards. Although originally proposed in terms of means, ratios are being applied without regard to position in the distribution. It is now known that lumber strength properties are generally not normally distributed. Therefore, nonparametric methods are often used to derive...
C-Sphere Strength-Size Scaling in a Bearing-Grade Silicon Nitride
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, Andrew A; Jadaan, Osama M.; Kirkland, Timothy Philip
2008-01-01
A C-sphere specimen geometry was used to determine the failure strength distributions of a commercially available bearing-grade silicon nitride (Si3N4) having ball diameters of 12.7 and 25.4 mm. Strengths for both diameters were determined using the combination of failure load, C sphere geometry, and finite element analysis and fitted using two-parameter Weibull distributions. Effective areas of both diameters were estimated as a function of Weibull modulus and used to explore whether the strength distributions predictably strength-scaled between each size. They did not. That statistical observation suggested that the same flaw type did not limit the strength of both ball diametersmore » indicating a lack of material homogeneity between the two sizes. Optical fractography confirmed that. It showed there were two distinct strength-limiting flaw types in both ball diameters, that one flaw type was always associated with lower strength specimens, and that significantly higher fraction of the 24.5-mm-diameter c-sphere specimens failed from it. Predictable strength-size-scaling would therefore not result as a consequence of this because these flaw types were not homogenously distributed and sampled in both c-sphere geometries.« less
Four theorems on the psychometric function.
May, Keith A; Solomon, Joshua A
2013-01-01
In a 2-alternative forced-choice (2AFC) discrimination task, observers choose which of two stimuli has the higher value. The psychometric function for this task gives the probability of a correct response for a given stimulus difference, Δx. This paper proves four theorems about the psychometric function. Assuming the observer applies a transducer and adds noise, Theorem 1 derives a convenient general expression for the psychometric function. Discrimination data are often fitted with a Weibull function. Theorem 2 proves that the Weibull "slope" parameter, β, can be approximated by β(Noise) x β(Transducer), where β(Noise) is the β of the Weibull function that fits best to the cumulative noise distribution, and β(Transducer) depends on the transducer. We derive general expressions for β(Noise) and β(Transducer), from which we derive expressions for specific cases. One case that follows naturally from our general analysis is Pelli's finding that, when d' ∝ (Δx)(b), β ≈ β(Noise) x b. We also consider two limiting cases. Theorem 3 proves that, as sensitivity improves, 2AFC performance will usually approach that for a linear transducer, whatever the actual transducer; we show that this does not apply at signal levels where the transducer gradient is zero, which explains why it does not apply to contrast detection. Theorem 4 proves that, when the exponent of a power-function transducer approaches zero, 2AFC performance approaches that of a logarithmic transducer. We show that the power-function exponents of 0.4-0.5 fitted to suprathreshold contrast discrimination data are close enough to zero for the fitted psychometric function to be practically indistinguishable from that of a log transducer. Finally, Weibull β reflects the shape of the noise distribution, and we used our results to assess the recent claim that internal noise has higher kurtosis than a Gaussian. Our analysis of β for contrast discrimination suggests that, if internal noise is stimulus-independent, it has lower kurtosis than a Gaussian.
Weissman-Miller, Deborah
2013-11-02
Point estimation is particularly important in predicting weight loss in individuals or small groups. In this analysis, a new health response function is based on a model of human response over time to estimate long-term health outcomes from a change point in short-term linear regression. This important estimation capability is addressed for small groups and single-subject designs in pilot studies for clinical trials, medical and therapeutic clinical practice. These estimations are based on a change point given by parameters derived from short-term participant data in ordinary least squares (OLS) regression. The development of the change point in initial OLS data and the point estimations are given in a new semiparametric ratio estimator (SPRE) model. The new response function is taken as a ratio of two-parameter Weibull distributions times a prior outcome value that steps estimated outcomes forward in time, where the shape and scale parameters are estimated at the change point. The Weibull distributions used in this ratio are derived from a Kelvin model in mechanics taken here to represent human beings. A distinct feature of the SPRE model in this article is that initial treatment response for a small group or a single subject is reflected in long-term response to treatment. This model is applied to weight loss in obesity in a secondary analysis of data from a classic weight loss study, which has been selected due to the dramatic increase in obesity in the United States over the past 20 years. A very small relative error of estimated to test data is shown for obesity treatment with the weight loss medication phentermine or placebo for the test dataset. An application of SPRE in clinical medicine or occupational therapy is to estimate long-term weight loss for a single subject or a small group near the beginning of treatment.
Goodness of fit of probability distributions for sightings as species approach extinction.
Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael
2009-04-01
Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.
Normal and Extreme Wind Conditions for Power at Coastal Locations in China
Gao, Meng; Ning, Jicai; Wu, Xiaoqing
2015-01-01
In this paper, the normal and extreme wind conditions for power at 12 coastal locations along China’s coastline were investigated. For this purpose, the daily meteorological data measured at the standard 10-m height above ground for periods of 40–62 years are statistically analyzed. The East Asian Monsoon that affects almost China’s entire coastal region is considered as the leading factor determining wind energy resources. For most stations, the mean wind speed is higher in winter and lower in summer. Meanwhile, the wind direction analysis indicates that the prevalent winds in summer are southerly, while those in winter are northerly. The air densities at different coastal locations differ significantly, resulting in the difference in wind power density. The Weibull and lognormal distributions are applied to fit the yearly wind speeds. The lognormal distribution performs better than the Weibull distribution at 8 coastal stations according to two judgement criteria, the Kolmogorov–Smirnov test and absolute error (AE). Regarding the annual maximum extreme wind speed, the generalized extreme value (GEV) distribution performs better than the commonly-used Gumbel distribution. At these southeastern coastal locations, strong winds usually occur in typhoon season. These 4 coastal provinces, that is, Guangdong, Fujian, Hainan, and Zhejiang, which have abundant wind resources, are also prone to typhoon disasters. PMID:26313256
Normal and Extreme Wind Conditions for Power at Coastal Locations in China.
Gao, Meng; Ning, Jicai; Wu, Xiaoqing
2015-01-01
In this paper, the normal and extreme wind conditions for power at 12 coastal locations along China's coastline were investigated. For this purpose, the daily meteorological data measured at the standard 10-m height above ground for periods of 40-62 years are statistically analyzed. The East Asian Monsoon that affects almost China's entire coastal region is considered as the leading factor determining wind energy resources. For most stations, the mean wind speed is higher in winter and lower in summer. Meanwhile, the wind direction analysis indicates that the prevalent winds in summer are southerly, while those in winter are northerly. The air densities at different coastal locations differ significantly, resulting in the difference in wind power density. The Weibull and lognormal distributions are applied to fit the yearly wind speeds. The lognormal distribution performs better than the Weibull distribution at 8 coastal stations according to two judgement criteria, the Kolmogorov-Smirnov test and absolute error (AE). Regarding the annual maximum extreme wind speed, the generalized extreme value (GEV) distribution performs better than the commonly-used Gumbel distribution. At these southeastern coastal locations, strong winds usually occur in typhoon season. These 4 coastal provinces, that is, Guangdong, Fujian, Hainan, and Zhejiang, which have abundant wind resources, are also prone to typhoon disasters.
The impacts of precipitation amount simulation on hydrological modeling in Nordic watersheds
NASA Astrophysics Data System (ADS)
Li, Zhi; Brissette, Fancois; Chen, Jie
2013-04-01
Stochastic modeling of daily precipitation is very important for hydrological modeling, especially when no observed data are available. Precipitation is usually modeled by two component model: occurrence generation and amount simulation. For occurrence simulation, the most common method is the first-order two-state Markov chain due to its simplification and good performance. However, various probability distributions have been reported to simulate precipitation amount, and spatiotemporal differences exist in the applicability of different distribution models. Therefore, assessing the applicability of different distribution models is necessary in order to provide more accurate precipitation information. Six precipitation probability distributions (exponential, Gamma, Weibull, skewed normal, mixed exponential, and hybrid exponential/Pareto distributions) are directly and indirectly evaluated on their ability to reproduce the original observed time series of precipitation amount. Data from 24 weather stations and two watersheds (Chute-du-Diable and Yamaska watersheds) in the province of Quebec (Canada) are used for this assessment. Various indices or statistics, such as the mean, variance, frequency distribution and extreme values are used to quantify the performance in simulating the precipitation and discharge. Performance in reproducing key statistics of the precipitation time series is well correlated to the number of parameters of the distribution function, and the three-parameter precipitation models outperform the other models, with the mixed exponential distribution being the best at simulating daily precipitation. The advantage of using more complex precipitation distributions is not as clear-cut when the simulated time series are used to drive a hydrological model. While the advantage of using functions with more parameters is not nearly as obvious, the mixed exponential distribution appears nonetheless as the best candidate for hydrological modeling. The implications of choosing a distribution function with respect to hydrological modeling and climate change impact studies are also discussed.
A fuzzy set approach for reliability calculation of valve controlling electric actuators
NASA Astrophysics Data System (ADS)
Karmachev, D. P.; Yefremov, A. A.; Luneva, E. E.
2017-02-01
The oil and gas equipment and electric actuators in particular frequently perform in various operational modes and under dynamic environmental conditions. These factors affect equipment reliability measures in a vague, uncertain way. To eliminate the ambiguity, reliability model parameters could be defined as fuzzy numbers. We suggest a technique that allows constructing fundamental fuzzy-valued performance reliability measures based on an analysis of electric actuators failure data in accordance with the amount of work, completed before the failure, instead of failure time. Also, this paper provides a computation example of fuzzy-valued reliability and hazard rate functions, assuming Kumaraswamy complementary Weibull geometric distribution as a lifetime (reliability) model for electric actuators.
Evaluation of portfolio credit risk based on survival analysis for progressive censored data
NASA Astrophysics Data System (ADS)
Jaber, Jamil J.; Ismail, Noriszura; Ramli, Siti Norafidah Mohd
2017-04-01
In credit risk management, the Basel committee provides a choice of three approaches to the financial institutions for calculating the required capital: the standardized approach, the Internal Ratings-Based (IRB) approach, and the Advanced IRB approach. The IRB approach is usually preferred compared to the standard approach due to its higher accuracy and lower capital charges. This paper use several parametric models (Exponential, log-normal, Gamma, Weibull, Log-logistic, Gompertz) to evaluate the credit risk of the corporate portfolio in the Jordanian banks based on the monthly sample collected from January 2010 to December 2015. The best model is selected using several goodness-of-fit criteria (MSE, AIC, BIC). The results indicate that the Gompertz distribution is the best model parametric model for the data.
Influence of the bracket on bonding and physical behavior of orthodontic resin cements.
Bolaños-Carmona, Victoria; Zein, Bilal; Menéndez-Núñez, Mario; Sánchez-Sánchez, Purificación; Ceballos-García, Laura; González-López, Santiago
2015-01-01
The aim of the study is to determine the influence of the type of bracket, on bond strength, microhardness and conversion degree (CD) of four resin orthodontic cements. Micro-tensile bond strength (µTBS) test between the bracket base and the cement was carried out on glass-hour-shaped specimens (n=20). Vickers Hardness Number (VHN) and micro-Raman spectra were recorded in situ under the bracket base. Weibull distribution, ANOVA and non-parametric test were applied for data analysis (p<0.05). The highest values of ή as well as the β Weibull parameter were obtained for metallic brackets with Transbond™ plastic brackets with the self-curing cement showing the worst performance. The CD was from 80% to 62.5%.
Kim, Do-Kyun; Kim, Soo-Ji; Kang, Dong-Hyun
2017-01-01
In order to assure the microbial safety of drinking water, UVC-LED treatment has emerged as a possible technology to replace the use of conventional low pressure (LP) mercury vapor UV lamps. In this investigation, inactivation of Human Enteric Virus (HuEV) surrogates with UVC-LEDs was investigated in a water disinfection system, and kinetic model equations were applied to depict the surviving infectivities of the viruses. MS2, Qβ, and ΦX 174 bacteriophages were inoculated into sterile distilled water (DW) and irradiated with UVC-LED printed circuit boards (PCBs) (266nm and 279nm) or conventional LP lamps. Infectivities of bacteriophages were effectively reduced by up to 7-log after 9mJ/cm 2 treatment for MS2 and Qβ, and 1mJ/cm 2 for ΦX 174. UVC-LEDs showed a superior viral inactivation effect compared to conventional LP lamps at the same dose (1mJ/cm 2 ). Non-log linear plot patterns were observed, so that Weibull, Biphasic, Log linear-tail, and Weibull-tail model equations were used to fit the virus survival curves. For MS2 and Qβ, Weibull and Biphasic models fit well with R 2 values approximately equal to 0.97-0.99, and the Weibull-tail equation accurately described survival of ΦX 174. The level of UV-susceptibility among coliphages measured by the inactivation rate constant, k, was statistically different (ΦX 174 (ssDNA)>MS2, Qβ (ssRNA)), and indicated that sensitivity to UV was attributed to viral genetic material. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Age Specific Incidence Anomaly Suggests that Cancers Originate During Development
NASA Astrophysics Data System (ADS)
Brody, James P.
The accumulation of genetic alterations causes cancers. Since this accumulation takes time, the incidence of most cancers is thought to increase exponentially with age. However, careful measurements of the age-specific incidence show that the specific incidence for many forms of cancer rises with age to a maximum, and then decreases. This decrease in the age-specific incidence with age is an anomaly. Understanding this anomaly should lead to a better understanding of how tumors develop and grow. Here we derive the shape of the age-specific incidence, showing that it should follow the shape of a Weibull distribution. Measurements indicate that the age-specific incidence for colon cancer does indeed follow a Weibull distribution. This analysis leads to the interpretation that for colon cancer two subpopulations exist in the general population: a susceptible population and an immune population. Colon tumors will only occur in the susceptible population. This analysis is consistent with the developmental origins of disease hypothesis and generalizable to many other common forms of cancer.
CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
2003-01-01
This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.
Reliability analysis of C-130 turboprop engine components using artificial neural network
NASA Astrophysics Data System (ADS)
Qattan, Nizar A.
In this study, we predict the failure rate of Lockheed C-130 Engine Turbine. More than thirty years of local operational field data were used for failure rate prediction and validation. The Weibull regression model and the Artificial Neural Network model including (feed-forward back-propagation, radial basis neural network, and multilayer perceptron neural network model); will be utilized to perform this study. For this purpose, the thesis will be divided into five major parts. First part deals with Weibull regression model to predict the turbine general failure rate, and the rate of failures that require overhaul maintenance. The second part will cover the Artificial Neural Network (ANN) model utilizing the feed-forward back-propagation algorithm as a learning rule. The MATLAB package will be used in order to build and design a code to simulate the given data, the inputs to the neural network are the independent variables, the output is the general failure rate of the turbine, and the failures which required overhaul maintenance. In the third part we predict the general failure rate of the turbine and the failures which require overhaul maintenance, using radial basis neural network model on MATLAB tool box. In the fourth part we compare the predictions of the feed-forward back-propagation model, with that of Weibull regression model, and radial basis neural network model. The results show that the failure rate predicted by the feed-forward back-propagation artificial neural network model is closer in agreement with radial basis neural network model compared with the actual field-data, than the failure rate predicted by the Weibull model. By the end of the study, we forecast the general failure rate of the Lockheed C-130 Engine Turbine, the failures which required overhaul maintenance and six categorical failures using multilayer perceptron neural network (MLP) model on DTREG commercial software. The results also give an insight into the reliability of the engine turbine under actual operating conditions, which can be used by aircraft operators for assessing system and component failures and customizing the maintenance programs recommended by the manufacturer.
NASA Technical Reports Server (NTRS)
Leybold, H. A.
1971-01-01
Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.
Solar F10.7 radiation - A short term model for Space Station applications
NASA Technical Reports Server (NTRS)
Vedder, John D.; Tabor, Jill L.
1991-01-01
A new method is described for statistically modeling the F10.7 component of solar radiation for 91-day intervals. The resulting model represents this component of the solar flux as a quasi-exponentially correlated, Weibull distributed random variable, and thereby demonstrates excellent agreement with observed F10.7 data. Values of the F10.7 flux are widely used in models of the earth's upper atmosphere because of its high correlation with density fluctuations due to solar heating effects. Because of the direct relation between atmospheric density and drag, a realistic model of the short term fluctuation of the F10.7 flux is important for the design and operation of Space Station Freedom. The method of modeling this flux described in this report should therefore be useful for a variety of Space Station applications.
Modelling the Ozone-Based Treatments for Inactivation of Microorganisms
Brodowska, Agnieszka Joanna; Nowak, Agnieszka; Kondratiuk-Janyska, Alina; Piątkowski, Marcin; Śmigielski, Krzysztof
2017-01-01
The paper presents the development of a model for ozone treatment in a dynamic bed of different microorganisms (Bacillus subtilis, B. cereus, B. pumilus, Escherichia coli, Pseudomonas fluorescens, Aspergillus niger, Eupenicillium cinnamopurpureum) on a heterogeneous matrix (juniper berries, cardamom seeds) initially treated with numerous ozone doses during various contact times was studied. Taking into account various microorganism susceptibility to ozone, it was of great importance to develop a sufficiently effective ozone dose to preserve food products using different strains based on the microbial model. For this purpose, we have chosen the Weibull model to describe the survival curves of different microorganisms. Based on the results of microorganism survival modelling after ozone treatment and considering the least susceptible strains to ozone, we selected the critical ones. Among tested strains, those from genus Bacillus were recognized as the most critical strains. In particular, B. subtilis and B. pumilus possessed the highest resistance to ozone treatment because the time needed to achieve the lowest level of its survival was the longest (up to 17.04 min and 16.89 min for B. pumilus reduction on juniper berry and cardamom seed matrix, respectively). Ozone treatment allow inactivate microorganisms to achieving lower survival rates by ozone dose (20.0 g O3/m3 O2, with a flow rate of 0.4 L/min) and contact time (up to 20 min). The results demonstrated that a linear correlation between parameters p and k in Weibull distribution, providing an opportunity to calculate a fitted equation of the process. PMID:28991199
NASA Technical Reports Server (NTRS)
Nemeth, Noel
2013-01-01
Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software
NASA Astrophysics Data System (ADS)
Sardet, Laure; Patilea, Valentin
When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.
NASA Astrophysics Data System (ADS)
Cohen, D.; Michlmayr, G.; Or, D.
2012-04-01
Shearing of dense granular materials appears in many engineering and Earth sciences applications. Under a constant strain rate, the shearing stress at steady state oscillates with slow rises followed by rapid drops that are linked to the build up and failure of force chains. Experiments indicate that these drops display exponential statistics. Measurements of acoustic emissions during shearing indicates that the energy liberated by failure of these force chains has power-law statistics. Representing force chains as fibers, we use a stick-slip fiber bundle model to obtain analytical solutions of the statistical distribution of stress drops and failure energy. In the model, fibers stretch, fail, and regain strength during deformation. Fibers have Weibull-distributed threshold strengths with either quenched and annealed disorder. The shape of the distribution for drops and energy obtained from the model are similar to those measured during shearing experiments. This simple model may be useful to identify failure events linked to force chain failures. Future generalizations of the model that include different types of fiber failure may also allow identification of different types of granular failures that have distinct statistical acoustic emission signatures.
Multiscale modeling of porous ceramics using movable cellular automaton method
NASA Astrophysics Data System (ADS)
Smolin, Alexey Yu.; Smolin, Igor Yu.; Smolina, Irina Yu.
2017-10-01
The paper presents a multiscale model for porous ceramics based on movable cellular automaton method, which is a particle method in novel computational mechanics of solid. The initial scale of the proposed approach corresponds to the characteristic size of the smallest pores in the ceramics. At this scale, we model uniaxial compression of several representative samples with an explicit account of pores of the same size but with the unique position in space. As a result, we get the average values of Young's modulus and strength, as well as the parameters of the Weibull distribution of these properties at the current scale level. These data allow us to describe the material behavior at the next scale level were only the larger pores are considered explicitly, while the influence of small pores is included via effective properties determined earliar. If the pore size distribution function of the material has N maxima we need to perform computations for N-1 levels in order to get the properties step by step from the lowest scale up to the macroscale. The proposed approach was applied to modeling zirconia ceramics with bimodal pore size distribution. The obtained results show correct behavior of the model sample at the macroscale.
Zero expansion glass ceramic ZERODUR® roadmap for advanced lithography
NASA Astrophysics Data System (ADS)
Westerhoff, Thomas; Jedamzik, Ralf; Hartmann, Peter
2013-04-01
The zero expansion glass ceramic ZERODUR® is a well-established material in microlithography in critical components as wafer- and reticle-stages, mirrors and frames in the stepper positioning and alignment system. The very low coefficient of thermal expansion (CTE) and its extremely high CTE homogeneity are key properties to achieve the tight overlay requirements of advanced lithography processes. SCHOTT is continuously improving critical material properties of ZERODUR® essential for microlithography applications according to a roadmap driven by the ever tighter material specifications broken down from the customer roadmaps. This paper will present the SCHOTT Roadmap for ZERODUR® material property development. In the recent years SCHOTT established a physical model based on structural relaxation to describe the coefficient of thermal expansion's temperature dependence. The model is successfully applied for the new expansion grade ZERODUR® TAILORED introduced to the market in 2012. ZERODUR® TAILORED delivers the lowest thermal expansion of ZERODUR® products at microlithography tool application temperature allowing for higher thermal stability for tighter overlay control in IC production. Data will be reported demonstrating the unique CTE homogeneity of ZERODUR® and its very high reproducibility, a necessary precondition for serial production for microlithography equipment components. New data on the bending strength of ZERODUR® proves its capability to withstand much higher mechanical loads than previously reported. Utilizing a three parameter Weibull distribution it is possible to derive minimum strength values for a given ZERODUR® surface treatment. Consequently the statistical uncertainties of the earlier approach based on a two parameter Weibull distribution have been eliminated. Mechanical fatigue due to stress corrosion was included in a straightforward way. The derived formulae allows calculating life time of ZERODUR® components for a given stress load or the allowable maximum stress for a minimum required life time.
Scale-dependent measurements of meteorite strength: Implications for asteroid fragmentation
NASA Astrophysics Data System (ADS)
Cotto-Figueroa, Desireé; Asphaug, Erik; Garvie, Laurence A. J.; Rai, Ashwin; Johnston, Joel; Borkowski, Luke; Datta, Siddhant; Chattopadhyay, Aditi; Morris, Melissa A.
2016-10-01
Measuring the strengths of asteroidal materials is important for developing mitigation strategies for potential Earth impactors and for understanding properties of in situ materials on asteroids during human and robotic exploration. Studies of asteroid disruption and fragmentation have typically used the strengths determined from terrestrial analog materials, although questions have been raised regarding the suitability of these materials. The few published measurements of meteorite strength are typically significantly greater than those estimated from the stratospheric breakup of meter-sized meteoroids. Given the paucity of relevant strength data, the scale-varying strength properties of meteoritic and asteroidal materials are poorly constrained. Based on our uniaxial failure studies of centimeter-sized cubes of a carbonaceous and ordinary chondrite, we develop the first Weibull failure distribution analysis of meteorites. This Weibull distribution projected to meter scales, overlaps the strengths determined from asteroidal airbursts and can be used to predict properties of to the 100 m scale. In addition, our analysis shows that meter-scale boulders on asteroids are significantly weaker than small pieces of meteorites, while large meteorites surviving on Earth are selected by attrition. Further, the common use of terrestrial analog materials to predict scale-dependent strength properties significantly overestimates the strength of meter-sized asteroidal materials and therefore is unlikely well suited for the modeling of asteroid disruption and fragmentation. Given the strength scale-dependence determined for carbonaceous and ordinary chondrite meteorites, our results suggest that boulders of similar composition on asteroids will have compressive strengths significantly less than typical terrestrial rocks.
Upadhyay, S K; Mukherjee, Bhaswati; Gupta, Ashutosh
2009-09-01
Several models for studies related to tensile strength of materials are proposed in the literature where the size or length component has been taken to be an important factor for studying the specimens' failure behaviour. An important model, developed on the basis of cumulative damage approach, is the three-parameter extension of the Birnbaum-Saunders fatigue model that incorporates size of the specimen as an additional variable. This model is a strong competitor of the commonly used Weibull model and stands better than the traditional models, which do not incorporate the size effect. The paper considers two such cumulative damage models, checks their compatibility with a real dataset, compares them with some of the recent toolkits, and finally recommends a model, which appears an appropriate one. Throughout the study is Bayesian based on Markov chain Monte Carlo simulation.
NASA Technical Reports Server (NTRS)
Trivedi, K. S.; Geist, R. M.
1981-01-01
The CARE 3 reliability model for aircraft avionics and control systems is described by utilizing a number of examples which frequently use state-of-the-art mathematical modeling techniques as a basis for their exposition. Behavioral decomposition followed by aggregration were used in an attempt to deal with reliability models with a large number of states. A comprehensive set of models of the fault-handling processes in a typical fault-tolerant system was used. These models were semi-Markov in nature, thus removing the usual restrictions of exponential holding times within the coverage model. The aggregate model is a non-homogeneous Markov chain, thus allowing the times to failure to posses Weibull-like distributions. Because of the departures from traditional models, the solution method employed is that of Kolmogorov integral equations, which are evaluated numerically.
Evidence of Long Range Dependence and Self-similarity in Urban Traffic Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakur, Gautam S; Helmy, Ahmed; Hui, Pan
2015-01-01
Transportation simulation technologies should accurately model traffic demand, distribution, and assignment parame- ters for urban environment simulation. These three param- eters significantly impact transportation engineering bench- mark process, are also critical in realizing realistic traffic modeling situations. In this paper, we model and charac- terize traffic density distribution of thousands of locations around the world. The traffic densities are generated from millions of images collected over several years and processed using computer vision techniques. The resulting traffic den- sity distribution time series are then analyzed. It is found using the goodness-of-fit test that the traffic density dis- tributions follows heavy-tailmore » models such as Log-gamma, Log-logistic, and Weibull in over 90% of analyzed locations. Moreover, a heavy-tail gives rise to long-range dependence and self-similarity, which we studied by estimating the Hurst exponent (H). Our analysis based on seven different Hurst estimators strongly indicate that the traffic distribution pat- terns are stochastically self-similar (0.5 H 1.0). We believe this is an important finding that will influence the design and development of the next generation traffic simu- lation techniques and also aid in accurately modeling traffic engineering of urban systems. In addition, it shall provide a much needed input for the development of smart cities.« less
Hoseini, Mina; Bahrampour, Abbas; Mirzaee, Moghaddameh
2017-02-16
Breast cancer is the most common cancer after lung cancer and the second cause of death. In this study we compared Weibull and Lognormal Cure Models with Cox regression on the survival of breast cancer. A cohort study. The current study retrospective cohort study was conducted on 140 patients referred to Ali Ibn Abitaleb Hospital, Rafsanjan southeastern Iran from 2001 to 2015 suffering from breast cancer. We determined and analyzed the effective survival causes by different models using STATA14. According to AIC, log-normal model was more consistent than Weibull. In the multivariable Lognormal model, the effective factors like smoking, second -hand smoking, drinking herbal tea and the last breast-feeding period were included. In addition, using Cox regression factors of significant were the disease grade, size of tumor and its metastasis (p-value<0.05). As Rafsanjan is surrounded by pistachio orchards and pesticides applied by farmers, people of this city are exposed to agricultural pesticides and its harmful consequences. The effect of the pesticide on breast cancer was studied and the results showed that the effect of pesticides on breast cancer was not in agreement with the models used in this study. Based on different methods for survival analysis, researchers can decide how they can reach a better conclusion. This comparison indicates the result of semi-parametric Cox method is closer to clinical experiences evidences.
Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.
2007-01-01
Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.
ZERODUR: bending strength data for etched surfaces
NASA Astrophysics Data System (ADS)
Hartmann, Peter; Leys, Antoine; Carré, Antoine; Kerz, Franca; Westerhoff, Thomas
2014-07-01
In a continuous effort since 2007 a considerable amount of new data and information has been gathered on the bending strength of the extremely low thermal expansion glass ceramic ZERODUR®. By fitting a three parameter Weibull distribution to the data it could be shown that for homogenously ground surfaces minimum breakage stresses exist lying much higher than the previously applied design limits. In order to achieve even higher allowable stress values diamond grain ground surfaces have been acid etched, a procedure widely accepted as strength increasing measure. If surfaces are etched taking off layers with thickness which are comparable to the maximum micro crack depth of the preceding grinding process they also show statistical distributions compatible with a three parameter Weibull distribution. SCHOTT has performed additional measurement series with etch solutions with variable composition testing the applicability of this distribution and the possibility to achieve further increase of the minimum breakage stress. For long term loading applications strength change with time and environmental media are important. The parameter needed for prediction calculations which is combining these influences is the stress corrosion constant. Results from the past differ significantly from each other. On the basis of new investigations better information will be provided for choosing the best value for the given application conditions.
Wei, Shaoceng; Kryscio, Richard J.
2015-01-01
Continuous-time multi-state stochastic processes are useful for modeling the flow of subjects from intact cognition to dementia with mild cognitive impairment and global impairment as intervening transient, cognitive states and death as a competing risk (Figure 1). Each subject's cognition is assessed periodically resulting in interval censoring for the cognitive states while death without dementia is not interval censored. Since back transitions among the transient states are possible, Markov chains are often applied to this type of panel data. In this manuscript we apply a Semi-Markov process in which we assume that the waiting times are Weibull distributed except for transitions from the baseline state, which are exponentially distributed and in which we assume no additional changes in cognition occur between two assessments. We implement a quasi-Monte Carlo (QMC) method to calculate the higher order integration needed for likelihood estimation. We apply our model to a real dataset, the Nun Study, a cohort of 461 participants. PMID:24821001
Wei, Shaoceng; Kryscio, Richard J
2016-12-01
Continuous-time multi-state stochastic processes are useful for modeling the flow of subjects from intact cognition to dementia with mild cognitive impairment and global impairment as intervening transient cognitive states and death as a competing risk. Each subject's cognition is assessed periodically resulting in interval censoring for the cognitive states while death without dementia is not interval censored. Since back transitions among the transient states are possible, Markov chains are often applied to this type of panel data. In this manuscript, we apply a semi-Markov process in which we assume that the waiting times are Weibull distributed except for transitions from the baseline state, which are exponentially distributed and in which we assume no additional changes in cognition occur between two assessments. We implement a quasi-Monte Carlo (QMC) method to calculate the higher order integration needed for likelihood estimation. We apply our model to a real dataset, the Nun Study, a cohort of 461 participants. © The Author(s) 2014.
Janković, Bojan
2011-10-01
The non-isothermal pyrolysis kinetics of Acetocell (the organosolv) and Lignoboost® (kraft) lignins, in an inert atmosphere, have been studied by thermogravimetric analysis. Using isoconversional analysis, it was concluded that the apparent activation energy for all lignins strongly depends on conversion, showing that the pyrolysis of lignins is not a single chemical process. It was identified that the pyrolysis process of Acetocell and Lignoboost® lignin takes place over three reaction steps, which was confirmed by appearance of the corresponding isokinetic relationships (IKR). It was found that major pyrolysis stage of both lignins is characterized by stilbene pyrolysis reactions, which were subsequently followed by decomposition reactions of products derived from the stilbene pyrolytic process. It was concluded that non-isothermal pyrolysis of Acetocell and Lignoboost® lignins can be best described by n-th (n>1) reaction order kinetics, using the Weibull mixture model (as distributed reactivity model) with alternating shape parameters. Copyright © 2011 Elsevier Ltd. All rights reserved.
Cai, Jing; Li, Shan; Zhang, Haixin; Zhang, Shuoxin; Tyree, Melvin T
2014-01-01
Vulnerability curves (VCs) generally can be fitted to the Weibull equation; however, a growing number of VCs appear to be recalcitrant, that is, deviate from a Weibull but seem to fit dual Weibull curves. We hypothesize that dual Weibull curves in Hippophae rhamnoides L. are due to different vessel diameter classes, inter-vessel hydraulic connections or vessels versus fibre tracheids. We used dye staining techniques, hydraulic measurements and quantitative anatomy measurements to test these hypotheses. The fibres contribute 1.3% of the total stem conductivity, which eliminates the hypothesis that fibre tracheids account for the second Weibull curve. Nevertheless, the staining pattern of vessels and fibre tracheids suggested that fibres might function as a hydraulic bridge between adjacent vessels. We also argue that fibre bridges are safer than vessel-to-vessel pits and put forward the concept as a new paradigm. Hence, we tentatively propose that the first Weibull curve may be accounted by vessels connected to each other directly by pit fields, while the second Weibull curve is associated with vessels that are connected almost exclusively by fibre bridges. Further research is needed to test the concept of fibre bridge safety in species that have recalcitrant or normal Weibull curves. © 2013 John Wiley & Sons Ltd.
Langenbucher, Frieder
2003-01-01
MS Excel is a useful tool to handle in vitro/in vivo correlation (IVIVC) distribution functions, with emphasis on the Weibull and the biexponential distribution, which are most useful for the presentation of cumulative profiles, e.g. release in vitro or urinary excretion in vivo, and differential profiles such as the plasma response in vivo. The discussion includes moments (AUC and mean) as summarizing statistics, and data-fitting algorithms for parameter estimation.
Numerical simulation of the fracture process in ceramic FPD frameworks caused by oblique loading.
Kou, Wen; Qiao, Jiyan; Chen, Li; Ding, Yansheng; Sjögren, Göran
2015-10-01
Using a newly developed three-dimensional (3D) numerical modeling code, an analysis was performed of the fracture behavior in a three-unit ceramic-based fixed partial denture (FPD) framework subjected to oblique loading. All the materials in the study were treated heterogeneously; Weibull׳s distribution law was applied to the description of the heterogeneity. The Mohr-Coulomb failure criterion with tensile strength cut-off was utilized in judging whether the material was in an elastic or failed state. The simulated loading area was placed either on the buccal or the lingual cusp of a premolar-shaped pontic with the loading direction at 30°, 45°, 60°, 75° or 90° angles to the occlusal surface. The stress distribution, fracture initiation and propagation in the framework during the loading and fracture process were analyzed. This numerical simulation allowed the cause of the framework fracture to be identified as tensile stress failure. The decisive fracture was initiated in the gingival embrasure of the pontic, regardless of whether the buccal or lingual cusp of the pontic was loaded. The stress distribution and fracture propagation process of the framework could be followed step by step from beginning to end. The bearing capacity and the rigidity of the framework vary with the loading position and direction. The framework loaded with 90° towards the occlusal surface has the highest bearing capacity and the greatest rigidity. The framework loaded with 30° towards the occlusal surface has the least rigidity indicating that oblique loading has a major impact on the fracture of ceramic frameworks. Copyright © 2015 Elsevier Ltd. All rights reserved.
Statistical behavior of the tensile property of heated cotton fiber
USDA-ARS?s Scientific Manuscript database
The temperature dependence of the tensile property of single cotton fiber was studied in the range of 160-300°C using Favimat test, and its statistical behavior was interpreted in terms of structural changes. The tenacity of control cotton fiber was well described by the single Weibull distribution,...
Bozkurt, Hayriye; D'Souza, Doris H; Davidson, P Michael
2014-09-01
Human noroviruses and hepatitis A virus (HAV) are considered as epidemiologically significant causes of foodborne disease. Therefore, studies are needed to bridge existing data gaps and determine appropriate parameters for thermal inactivation of human noroviruses and HAV. The objectives of this research were to compare the thermal inactivation kinetics of human norovirus surrogates (murine norovirus (MNV-1), and feline calicivirus (FCV-F9)) and HAV in buffered medium (2-ml vials), compare first-order and Weibull models to describe the data, calculate Arrhenius activation energy for each model, and evaluate model efficiency using selected statistical criteria. The D-values calculated from the first-order model (50-72 °C) ranged from 0.21-19.75 min for FCV-F9, 0.25-36.28 min for MNV-1, and 0.88-56.22 min for HAV. Using the Weibull model, the tD = 1 (time to destroy 1 log) for FCV-F9, MNV-1 and HAV at the same temperatures ranged from 0.10-13.27, 0.09-26.78, and 1.03-39.91 min, respectively. The z-values for FCV-F9, MNV-1, and HAV were 9.66 °C, 9.16 °C, and 14.50 °C, respectively, using the Weibull model. For the first order model, z-values were 9.36 °C, 9.32 °C, and 12.49 °C for FCV-F9, MNV-1, and HAV, respectively. For the Weibull model, estimated activation energies for FCV-F9, MNV-1, and HAV were 225, 278, and 182 kJ/mol, respectively, while the calculated activation energies for the first order model were 195, 202, and 171 kJ/mol, respectively. Knowledge of the thermal inactivation kinetics of norovirus surrogates and HAV will allow the development of processes that produce safer food products and improve consumer safety. Copyright © 2014. Published by Elsevier Ltd.
Fracture mechanics concepts in reliability analysis of monolithic ceramics
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.; Gyekenyesi, John P.
1987-01-01
Basic design concepts for high-performance, monolithic ceramic structural components are addressed. The design of brittle ceramics differs from that of ductile metals because of the inability of ceramic materials to redistribute high local stresses caused by inherent flaws. Random flaw size and orientation requires that a probabilistic analysis be performed in order to determine component reliability. The current trend in probabilistic analysis is to combine linear elastic fracture mechanics concepts with the two parameter Weibull distribution function to predict component reliability under multiaxial stress states. Nondestructive evaluation supports this analytical effort by supplying data during verification testing. It can also help to determine statistical parameters which describe the material strength variation, in particular the material threshold strength (the third Weibull parameter), which in the past was often taken as zero for simplicity.
Effect of bending on the room-temperature tensile strengths of structural ceramics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, M.G.
1992-01-01
Results for nearly fifty, room-temperature tensile tests conducted on two advanced, monolithic silicon nitride ceramics are evaluated for the effects of bending and application of various Weibull statistical analyses. Two specimen gripping systems (straight collet and tapered collet) were evaluated for both success in producing gage section failures and tendency to minimize bending at failure. Specimen fabrication and grinding technique consderations are briefly reviewed and related to their effects on successful tensile tests. Ultimate tensile strengths are related to the bending measured at specimen failure and the effects of the gripping system on bending are discussed. Finally, comparisons are mademore » between the use of censored and uncensored data sample sets for determining the maximum likelihood estimates of the Weibull parameters from the tensile strength distributions.« less
Effect of bending on the room-temperature tensile strengths of structural ceramics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, M.G.
1992-07-01
Results for nearly fifty, room-temperature tensile tests conducted on two advanced, monolithic silicon nitride ceramics are evaluated for the effects of bending and application of various Weibull statistical analyses. Two specimen gripping systems (straight collet and tapered collet) were evaluated for both success in producing gage section failures and tendency to minimize bending at failure. Specimen fabrication and grinding technique consderations are briefly reviewed and related to their effects on successful tensile tests. Ultimate tensile strengths are related to the bending measured at specimen failure and the effects of the gripping system on bending are discussed. Finally, comparisons are mademore » between the use of censored and uncensored data sample sets for determining the maximum likelihood estimates of the Weibull parameters from the tensile strength distributions.« less
Abutment design for implant-supported indirect composite molar crowns: reliability and fractography.
Bonfante, Estevam Augusto; Suzuki, Marcelo; Lubelski, William; Thompson, Van P; de Carvalho, Ricardo Marins; Witek, Lukasz; Coelho, Paulo G
2012-12-01
To investigate the reliability of titanium abutments veneered with indirect composites for implant-supported crowns and the possibility to trace back the fracture origin by qualitative fractographic analysis. Large base (LB) (6.4-mm diameter base, with a 4-mm high cone in the center for composite retention), small base (SB-4) (5.2-mm base, 4-mm high cone), and small base with cone shortened to 2 mm (SB-2) Ti abutments were used. Each abutment received incremental layers of indirect resin composite until completing the anatomy of a maxillary molar crown. Step-stress accelerated-life fatigue testing (n = 18 each) was performed in water. Weibull curves with use stress of 200 N for 50,000 and 100,000 cycles were calculated. Probability Weibull plots examined the differences between groups. Specimens were inspected in light-polarized and scanning electron microscopes for fractographic analysis. Use level probability Weibull plots showed Beta values of 0.27 for LB, 0.32 for SB-4, and 0.26 for SB-2, indicating that failures were not influenced by fatigue and damage accumulation. The data replotted as Weibull distribution showed no significant difference in the characteristic strengths between LB (794 N) and SB-4 abutments (836 N), which were both significantly higher than SB-2 (601 N). Failure mode was cohesive within the composite for all groups. Fractographic markings showed that failures initiated at the indentation area and propagated toward the margins of cohesively failed composite. Reliability was not influenced by abutment design. Qualitative fractographic analysis of the failed indirect composite was feasible. © 2012 by the American College of Prosthodontists.
Thelen, Kirstin; Coboeken, Katrin; Willmann, Stefan; Dressman, Jennifer B; Lippert, Jörg
2012-03-01
The physiological absorption model presented in part I of this work is now extended to account for dosage-form-dependent gastrointestinal (GI) transit as well as disintegration and dissolution processes of various immediate-release and modified-release dosage forms. Empirical functions of the Weibull type were fitted to experimental in vitro dissolution profiles of solid dosage forms for eight test compounds (aciclovir, caffeine, cimetidine, diclofenac, furosemide, paracetamol, phenobarbital, and theophylline). The Weibull functions were then implemented into the model to predict mean plasma concentration-time profiles of the various dosage forms. On the basis of these dissolution functions, pharmacokinetics (PK) of six model drugs was predicted well. In the case of diclofenac, deviations between predicted and observed plasma concentrations were attributable to the large variability in gastric emptying time of the enteric-coated tablets. Likewise, oral PK of furosemide was found to be predominantly governed by the gastric emptying patterns. It is concluded that the revised model for GI transit and absorption was successfully integrated with dissolution functions of the Weibull type, enabling prediction of in vivo PK profiles from in vitro dissolution data. It facilitates a comparative analysis of the parameters contributing to oral drug absorption and is thus a powerful tool for formulation design. Copyright © 2011 Wiley Periodicals, Inc.
A Modeling Approach to Fiber Fracture in Melt Impregnation
NASA Astrophysics Data System (ADS)
Ren, Feng; Zhang, Cong; Yu, Yang; Xin, Chunling; Tang, Ke; He, Yadong
2017-02-01
The effect of process variables such as roving pulling speed, melt temperature and number of pins on the fiber fracture during the processing of thermoplastic based composites was investigated in this study. The melt impregnation was used in this process of continuous glass fiber reinforced thermoplastic composites. Previous investigators have suggested a variety of models for melt impregnation, while comparatively little effort has been spent on modeling the fiber fracture caused by the viscous resin. Herein, a mathematical model was developed for impregnation process to predict the fiber fracture rate and describe the experimental results with the Weibull intensity distribution function. The optimal parameters of this process were obtained by orthogonal experiment. The results suggest that the fiber fracture is caused by viscous shear stress on fiber bundle in melt impregnation mold when pulling the fiber bundle.
Quantifying the impact of sub-grid surface wind variability on sea salt and dust emissions in CAM5
NASA Astrophysics Data System (ADS)
Zhang, Kai; Zhao, Chun; Wan, Hui; Qian, Yun; Easter, Richard C.; Ghan, Steven J.; Sakaguchi, Koichi; Liu, Xiaohong
2016-02-01
This paper evaluates the impact of sub-grid variability of surface wind on sea salt and dust emissions in the Community Atmosphere Model version 5 (CAM5). The basic strategy is to calculate emission fluxes multiple times, using different wind speed samples of a Weibull probability distribution derived from model-predicted grid-box mean quantities. In order to derive the Weibull distribution, the sub-grid standard deviation of surface wind speed is estimated by taking into account four mechanisms: turbulence under neutral and stable conditions, dry convective eddies, moist convective eddies over the ocean, and air motions induced by mesoscale systems and fine-scale topography over land. The contributions of turbulence and dry convective eddy are parameterized using schemes from the literature. Wind variabilities caused by moist convective eddies and fine-scale topography are estimated using empirical relationships derived from an operational weather analysis data set at 15 km resolution. The estimated sub-grid standard deviations of surface wind speed agree well with reference results derived from 1 year of global weather analysis at 15 km resolution and from two regional model simulations with 3 km grid spacing.The wind-distribution-based emission calculations are implemented in CAM5. In terms of computational cost, the increase in total simulation time turns out to be less than 3 %. Simulations at 2° resolution indicate that sub-grid wind variability has relatively small impacts (about 7 % increase) on the global annual mean emission of sea salt aerosols, but considerable influence on the emission of dust. Among the considered mechanisms, dry convective eddies and mesoscale flows associated with topography are major causes of dust emission enhancement. With all the four mechanisms included and without additional adjustment of uncertain parameters in the model, the simulated global and annual mean dust emission increase by about 50 % compared to the default model. By tuning the globally constant dust emission scale factor, the global annual mean dust emission, aerosol optical depth, and top-of-atmosphere radiative fluxes can be adjusted to the level of the default model, but the frequency distribution of dust emission changes, with more contribution from weaker wind events and less contribution from stronger wind events. In Africa and Asia, the overall frequencies of occurrence of dust emissions increase, and the seasonal variations are enhanced, while the geographical patterns of the emission frequency show little change.
Quantifying the impact of sub-grid surface wind variability on sea salt and dust emissions in CAM5
Zhang, Kai; Zhao, Chun; Wan, Hui; ...
2016-02-12
This paper evaluates the impact of sub-grid variability of surface wind on sea salt and dust emissions in the Community Atmosphere Model version 5 (CAM5). The basic strategy is to calculate emission fluxes multiple times, using different wind speed samples of a Weibull probability distribution derived from model-predicted grid-box mean quantities. In order to derive the Weibull distribution, the sub-grid standard deviation of surface wind speed is estimated by taking into account four mechanisms: turbulence under neutral and stable conditions, dry convective eddies, moist convective eddies over the ocean, and air motions induced by mesoscale systems and fine-scale topography overmore » land. The contributions of turbulence and dry convective eddy are parameterized using schemes from the literature. Wind variabilities caused by moist convective eddies and fine-scale topography are estimated using empirical relationships derived from an operational weather analysis data set at 15 km resolution. The estimated sub-grid standard deviations of surface wind speed agree well with reference results derived from 1 year of global weather analysis at 15 km resolution and from two regional model simulations with 3 km grid spacing.The wind-distribution-based emission calculations are implemented in CAM5. In terms of computational cost, the increase in total simulation time turns out to be less than 3 %. Simulations at 2° resolution indicate that sub-grid wind variability has relatively small impacts (about 7 % increase) on the global annual mean emission of sea salt aerosols, but considerable influence on the emission of dust. Among the considered mechanisms, dry convective eddies and mesoscale flows associated with topography are major causes of dust emission enhancement. With all the four mechanisms included and without additional adjustment of uncertain parameters in the model, the simulated global and annual mean dust emission increase by about 50 % compared to the default model. By tuning the globally constant dust emission scale factor, the global annual mean dust emission, aerosol optical depth, and top-of-atmosphere radiative fluxes can be adjusted to the level of the default model, but the frequency distribution of dust emission changes, with more contribution from weaker wind events and less contribution from stronger wind events. Lastly, in Africa and Asia, the overall frequencies of occurrence of dust emissions increase, and the seasonal variations are enhanced, while the geographical patterns of the emission frequency show little change.« less
Quantifying the impact of sub-grid surface wind variability on sea salt and dust emissions in CAM5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Kai; Zhao, Chun; Wan, Hui
This paper evaluates the impact of sub-grid variability of surface wind on sea salt and dust emissions in the Community Atmosphere Model version 5 (CAM5). The basic strategy is to calculate emission fluxes multiple times, using different wind speed samples of a Weibull probability distribution derived from model-predicted grid-box mean quantities. In order to derive the Weibull distribution, the sub-grid standard deviation of surface wind speed is estimated by taking into account four mechanisms: turbulence under neutral and stable conditions, dry convective eddies, moist convective eddies over the ocean, and air motions induced by mesoscale systems and fine-scale topography overmore » land. The contributions of turbulence and dry convective eddy are parameterized using schemes from the literature. Wind variabilities caused by moist convective eddies and fine-scale topography are estimated using empirical relationships derived from an operational weather analysis data set at 15 km resolution. The estimated sub-grid standard deviations of surface wind speed agree well with reference results derived from 1 year of global weather analysis at 15 km resolution and from two regional model simulations with 3 km grid spacing.The wind-distribution-based emission calculations are implemented in CAM5. In terms of computational cost, the increase in total simulation time turns out to be less than 3 %. Simulations at 2° resolution indicate that sub-grid wind variability has relatively small impacts (about 7 % increase) on the global annual mean emission of sea salt aerosols, but considerable influence on the emission of dust. Among the considered mechanisms, dry convective eddies and mesoscale flows associated with topography are major causes of dust emission enhancement. With all the four mechanisms included and without additional adjustment of uncertain parameters in the model, the simulated global and annual mean dust emission increase by about 50 % compared to the default model. By tuning the globally constant dust emission scale factor, the global annual mean dust emission, aerosol optical depth, and top-of-atmosphere radiative fluxes can be adjusted to the level of the default model, but the frequency distribution of dust emission changes, with more contribution from weaker wind events and less contribution from stronger wind events. Lastly, in Africa and Asia, the overall frequencies of occurrence of dust emissions increase, and the seasonal variations are enhanced, while the geographical patterns of the emission frequency show little change.« less
Statistical damage constitutive model for rocks subjected to cyclic stress and cyclic temperature
NASA Astrophysics Data System (ADS)
Zhou, Shu-Wei; Xia, Cai-Chu; Zhao, Hai-Bin; Mei, Song-Hua; Zhou, Yu
2017-10-01
A constitutive model of rocks subjected to cyclic stress-temperature was proposed. Based on statistical damage theory, the damage constitutive model with Weibull distribution was extended. Influence of model parameters on the stress-strain curve for rock reloading after stress-temperature cycling was then discussed. The proposed model was initially validated by rock tests for cyclic stress-temperature and only cyclic stress. Finally, the total damage evolution induced by stress-temperature cycling and reloading after cycling was explored and discussed. The proposed constitutive model is reasonable and applicable, describing well the stress-strain relationship during stress-temperature cycles and providing a good fit to the test results. Elastic modulus in the reference state and the damage induced by cycling affect the shape of reloading stress-strain curve. Total damage induced by cycling and reloading after cycling exhibits three stages: initial slow increase, mid-term accelerated increase, and final slow increase.
Modeling of pathogen survival during simulated gastric digestion.
Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru
2011-02-01
The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens.
Modeling of Pathogen Survival during Simulated Gastric Digestion ▿
Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru
2011-01-01
The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens. PMID:21131530
Krueger, Ute; Schimmelpfeng, Katja
2013-03-01
A sufficient staffing level in fire and rescue dispatch centers is crucial for saving lives. Therefore, it is important to estimate the expected workload properly. For this purpose, we analyzed whether a dispatch center can be considered as a call center. Current call center publications very often model call arrivals as a non-homogeneous Poisson process. This bases on the underlying assumption of the caller's independent decision to call or not to call. In case of an emergency, however, there are often calls from more than one person reporting the same incident and thus, these calls are not independent. Therefore, this paper focuses on the dependency of calls in a fire and rescue dispatch center. We analyzed and evaluated several distributions in this setting. Results are illustrated using real-world data collected from a typical German dispatch center in Cottbus ("Leitstelle Lausitz"). We identified the Pólya distribution as being superior to the Poisson distribution in describing the call arrival rate and the Weibull distribution to be more suitable than the exponential distribution for interarrival times and service times. However, the commonly used distributions offer acceptable approximations. This is important for estimating a sufficient staffing level in practice using, e.g., the Erlang-C model.
Reliability demonstration test for load-sharing systems with exponential and Weibull components
Hu, Qingpei; Yu, Dan; Xie, Min
2017-01-01
Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn’t yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics. PMID:29284030
Reliability demonstration test for load-sharing systems with exponential and Weibull components.
Xu, Jianyu; Hu, Qingpei; Yu, Dan; Xie, Min
2017-01-01
Conducting a Reliability Demonstration Test (RDT) is a crucial step in production. Products are tested under certain schemes to demonstrate whether their reliability indices reach pre-specified thresholds. Test schemes for RDT have been studied in different situations, e.g., lifetime testing, degradation testing and accelerated testing. Systems designed with several structures are also investigated in many RDT plans. Despite the availability of a range of test plans for different systems, RDT planning for load-sharing systems hasn't yet received the attention it deserves. In this paper, we propose a demonstration method for two specific types of load-sharing systems with components subject to two distributions: exponential and Weibull. Based on the assumptions and interpretations made in several previous works on such load-sharing systems, we set the mean time to failure (MTTF) of the total system as the demonstration target. We represent the MTTF as a summation of mean time between successive component failures. Next, we introduce generalized test statistics for both the underlying distributions. Finally, RDT plans for the two types of systems are established on the basis of these test statistics.
Modelling volatility recurrence intervals in the Chinese commodity futures market
NASA Astrophysics Data System (ADS)
Zhou, Weijie; Wang, Zhengxin; Guo, Haiming
2016-09-01
The law of extreme event occurrence attracts much research. The volatility recurrence intervals of Chinese commodity futures market prices are studied: the results show that the probability distributions of the scaled volatility recurrence intervals have a uniform scaling curve for different thresholds q. So we can deduce the probability distribution of extreme events from normal events. The tail of a scaling curve can be well fitted by a Weibull form, which is significance-tested by KS measures. Both short-term and long-term memories are present in the recurrence intervals with different thresholds q, which denotes that the recurrence intervals can be predicted. In addition, similar to volatility, volatility recurrence intervals also have clustering features. Through Monte Carlo simulation, we artificially synthesise ARMA, GARCH-class sequences similar to the original data, and find out the reason behind the clustering. The larger the parameter d of the FIGARCH model, the stronger the clustering effect is. Finally, we use the Fractionally Integrated Autoregressive Conditional Duration model (FIACD) to analyse the recurrence interval characteristics. The results indicated that the FIACD model may provide a method to analyse volatility recurrence intervals.
NASA Astrophysics Data System (ADS)
Kalkanis, G.; Rosso, E.
1989-09-01
Results of an accelerated test on the lifetime of a mylar-polyurethane laminated dc high voltage insulating structure are reported. This structure consists of mylar ribbons placed side by side in a number of layers, staggered and glued together with a polyurethane adhesive. The lifetime until breakdown as a function of extremely high values of voltage stress is measured and represented by a mathematical model, the inverse power law model with a 2-parameter Weibull lifetime distribution. The statistical treatment of the data — either by graphical or by analytical methods — allowed us to estimate the lifetime distribution and confidence bounds for any required normal voltage stress. The laminated structure under consideration is, according to the analysis, a very reliable dc hv insulating material, with a very good life performance according to the inverse power law model, and with an exponent of voltage stress equal to 6. A large insulator of cylindrical shape with this kind of laminated structure can be constructed by winding helically a mylar ribbon in a number of layers.
Development of a Fault Monitoring Technique for Wind Turbines Using a Hidden Markov Model.
Shin, Sung-Hwan; Kim, SangRyul; Seo, Yun-Ho
2018-06-02
Regular inspection for the maintenance of the wind turbines is difficult because of their remote locations. For this reason, condition monitoring systems (CMSs) are typically installed to monitor their health condition. The purpose of this study is to propose a fault detection algorithm for the mechanical parts of the wind turbine. To this end, long-term vibration data were collected over two years by a CMS installed on a 3 MW wind turbine. The vibration distribution at a specific rotating speed of main shaft is approximated by the Weibull distribution and its cumulative distribution function is utilized for determining the threshold levels that indicate impending failure of mechanical parts. A Hidden Markov model (HMM) is employed to propose the statistical fault detection algorithm in the time domain and the method whereby the input sequence for HMM is extracted is also introduced by considering the threshold levels and the correlation between the signals. Finally, it was demonstrated that the proposed HMM algorithm achieved a greater than 95% detection success rate by using the long-term signals.
Gravitational Effects on Closed-Cellular-Foam Microstructure
NASA Technical Reports Server (NTRS)
Noever, David A.; Cronise, Raymond J.; Wessling, Francis C.; McMannus, Samuel P.; Mathews, John; Patel, Darayas
1996-01-01
Polyurethane foam has been produced in low gravity for the first time. The cause and distribution of different void or pore sizes are elucidated from direct comparison of unit-gravity and low-gravity samples. Low gravity is found to increase the pore roundness by 17% and reduce the void size by 50%. The standard deviation for pores becomes narrower (a more homogeneous foam is produced) in low gravity. Both a Gaussian and a Weibull model fail to describe the statistical distribution of void areas, and hence the governing dynamics do not combine small voids in either a uniform or a dependent fashion to make larger voids. Instead, the void areas follow an exponential law, which effectively randomizes the production of void sizes in a nondependent fashion consistent more with single nucleation than with multiple or combining events.
Pricing of premiums for equity-linked life insurance based on joint mortality models
NASA Astrophysics Data System (ADS)
Riaman; Parmikanti, K.; Irianingsih, I.; Supian, S.
2018-03-01
Life insurance equity - linked is a financial product that not only offers protection, but also investment. The calculation of equity-linked life insurance premiums generally uses mortality tables. Because of advances in medical technology and reduced birth rates, it appears that the use of mortality tables is less relevant in the calculation of premiums. To overcome this problem, we use a combination mortality model which in this study is determined based on Indonesian Mortality table 2011 to determine the chances of death and survival. In this research, we use the Combined Mortality Model of the Weibull, Inverse-Weibull, and Gompertz Mortality Model. After determining the Combined Mortality Model, simulators calculate the value of the claim to be given and the premium price numerically. By calculating equity-linked life insurance premiums well, it is expected that no party will be disadvantaged due to the inaccuracy of the calculation result
NASA Astrophysics Data System (ADS)
Sun, Huarui; Bajo, Miguel Montes; Uren, Michael J.; Kuball, Martin
2015-01-01
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage "hot spots" at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7-0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which is consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.
Probability distribution functions for unit hydrographs with optimization using genetic algorithm
NASA Astrophysics Data System (ADS)
Ghorbani, Mohammad Ali; Singh, Vijay P.; Sivakumar, Bellie; H. Kashani, Mahsa; Atre, Atul Arvind; Asadi, Hakimeh
2017-05-01
A unit hydrograph (UH) of a watershed may be viewed as the unit pulse response function of a linear system. In recent years, the use of probability distribution functions (pdfs) for determining a UH has received much attention. In this study, a nonlinear optimization model is developed to transmute a UH into a pdf. The potential of six popular pdfs, namely two-parameter gamma, two-parameter Gumbel, two-parameter log-normal, two-parameter normal, three-parameter Pearson distribution, and two-parameter Weibull is tested on data from the Lighvan catchment in Iran. The probability distribution parameters are determined using the nonlinear least squares optimization method in two ways: (1) optimization by programming in Mathematica; and (2) optimization by applying genetic algorithm. The results are compared with those obtained by the traditional linear least squares method. The results show comparable capability and performance of two nonlinear methods. The gamma and Pearson distributions are the most successful models in preserving the rising and recession limbs of the unit hydographs. The log-normal distribution has a high ability in predicting both the peak flow and time to peak of the unit hydrograph. The nonlinear optimization method does not outperform the linear least squares method in determining the UH (especially for excess rainfall of one pulse), but is comparable.
Modelling explicit fracture of nuclear fuel pellets using peridynamics
NASA Astrophysics Data System (ADS)
Mella, R.; Wenman, M. R.
2015-12-01
Three dimensional models of explicit cracking of nuclear fuel pellets for a variety of power ratings have been explored with peridynamics, a non-local, mesh free, fracture mechanics method. These models were implemented in the explicitly integrated molecular dynamics code LAMMPS, which was modified to include thermal strains in solid bodies. The models of fuel fracture, during initial power transients, are shown to correlate with the mean number of cracks observed on the inner and outer edges of the pellet, by experimental post irradiation examination of fuel, for power ratings of 10 and 15 W g-1 UO2. The models of the pellet show the ability to predict expected features such as the mid-height pellet crack, the correct number of radial cracks and initiation and coalescence of radial cracks. This work presents a modelling alternative to empirical fracture data found in many fuel performance codes and requires just one parameter of fracture strain. Weibull distributions of crack numbers were fitted to both numerical and experimental data using maximum likelihood estimation so that statistical comparison could be made. The findings show P-values of less than 0.5% suggesting an excellent agreement between model and experimental distributions.
Does artificial aging affect mechanical properties of CAD/CAM composite materials.
Egilmez, Ferhan; Ergun, Gulfem; Cekic-Nagas, Isil; Vallittu, Pekka K; Lassila, Lippo V J
2018-01-01
The purpose of this study was to determine the flexural strength and Weibull characteristics of different CAD/CAM materials after different in vitro aging conditions. The specimens were randomly assigned to one of the six in vitro aging conditions: (1) water storage (37°C, 3 weeks), (2) boiling water (24h), (3) hydrochloric acid exposure (pH: 1.2, 24h), (4) autoclave treatment (134°C, 200kPa, 12h), (5) thermal cycling (5000 times, 5-55°C), (6) cyclic loading (100N, 50,000 cycles). No treatment was applied to the specimens in control group. Three-point bending test was used for the calculation of flexural strength. The reliability of the strength was assessed by Weibull distribution. Surface roughness and topography was examined by coherence scanning interferometry. Evaluated parameters were compared using the Kruskall-Wallis or Mann-Whitney U test. Water storage, autoclave treatment and thermal cycling significantly decreased the flexural strength of all materials (p<0.05), whereas HCl exposure or cyclic loading did not affect the properties (p>0.05). Weibull moduli of Cerasmart™ and Lava™ Ultimate were similar with control. Vita Enamic ® exhibited similar Weibull moduli in all aging groups except the HCl treated group (p>0.05). R a values of Cerasmart™ and Lava™ Ultimate were in the range of 0.053-0.088μm in the aged groups. However R a results of Vita Enamic ® were larger than 0.2μm. Flexural strength of newly developed restorative CAD/CAM materials was significantly decreased by artificial aging. Cyclic loading or HCl exposure does not affect to the flexural strength and structural reliability of Cerasmart™ and Lava™ Ultimate. Copyright © 2017 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Husak, Gregory J.; Michaelsen, Joel C.; Funk, Christopher C.
2007-01-01
Evaluating a range of scenarios that accurately reflect precipitation variability is critical for water resource applications. Inputs to these applications can be provided using location- and interval-specific probability distributions. These distributions make it possible to estimate the likelihood of rainfall being within a specified range. In this paper, we demonstrate the feasibility of fitting cell-by-cell probability distributions to grids of monthly interpolated, continent-wide data. Future work will then detail applications of these grids to improved satellite-remote sensing of drought and interpretations of probabilistic climate outlook forum forecasts. The gamma distribution is well suited to these applications because it is fairly familiar to African scientists, and capable of representing a variety of distribution shapes. This study tests the goodness-of-fit using the Kolmogorov–Smirnov (KS) test, and compares these results against another distribution commonly used in rainfall events, the Weibull. The gamma distribution is suitable for roughly 98% of the locations over all months. The techniques and results presented in this study provide a foundation for use of the gamma distribution to generate drivers for various rain-related models. These models are used as decision support tools for the management of water and agricultural resources as well as food reserves by providing decision makers with ways to evaluate the likelihood of various rainfall accumulations and assess different scenarios in Africa.
Analysis and Modeling of Realistic Compound Channels in Transparent Relay Transmissions
Kanjirathumkal, Cibile K.; Mohammed, Sameer S.
2014-01-01
Analytical approaches for the characterisation of the compound channels in transparent multihop relay transmissions over independent fading channels are considered in this paper. Compound channels with homogeneous links are considered first. Using Mellin transform technique, exact expressions are derived for the moments of cascaded Weibull distributions. Subsequently, two performance metrics, namely, coefficient of variation and amount of fade, are derived using the computed moments. These metrics quantify the possible variations in the channel gain and signal to noise ratio from their respective average values and can be used to characterise the achievable receiver performance. This approach is suitable for analysing more realistic compound channel models for scattering density variations of the environment, experienced in multihop relay transmissions. The performance metrics for such heterogeneous compound channels having distinct distribution in each hop are computed and compared with those having identical constituent component distributions. The moments and the coefficient of variation computed are then used to develop computationally efficient estimators for the distribution parameters and the optimal hop count. The metrics and estimators proposed are complemented with numerical and simulation results to demonstrate the impact of the accuracy of the approaches. PMID:24701175
Survivorship analysis when cure is a possibility: a Monte Carlo study.
Goldman, A I
1984-01-01
Parametric survivorship analyses of clinical trials commonly involves the assumption of a hazard function constant with time. When the empirical curve obviously levels off, one can modify the hazard function model by use of a Gompertz or Weibull distribution with hazard decreasing over time. Some cancer treatments are thought to cure some patients within a short time of initiation. Then, instead of all patients having the same hazard, decreasing over time, a biologically more appropriate model assumes that an unknown proportion (1 - pi) have constant high risk whereas the remaining proportion (pi) have essentially no risk. This paper discusses the maximum likelihood estimation of pi and the power curves of the likelihood ratio test. Monte Carlo studies provide results for a variety of simulated trials; empirical data illustrate the methods.
Reliability enhancement through optimal burn-in
NASA Astrophysics Data System (ADS)
Kuo, W.
1984-06-01
A numerical reliability and cost model is defined for production line burn-in tests of electronic components. The necessity of burn-in is governed by upper and lower bounds: burn-in is mandatory for operation-critical or nonreparable component; no burn-in is needed when failure effects are insignificant or easily repairable. The model considers electronic systems in terms of a series of components connected by a single black box. The infant mortality rate is described with a Weibull distribution. Performance reaches a steady state after burn-in, and the cost of burn-in is a linear function for each component. A minimum cost is calculated among the costs and total time of burn-in, shop repair, and field repair, with attention given to possible losses in future sales from inadequate burn-in testing.
A FORTRAN program for multivariate survival analysis on the personal computer.
Mulder, P G
1988-01-01
In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples.
NASA Astrophysics Data System (ADS)
Yu, Jonas C. P.; Lin, Yu-Siang; Wang, Kung-Jeng
2013-09-01
This study develops a model for inventory management consisting of a two-echelon supply chain (SC) with profit sharing and deteriorating items. The retailer and the supplier act as the leader and follower, in which the supplier faces a huge setup cost and economic order quantity ordering strategy. The market demand is affected by the sale price of the product, and the inventory has a deterioration rate following a Weibull distribution. The retailer executes three profit-sharing mechanisms to motivate the supplier to participate in SC optimisation and to extend the life cycle of the product. A search algorithm is developed to determine the solutions as using the profit-sharing mechanisms. The outcomes from numerical experiments demonstrate the profitability of the proposed model.
Koseki, Shigenobu; Nakamura, Nobutaka; Shiina, Takeo
2015-01-01
Bacterial pathogens such as Listeria monocytogenes, Escherichia coli O157:H7, Salmonella enterica, and Cronobacter sakazakii have demonstrated long-term survival in/on dry or low-water activity (aw) foods. However, there have been few comparative studies on the desiccation tolerance among these bacterial pathogens separately in a same food matrix. In the present study, the survival kinetics of the four bacterial pathogens separately inoculated onto powdered infant formula as a model low-aw food was compared during storage at 5, 22, and 35°C. No significant differences in the survival kinetics between E. coli O157:H7 and L. monocytogenes were observed. Salmonella showed significantly higher desiccation tolerance than these pathogens, and C. sakazakii demonstrated significantly higher desiccation tolerance than all other three bacteria studied. Thus, the desiccation tolerance was represented as C. sakazakii > Salmonella > E. coli O157:H7 = L. monocytogenes. The survival kinetics of each bacterium was mathematically analyzed, and the observed kinetics was successfully described using the Weibull model. To evaluate the variability of the inactivation kinetics of the tested bacterial pathogens, the Monte Carlo simulation was performed using assumed probability distribution of the estimated fitted parameters. The simulation results showed that the storage temperature significantly influenced survival of each bacterium under the dry environment, where the bacterial inactivation became faster with increasing storage temperature. Furthermore, the fitted rate and shape parameters of the Weibull model were successfully modelled as a function of temperature. The numerical simulation of the bacterial inactivation was realized using the functions of the parameters under arbitrary fluctuating temperature conditions.
Design prediction for long term stress rupture service of composite pressure vessels
NASA Technical Reports Server (NTRS)
Robinson, Ernest Y.
1992-01-01
Extensive stress rupture studies on glass composites and Kevlar composites were conducted by the Lawrence Radiation Laboratory beginning in the late 1960's and extending to about 8 years in some cases. Some of the data from these studies published over the years were incomplete or were tainted by spurious failures, such as grip slippage. Updated data sets were defined for both fiberglass and Kevlar composite stand test specimens. These updated data are analyzed in this report by a convenient form of the bivariate Weibull distribution, to establish a consistent set of design prediction charts that may be used as a conservative basis for predicting the stress rupture life of composite pressure vessels. The updated glass composite data exhibit an invariant Weibull modulus with lifetime. The data are analyzed in terms of homologous service load (referenced to the observed median strength). The equations relating life, homologous load, and probability are given, and corresponding design prediction charts are presented. A similar approach is taken for Kevlar composites, where the updated stand data do show a turndown tendency at long life accompanied by a corresponding change (increase) of the Weibull modulus. The turndown characteristic is not present in stress rupture test data of Kevlar pressure vessels. A modification of the stress rupture equations is presented to incorporate a latent, but limited, strength drop, and design prediction charts are presented that incorporate such behavior. The methods presented utilize Cartesian plots of the probability distributions (which are a more natural display for the design engineer), based on median normalized data that are independent of statistical parameters and are readily defined for any set of test data.
Risk estimates for CO exposure in man based on behavioral and physiological responses in rodents
NASA Technical Reports Server (NTRS)
Gross, M. K.
1983-01-01
An examination of animal response to CO is studied along with potential models for extrapolating animal test data to humans. The best models for extrapolating data were found to be the Probit and Weibull models.
Distribution analysis of airborne nicotine concentrations in hospitality facilities.
Schorp, Matthias K; Leyden, Donald E
2002-02-01
A number of publications report statistical summaries for environmental tobacco smoke (ETS) concentrations. Despite compelling evidence for the data not being normally distributed, these publications typically report the arithmetic mean and standard deviation of the data, thereby losing important information related to the distribution of values contained in the original data. We were interested in the frequency distributions of reported nicotine concentrations in hospitality environments and subjected available data to distribution analyses. The distribution of experimental indoor airborne nicotine concentration data taken from hospitality facilities worldwide was fit to lognormal, Weibull, exponential, Pearson (Type V), logistic, and loglogistic distribution models. Comparison of goodness of fit (GOF) parameters and indications from the literature verified the selection of a lognormal distribution as the overall best model. When individual data were not reported in the literature, statistical summaries of results were used to model sets of lognormally distributed data that are intended to mimic the original data distribution. Grouping the data into various categories led to 31 frequency distributions that were further interpreted. The median values in nonsmoking environments are about half of the median values in smoking sections. When different continents are compared, Asian, European, and North American median values in restaurants are about a factor of three below levels encountered in other hospitality facilities. On a comparison of nicotine concentrations in North American smoking sections and nonsmoking sections, median values are about one-third of the European levels. The results obtained may be used to address issues related to exposure to ETS in the hospitality sector.
Calling patterns in human communication dynamics
Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene
2013-01-01
Modern technologies not only provide a variety of communication modes (e.g., texting, cell phone conversation, and online instant messaging), but also detailed electronic traces of these communications between individuals. These electronic traces indicate that the interactions occur in temporal bursts. Here, we study intercall duration of communications of the 100,000 most active cell phone users of a Chinese mobile phone operator. We confirm that the intercall durations follow a power-law distribution with an exponential cutoff at the population level but find differences when focusing on individual users. We apply statistical tests at the individual level and find that the intercall durations follow a power-law distribution for only 3,460 individuals (3.46%). The intercall durations for the majority (73.34%) follow a Weibull distribution. We quantify individual users using three measures: out-degree, percentage of outgoing calls, and communication diversity. We find that the cell phone users with a power-law duration distribution fall into three anomalous clusters: robot-based callers, telecom fraud, and telephone sales. This information is of interest to both academics and practitioners, mobile telecom operators in particular. In contrast, the individual users with a Weibull duration distribution form the fourth cluster of ordinary cell phone users. We also discover more information about the calling patterns of these four clusters (e.g., the probability that a user will call the cr-th most contact and the probability distribution of burst sizes). Our findings may enable a more detailed analysis of the huge body of data contained in the logs of massive users. PMID:23319645
Calling patterns in human communication dynamics.
Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H Eugene
2013-01-29
Modern technologies not only provide a variety of communication modes (e.g., texting, cell phone conversation, and online instant messaging), but also detailed electronic traces of these communications between individuals. These electronic traces indicate that the interactions occur in temporal bursts. Here, we study intercall duration of communications of the 100,000 most active cell phone users of a Chinese mobile phone operator. We confirm that the intercall durations follow a power-law distribution with an exponential cutoff at the population level but find differences when focusing on individual users. We apply statistical tests at the individual level and find that the intercall durations follow a power-law distribution for only 3,460 individuals (3.46%). The intercall durations for the majority (73.34%) follow a Weibull distribution. We quantify individual users using three measures: out-degree, percentage of outgoing calls, and communication diversity. We find that the cell phone users with a power-law duration distribution fall into three anomalous clusters: robot-based callers, telecom fraud, and telephone sales. This information is of interest to both academics and practitioners, mobile telecom operators in particular. In contrast, the individual users with a Weibull duration distribution form the fourth cluster of ordinary cell phone users. We also discover more information about the calling patterns of these four clusters (e.g., the probability that a user will call the c(r)-th most contact and the probability distribution of burst sizes). Our findings may enable a more detailed analysis of the huge body of data contained in the logs of massive users.
Fitting Cure Rate Model to Breast Cancer Data of Cancer Research Center.
Baghestani, Ahmad Reza; Zayeri, Farid; Akbari, Mohammad Esmaeil; Shojaee, Leyla; Khadembashi, Naghmeh; Shahmirzalou, Parviz
2015-01-01
The Cox PH model is one of the most significant statistical models in studying survival of patients. But, in the case of patients with long-term survival, it may not be the most appropriate. In such cases, a cure rate model seems more suitable. The purpose of this study was to determine clinical factors associated with cure rate of patients with breast cancer. In order to find factors affecting cure rate (response), a non-mixed cure rate model with negative binomial distribution for latent variable was used. Variables selected were recurrence cancer, status for HER2, estrogen receptor (ER) and progesterone receptor (PR), size of tumor, grade of cancer, stage of cancer, type of surgery, age at the diagnosis time and number of removed positive lymph nodes. All analyses were performed using PROC MCMC processes in the SAS 9.2 program. The mean (SD) age of patients was equal to 48.9 (11.1) months. For these patients, 1, 5 and 10-year survival rates were 95, 79 and 50 percent respectively. All of the mentioned variables were effective in cure fraction. Kaplan-Meier curve showed cure model's use competence. Unlike other variables, existence of ER and PR positivity will increase probability of cure in patients. In the present study, Weibull distribution was used for the purpose of analysing survival times. Model fitness with other distributions such as log-N and log-logistic and other distributions for latent variable is recommended.
Investigation of Weibull statistics in fracture analysis of cast aluminum
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Entropy of dynamical social networks
NASA Astrophysics Data System (ADS)
Zhao, Kun; Karsai, Marton; Bianconi, Ginestra
2012-02-01
Dynamical social networks are evolving rapidly and are highly adaptive. Characterizing the information encoded in social networks is essential to gain insight into the structure, evolution, adaptability and dynamics. Recently entropy measures have been used to quantify the information in email correspondence, static networks and mobility patterns. Nevertheless, we still lack methods to quantify the information encoded in time-varying dynamical social networks. In this talk we present a model to quantify the entropy of dynamical social networks and use this model to analyze the data of phone-call communication. We show evidence that the entropy of the phone-call interaction network changes according to circadian rhythms. Moreover we show that social networks are extremely adaptive and are modified by the use of technologies such as mobile phone communication. Indeed the statistics of duration of phone-call is described by a Weibull distribution and is significantly different from the distribution of duration of face-to-face interactions in a conference. Finally we investigate how much the entropy of dynamical social networks changes in realistic models of phone-call or face-to face interactions characterizing in this way different type human social behavior.
Multiscale Simulation of Porous Ceramics Based on Movable Cellular Automaton Method
NASA Astrophysics Data System (ADS)
Smolin, A.; Smolin, I.; Eremina, G.; Smolina, I.
2017-10-01
The paper presents a model for simulating mechanical behaviour of multiscale porous ceramics based on movable cellular automaton method, which is a novel particle method in computational mechanics of solid. The initial scale of the proposed approach corresponds to the characteristic size of the smallest pores in the ceramics. At this scale, we model uniaxial compression of several representative samples with an explicit account of pores of the same size but with the random unique position in space. As a result, we get the average values of Young’s modulus and strength, as well as the parameters of the Weibull distribution of these properties at the current scale level. These data allow us to describe the material behaviour at the next scale level were only the larger pores are considered explicitly, while the influence of small pores is included via the effective properties determined at the previous scale level. If the pore size distribution function of the material has N maxima we need to perform computations for N - 1 levels in order to get the properties from the lowest scale up to the macroscale step by step. The proposed approach was applied to modelling zirconia ceramics with bimodal pore size distribution. The obtained results show correct behaviour of the model sample at the macroscale.
The topology of large Open Connectome networks for the human brain.
Gastner, Michael T; Ódor, Géza
2016-06-07
The structural human connectome (i.e. the network of fiber connections in the brain) can be analyzed at ever finer spatial resolution thanks to advances in neuroimaging. Here we analyze several large data sets for the human brain network made available by the Open Connectome Project. We apply statistical model selection to characterize the degree distributions of graphs containing up to nodes and edges. A three-parameter generalized Weibull (also known as a stretched exponential) distribution is a good fit to most of the observed degree distributions. For almost all networks, simple power laws cannot fit the data, but in some cases there is statistical support for power laws with an exponential cutoff. We also calculate the topological (graph) dimension D and the small-world coefficient σ of these networks. While σ suggests a small-world topology, we found that D < 4 showing that long-distance connections provide only a small correction to the topology of the embedding three-dimensional space.
The topology of large Open Connectome networks for the human brain
NASA Astrophysics Data System (ADS)
Gastner, Michael T.; Ódor, Géza
2016-06-01
The structural human connectome (i.e. the network of fiber connections in the brain) can be analyzed at ever finer spatial resolution thanks to advances in neuroimaging. Here we analyze several large data sets for the human brain network made available by the Open Connectome Project. We apply statistical model selection to characterize the degree distributions of graphs containing up to nodes and edges. A three-parameter generalized Weibull (also known as a stretched exponential) distribution is a good fit to most of the observed degree distributions. For almost all networks, simple power laws cannot fit the data, but in some cases there is statistical support for power laws with an exponential cutoff. We also calculate the topological (graph) dimension D and the small-world coefficient σ of these networks. While σ suggests a small-world topology, we found that D < 4 showing that long-distance connections provide only a small correction to the topology of the embedding three-dimensional space.
Rice, Stephen B; Chan, Christopher; Brown, Scott C; Eschbach, Peter; Han, Li; Ensor, David S; Stefaniak, Aleksandr B; Bonevich, John; Vladár, András E; Hight Walker, Angela R; Zheng, Jiwen; Starnes, Catherine; Stromberg, Arnold; Ye, Jia; Grulke, Eric A
2015-01-01
This paper reports an interlaboratory comparison that evaluated a protocol for measuring and analysing the particle size distribution of discrete, metallic, spheroidal nanoparticles using transmission electron microscopy (TEM). The study was focused on automated image capture and automated particle analysis. NIST RM8012 gold nanoparticles (30 nm nominal diameter) were measured for area-equivalent diameter distributions by eight laboratories. Statistical analysis was used to (1) assess the data quality without using size distribution reference models, (2) determine reference model parameters for different size distribution reference models and non-linear regression fitting methods and (3) assess the measurement uncertainty of a size distribution parameter by using its coefficient of variation. The interlaboratory area-equivalent diameter mean, 27.6 nm ± 2.4 nm (computed based on a normal distribution), was quite similar to the area-equivalent diameter, 27.6 nm, assigned to NIST RM8012. The lognormal reference model was the preferred choice for these particle size distributions as, for all laboratories, its parameters had lower relative standard errors (RSEs) than the other size distribution reference models tested (normal, Weibull and Rosin–Rammler–Bennett). The RSEs for the fitted standard deviations were two orders of magnitude higher than those for the fitted means, suggesting that most of the parameter estimate errors were associated with estimating the breadth of the distributions. The coefficients of variation for the interlaboratory statistics also confirmed the lognormal reference model as the preferred choice. From quasi-linear plots, the typical range for good fits between the model and cumulative number-based distributions was 1.9 fitted standard deviations less than the mean to 2.3 fitted standard deviations above the mean. Automated image capture, automated particle analysis and statistical evaluation of the data and fitting coefficients provide a framework for assessing nanoparticle size distributions using TEM for image acquisition. PMID:26361398
The effects of surface finish and grain size on the strength of sintered silicon carbide
NASA Technical Reports Server (NTRS)
You, Y. H.; Kim, Y. W.; Lee, J. G.; Kim, C. H.
1985-01-01
The effects of surface treatment and microstructure, especially abnormal grain growth, on the strength of sintered SiC were studied. The surfaces of sintered SiC were treated with 400, 800 and 1200 grit diamond wheels. Grain growth was induced by increasing the sintering times at 2050 C. The beta to alpha transformation occurred during the sintering of beta-phase starting materials and was often accompanied by abnormal grain growth. The overall strength distributions were established using Weibull statistics. The strength of the sintered SiC is limited by extrinsic surface flaws in normal-sintered specimens. The finer the surface finish and grain size, the higher the strength. But the strength of abnormal sintering specimens is limited by the abnormally grown large tabular grains. The Weibull modulus increases with decreasing grain size and decreasing grit size for grinding.
Performance analysis for mixed FSO/RF Nakagami-m and Exponentiated Weibull dual-hop airborne systems
NASA Astrophysics Data System (ADS)
Jing, Zhao; Shang-hong, Zhao; Wei-hu, Zhao; Ke-fan, Chen
2017-06-01
In this paper, the performances of mixed free-space optical (FSO)/radio frequency (RF) systems are presented based on the decode-and-forward relaying. The Exponentiated Weibull fading channel with pointing error effect is adopted for the atmospheric fluctuation of FSO channel and the RF link undergoes the Nakagami-m fading. We derived the analytical expression for cumulative distribution function (CDF) of equivalent signal-to-noise ratio (SNR). The novel mathematical presentations of outage probability and average bit-error-rate (BER) are developed based on the Meijer's G function. The analytical results show an accurately match to the Monte-Carlo simulation results. The outage and BER performance for the mixed system by decode-and-forward relay are investigated considering atmospheric turbulence and pointing error condition. The effect of aperture averaging is evaluated in all atmospheric turbulence conditions as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Huarui, E-mail: huarui.sun@bristol.ac.uk; Bajo, Miguel Montes; Uren, Michael J.
2015-01-26
Gate leakage degradation of AlGaN/GaN high electron mobility transistors under OFF-state stress is investigated using a combination of electrical, optical, and surface morphology characterizations. The generation of leakage “hot spots” at the edge of the gate is found to be strongly temperature accelerated. The time for the formation of each failure site follows a Weibull distribution with a shape parameter in the range of 0.7–0.9 from room temperature up to 120 °C. The average leakage per failure site is only weakly temperature dependent. The stress-induced structural degradation at the leakage sites exhibits a temperature dependence in the surface morphology, which ismore » consistent with a surface defect generation process involving temperature-associated changes in the breakdown sites.« less
Optimal design and use of retry in fault tolerant real-time computer systems
NASA Technical Reports Server (NTRS)
Lee, Y. H.; Shin, K. G.
1983-01-01
A new method to determin an optimal retry policy and for use in retry of fault characterization is presented. An optimal retry policy for a given fault characteristic, which determines the maximum allowable retry durations to minimize the total task completion time was derived. The combined fault characterization and retry decision, in which the characteristics of fault are estimated simultaneously with the determination of the optimal retry policy were carried out. Two solution approaches were developed, one based on the point estimation and the other on the Bayes sequential decision. The maximum likelihood estimators are used for the first approach, and the backward induction for testing hypotheses in the second approach. Numerical examples in which all the durations associated with faults have monotone hazard functions, e.g., exponential, Weibull and gamma distributions are presented. These are standard distributions commonly used for modeling analysis and faults.
NASA Astrophysics Data System (ADS)
Mohamed, Refaat; Ismail, Mahmoud H.; Newagy, Fatma; Mourad, Heba M.
2013-03-01
Stemming from the fact that the α-μ fading distribution is one of the very general fading models used in the literature to describe the small scale fading phenomenon, in this paper, closed-form expressions for the Shannon capacity of the α-μ fading channel operating under four main adaptive transmission strategies are derived assuming integer values for μ. These expressions are derived for the case of no diversity as well as for selection combining diversity with independent and identically distributed branches. The obtained expressions reduce to those previously derived in the literature for the Weibull as well as the Rayleigh fading cases, which are both special cases of the α-μ channel. Numerical results are presented for the capacity under the four adaptive transmission strategies and the effect of the fading parameter as well as the number of diversity branches is studied.
Pankaj, S K; Wan, Zifan; Colonna, William; Keener, Kevin M
2017-07-01
High voltage atmospheric cold plasma (HVACP) is a novel, non-thermal technology which has shown potential for degradation of various toxic components in wastewater. In this study, HVACP was used to examine the degradation kinetics of methyl red, crystal violet and fast green FCF dyes. HVACP discharge was found to be a source of reactive nitrogen and oxygen species. High voltage application completely degraded all dyes tested in less than 5 min treatment time. Plasma from modified gas (∼65% O 2 ) further reduced the treatment time by 50% vs. plasma from dry air. First order and Weibull models were fitted to the degradation data. The Weibull model was found better in explaining the degradation kinetics of all the treated dyes.
EU-Norsewind Using Envisat ASAR And Other Data For Offshore Wind Atlas
NASA Astrophysics Data System (ADS)
Hasager, Charlotte B.; Mouche, Alexis; Badger, Merete
2010-04-01
The EU project NORSEWIND - short for Northern Seas Wind Index Database - www.norsewind.eu has the aim to produce state-of-the-art wind atlas for the Baltic, Irish and North Seas using ground-based lidar, meteorological masts, satellite data and mesoscale modelling. So far CLS and Risø DTU have collected Envisat ASAR images for the area of interest and the first results: maps of wind statistics, Weibull scale and shape parameters, mean and energy density are presented. The results will be compared to a distributed network of high-quality in-situ observations and mesoscale model results during 2009-2011 as the in-situ data and model results become available. Wind energy is proportional with wind speed to the third power, thus even small improvements on wind speed mapping are important in this project. One challenge is to arrive at hub-height winds ~100 m above sea level.
Population pharmacokinetic model of transdermal nicotine delivered from a matrix-type patch.
Linakis, Matthew W; Rower, Joseph E; Roberts, Jessica K; Miller, Eleanor I; Wilkins, Diana G; Sherwin, Catherine M T
2017-12-01
Nicotine addiction is an issue faced by millions of individuals worldwide. As a result, nicotine replacement therapies, such as transdermal nicotine patches, have become widely distributed and used. While the pharmacokinetics of transdermal nicotine have been extensively described using noncompartmental methods, there are few data available describing the between-subject variability in transdermal nicotine pharmacokinetics. The aim of this investigation was to use population pharmacokinetic techniques to describe this variability, particularly as it pertains to the absorption of nicotine from the transdermal patch. A population pharmacokinetic parent-metabolite model was developed using plasma concentrations from 25 participants treated with transdermal nicotine. Covariates tested in this model included: body weight, body mass index, body surface area (calculated using the Mosteller equation) and sex. Nicotine pharmacokinetics were best described with a one-compartment model with absorption based on a Weibull distribution and first-order elimination and a single compartment for the major metabolite, cotinine. Body weight was a significant covariate on apparent volume of distribution of nicotine (exponential scaling factor 1.42). After the inclusion of body weight in the model, no other covariates were significant. This is the first population pharmacokinetic model to describe the absorption and disposition of transdermal nicotine and its metabolism to cotinine and the pharmacokinetic variability between individuals who were administered the patch. © 2017 The British Pharmacological Society.
Determination of Turboprop Reduction Gearbox System Fatigue Life and Reliability
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.; Lewicki, David G.; Savage, Michael; Vlcek, Brian L.
2007-01-01
Two computational models to determine the fatigue life and reliability of a commercial turboprop gearbox are compared with each other and with field data. These models are (1) Monte Carlo simulation of randomly selected lives of individual bearings and gears comprising the system and (2) two-parameter Weibull distribution function for bearings and gears comprising the system using strict-series system reliability to combine the calculated individual component lives in the gearbox. The Monte Carlo simulation included the virtual testing of 744,450 gearboxes. Two sets of field data were obtained from 64 gearboxes that were first-run to removal for cause, were refurbished and placed back in service, and then were second-run until removal for cause. A series of equations were empirically developed from the Monte Carlo simulation to determine the statistical variation in predicted life and Weibull slope as a function of the number of gearboxes failed. The resultant L(sub 10) life from the field data was 5,627 hr. From strict-series system reliability, the predicted L(sub 10) life was 774 hr. From the Monte Carlo simulation, the median value for the L(sub 10) gearbox lives equaled 757 hr. Half of the gearbox L(sub 10) lives will be less than this value and the other half more. The resultant L(sub 10) life of the second-run (refurbished) gearboxes was 1,334 hr. The apparent load-life exponent p for the roller bearings is 5.2. Were the bearing lives to be recalculated with a load-life exponent p equal to 5.2, the predicted L(sub 10) life of the gearbox would be equal to the actual life obtained in the field. The component failure distribution of the gearbox from the Monte Carlo simulation was nearly identical to that using the strict-series system reliability analysis, proving the compatibility of these methods.
Reliability of High-Voltage Tantalum Capacitors. Parts 3 and 4)
NASA Technical Reports Server (NTRS)
Teverovsky, Alexander
2010-01-01
Weibull grading test is a powerful technique that allows selection and reliability rating of solid tantalum capacitors for military and space applications. However, inaccuracies in the existing method and non-adequate acceleration factors can result in significant, up to three orders of magnitude, errors in the calculated failure rate of capacitors. This paper analyzes deficiencies of the existing technique and recommends more accurate method of calculations. A physical model presenting failures of tantalum capacitors as time-dependent-dielectric-breakdown is used to determine voltage and temperature acceleration factors and select adequate Weibull grading test conditions. This model is verified by highly accelerated life testing (HALT) at different temperature and voltage conditions for three types of solid chip tantalum capacitors. It is shown that parameters of the model and acceleration factors can be calculated using a general log-linear relationship for the characteristic life with two stress levels.
Two-state Markov-chain Poisson nature of individual cellphone call statistics
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Xie, Wen-Jie; Li, Ming-Xia; Zhou, Wei-Xing; Sornette, Didier
2016-07-01
Unfolding the burst patterns in human activities and social interactions is a very important issue especially for understanding the spreading of disease and information and the formation of groups and organizations. Here, we conduct an in-depth study of the temporal patterns of cellphone conversation activities of 73 339 anonymous cellphone users, whose inter-call durations are Weibull distributed. We find that the individual call events exhibit a pattern of bursts, that high activity periods are alternated with low activity periods. In both periods, the number of calls are exponentially distributed for individuals, but power-law distributed for the population. Together with the exponential distributions of inter-call durations within bursts and of the intervals between consecutive bursts, we demonstrate that the individual call activities are driven by two independent Poisson processes, which can be combined within a minimal model in terms of a two-state first-order Markov chain, giving significant fits for nearly half of the individuals. By measuring directly the distributions of call rates across the population, which exhibit power-law tails, we purport the existence of power-law distributions, via the ‘superposition of distributions’ mechanism. Our findings shed light on the origins of bursty patterns in other human activities.
Chakraborty, Snehasis; Rao, Pavuluri Srinivasa; Mishra, Hari Niwas
2015-10-15
High pressure inactivation of natural microbiota viz. aerobic mesophiles (AM), psychrotrophs (PC), yeasts and molds (YM), total coliforms (TC) and lactic acid bacteria (LAB) in pineapple puree was studied within the experimental domain of 0.1-600 MPa and 30-50 °C with a treatment time up to 20 min. A complete destruction of yeasts and molds was obtained at 500 MPa/50 °C/15 min; whereas no counts were detected for TC and LAB at 300 MPa/30 °C/15 min. A maximum of two log cycle reductions was obtained for YM during pulse pressurization at the severe process intensity of 600 MPa/50 °C/20 min. The Weibull model clearly described the non-linearity of the survival curves during the isobaric period. The tailing effect, as confirmed by the shape parameter (β) of the survival curve, was obtained in case of YM (β<1); whereas a shouldering effect (β>1) was observed for the other microbial groups. Analogous to thermal death kinetics, the activation energy (Ea, kJ·mol(-1)) and the activation volume (Va, mL·mol(-1)) values were computed further to describe the temperature and pressure dependencies of the scale parameter (δ, min), respectively. A higher δ value was obtained for each microbe at a lower temperature and it decreased with an increase in pressure. A secondary kinetic model was developed describing the inactivation rate (k, min(-1)) as a function of pressure (P, MPa) and temperature (T, K) including the dependencies of Ea and Va on P and T, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.
Use of passive ambient ozone (O3) samplers in vegetation effects assessment
Krupa, S.; Nosal, M.; Peterson, D.L.
2001-01-01
A stochastistic, Weibull probability model was developed and verified to simulate the underlying frequency distributions of hourly ozone (O3) concentrations (exposure dynamics) using the single, weekly mean values obtained from a passive (sodium nitrite absorbent) sampler. The simulation was based on the data derived from a co-located continuous monitor. Although at the moment the model output may be considered as being specific to the elevation and location of the study site, the results were extremely good. This effort for the approximation of the O3 exposure dynamics can be extended to other sites with similar data sets and in developing a generalized understanding of the stochastic O3 exposure-plant response relationships, conferring measurable benefits to the future use of passive O3 samplers, in the absence of continuous monitoring. Copyright ?? 2000 Elsevier Science Ltd.
MSW Time to Tumor Model and Supporting Documentation
The multistage Weibull (MSW) time-to-tumor model and related documentation were developed principally (but not exclusively) for conducting time-to-tumor analyses to support risk assessments under the IRIS program. These programs and related docum...
NASA Astrophysics Data System (ADS)
Wang, Yue; Wang, Ping; Liu, Xiaoxia; Cao, Tian
2018-03-01
The performance of decode-and-forward dual-hop mixed radio frequency / free-space optical system in urban area is studied. The RF link is modeled by the Nakagami-m distribution and the FSO link is described by the composite exponentiated Weibull (EW) fading channels with nonzero boresight pointing errors (NBPE). For comparison, the ABER results without pointing errors (PE) and those with zero boresight pointing errors (ZBPE) are also provided. The closed-form expression for the average bit error rate (ABER) in RF link is derived with the help of hypergeometric function, and that in FSO link is obtained by Meijer's G and generalized Gauss-Laguerre quadrature functions. Then, the end-to-end ABERs with binary phase shift keying modulation are achieved on the basis of the computed ABER results of RF and FSO links. The end-to-end ABER performance is further analyzed with different Nakagami-m parameters, turbulence strengths, receiver aperture sizes and boresight displacements. The result shows that with ZBPE and NBPE considered, FSO link suffers a severe ABER degradation and becomes the dominant limitation of the mixed RF/FSO system in urban area. However, aperture averaging can bring significant ABER improvement of this system. Monte Carlo simulation is provided to confirm the validity of the analytical ABER expressions.
Jucherski, Andrzej; Nastawny, Maria; Walczowski, Andrzej; Jóźwiakowski, Krzysztof; Gajewska, Magdalena
2017-06-01
The aim of the present study was to assess the technological reliability of a domestic hybrid wastewater treatment installation consisting of a classic three-chambered (volume 6 m 3 ) septic tank, a vertical flow trickling bed filled with granules of a calcinated clay material (KERAMZYT), a special wetland bed constructed on a slope, and a permeable pond used as a receiver. The test treatment plant was located at a mountain eco-tourist farm on the periphery of the spa municipality of Krynica-Zdrój, Poland. The plant's operational reliability in reducing the concentration of organic matter, measured as biochemical oxygen demand (BOD 5 ) and chemical oxygen demand (COD), was 100% when modelled by both the Weibull and the lognormal distributions. The respective reliability values for total nitrogen removal were 76.8% and 77.0%, total suspended solids - 99.5% and 92.6%, and PO 4 -P - 98.2% and 95.2%, with the differences being negligible. The installation was characterized by a very high level of technological reliability when compared with other solutions of this type. The Weibull method employed for statistical evaluation of technological reliability can also be used for comparison purposes. From the ecological perspective, the facility presented in the study has proven to be an effective tool for protecting local aquifer areas.
Garcés-Vega, Francisco; Marks, Bradley P
2014-08-01
In the last 20 years, the use of microbial reduction models has expanded significantly, including inactivation (linear and nonlinear), survival, and transfer models. However, a major constraint for model development is the impossibility to directly quantify the number of viable microorganisms below the limit of detection (LOD) for a given study. Different approaches have been used to manage this challenge, including ignoring negative plate counts, using statistical estimations, or applying data transformations. Our objective was to illustrate and quantify the effect of negative plate count data management approaches on parameter estimation for microbial reduction models. Because it is impossible to obtain accurate plate counts below the LOD, we performed simulated experiments to generate synthetic data for both log-linear and Weibull-type microbial reductions. We then applied five different, previously reported data management practices and fit log-linear and Weibull models to the resulting data. The results indicated a significant effect (α = 0.05) of the data management practices on the estimated model parameters and performance indicators. For example, when the negative plate counts were replaced by the LOD for log-linear data sets, the slope of the subsequent log-linear model was, on average, 22% smaller than for the original data, the resulting model underpredicted lethality by up to 2.0 log, and the Weibull model was erroneously selected as the most likely correct model for those data. The results demonstrate that it is important to explicitly report LODs and related data management protocols, which can significantly affect model results, interpretation, and utility. Ultimately, we recommend using only the positive plate counts to estimate model parameters for microbial reduction curves and avoiding any data value substitutions or transformations when managing negative plate counts to yield the most accurate model parameters.
Haque, Md Mazharul; Washington, Simon
2014-01-01
The use of mobile phones while driving is more prevalent among young drivers-a less experienced cohort with elevated crash risk. The objective of this study was to examine and better understand the reaction times of young drivers to a traffic event originating in their peripheral vision whilst engaged in a mobile phone conversation. The CARRS-Q advanced driving simulator was used to test a sample of young drivers on various simulated driving tasks, including an event that originated within the driver's peripheral vision, whereby a pedestrian enters a zebra crossing from a sidewalk. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free and handheld. In addition to driving the simulator each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The participants were 21-26 years old and split evenly by gender. Drivers' reaction times to a pedestrian in the zebra crossing were modelled using a parametric accelerated failure time (AFT) duration model with a Weibull distribution. Also tested where two different model specifications to account for the structured heterogeneity arising from the repeated measures experimental design. The Weibull AFT model with gamma heterogeneity was found to be the best fitting model and identified four significant variables influencing the reaction times, including phone condition, driver's age, license type (provisional license holder or not), and self-reported frequency of usage of handheld phones while driving. The reaction times of drivers were more than 40% longer in the distracted condition compared to baseline (not distracted). Moreover, the impairment of reaction times due to mobile phone conversations was almost double for provisional compared to open license holders. A reduction in the ability to detect traffic events in the periphery whilst distracted presents a significant and measurable safety concern that will undoubtedly persist unless mitigated. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)
2001-01-01
This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.
González-Ferreiro, Eduardo; Arellano-Pérez, Stéfano; Castedo-Dorado, Fernando; Hevia, Andrea; Vega, José Antonio; Vega-Nieva, Daniel; Álvarez-González, Juan Gabriel; Ruiz-González, Ana Daría
2017-01-01
The fuel complex variables canopy bulk density and canopy base height are often used to predict crown fire initiation and spread. Direct measurement of these variables is impractical, and they are usually estimated indirectly by modelling. Recent advances in predicting crown fire behaviour require accurate estimates of the complete vertical distribution of canopy fuels. The objectives of the present study were to model the vertical profile of available canopy fuel in pine stands by using data from the Spanish national forest inventory plus low-density airborne laser scanning (ALS) metrics. In a first step, the vertical distribution of the canopy fuel load was modelled using the Weibull probability density function. In a second step, two different systems of models were fitted to estimate the canopy variables defining the vertical distributions; the first system related these variables to stand variables obtained in a field inventory, and the second system related the canopy variables to airborne laser scanning metrics. The models of each system were fitted simultaneously to compensate the effects of the inherent cross-model correlation between the canopy variables. Heteroscedasticity was also analyzed, but no correction in the fitting process was necessary. The estimated canopy fuel load profiles from field variables explained 84% and 86% of the variation in canopy fuel load for maritime pine and radiata pine respectively; whereas the estimated canopy fuel load profiles from ALS metrics explained 52% and 49% of the variation for the same species. The proposed models can be used to assess the effectiveness of different forest management alternatives for reducing crown fire hazard.
NASA Astrophysics Data System (ADS)
Abaimov, Sergey G.
The concept of self-organized criticality is associated with scale-invariant, fractal behavior; this concept is also applicable to earthquake systems. It is known that the interoccurrent frequency-size distribution of earthquakes in a region is scale-invariant and obeys the Gutenberg-Richter power-law dependence. Also, the interoccurrent time-interval distribution is known to obey Poissonian statistics excluding aftershocks. However, to estimate the hazard risk for a region it is necessary to know also the recurrent behavior of earthquakes at a given point on a fault. This behavior has been investigated in the literature, however, major questions remain unresolved. The reason is the small number of earthquakes in observed sequences. To overcome this difficulty this research utilizes numerical simulations of a slider-block model and a sand-pile model. Also, experimental observations of creep events on the creeping section of the San Andreas fault are processed and sequences up to 100 events are studied. Then the recurrent behavior of earthquakes at a given point on a fault or at a given fault is investigated. It is shown that both the recurrent frequency-size and the time-interval behaviors of earthquakes obey the Weibull distribution.
An Australian stocks and flows model for asbestos.
Donovan, Sally; Pickin, Joe
2016-10-01
All available data on asbestos consumption in Australia were collated in order to determine the most common asbestos-containing materials remaining in the built environment. The proportion of asbestos contained within each material and the types of products these materials are most commonly found in was also determined. The lifetime of these asbestos containing products was estimated in order to develop a model that projects stocks and flows of asbestos products in Australia through to the year 2100. The model is based on a Weibull distribution and was built in an excel spreadsheet to make it user-friendly and accessible. The nature of the products under consideration means both their asbestos content and lifetime parameters are highly variable, and so for each of these a high and low estimate is presented along with the estimate used in the model. The user is able to vary the parameters in the model as better data become available. © The Author(s) 2016.
NASA Technical Reports Server (NTRS)
Ricks, Trenton M.; Lacy, Thomas E., Jr.; Bednarcyk, Brett A.; Arnold, Steven M.; Hutchins, John W.
2014-01-01
A multiscale modeling methodology was developed for continuous fiber composites that incorporates a statistical distribution of fiber strengths into coupled multiscale micromechanics/finite element (FE) analyses. A modified two-parameter Weibull cumulative distribution function, which accounts for the effect of fiber length on the probability of failure, was used to characterize the statistical distribution of fiber strengths. A parametric study using the NASA Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) was performed to assess the effect of variable fiber strengths on local composite failure within a repeating unit cell (RUC) and subsequent global failure. The NASA code FEAMAC and the ABAQUS finite element solver were used to analyze the progressive failure of a unidirectional SCS-6/TIMETAL 21S metal matrix composite tensile dogbone specimen at 650 degC. Multiscale progressive failure analyses were performed to quantify the effect of spatially varying fiber strengths on the RUC-averaged and global stress-strain responses and failure. The ultimate composite strengths and distribution of failure locations (predominately within the gage section) reasonably matched the experimentally observed failure behavior. The predicted composite failure behavior suggests that use of macroscale models that exploit global geometric symmetries are inappropriate for cases where the actual distribution of local fiber strengths displays no such symmetries. This issue has not received much attention in the literature. Moreover, the model discretization at a specific length scale can have a profound effect on the computational costs associated with multiscale simulations.models that yield accurate yet tractable results.
Saucedo-Reyes, Daniela; Carrillo-Salazar, José A; Román-Padilla, Lizbeth; Saucedo-Veloz, Crescenciano; Reyes-Santamaría, María I; Ramírez-Gilly, Mariana; Tecante, Alberto
2018-03-01
High hydrostatic pressure inactivation kinetics of Escherichia coli ATCC 25922 and Salmonella enterica subsp. enterica serovar Typhimurium ATCC 14028 ( S. typhimurium) in a low acid mamey pulp at four pressure levels (300, 350, 400, and 450 MPa), different exposure times (0-8 min), and temperature of 25 ± 2℃ were obtained. Survival curves showed deviations from linearity in the form of a tail (upward concavity). The primary models tested were the Weibull model, the modified Gompertz equation, and the biphasic model. The Weibull model gave the best goodness of fit ( R 2 adj > 0.956, root mean square error < 0.290) in the modeling and the lowest Akaike information criterion value. Exponential-logistic and exponential decay models, and Bigelow-type and an empirical models for b'( P) and n( P) parameters, respectively, were tested as alternative secondary models. The process validation considered the two- and one-step nonlinear regressions for making predictions of the survival fraction; both regression types provided an adequate goodness of fit and the one-step nonlinear regression clearly reduced fitting errors. The best candidate model according to the Akaike theory information, with better accuracy and more reliable predictions was the Weibull model integrated by the exponential-logistic and exponential decay secondary models as a function of time and pressure (two-step procedure) or incorporated as one equation (one-step procedure). Both mathematical expressions were used to determine the t d parameter, where the desired reductions ( 5D) (considering d = 5 ( t 5 ) as the criterion of 5 Log 10 reduction (5 D)) in both microorganisms are attainable at 400 MPa for 5.487 ± 0.488 or 5.950 ± 0.329 min, respectively, for the one- or two-step nonlinear procedure.
Predicting a future lifetime through Box-Cox transformation.
Yang, Z
1999-09-01
In predicting a future lifetime based on a sample of past lifetimes, the Box-Cox transformation method provides a simple and unified procedure that is shown in this article to meet or often outperform the corresponding frequentist solution in terms of coverage probability and average length of prediction intervals. Kullback-Leibler information and second-order asymptotic expansion are used to justify the Box-Cox procedure. Extensive Monte Carlo simulations are also performed to evaluate the small sample behavior of the procedure. Certain popular lifetime distributions, such as Weibull, inverse Gaussian and Birnbaum-Saunders are served as illustrative examples. One important advantage of the Box-Cox procedure lies in its easy extension to linear model predictions where the exact frequentist solutions are often not available.
NASA Astrophysics Data System (ADS)
Taravat, A.; Del Frate, F.
2013-09-01
As a major aspect of marine pollution, oil release into the sea has serious biological and environmental impacts. Among remote sensing systems (which is a tool that offers a non-destructive investigation method), synthetic aperture radar (SAR) can provide valuable synoptic information about the position and size of the oil spill due to its wide area coverage and day/night, and all-weather capabilities. In this paper we present a new automated method for oil-spill monitoring. A new approach is based on the combination of Weibull Multiplicative Model and machine learning techniques to differentiate between dark spots and the background. First, the filter created based on Weibull Multiplicative Model is applied to each sub-image. Second, the sub-image is segmented by two different neural networks techniques (Pulsed Coupled Neural Networks and Multilayer Perceptron Neural Networks). As the last step, a very simple filtering process is used to eliminate the false targets. The proposed approaches were tested on 20 ENVISAT and ERS2 images which contained dark spots. The same parameters were used in all tests. For the overall dataset, the average accuracies of 94.05 % and 95.20 % were obtained for PCNN and MLP methods, respectively. The average computational time for dark-spot detection with a 256 × 256 image in about 4 s for PCNN segmentation using IDL software which is the fastest one in this field at present. Our experimental results demonstrate that the proposed approach is very fast, robust and effective. The proposed approach can be applied to the future spaceborne SAR images.
A. Broido; Hsiukang Yow
1977-01-01
Even before weight loss in the low-temperature pyrolysis of cellulose becomes significant, the average degree of polymerization of the partially pyrolyzed samples drops sharply. The gel permeation chromatograms of nitrated derivatives of the samples can be described in terms of a small number of mixed size populationsâeach component fitted within reasonable limits by a...
Lima, Robson B DE; Bufalino, Lina; Alves, Francisco T; Silva, José A A DA; Ferreira, Rinaldo L C
2017-01-01
Currently, there is a lack of studies on the correct utilization of continuous distributions for dry tropical forests. Therefore, this work aims to investigate the diameter structure of a brazilian tropical dry forest and to select suitable continuous distributions by means of statistic tools for the stand and the main species. Two subsets were randomly selected from 40 plots. Diameter at base height was obtained. The following functions were tested: log-normal; gamma; Weibull 2P and Burr. The best fits were selected by Akaike's information validation criterion. Overall, the diameter distribution of the dry tropical forest was better described by negative exponential curves and positive skewness. The forest studied showed diameter distributions with decreasing probability for larger trees. This behavior was observed for both the main species and the stand. The generalization of the function fitted for the main species show that the development of individual models is needed. The Burr function showed good flexibility to describe the diameter structure of the stand and the behavior of Mimosa ophthalmocentra and Bauhinia cheilantha species. For Poincianella bracteosa, Aspidosperma pyrifolium and Myracrodum urundeuva better fitting was obtained with the log-normal function.
Statistical theory on the analytical form of cloud particle size distributions
NASA Astrophysics Data System (ADS)
Wu, Wei; McFarquhar, Greg
2017-11-01
Several analytical forms of cloud particle size distributions (PSDs) have been used in numerical modeling and remote sensing retrieval studies of clouds and precipitation, including exponential, gamma, lognormal, and Weibull distributions. However, there is no satisfying physical explanation as to why certain distribution forms preferentially occur instead of others. Theoretically, the analytical form of a PSD can be derived by directly solving the general dynamic equation, but no analytical solutions have been found yet. Instead of using a process level approach, the use of the principle of maximum entropy (MaxEnt) for determining the analytical form of PSDs from the perspective of system is examined here. Here, the issue of variability under coordinate transformations that arises using the Gibbs/Shannon definition of entropy is identified, and the use of the concept of relative entropy to avoid these problems is discussed. Focusing on cloud physics, the four-parameter generalized gamma distribution is proposed as the analytical form of a PSD using the principle of maximum (relative) entropy with assumptions on power law relations between state variables, scale invariance and a further constraint on the expectation of one state variable (e.g. bulk water mass). DOE ASR.
NASA Astrophysics Data System (ADS)
Platonov, Vladimir S.; Kislov, Alexander V.
2016-11-01
A statistical analysis of extreme weather events over coastal areas of the Russian Arctic based on observational data has revealed many interesting features of wind velocity distributions. It has been shown that the extremes contain data belonging to two different statistical populations. Each of them is reliably described by a Weibull distribution. According to the standard terminology, these sets of extremes are named ‘black swans’ and ‘dragons’. The ‘dragons’ are responsible for most extremes, surpassing the ‘black swans’ by 10 - 30 %. Since the data of the global climate model INM-CM4 do not contain ‘dragons’, the wind speed extremes are investigated on the mesoscale using the COSMO-CLM model. The modelling results reveal no differences between the ‘swans’ and ‘dragons’ situations. It could be associated with the poor sample data used. However, according to many case studies and modeling results we assume that it is caused by a rare superposition of large-scale synoptic factors and many local meso- and microscale factors (surface, coastline configuration, etc.). Further studies of extreme wind speeds in the Arctic, such as ‘black swans’ and ‘dragons’, are necessary to focus on non-hydrostatic high-resolution atmospheric modelling using downscaling techniques.
Bozkurt, Hayriye; D'Souza, Doris H; Davidson, P Michael
2015-07-01
Human noroviruses (HNoV) and hepatitis A virus (HAV) have been implicated in outbreaks linked to the consumption of presliced ready-to-eat deli meats. The objectives of this research were to determine the thermal inactivation kinetics of HNoV surrogates (murine norovirus 1 [MNV-1] and feline calicivirus strain F9 [FCV-F9]) and HAV in turkey deli meat, compare first-order and Weibull models to describe the data, and calculate Arrhenius activation energy values for each model. The D (decimal reduction time) values in the temperature range of 50 to 72°C calculated from the first-order model were 0.1 ± 0.0 to 9.9 ± 3.9 min for FCV-F9, 0.2 ± 0.0 to 21.0 ± 0.8 min for MNV-1, and 1.0 ± 0.1 to 42.0 ± 5.6 min for HAV. Using the Weibull model, the tD = 1 (time to destroy 1 log) values for FCV-F9, MNV-1, and HAV at the same temperatures ranged from 0.1 ± 0.0 to 11.9 ± 5.1 min, from 0.3 ± 0.1 to 17.8 ± 1.8 min, and from 0.6 ± 0.3 to 25.9 ± 3.7 min, respectively. The z (thermal resistance) values for FCV-F9, MNV-1, and HAV were 11.3 ± 2.1°C, 11.0 ± 1.6°C, and 13.4 ± 2.6°C, respectively, using the Weibull model. The z values using the first-order model were 11.9 ± 1.0°C, 10.9 ± 1.3°C, and 12.8 ± 1.7°C for FCV-F9, MNV-1, and HAV, respectively. For the Weibull model, estimated activation energies for FCV-F9, MNV-1, and HAV were 214 ± 28, 242 ± 36, and 154 ± 19 kJ/mole, respectively, while the calculated activation energies for the first-order model were 181 ± 16, 196 ± 5, and 167 ± 9 kJ/mole, respectively. Precise information on the thermal inactivation of HNoV surrogates and HAV in turkey deli meat was generated. This provided calculations of parameters for more-reliable thermal processes to inactivate viruses in contaminated presliced ready-to-eat deli meats and thus to reduce the risk of foodborne illness outbreaks. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Bozkurt, Hayriye; Davidson, P. Michael
2015-01-01
Human noroviruses (HNoV) and hepatitis A virus (HAV) have been implicated in outbreaks linked to the consumption of presliced ready-to-eat deli meats. The objectives of this research were to determine the thermal inactivation kinetics of HNoV surrogates (murine norovirus 1 [MNV-1] and feline calicivirus strain F9 [FCV-F9]) and HAV in turkey deli meat, compare first-order and Weibull models to describe the data, and calculate Arrhenius activation energy values for each model. The D (decimal reduction time) values in the temperature range of 50 to 72°C calculated from the first-order model were 0.1 ± 0.0 to 9.9 ± 3.9 min for FCV-F9, 0.2 ± 0.0 to 21.0 ± 0.8 min for MNV-1, and 1.0 ± 0.1 to 42.0 ± 5.6 min for HAV. Using the Weibull model, the tD = 1 (time to destroy 1 log) values for FCV-F9, MNV-1, and HAV at the same temperatures ranged from 0.1 ± 0.0 to 11.9 ± 5.1 min, from 0.3 ± 0.1 to 17.8 ± 1.8 min, and from 0.6 ± 0.3 to 25.9 ± 3.7 min, respectively. The z (thermal resistance) values for FCV-F9, MNV-1, and HAV were 11.3 ± 2.1°C, 11.0 ± 1.6°C, and 13.4 ± 2.6°C, respectively, using the Weibull model. The z values using the first-order model were 11.9 ± 1.0°C, 10.9 ± 1.3°C, and 12.8 ± 1.7°C for FCV-F9, MNV-1, and HAV, respectively. For the Weibull model, estimated activation energies for FCV-F9, MNV-1, and HAV were 214 ± 28, 242 ± 36, and 154 ± 19 kJ/mole, respectively, while the calculated activation energies for the first-order model were 181 ± 16, 196 ± 5, and 167 ± 9 kJ/mole, respectively. Precise information on the thermal inactivation of HNoV surrogates and HAV in turkey deli meat was generated. This provided calculations of parameters for more-reliable thermal processes to inactivate viruses in contaminated presliced ready-to-eat deli meats and thus to reduce the risk of foodborne illness outbreaks. PMID:25956775
Probabilistic structural analysis of a truss typical for space station
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.
1990-01-01
A three-bay, space, cantilever truss is probabilistically evaluated using the computer code NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) to identify and quantify the uncertainties and respective sensitivities associated with corresponding uncertainties in the primitive variables (structural, material, and loads parameters) that defines the truss. The distribution of each of these primitive variables is described in terms of one of several available distributions such as the Weibull, exponential, normal, log-normal, etc. The cumulative distribution function (CDF's) for the response functions considered and sensitivities associated with the primitive variables for given response are investigated. These sensitivities help in determining the dominating primitive variables for that response.
Universal behaviour in the stock market: Time dynamics of the electronic orderbook
NASA Astrophysics Data System (ADS)
Kızılersü, Ayşe; Kreer, Markus; Thomas, Anthony W.; Feindt, Michael
2016-07-01
A consequence of the digital revolution is that share trading at the stock exchange takes place via electronic order books which are accessed by traders and investors via the internet. Our empirical findings of the London Stock Exchange demonstrate that once ultra-high frequency manipulation on time scales less than around ten milliseconds is excluded, all relevant changes in the order book happen with time differences that are randomly distributed and well described by a left-truncated Weibull distribution with universal shape parameter (independent of time and same for all stocks). The universal shape parameter corresponds to maximum entropy of the distribution.
Scaling and Multifractality in Road Accidental Distances
NASA Astrophysics Data System (ADS)
Qiu, Tian; Wan, Chi; Zou, Xiang-Xiang; Wang, Xiao-Fan
Accidental distance dynamics is investigated, based on the road accidental data of the Great Britain. The distance distribution of all the districts as an ensemble presents a power law tail, which is different from that of the individual district. A universal distribution is found for different districts, by rescaling the distribution functions of individual districts, which can be well fitted by the Weibull distribution. The male and female drivers behave similarly in the distance distribution. The multifractal characteristic is further studied for the individual district and all the districts as an ensemble, and different behaviors are also revealed between them. The accidental distances of the individual district show a weak multifractality, whereas of all the districts present a strong multifractality when taking them as an ensemble.
Modelling of PM10 concentration for industrialized area in Malaysia: A case study in Shah Alam
NASA Astrophysics Data System (ADS)
N, Norazian Mohamed; Abdullah, M. M. A.; Tan, Cheng-yau; Ramli, N. A.; Yahaya, A. S.; Fitri, N. F. M. Y.
In Malaysia, the predominant air pollutants are suspended particulate matter (SPM) and nitrogen dioxide (NO2). This research is on PM10 as they may trigger harm to human health as well as environment. Six distributions, namely Weibull, log-normal, gamma, Rayleigh, Gumbel and Frechet were chosen to model the PM10 observations at the chosen industrial area i.e. Shah Alam. One-year period hourly average data for 2006 and 2007 were used for this research. For parameters estimation, method of maximum likelihood estimation (MLE) was selected. Four performance indicators that are mean absolute error (MAE), root mean squared error (RMSE), coefficient of determination (R2) and prediction accuracy (PA), were applied to determine the goodness-of-fit criteria of the distributions. The best distribution that fits with the PM10 observations in Shah Alamwas found to be log-normal distribution. The probabilities of the exceedences concentration were calculated and the return period for the coming year was predicted from the cumulative density function (cdf) obtained from the best-fit distributions. For the 2006 data, Shah Alam was predicted to exceed 150 μg/m3 for 5.9 days in 2007 with a return period of one occurrence per 62 days. For 2007, the studied area does not exceed the MAAQG of 150 μg/m3
The Dynamics of Conditioning and Extinction
Killeen, Peter R.; Sanabria, Federico; Dolgov, Igor
2009-01-01
Pigeons responded to intermittently reinforced classical conditioning trials with erratic bouts of responding to the CS. Responding depended on whether the prior trial contained a peck, food, or both. A linear-persistence/learning model moved animals into and out of a response state, and a Weibull distribution for number of within-trial responses governed in-state pecking. Variations of trial and inter-trial durations caused correlated changes in rate and probability of responding, and model parameters. A novel prediction—in the protracted absence of food, response rates can plateau above zero—was validated. The model predicted smooth acquisition functions when instantiated with the probability of food, but a more accurate jagged learning curve when instantiated with trial-to-trial records of reinforcement. The Skinnerian parameter was dominant only when food could be accelerated or delayed by pecking. These experiments provide a framework for trial-by-trial accounts of conditioning and extinction that increases the information available from the data, permitting them to comment more definitively on complex contemporary models of momentum and conditioning. PMID:19839699
Optimizing preventive maintenance policy: A data-driven application for a light rail braking system.
Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel
2017-10-01
This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions.
Theoretical Analysis of Rain Attenuation Probability
NASA Astrophysics Data System (ADS)
Roy, Surendra Kr.; Jha, Santosh Kr.; Jha, Lallan
2007-07-01
Satellite communication technologies are now highly developed and high quality, distance-independent services have expanded over a very wide area. As for the system design of the Hokkaido integrated telecommunications(HIT) network, it must first overcome outages of satellite links due to rain attenuation in ka frequency bands. In this paper theoretical analysis of rain attenuation probability on a slant path has been made. The formula proposed is based Weibull distribution and incorporates recent ITU-R recommendations concerning the necessary rain rates and rain heights inputs. The error behaviour of the model was tested with the loading rain attenuation prediction model recommended by ITU-R for large number of experiments at different probability levels. The novel slant path rain attenuastion prediction model compared to the ITU-R one exhibits a similar behaviour at low time percentages and a better root-mean-square error performance for probability levels above 0.02%. The set of presented models exhibits the advantage of implementation with little complexity and is considered useful for educational and back of the envelope computations.
Optimizing preventive maintenance policy: A data-driven application for a light rail braking system
Corman, Francesco; Kraijema, Sander; Godjevac, Milinko; Lodewijks, Gabriel
2017-01-01
This article presents a case study determining the optimal preventive maintenance policy for a light rail rolling stock system in terms of reliability, availability, and maintenance costs. The maintenance policy defines one of the three predefined preventive maintenance actions at fixed time-based intervals for each of the subsystems of the braking system. Based on work, maintenance, and failure data, we model the reliability degradation of the system and its subsystems under the current maintenance policy by a Weibull distribution. We then analytically determine the relation between reliability, availability, and maintenance costs. We validate the model against recorded reliability and availability and get further insights by a dedicated sensitivity analysis. The model is then used in a sequential optimization framework determining preventive maintenance intervals to improve on the key performance indicators. We show the potential of data-driven modelling to determine optimal maintenance policy: same system availability and reliability can be achieved with 30% maintenance cost reduction, by prolonging the intervals and re-grouping maintenance actions. PMID:29278245
The AFIS tree growth model for updating annual forest inventories in Minnesota
Margaret R. Holdaway
2000-01-01
As the Forest Service moves towards annual inventories, states may use model predictions of growth to update unmeasured plots. A tree growth model (AFIS) based on the scaled Weibull function and using the average-adjusted model form is presented. Annual diameter growth for four species was modeled using undisturbed plots from Minnesota's Aspen-Birch and Northern...
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
Preparation and Mechanical Behavior of Glass-Ceramics from Feldspathic Frits
NASA Astrophysics Data System (ADS)
da Silva, Fernanda A. N. G.; Barbato, Carla N.; França, Silvia C. A.; Silva, Ana Lúcia N.; de Andrade, Mônica C.
2017-10-01
Glass-ceramics were produced from frits with feldspar (79.09% wt/wt), alumina, sodium carbonate, potassium carbonate, borax and cerium dioxide. Feldspathic frits obtained at 1200 °C were shaped and sintered at various temperatures. Flexural strength results were analyzed by using the Weibull statistical distribution. These materials were also characterized by x-ray diffraction and scanning electron microscopy (SEM). At 600 °C, an initial leucite formation occurred as a crystalline phase, but the amorphous phase still prevailed, with low flexural strength. On the other hand, when the temperature increased to 800 °C, flexural strength also increased to approximately 70 MPa and Weibull modulus, m = 4.4 . This behavior was explained by the formation of leucite crystals dispersed within the glassy matrix, which hinders, in a certain concentration, the propagation of cracks. However, for the sintering temperature of 1000 °C, flexural strength decreased and may be associated with higher levels of this leucite crystal, in spite of the higher reliability m = 6.6.
Application of a Probalistic Sizing Methodology for Ceramic Structures
NASA Astrophysics Data System (ADS)
Rancurel, Michael; Behar-Lafenetre, Stephanie; Cornillon, Laurence; Leroy, Francois-Henri; Coe, Graham; Laine, Benoit
2012-07-01
Ceramics are increasingly used in the space industry to take advantage of their stability and high specific stiffness properties. Their brittle behaviour often leads to size them by increasing the safety factors that are applied on the maximum stresses. It induces to oversize the structures. This is inconsistent with the major driver in space architecture, the mass criteria. This paper presents a methodology to size ceramic structures based on their failure probability. Thanks to failure tests on samples, the Weibull law which characterizes the strength distribution of the material is obtained. A-value (Q0.0195%) and B-value (Q0.195%) are then assessed to take into account the limited number of samples. A knocked-down Weibull law that interpolates the A- & B- values is also obtained. Thanks to these two laws, a most-likely and a knocked- down prediction of failure probability are computed for complex ceramic structures. The application of this methodology and its validation by test is reported in the paper.
Rolling Bearing Life Prediction-Past, Present, and Future
NASA Technical Reports Server (NTRS)
Zaretsky, E V; Poplawski, J. V.; Miller, C. R.
2000-01-01
Comparisons were made between the life prediction formulas of Lundberg and Palmgren, Ioannides and Harris, and Zaretsky and full-scale ball and roller bearing life data. The effect of Weibull slope on bearing life prediction was determined. Life factors are proposed to adjust the respective life formulas to the normalized statistical life distribution of each bearing type. The Lundberg-Palmgren method resulted in the most conservative life predictions compared to Ioannides and Harris, and Zaretsky methods which produced statistically similar results. Roller profile can have significant effects on bearing life prediction results. Roller edge loading can reduce life by as much as 98 percent. The resultant predicted life not only depends on the life equation used but on the Weibull slope assumed, the least variation occurring with the Zaretsky equation. The load-life exponent p of 10/3 used in the American National Standards Institute (ANSI)/American Bearing Manufacturers Association (ABMA)/International Organization for Standardization (ISO) standards is inconsistent with the majority roller bearings designed and used today.
CRACK GROWTH ANALYSIS OF SOLID OXIDE FUEL CELL ELECTROLYTES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. Bandopadhyay; N. Nagabhushana
2003-10-01
Defects and Flaws control the structural and functional property of ceramics. In determining the reliability and lifetime of ceramics structures it is very important to quantify the crack growth behavior of the ceramics. In addition, because of the high variability of the strength and the relatively low toughness of ceramics, a statistical design approach is necessary. The statistical nature of the strength of ceramics is currently well recognized, and is usually accounted for by utilizing Weibull or similar statistical distributions. Design tools such as CARES using a combination of strength measurements, stress analysis, and statistics are available and reasonably wellmore » developed. These design codes also incorporate material data such as elastic constants as well as flaw distributions and time-dependent properties. The fast fracture reliability for ceramics is often different from their time-dependent reliability. Further confounding the design complexity, the time-dependent reliability varies with the environment/temperature/stress combination. Therefore, it becomes important to be able to accurately determine the behavior of ceramics under simulated application conditions to provide a better prediction of the lifetime and reliability for a given component. In the present study, Yttria stabilized Zirconia (YSZ) of 9.6 mol% Yttria composition was procured in the form of tubes of length 100 mm. The composition is of interest as tubular electrolytes for Solid Oxide Fuel Cells. Rings cut from the tubes were characterized for microstructure, phase stability, mechanical strength (Weibull modulus) and fracture mechanisms. The strength at operating condition of SOFCs (1000 C) decreased to 95 MPa as compared to room temperature strength of 230 MPa. However, the Weibull modulus remains relatively unchanged. Slow crack growth (SCG) parameter, n = 17 evaluated at room temperature in air was representative of well studied brittle materials. Based on the results, further work was planned to evaluate the strength degradation, modulus and failure in more representative environment of the SOFCs.« less
Improved silicon carbide for advanced heat engines
NASA Technical Reports Server (NTRS)
Whalen, Thomas J.
1988-01-01
This is the third annual technical report for the program entitled, Improved Silicon Carbide for Advanced Heat Engines, for the period February 16, 1987 to February 15, 1988. The objective of the original program was the development of high strength, high reliability silicon carbide parts with complex shapes suitable for use in advanced heat engines. Injection molding is the forming method selected for the program because it is capable of forming complex parts adaptable for mass production on an economically sound basis. The goals of the revised program are to reach a Weibull characteristic strength of 550 MPa (80 ksi) and a Weibull modulus of 16 for bars tested in 4-point loading. Two tasks are discussed: Task 1 which involves materials and process improvements, and Task 2 which is a MOR bar matrix to improve strength and reliability. Many statistically designed experiments were completed under task 1 which improved the composition of the batches, the mixing of the powders, the sinter and anneal cycles. The best results were obtained by an attritor mixing process which yielded strengths in excess of 550 MPa (80 ksi) and an individual Weibull modulus of 16.8 for a 9-sample group. Strengths measured at 1200 and 1400 C were equal to the room temperature strength. Annealing of machined test bars significantly improved the strength. Molding yields were measured and flaw distributions were observed to follow a Poisson process. The second iteration of the Task 2 matrix experiment is described.
Fluctuations in time intervals of financial data from the view point of the Gini index
NASA Astrophysics Data System (ADS)
Sazuka, Naoya; Inoue, Jun-ichi
2007-09-01
We propose an approach to explain fluctuations in time intervals of financial markets data from the view-point of the Gini index. We show the explicit form of the Gini index for a Weibull distribution: A good candidate to describe the first passage time of foreign exchange rate. The analytical expression of the Gini index compares well with the value obtained from empirical data.
Rescaled earthquake recurrence time statistics: application to microrepeaters
NASA Astrophysics Data System (ADS)
Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru
2009-01-01
Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.
Rodrigo, D; Barbosa-Cánovas, G V; Martínez, A; Rodrigo, M
2003-12-01
The effects of pulsed electric fields (PEFs) on pectin methyl esterase (PME), molds and yeast, and total flora in fresh (nonpasteurized) mixed orange and carrot juice were studied. The PEF effect was more extensive when juices with high levels of initial PME activity were subjected to treatment and when PEF treatment (at 25 kV/cm for 340 micros) was combined with a moderate temperature (63 degrees C), with the maximum level of PME inactivation being 81.4%. These conditions produced 3.7 decimal reductions in molds and yeast and 2.4 decimal reductions in total flora. Experimental inactivation data for PME, molds and yeast, and total flora were fitted to Bigelow, Hülsheger, and Weibull inactivation models by nonlinear regression. The best fit (lowest mean square error) was obtained with the Weibull model.
Baldi, Pierre
2010-01-01
As repositories of chemical molecules continue to expand and become more open, it becomes increasingly important to develop tools to search them efficiently and assess the statistical significance of chemical similarity scores. Here we develop a general framework for understanding, modeling, predicting, and approximating the distribution of chemical similarity scores and its extreme values in large databases. The framework can be applied to different chemical representations and similarity measures but is demonstrated here using the most common binary fingerprints with the Tanimoto similarity measure. After introducing several probabilistic models of fingerprints, including the Conditional Gaussian Uniform model, we show that the distribution of Tanimoto scores can be approximated by the distribution of the ratio of two correlated Normal random variables associated with the corresponding unions and intersections. This remains true also when the distribution of similarity scores is conditioned on the size of the query molecules in order to derive more fine-grained results and improve chemical retrieval. The corresponding extreme value distributions for the maximum scores are approximated by Weibull distributions. From these various distributions and their analytical forms, Z-scores, E-values, and p-values are derived to assess the significance of similarity scores. In addition, the framework allows one to predict also the value of standard chemical retrieval metrics, such as Sensitivity and Specificity at fixed thresholds, or ROC (Receiver Operating Characteristic) curves at multiple thresholds, and to detect outliers in the form of atypical molecules. Numerous and diverse experiments carried in part with large sets of molecules from the ChemDB show remarkable agreement between theory and empirical results. PMID:20540577
NASA Astrophysics Data System (ADS)
Demiray, Engin; Tulek, Yahya
2017-05-01
Rehydration, which is a complex process aimed at the restoration of raw material properties when dried material comes in contact with water. In the present research, studies were conducted to probe the kinetics of rehydration of sun-dried red peppers. The kinetics associated with rehydrating sun-dried red peppers was studied at three different temperatures (25, 35 and 45 °C). To describe the rehydration kinetics, four different models, Peleg's, Weibull, first order and exponential association, were considered. Between these four models proposed Weibull model gave a better fit for all rehydration conditions applied. The effective moisture diffusivity values of red peppers increased as water rehydration temperature increased. The values of the effective moisture diffusivity of red peppers were in the range 1.37 × 10-9-1.48 × 10-9 m2 s-1. On the other hand, the activation energy for rehydration kinetic was also calculated using Arrhenius equation and found as 3.17 kJ mol-1.
Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.
2012-01-01
This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936
The acquisition of conditioned responding.
Harris, Justin A
2011-04-01
This report analyzes the acquisition of conditioned responses in rats trained in a magazine approach paradigm. Following the suggestion by Gallistel, Fairhurst, and Balsam (2004), Weibull functions were fitted to the trial-by-trial response rates of individual rats. These showed that the emergence of responding was often delayed, after which the response rate would increase relatively gradually across trials. The fit of the Weibull function to the behavioral data of each rat was equaled by that of a cumulative exponential function incorporating a response threshold. Thus, the growth in conditioning strength on each trial can be modeled by the derivative of the exponential--a difference term of the form used in many models of associative learning (e.g., Rescorla & Wagner, 1972). Further analyses, comparing the acquisition of responding with a continuously reinforced stimulus (CRf) and a partially reinforced stimulus (PRf), provided further evidence in support of the difference term. In conclusion, the results are consistent with conventional models that describe learning as the growth of associative strength, incremented on each trial by an error-correction process.
Analysis of an experiment aimed at improving the reliability of transmission centre shafts.
Davis, T P
1995-01-01
Smith (1991) presents a paper proposing the use of Weibull regression models to establish dependence of failure data (usually times) on covariates related to the design of the test specimens and test procedures. In his article Smith made the point that good experimental design was as important in reliability applications as elsewhere, and in view of the current interest in design inspired by Taguchi and others, we pay some attention in this article to that topic. A real case study from the Ford Motor Company is presented. Our main approach is to utilize suggestions in the literature for applying standard least squares techniques of experimental analysis even when there is likely to be nonnormal error, and censoring. This approach lacks theoretical justification, but its appeal is its simplicity and flexibility. For completeness we also include some analysis based on the proportional hazards model, and in an attempt to link back to Smith (1991), look at a Weibull regression model.
Castedo-Dorado, Fernando; Hevia, Andrea; Vega, José Antonio; Vega-Nieva, Daniel; Ruiz-González, Ana Daría
2017-01-01
The fuel complex variables canopy bulk density and canopy base height are often used to predict crown fire initiation and spread. Direct measurement of these variables is impractical, and they are usually estimated indirectly by modelling. Recent advances in predicting crown fire behaviour require accurate estimates of the complete vertical distribution of canopy fuels. The objectives of the present study were to model the vertical profile of available canopy fuel in pine stands by using data from the Spanish national forest inventory plus low-density airborne laser scanning (ALS) metrics. In a first step, the vertical distribution of the canopy fuel load was modelled using the Weibull probability density function. In a second step, two different systems of models were fitted to estimate the canopy variables defining the vertical distributions; the first system related these variables to stand variables obtained in a field inventory, and the second system related the canopy variables to airborne laser scanning metrics. The models of each system were fitted simultaneously to compensate the effects of the inherent cross-model correlation between the canopy variables. Heteroscedasticity was also analyzed, but no correction in the fitting process was necessary. The estimated canopy fuel load profiles from field variables explained 84% and 86% of the variation in canopy fuel load for maritime pine and radiata pine respectively; whereas the estimated canopy fuel load profiles from ALS metrics explained 52% and 49% of the variation for the same species. The proposed models can be used to assess the effectiveness of different forest management alternatives for reducing crown fire hazard. PMID:28448524
Model-free estimation of the psychometric function
Żychaluk, Kamila; Foster, David H.
2009-01-01
A subject's response to the strength of a stimulus is described by the psychometric function, from which summary measures, such as a threshold or slope, may be derived. Traditionally, this function is estimated by fitting a parametric model to the experimental data, usually the proportion of successful trials at each stimulus level. Common models include the Gaussian and Weibull cumulative distribution functions. This approach works well if the model is correct, but it can mislead if not. In practice, the correct model is rarely known. Here, a nonparametric approach based on local linear fitting is advocated. No assumption is made about the true model underlying the data, except that the function is smooth. The critical role of the bandwidth is identified, and its optimum value estimated by a cross-validation procedure. As a demonstration, seven vision and hearing data sets were fitted by the local linear method and by several parametric models. The local linear method frequently performed better and never worse than the parametric ones. Supplemental materials for this article can be downloaded from app.psychonomic-journals.org/content/supplemental. PMID:19633355
The impact of household wealth on child survival in Ghana.
Lartey, Stella T; Khanam, Rasheda; Takahashi, Shingo
2016-11-22
Improving child health is one of the major policy agendas for most of the governments, especially in the developing countries. These governments have been implementing various strategies such as improving healthcare financing, improving access to health, increasing educational level, and income level of the household to improve child health. Despite all these efforts, under-five and infant mortality rates remain high in many developing nations. Some previous studies examined how economic development or household's economic condition contributes to child survival in developing countries. In Ghana, the question as to what extent does economic circumstances of households reduces infant and child mortality still remain largely unanswered. Thus, the purpose of this study is to investigate the extent to which wealth affects the survival of under-five children, using data from the Demographic and Health Survey (DHS) of Ghana. In this study, we use four waves of data from Demographic and Health Surveys (DHS) of Ghana from 1993 to 2008. The DHS is a detailed data set that provides comprehensive information on households and their demographic characteristics in Ghana. Data was obtained by distributing questionnaires to women (from 6000 households) of reproductive age between 15 and 49 years, which asked, among other things, their birth history information. The Weibull hazard model with gamma frailty was used to estimate wealth effect, as well as the trend of wealth effect on child's survival probability. We find that household wealth status has a significant effect on the child survival in Ghana. A child is more likely to survive when he/she is from a household with high wealth status. Among other factors, birth spacing and parental education were found to be highly significant to increase a child's survival probability. Our findings offer plausible mechanisms for the association of household wealth and child survival. We therefore suggest that the Government of Ghana strengthens and sustains improved livelihood programs, which reduce poverty. They should also take further initiatives that will increase adult education and improve health knowledge. To the best of our knowledge, this is the first study in Ghana that combines four cross sectional data sets from DHS to study a policy-relevant question. We extend Standard Weibull hazard model into Weibull hazard model with gamma frailty, which gives us a more accurate estimation. Finally, the findings of this study are of interest not only because they provide insights into the determinants of child health in Ghana and other developing countries, but they also suggest policies beyond the scope of health.
Development of a subway operation incident delay model using accelerated failure time approaches.
Weng, Jinxian; Zheng, Yang; Yan, Xuedong; Meng, Qiang
2014-12-01
This study aims to develop a subway operational incident delay model using the parametric accelerated time failure (AFT) approach. Six parametric AFT models including the log-logistic, lognormal and Weibull models, with fixed and random parameters are built based on the Hong Kong subway operation incident data from 2005 to 2012, respectively. In addition, the Weibull model with gamma heterogeneity is also considered to compare the model performance. The goodness-of-fit test results show that the log-logistic AFT model with random parameters is most suitable for estimating the subway incident delay. First, the results show that a longer subway operation incident delay is highly correlated with the following factors: power cable failure, signal cable failure, turnout communication disruption and crashes involving a casualty. Vehicle failure makes the least impact on the increment of subway operation incident delay. According to these results, several possible measures, such as the use of short-distance and wireless communication technology (e.g., Wifi and Zigbee) are suggested to shorten the delay caused by subway operation incidents. Finally, the temporal transferability test results show that the developed log-logistic AFT model with random parameters is stable over time. Copyright © 2014 Elsevier Ltd. All rights reserved.
CARES - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES
NASA Technical Reports Server (NTRS)
Nemeth, N. N.
1994-01-01
The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES calculates the fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings. The program uses results from a commercial structural analysis program (MSC/NASTRAN or ANSYS) to evaluate component reliability due to inherent surface and/or volume type flaws. A multiple material capability allows the finite element model reliability to be a function of many different ceramic material statistical characterizations. The reliability analysis uses element stress, temperature, area, and volume output, which are obtained from two dimensional shell and three dimensional solid isoparametric or axisymmetric finite elements. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multi-axial stress states on material strength. The shear-sensitive Batdorf model requires a user-selected flaw geometry and a mixed-mode fracture criterion. Flaws intersecting the surface and imperfections embedded in the volume can be modeled. The total strain energy release rate theory is used as a mixed mode fracture criterion for co-planar crack extension. Out-of-plane crack extension criteria are approximated by a simple equation with a semi-empirical constant that can model the maximum tangential stress theory, the minimum strain energy density criterion, the maximum strain energy release rate theory, or experimental results. For comparison, Griffith's maximum tensile stress theory, the principle of independent action, and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. A more limited program, CARES/PC (COSMIC number LEW-15248) runs on a personal computer and estimates ceramic material properties from three-point bend bar data. CARES/PC does not perform fast fracture reliability estimation. CARES is written in FORTRAN 77 and has been implemented on DEC VAX series computers under VMS and on IBM 370 series computers under VM/CMS. On a VAX, CARES requires 10Mb of main memory. Five MSC/NASTRAN example problems and two ANSYS example problems are provided. There are two versions of CARES supplied on the distribution tape, CARES1 and CARES2. CARES2 contains sub-elements and CARES1 does not. CARES is available on a 9-track 1600 BPI VAX FILES-11 format magnetic tape (standard media) or in VAX BACKUP format on a TK50 tape cartridge. The program requires a FORTRAN 77 compiler and about 12Mb memory. CARES was developed in 1990. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. IBM 370 is a trademark of International Business Machines. MSC/NASTRAN is a trademark of MacNeal-Schwendler Corporation. ANSYS is a trademark of Swanson Analysis Systems, Inc.
Effect of Roller Profile on Cylindrical Roller Bearing Life Prediction
NASA Technical Reports Server (NTRS)
Poplawski, Joseph V.; Zaretsky, Erwin V.; Peters, Steven M.
2000-01-01
Four roller profiles used in cylindrical roller bearing design and manufacture were analyzed using both a closed form solution and finite element analysis (FEA) for stress and life. The roller profiles analyzed were flat, tapered end, aerospace, and fully crowned loaded against a flat raceway. Four rolling-element bearing life models were chosen for this analysis and compared. These were those of Weibull, Lundberg and Palmgren, Ioannides and Harris, and Zaretsky. The flat roller profile without edge loading has the longest predicted life. However, edge loading can reduce life by as much as 98 percent. The end tapered profile produced the highest lives but not significantly different than the aerospace profile. The fully crowned profile produces the lowest lives. The resultant predicted life at each stress condition not only depends on the life equation used but also on the Weibull slope assumed. For Weibull slopes of 1.5 and 2, both Lundberg-Palmgren and Iaonnides-Harris equations predict lower lives than the ANSI/ABMAJISO standards. Based upon the Hertz stresses for line contact, the accepted load-life exponent of 10/3 results in a maximum Hertz stress-life exponent equal to 6.6. This value is inconsistent with that experienced in the field.
Trudeau, Michaela P; Verma, Harsha; Sampedro, Fernando; Urriola, Pedro E; Shurson, Gerald C; McKelvey, Jessica; Pillai, Suresh D; Goyal, Sagar M
2016-01-01
Infection with porcine epidemic diarrhea virus (PEDV) causes diarrhea, vomiting, and high mortality in suckling pigs. Contaminated feed has been suggested as a vehicle of transmission for PEDV. The objective of this study was to compare thermal and electron beam processing, and the inclusion of feed additives on the inactivation of PEDV in feed. Feed samples were spiked with PEDV and then heated to 120-145°C for up to 30 min or irradiated at 0-50 kGy. Another set of feed samples spiked with PEDV and mixed with Ultracid P (Nutriad), Activate DA (Novus International), KEM-GEST (Kemin Agrifood), Acid Booster (Agri-Nutrition), sugar or salt was incubated at room temperature (~25°C) for up to 21 days. At the end of incubation, the virus titers were determined by inoculation of Vero-81 cells and the virus inactivation kinetics were modeled using the Weibull distribution model. The Weibull kinetic parameter delta represented the time or eBeam dose required to reduce virus concentration by 1 log. For thermal processing, delta values ranged from 16.52 min at 120°C to 1.30 min at 145°C. For eBeam processing, a target dose of 50 kGy reduced PEDV concentration by 3 log. All additives tested were effective in reducing the survival of PEDV when compared with the control sample (delta = 17.23 days). Activate DA (0.81) and KEM-GEST (3.28) produced the fastest inactivation. In conclusion, heating swine feed at temperatures over 130°C or eBeam processing of feed with a dose over 50 kGy are effective processing steps to reduce PEDV survival. Additionally, the inclusion of selected additives can decrease PEDV survivability.
Hazard function analysis for flood planning under nonstationarity
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-05-01
The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.
Characterization of intermittency in renewal processes: Application to earthquakes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akimoto, Takuma; Hasumi, Tomohiro; Aizawa, Yoji
2010-03-15
We construct a one-dimensional piecewise linear intermittent map from the interevent time distribution for a given renewal process. Then, we characterize intermittency by the asymptotic behavior near the indifferent fixed point in the piecewise linear intermittent map. Thus, we provide a framework to understand a unified characterization of intermittency and also present the Lyapunov exponent for renewal processes. This method is applied to the occurrence of earthquakes using the Japan Meteorological Agency and the National Earthquake Information Center catalog. By analyzing the return map of interevent times, we find that interevent times are not independent and identically distributed random variablesmore » but that the conditional probability distribution functions in the tail obey the Weibull distribution.« less
Statistical modeling of optical attenuation measurements in continental fog conditions
NASA Astrophysics Data System (ADS)
Khan, Muhammad Saeed; Amin, Muhammad; Awan, Muhammad Saleem; Minhas, Abid Ali; Saleem, Jawad; Khan, Rahimdad
2017-03-01
Free-space optics is an innovative technology that uses atmosphere as a propagation medium to provide higher data rates. These links are heavily affected by atmospheric channel mainly because of fog and clouds that act to scatter and even block the modulated beam of light from reaching the receiver end, hence imposing severe attenuation. A comprehensive statistical study of the fog effects and deep physical understanding of the fog phenomena are very important for suggesting improvements (reliability and efficiency) in such communication systems. In this regard, 6-months real-time measured fog attenuation data are considered and statistically investigated. A detailed statistical analysis related to each fog event for that period is presented; the best probability density functions are selected on the basis of Akaike information criterion, while the estimates of unknown parameters are computed by maximum likelihood estimation technique. The results show that most fog attenuation events follow normal mixture distribution and some follow the Weibull distribution.
Bridge element deterioration rates.
DOT National Transportation Integrated Search
2008-10-01
This report describes the development of bridge element deterioration rates using the NYSDOT : bridge inspection database using Markov chains and Weibull-based approaches. It is observed : that Weibull-based approach is more reliable for developing b...
A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The two identified failure modes follow different acceleration functions. Catastrophic failures follow the traditional power-law relationship to the applied voltage. Slow degradation failures fit well to an exponential law relationship to the applied electrical field. Finally, the impact of capacitor structure on the reliability of BME capacitors is discussed with respect to the number of dielectric layers in an MLCC unit, the number of BaTiO3 grains per dielectric layer, and the chip size of the capacitor device.
Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions
2015-12-01
spray formed when a fast gas stream blows over a liquid volume.” As a theoretical justification, they showed that Gamma size distributions are...of Fracture, 140, 243, 2006 P.-K. Wu, G. A. Ruff, and G. M. Faeth, Primary Breakup in Liquid - Gas Mixing Layers, Atomization and Sprays, 1, 421-440...103 meter (m) barn (b) 1 × 10–28 square meter (m2) gallon (gal, U.S. liquid ) 3.785 412 × 10–3 cubic meter (m3) cubic foot (ft3) 2.831 685 × 10–2
NASA Astrophysics Data System (ADS)
Pathak, Savita; Mondal, Seema Sarkar
2010-10-01
A multi-objective inventory model of deteriorating item has been developed with Weibull rate of decay, time dependent demand, demand dependent production, time varying holding cost allowing shortages in fuzzy environments for non- integrated and integrated businesses. Here objective is to maximize the profit from different deteriorating items with space constraint. The impreciseness of inventory parameters and goals for non-integrated business has been expressed by linear membership functions. The compromised solutions are obtained by different fuzzy optimization methods. To incorporate the relative importance of the objectives, the different cardinal weights crisp/fuzzy have been assigned. The models are illustrated with numerical examples and results of models with crisp/fuzzy weights are compared. The result for the model assuming them to be integrated business is obtained by using Generalized Reduced Gradient Method (GRG). The fuzzy integrated model with imprecise inventory cost is formulated to optimize the possibility necessity measure of fuzzy goal of the objective function by using credibility measure of fuzzy event by taking fuzzy expectation. The results of crisp/fuzzy integrated model are illustrated with numerical examples and results are compared.
Markov Transition Model to Dementia with Death as a Competing Event.
Wei, Shaoceng; Xu, Liou; Kryscio, Richard J
2014-12-01
This study evaluates the effect of death as a competing event to the development of dementia in a longitudinal study of the cognitive status of elderly subjects. A multi-state Markov model with three transient states: intact cognition, mild cognitive impairment (M.C.I.) and global impairment (G.I.) and one absorbing state: dementia is used to model the cognitive panel data; transitions among states depend on four covariates age, education, prior state (intact cognition, or M.C.I., or G.I.) and the presence/absence of an apolipoprotein E-4 allele (APOE4). A Weibull model and a Cox proportional hazards (Cox PH) model are used to fit the survival from death based on age at entry and the APOE4 status. A shared random effect correlates this survival time with the transition model. Simulation studies determine the sensitivity of the maximum likelihood estimates to the violations of the Weibull and Cox PH model assumptions. Results are illustrated with an application to the Nun Study, a longitudinal cohort of 672 participants 75+ years of age at baseline and followed longitudinally with up to ten cognitive assessments per nun.
Markov Transition Model to Dementia with Death as a Competing Event
Wei, Shaoceng; Xu, Liou; Kryscio, Richard J.
2014-01-01
This study evaluates the effect of death as a competing event to the development of dementia in a longitudinal study of the cognitive status of elderly subjects. A multi-state Markov model with three transient states: intact cognition, mild cognitive impairment (M.C.I.) and global impairment (G.I.) and one absorbing state: dementia is used to model the cognitive panel data; transitions among states depend on four covariates age, education, prior state (intact cognition, or M.C.I., or G.I.) and the presence/absence of an apolipoprotein E-4 allele (APOE4). A Weibull model and a Cox proportional hazards (Cox PH) model are used to fit the survival from death based on age at entry and the APOE4 status. A shared random effect correlates this survival time with the transition model. Simulation studies determine the sensitivity of the maximum likelihood estimates to the violations of the Weibull and Cox PH model assumptions. Results are illustrated with an application to the Nun Study, a longitudinal cohort of 672 participants 75+ years of age at baseline and followed longitudinally with up to ten cognitive assessments per nun. PMID:25110380
Li, Longbiao
2016-01-01
In this paper, the fatigue life of fiber-reinforced ceramic-matrix composites (CMCs) with different fiber preforms, i.e., unidirectional, cross-ply, 2D (two dimensional), 2.5D and 3D CMCs at room and elevated temperatures in air and oxidative environments, has been predicted using the micromechanics approach. An effective coefficient of the fiber volume fraction along the loading direction (ECFL) was introduced to describe the fiber architecture of preforms. The statistical matrix multicracking model and fracture mechanics interface debonding criterion were used to determine the matrix crack spacing and interface debonded length. Under cyclic fatigue loading, the fiber broken fraction was determined by combining the interface wear model and fiber statistical failure model at room temperature, and interface/fiber oxidation model, interface wear model and fiber statistical failure model at elevated temperatures, based on the assumption that the fiber strength is subjected to two-parameter Weibull distribution and the load carried by broken and intact fibers satisfies the Global Load Sharing (GLS) criterion. When the broken fiber fraction approaches the critical value, the composites fatigue fracture. PMID:28773332
Lin, Wei-Shao; Ercoli, Carlo; Feng, Changyong; Morton, Dean
2012-07-01
The objective of this study was to compare the effect of veneering porcelain (monolithic or bilayer specimens) and core fabrication technique (heat-pressed or CAD/CAM) on the biaxial flexural strength and Weibull modulus of leucite-reinforced and lithium-disilicate glass ceramics. In addition, the effect of veneering technique (heat-pressed or powder/liquid layering) for zirconia ceramics on the biaxial flexural strength and Weibull modulus was studied. Five ceramic core materials (IPS Empress Esthetic, IPS Empress CAD, IPS e.max Press, IPS e.max CAD, IPS e.max ZirCAD) and three corresponding veneering porcelains (IPS Empress Esthetic Veneer, IPS e.max Ceram, IPS e.max ZirPress) were selected for this study. Each core material group contained three subgroups based on the core material thickness and the presence of corresponding veneering porcelain as follows: 1.5 mm core material only (subgroup 1.5C), 0.8 mm core material only (subgroup 0.8C), and 1.5 mm core/veneer group: 0.8 mm core with 0.7 mm corresponding veneering porcelain with a powder/liquid layering technique (subgroup 0.8C-0.7VL). The ZirCAD group had one additional 1.5 mm core/veneer subgroup with 0.7 mm heat-pressed veneering porcelain (subgroup 0.8C-0.7VP). The biaxial flexural strengths were compared for each subgroup (n = 10) according to ISO standard 6872:2008 with ANOVA and Tukey's post hoc multiple comparison test (p≤ 0.05). The reliability of strength was analyzed with the Weibull distribution. For all core materials, the 1.5 mm core/veneer subgroups (0.8C-0.7VL, 0.8C-0.7VP) had significantly lower mean biaxial flexural strengths (p < 0.0001) than the other two subgroups (subgroups 1.5C and 0.8C). For the ZirCAD group, the 0.8C-0.7VL subgroup had significantly lower flexural strength (p= 0.004) than subgroup 0.8C-0.7VP. Nonetheless, both veneered ZirCAD groups showed greater flexural strength than the monolithic Empress and e.max groups, regardless of core thickness and fabrication techniques. Comparing fabrication techniques, Empress Esthetic/CAD, e.max Press/CAD had similar biaxial flexural strength (p= 0.28 for Empress pair; p= 0.87 for e.max pair); however, e.max CAD/Press groups had significantly higher flexural strength (p < 0.0001) than Empress Esthetic/CAD groups. Monolithic core specimens presented with higher Weibull modulus with all selected core materials. For the ZirCAD group, although the bilayer 0.8C-0.7VL subgroup exhibited significantly lower flexural strength, it had highest Weibull modulus than the 0.8C-0.7VP subgroup. The present study suggests that veneering porcelain onto a ceramic core material diminishes the flexural strength and the reliability of the bilayer specimens. Leucite-reinforced glass-ceramic cores have lower flexural strength than lithium-disilicate ones, while fabrication techniques (heat-pressed or CAD/CAM) and specimen thicknesses do not affect the flexural strength of all glass ceramics. Compared with the heat-pressed veneering technique, the powder/liquid veneering technique exhibited lower flexural strength but increased reliability with a higher Weibull modulus for zirconia bilayer specimens. Zirconia-veneered ceramics exhibited greater flexural strength than monolithic leucite-reinforced and lithium-disilicate ceramics regardless of zirconia veneering techniques (heat-pressed or powder/liquid technique). © 2012 by the American College of Prosthodontists.
Characteristics of traffic flow at a non-signalized intersection in the framework of game theory
NASA Astrophysics Data System (ADS)
Fan, Hongqiang; Jia, Bin; Tian, Junfang; Yun, Lifen
2014-12-01
At a non-signalized intersection, some vehicles violate the traffic rules to pass the intersection as soon as possible. These behaviors may cause many traffic conflicts even traffic accidents. In this paper, a simulation model is proposed to research the effects of these behaviors at a non-signalized intersection. Vehicle’s movement is simulated by the cellular automaton (CA) model. The game theory is introduced for simulating the intersection dynamics. Two types of driver participate the game process: cooperator (C) and defector (D). The cooperator obey the traffic rules, but the defector does not. A transition process may occur when the cooperator is waiting before the intersection. The critical value of waiting time follows the Weibull distribution. One transition regime is found in the phase diagram. The simulation results illustrate the applicability of the proposed model and reveal a number of interesting insights into the intersection management, including that the existence of defectors is benefit for the capacity of intersection, but also reduce the safety of intersection.
USDA-ARS?s Scientific Manuscript database
Non-linear regression techniques are used widely to fit weed field emergence patterns to soil microclimatic indices using S-type functions. Artificial neural networks present interesting and alternative features for such modeling purposes. In this work, a univariate hydrothermal-time based Weibull m...
Derived distribution of floods based on the concept of partial area coverage with a climatic appeal
NASA Astrophysics Data System (ADS)
Iacobellis, Vito; Fiorentino, Mauro
2000-02-01
A new rationale for deriving the probability distribution of floods and help in understanding the physical processes underlying the distribution itself is presented. On the basis of this a model that presents a number of new assumptions is developed. The basic ideas are as follows: (1) The peak direct streamflow Q can always be expressed as the product of two random variates, namely, the average runoff per unit area ua and the peak contributing area a; (2) the distribution of ua conditional on a can be related to that of the rainfall depth occurring in a duration equal to a characteristic response time тa of the contributing part of the basin; and (3) тa is assumed to vary with a according to a power law. Consequently, the probability density function of Q can be found as the integral, over the total basin area A of that of a times the density function of ua given a. It is suggested that ua can be expressed as a fraction of the excess rainfall and that the annual flood distribution can be related to that of Q by the hypothesis that the flood occurrence process is Poissonian. In the proposed model it is assumed, as an exploratory attempt, that a and ua are gamma and Weibull distributed, respectively. The model was applied to the annual flood series of eight gauged basins in Basilicata (southern Italy) with catchment areas ranging from 40 to 1600 km2. The results showed strong physical consistence as the parameters tended to assume values in good agreement with well-consolidated geomorphologic knowledge and suggested a new key to understanding the climatic control of the probability distribution of floods.
Bradbury, Andrew W; Adam, Donald J; Bell, Jocelyn; Forbes, John F; Fowkes, F Gerry R; Gillespie, Ian; Ruckley, Charles Vaughan; Raab, Gillian M
2010-05-01
An intention-to-treat analysis of the Bypass versus Angioplasty in Severe Ischaemia of the Leg (BASIL) trial showed that in patients with severe lower limb ischemia (SLI) due to infrainguinal disease who survived for 2 years after intervention, initial randomization to a bypass surgery (BSX)-first vs balloon angioplasty (BAP)-first revascularization strategy was associated with improvements in subsequent overall survival (OS) and amputation-free survival (AFS) of about 7 and 6 months, respectively. This study explored the value of baseline factors to estimate the likelihood of survival to 2 years for the trial cohort (Cox model) and for individual BASIL trial patients (Weibull model) as an aid to clinical decision making. Of 452 patients presenting to 27 United Kingdom hospitals, 228 were randomly assigned to a BSX-first and 224 to a BAP-first revascularization strategy. Patients were monitored for at least 3 years. Baseline factors affecting the survival of the entire cohort were examined with a multivariate Cox model. The chances of survival at 1 and 2 years for patients with given baseline characteristics were estimated with a Weibull parametric model. At the end of follow-up, 172 patients (38%) were alive without major limb amputation of the trial leg, and 202 (45%) were alive. Baseline factors that were significant in the Cox model were BASIL randomization stratification group, below knee Bollinger angiogram score, body mass index, age, diabetes, creatinine level, and smoking status. Using these factors to define five equally sized groups, we identified patients with 2-year survival rates of 50% to 90%. The factors that contributed to the Weibull predictive model were age, presence of tissue loss, serum creatinine, number of ankle pressure measurements detectable, maximum ankle pressure measured, a history of myocardial infarction or angina, a history of stroke or transient ischemia attack, below knee Bollinger angiogram score, body mass index, and smoking status. Patients in the BASIL trial were at high risk of amputation and death regardless of revascularization strategy. However, baseline factors can be used to stratify those risks. Furthermore, within a parametric Weibull model, certain of these factors can be used to help predict outcomes for individuals. It may thus be possible to define the clinical and anatomic (angiographic) characteristics of SLI patients who are likely-and not likely-to live for >2 years after intervention. Used appropriately in the context of the BASIL trial outcomes, this may aid clinical decision making regarding a BSX- or BAP-first revascularization strategy in SLI patients like those randomized in BASIL. Copyright (c) 2010 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.; Phoenix, S. Leigh; Grimes-Ledesma, Lorie
2010-01-01
Stress rupture failure of Carbon Composite Overwrapped Pressure Vessels (COPVs) is of serious concern to Science Mission and Constellation programs since there are a number of COPVs on board space vehicles with stored gases under high pressure for long durations of time. It has become customary to establish the reliability of these vessels using the so called classic models. The classical models are based on Weibull statistics fitted to observed stress rupture data. These stochastic models cannot account for any additional damage due to the complex pressure-time histories characteristic of COPVs being supplied for NASA missions. In particular, it is suspected that the effects of proof test could significantly reduce the stress rupture lifetime of COPVs. The focus of this paper is to present an analytical appraisal of a model that incorporates damage due to proof test. The model examined in the current paper is based on physical mechanisms such as micromechanics based load sharing concepts coupled with creep rupture and Weibull statistics. For example, the classic model cannot accommodate for damage due to proof testing which every flight vessel undergoes. The paper compares current model to the classic model with a number of examples. In addition, several applications of the model to current ISS and Constellation program issues are also examined.
NASA Astrophysics Data System (ADS)
Bothe, Oliver; Wagner, Sebastian; Zorita, Eduardo
2015-04-01
How did regional precipitation change in past centuries? We have potentially three sources of information to answer this question: There are, especially for Europe, a number of long records of local station precipitation; documentary records and natural archives of past environmental variability serve as proxy records for empirical reconstructions; in addition, simulations with coupled climate models or Earth System Models provide estimates on the spatial structure of precipitation variability. However, instrumental records rarely extend back to the 18th century, reconstructions include large uncertainties, and simulation skill is often still unsatisfactory for precipitation. Thus, we can only seek to answer to which extent the three sources provide a consistent picture of past regional precipitation changes. This presentation describes the (lack of) consistency in describing changes of the distributional properties of seasonal precipitation between the different data sources. We concentrate on England and Wales since there are two recent reconstructions and a long observation based record available for this domain. The season of interest is an extended spring (March, April, May, June, July, MAMJJ) over the past 350 years. The main simulated data stem from a regional simulation for the European domain with CCLM driven at its lateral boundaries with conditions provided by a MPI-ESM COSMOS simulation for the last millennium using a high-amplitude solar forcing. A number of simulations for the past 1000 years from the Paleoclimate Modelling Intercomparison Project Phase III provide additional information. We fit a Weibull distribution to the available data sets following the approach for calculating standardized precipitation indices. We do so over 51 year moving windows to assess the consistency of changes in the distributional properties. Changes in the percentiles for severe (and extreme) dry or wet conditions and in the Weibull standard deviations of precipitation estimates are generally not consistent among the different data sets. Only few common signals are evident. Even the relatively strong exogenous forcing history of the late 18th and early 19th century appears to have only small effects on the precipitation distributions. The reconstructions differ systematically from the long instrumental data in displaying much stronger variability compared to the observations over their common period. Distributional properties for both data sets show to some extent opposite evolutions. The reconstructions do not reliably represent the distributions in specific periods but rather reflect low-frequency changes in the mean plus a certain amount of noise. Moreover, also multi-model simulations do not agree on the changes over this period. The lack of consistent simulated relations under purely naturally forced and internal variability on multi-decadal time-scales therefore questions our ability to conclude on dynamical inferences about regional climate variability in the PMIP3 ensemble and, in turn, in climate simulations in general. The potentially opposite evolution of reconstructions and instrumental data for the chosen domain further hampers reconciling available information about past regional precipitation variability in England and Wales. However, we find some possibly surprising but encouraging agreement between the observed data and the regional simulation.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2008-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
Test Population Selection from Weibull-Based, Monte Carlo Simulations of Fatigue Life
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Zaretsky, Erwin V.; Hendricks, Robert C.
2012-01-01
Fatigue life is probabilistic and not deterministic. Experimentally establishing the fatigue life of materials, components, and systems is both time consuming and costly. As a result, conclusions regarding fatigue life are often inferred from a statistically insufficient number of physical tests. A proposed methodology for comparing life results as a function of variability due to Weibull parameters, variability between successive trials, and variability due to size of the experimental population is presented. Using Monte Carlo simulation of randomly selected lives from a large Weibull distribution, the variation in the L10 fatigue life of aluminum alloy AL6061 rotating rod fatigue tests was determined as a function of population size. These results were compared to the L10 fatigue lives of small (10 each) populations from AL2024, AL7075 and AL6061. For aluminum alloy AL6061, a simple algebraic relationship was established for the upper and lower L10 fatigue life limits as a function of the number of specimens failed. For most engineering applications where less than 30 percent variability can be tolerated in the maximum and minimum values, at least 30 to 35 test samples are necessary. The variability of test results based on small sample sizes can be greater than actual differences, if any, that exists between materials and can result in erroneous conclusions. The fatigue life of AL2024 is statistically longer than AL6061 and AL7075. However, there is no statistical difference between the fatigue lives of AL6061 and AL7075 even though AL7075 had a fatigue life 30 percent greater than AL6061.
NASA Astrophysics Data System (ADS)
Scheu, B.; Fowler, A. C.
2015-12-01
Fragmentation is a ubiquitous phenomenon in many natural and engineering systems. It is the process by which an initially competent medium, solid or liquid, is broken up into a population of constituents. Examples occur in collisions and impacts of asteroids/meteorites, explosion driven fragmentation of munitions on a battlefield, as well as of magma in a volcanic conduit causing explosive volcanic eruptions and break-up of liquid drops. Besides the mechanism of fragmentation the resulting frequency-size distribution of the generated constituents is of central interest. Initially their distributions were fitted empirically using lognormal, Rosin-Rammler and Weibull distributions (e.g. Brown & Wohletz 1995). The sequential fragmentation theory (Brown 1989, Wohletz at al. 1989, Wohletz & Brown 1995) and the application of fractal theory to fragmentation products (Turcotte 1986, Perfect 1997, Perugini & Kueppers 2012) attempt to overcome this shortcoming by providing a more physical basis for the applied distribution. Both rely on an at least partially scale-invariant and thus self-similar random fragmentation process. Here we provide a stochastic model for the evolution of grain size distribution during the explosion process. Our model is based on laboratory experiments in which volcanic rock samples explode naturally when rapidly depressurized from initial pressures of several MPa to ambient conditions. The physics governing this fragmentation process has been successfully modelled and the observed fragmentation pattern could be numerically reproduced (Fowler et al. 2010). The fragmentation of these natural rocks leads to grain size distributions which vary depending on the experimental starting conditions. Our model provides a theoretical description of these different grain size distributions. Our model combines a sequential model of the type outlined by Turcotte (1986), but generalized to cater for the explosive process appropriate here, in particular by including in the description of the fracturing events in which the rock fragments, with a recipe for the production of fines, as observed in the experiments. To our knowledge, this implementation of a deterministic fracturing process into a stochastic (sequential) model is unique, further it provides the model with some forecasting power.
NASA Astrophysics Data System (ADS)
Zhao, Pei; Shao, Ming-an; Horton, Robert
2011-02-01
Soil particle-size distributions (PSD) have been used to estimate soil hydraulic properties. Various parametric PSD models have been proposed to describe the soil PSD from sparse experimental data. It is important to determine which PSD model best represents specific soils. Fourteen PSD models were examined in order to determine the best model for representing the deposited soils adjacent to dams in the China Loess Plateau; these were: Skaggs (S-1, S-2, and S-3), fractal (FR), Jaky (J), Lima and Silva (LS), Morgan (M), Gompertz (G), logarithm (L), exponential (E), log-exponential (LE), Weibull (W), van Genuchten type (VG) as well as Fredlund (F) models. Four-hundred and eighty samples were obtained from soils deposited in the Liudaogou catchment. The coefficient of determination (R 2), the Akaike's information criterion (AIC), and the modified AIC (mAIC) were used. Based upon R 2 and AIC, the three- and four-parameter models were both good at describing the PSDs of deposited soils, and the LE, FR, and E models were the poorest. However, the mAIC in conjunction with R 2 and AIC results indicated that the W model was optimum for describing PSD of the deposited soils for emphasizing the effect of parameter number. This analysis was also helpful for finding out which model is the best one. Our results are applicable to the China Loess Plateau.
Determining prescription durations based on the parametric waiting time distribution.
Støvring, Henrik; Pottegård, Anton; Hallas, Jesper
2016-12-01
The purpose of the study is to develop a method to estimate the duration of single prescriptions in pharmacoepidemiological studies when the single prescription duration is not available. We developed an estimation algorithm based on maximum likelihood estimation of a parametric two-component mixture model for the waiting time distribution (WTD). The distribution component for prevalent users estimates the forward recurrence density (FRD), which is related to the distribution of time between subsequent prescription redemptions, the inter-arrival density (IAD), for users in continued treatment. We exploited this to estimate percentiles of the IAD by inversion of the estimated FRD and defined the duration of a prescription as the time within which 80% of current users will have presented themselves again. Statistical properties were examined in simulation studies, and the method was applied to empirical data for four model drugs: non-steroidal anti-inflammatory drugs (NSAIDs), warfarin, bendroflumethiazide, and levothyroxine. Simulation studies found negligible bias when the data-generating model for the IAD coincided with the FRD used in the WTD estimation (Log-Normal). When the IAD consisted of a mixture of two Log-Normal distributions, but was analyzed with a single Log-Normal distribution, relative bias did not exceed 9%. Using a Log-Normal FRD, we estimated prescription durations of 117, 91, 137, and 118 days for NSAIDs, warfarin, bendroflumethiazide, and levothyroxine, respectively. Similar results were found with a Weibull FRD. The algorithm allows valid estimation of single prescription durations, especially when the WTD reliably separates current users from incident users, and may replace ad-hoc decision rules in automated implementations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Height extrapolation of wind data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mikhail, A.S.
1982-11-01
Hourly average data for a period of 1 year from three tall meteorological towers - the Erie tower in Colorado, the Goodnoe Hills tower in Washington and the WKY-TV tower in Oklahoma - were used to analyze the wind shear exponent variabiilty with various parameters such as thermal stability, anemometer level wind speed, projection height and surface roughness. Different proposed models for prediction of height variability of short-term average wind speeds were discussed. Other models that predict the height dependence of Weilbull distribution parameters were tested. The observed power law exponent for all three towers showed strong dependence on themore » anemometer level wind speed and stability (nighttime and daytime). It also exhibited a high degree of dependence on extrapolation height with respect to anemometer height. These dependences became less severe as the anemometer level wind speeds were increased due to the turbulent mixing of the atmospheric boundary layer. The three models used for Weibull distribution parameter extrapolation were he velocity-dependent power law model (Justus), the velocity, surface roughness, and height-dependent model (Mikhail) and the velocity and surface roughness-dependent model (NASA). The models projected the scale parameter C fairly accurately for the Goodnoe Hills and WKY-TV towers and were less accurate for the Erie tower. However, all models overestimated the C value. The maximum error for the Mikhail model was less than 2% for Goodnoe Hills, 6% for WKY-TV and 28% for Erie. The error associated with the prediction of the shape factor (K) was similar for the NASA, Mikhail and Justus models. It ranged from 20 to 25%. The effect of the misestimation of hub-height distribution parameters (C and K) on average power output is briefly discussed.« less
Independent Orbiter Assessment (IOA): Weibull analysis report
NASA Technical Reports Server (NTRS)
Raffaelli, Gary G.
1987-01-01
The Auxiliary Power Unit (APU) and Hydraulic Power Unit (HPU) Space Shuttle Subsystems were reviewed as candidates for demonstrating the Weibull analysis methodology. Three hardware components were identified as analysis candidates: the turbine wheel, the gearbox, and the gas generator. Detailed review of subsystem level wearout and failure history revealed the lack of actual component failure data. In addition, component wearout data were not readily available or would require a separate data accumulation effort by the vendor. Without adequate component history data being available, the Weibull analysis methodology application to the APU and HPU subsystem group was terminated.
ERIC Educational Resources Information Center
Indrayani, Ervina; Dimara, Lisiard; Paiki, Kalvin; Reba, Felix
2018-01-01
The coastal waters of East Yapen is one of the spawning sites and areas of care for marine biota in Papua. Because of its very open location, it is widely used by human activities such as fishing, residential, industrial and cruise lines. This indirectly affects the balance of coastal waters condition of East Yapen that impact on the existence of…
OBSIFRAC: database-supported software for 3D modeling of rock mass fragmentation
NASA Astrophysics Data System (ADS)
Empereur-Mot, Luc; Villemin, Thierry
2003-03-01
Under stress, fractures in rock masses tend to form fully connected networks. The mass can thus be thought of as a 3D series of blocks produced by fragmentation processes. A numerical model has been developed that uses a relational database to describe such a mass. The model, which assumes the fractures to be plane, allows data from natural networks to test theories concerning fragmentation processes. In the model, blocks are bordered by faces that are composed of edges and vertices. A fracture can originate from a seed point, its orientation being controlled by the stress field specified by an orientation matrix. Alternatively, it can be generated from a discrete set of given orientations and positions. Both kinds of fracture can occur together in a model. From an original simple block, a given fracture produces two simple polyhedral blocks, and the original block becomes compound. Compound and simple blocks created throughout fragmentation are stored in the database. Several fragmentation processes have been studied. In one scenario, a constant proportion of blocks is fragmented at each step of the process. The resulting distribution appears to be fractal, although seed points are random in each fragmented block. In a second scenario, division affects only one random block at each stage of the process, and gives a Weibull volume distribution law. This software can be used for a large number of other applications.
Pharmaceutical Evaluation of Cefuroxime Axetil Tablets Available in Drug Market of Pakistan
Israr, F.; Mahmood, Z. A.; Hassan, F.; Hasan, S. M. F.
2016-01-01
Cefuroxime is a second generation cephalosporin antibiotic with a broad spectrum activity against Gram positive and Gram negative bacteria. The purpose of this research work was to evaluate the pharmaceutical quality standards of four different brands of cefuroxime axetil 125 mg tablets with different price ranges purchased from retail pharmacies of Pakistan. The brands were tested for physicochemical evaluation and in vitro dissolution studies in different medium like 0.07N HCl, distilled water, 0.1N HCl of pH 1.2 and phosphate buffers of pH 4.5 and pH 6.8. Statistical analysis, model dependent (zero order, first order, Korsmeyer-Peppas, Hixson-Crowell, Weibull) and model independent (Difference f1, similarity f2) approaches were applied to multiple dissolution profile of all brands. All brands were found to be similar with reference and meeting the compendial quality standard. Inter brand variation was observed in disintegration time and assay which was resulted in significant differences (P<0.05) in drug release data and Weibull was observed as best fill model. PMID:27168677
Randomized central limit theorems: A unified theory.
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Randomized central limit theorems: A unified theory
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Strength of a Ceramic Sectored Flexure Specimen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wereszczak, Andrew A; Duffy, Stephen F; Baker, E. H.
2008-01-01
A new test specimen, defined here as the "sectored flexure strength specimen", was developed to measure the strength of ceramic tubes specifically for circumstances when flaws located at the tube's outer diameter are the strength-limiter and subjected to axial tension. The understanding of such strength-limitation is relevant for when ceramic tubes are subjected to bending or when the internal temperature is hotter than the tube's exterior (e.g., heat exchangers). The specimen is both economically and statistically attractive because eight specimens (eight in the case of this project - but the user is not necessarily limited to eight) were extracted outmore » of each length of tube. An analytic expression for maximum or failure stress, and relationships portraying effective area and effective volume as a function of Weibull modulus were developed. Lastly, it was proven from the testing of two ceramics that the sectored flexure specimen was very effective at producing failures caused by strength-limiting flaws located on the tube's original outer diameter. Keywords: ceramics, strength, sectored flexure specimen, effective area, effective volume, finite-element analysis, Weibull distribution, and fractography.« less
Correction to the Dynamic Tensile Strength of Ice and Ice-Silicate Mixtures (Lange & Ahrens 1983)
NASA Astrophysics Data System (ADS)
Stewart, S. T.; Ahrens, T. J.
1999-03-01
We present a correction to the Weibull parameters for ice and ice-silicate mixtures (Lange & Ahrens 1983). These parameters relate the dynamic tensile strength to the strain rate. These data are useful for continuum fracture models of ice.
Bioprosthetic Aortic Valve Durability: A Meta-Regression of Published Studies.
Wang, Mansen; Furnary, Anthony P; Li, Hsin-Fang; Grunkemeier, Gary L
2017-09-01
To compare structural valve deterioration (SVD) among bioprosthetic aortic valve types, a PubMed search found 54 papers containing SVD-free curves extending to at least 10 years. The curves were digitized and fit to Weibull distributions, and the mean times to valve failure (MTTF) were calculated. Twelve valve models were collapsed into four valve types: Medtronic (Medtronic, Minneapolis, MN) and Edwards (Edwards Lifesciences, Irvine, CA) porcine; and Sorin (Sorin Group [now LivaNova], London, United Kingdom) and Edwards pericardial. Meta-regression found MTTF was associated with the patient's age, publication year, SVD definition, and valve type. Sorin pericardial valves had significantly lower risk-adjusted MTTF (higher SVD risk), and there were no significant differences in MTTF among the other three valve types. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Dynamics of Polydisperse Foam-like Emulsion
NASA Astrophysics Data System (ADS)
Hicock, Harry; Feitosa, Klebert
2011-10-01
Foam is a complex fluid whose relaxation properties are associated with the continuous diffusion of gas from small to large bubbles driven by differences in Laplace pressures. We study the dynamics of bubble rearrangements by tracking droplets of a clear, buoyantly neutral emulsion that coarsens like a foam. The droplets are imaged in three dimensions using confocal microscopy. Analysis of the images allows us to measure their positions and radii, and track their evolution in time. We find that the droplet size distribution fits a Weibull distribution characteristics of foam systems. Additionally, we observe that droplets undergo continuous evolution interspersed by occasional large rearrangements in par with local relaxation behavior typical of foams.
Comparison of Methods for Estimating Low Flow Characteristics of Streams
Tasker, Gary D.
1987-01-01
Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.
A hierarchical fire frequency model to simulate temporal patterns of fire regimes in LANDIS
Jian Yang; Hong S. He; Eric J. Gustafson
2004-01-01
Fire disturbance has important ecological effects in many forest landscapes. Existing statistically based approaches can be used to examine the effects of a fire regime on forest landscape dynamics. Most examples of statistically based fire models divide a fire occurrence into two stages--fire ignition and fire initiation. However, the exponential and Weibull fire-...
Empirical membrane lifetime model for heavy duty fuel cell systems
NASA Astrophysics Data System (ADS)
Macauley, Natalia; Watson, Mark; Lauritzen, Michael; Knights, Shanna; Wang, G. Gary; Kjeang, Erik
2016-12-01
Heavy duty fuel cells used in transportation system applications such as transit buses expose the fuel cell membranes to conditions that can lead to lifetime-limiting membrane failure via combined chemical and mechanical degradation. Highly durable membranes and reliable predictive models are therefore needed in order to achieve the ultimate heavy duty fuel cell lifetime target of 25,000 h. In the present work, an empirical membrane lifetime model was developed based on laboratory data from a suite of accelerated membrane durability tests. The model considers the effects of cell voltage, temperature, oxygen concentration, humidity cycling, humidity level, and platinum in the membrane using inverse power law and exponential relationships within the framework of a general log-linear Weibull life-stress statistical distribution. The obtained model is capable of extrapolating the membrane lifetime from accelerated test conditions to use level conditions during field operation. Based on typical conditions for the Whistler, British Columbia fuel cell transit bus fleet, the model predicts a stack lifetime of 17,500 h and a membrane leak initiation time of 9200 h. Validation performed with the aid of a field operated stack confirmed the initial goal of the model to predict membrane lifetime within 20% of the actual operating time.
Arreyndip, Nkongho Ayuketang; Joseph, Ebobenow; David, Afungchui
2016-11-01
For the future installation of a wind farm in Cameroon, the wind energy potentials of three of Cameroon's coastal cities (Kribi, Douala and Limbe) are assessed using NASA average monthly wind data for 31 years (1983-2013) and compared through Weibull statistics. The Weibull parameters are estimated by the method of maximum likelihood, the mean power densities, the maximum energy carrying wind speeds and the most probable wind speeds are also calculated and compared over these three cities. Finally, the cumulative wind speed distributions over the wet and dry seasons are also analyzed. The results show that the shape and scale parameters for Kribi, Douala and Limbe are 2.9 and 2.8, 3.9 and 1.8 and 3.08 and 2.58, respectively. The mean power densities through Weibull analysis for Kribi, Douala and Limbe are 33.7 W/m2, 8.0 W/m2 and 25.42 W/m2, respectively. Kribi's most probable wind speed and maximum energy carrying wind speed was found to be 2.42 m/s and 3.35 m/s, 2.27 m/s and 3.03 m/s for Limbe and 1.67 m/s and 2.0 m/s for Douala, respectively. Analysis of the wind speed and hence power distribution over the wet and dry seasons shows that in the wet season, August is the windiest month for Douala and Limbe while September is the windiest month for Kribi while in the dry season, March is the windiest month for Douala and Limbe while February is the windiest month for Kribi. In terms of mean power density, most probable wind speed and wind speed carrying maximum energy, Kribi shows to be the best site for the installation of a wind farm. Generally, the wind speeds at all three locations seem quite low, average wind speeds of all the three studied locations fall below 4.0m/s which is far below the cut-in wind speed of many modern wind turbines. However we recommend the use of low cut-in speed wind turbines like the Savonius for stand alone low energy needs.
A nonparametric stochastic method for generating daily climate-adjusted streamflows
NASA Astrophysics Data System (ADS)
Stagge, J. H.; Moglen, G. E.
2013-10-01
A daily stochastic streamflow generation model is presented, which successfully replicates statistics of the historical streamflow record and can produce climate-adjusted daily time series. A monthly climate model relates general circulation model (GCM)-scale climate indicators to discrete climate-streamflow states, which in turn control parameters in a daily streamflow generation model. Daily flow is generated by a two-state (increasing/decreasing) Markov chain, with rising limb increments randomly sampled from a Weibull distribution and the falling limb modeled as exponential recession. When applied to the Potomac River, a 38,000 km2 basin in the Mid-Atlantic United States, the model reproduces the daily, monthly, and annual distribution and dynamics of the historical streamflow record, including extreme low flows. This method can be used as part of water resources planning, vulnerability, and adaptation studies and offers the advantage of a parsimonious model, requiring only a sufficiently long historical streamflow record and large-scale climate data. Simulation of Potomac streamflows subject to the Special Report on Emissions Scenarios (SRES) A1b, A2, and B1 emission scenarios predict a slight increase in mean annual flows over the next century, with the majority of this increase occurring during the winter and early spring. Conversely, mean summer flows are projected to decrease due to climate change, caused by a shift to shorter, more sporadic rain events. Date of the minimum annual flow is projected to shift 2-5 days earlier by the 2070-2099 period.
Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y
2012-06-01
Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.
Beyond Word Frequency: Bursts, Lulls, and Scaling in the Temporal Distributions of Words
Altmann, Eduardo G.; Pierrehumbert, Janet B.; Motter, Adilson E.
2009-01-01
Background Zipf's discovery that word frequency distributions obey a power law established parallels between biological and physical processes, and language, laying the groundwork for a complex systems perspective on human communication. More recent research has also identified scaling regularities in the dynamics underlying the successive occurrences of events, suggesting the possibility of similar findings for language as well. Methodology/Principal Findings By considering frequent words in USENET discussion groups and in disparate databases where the language has different levels of formality, here we show that the distributions of distances between successive occurrences of the same word display bursty deviations from a Poisson process and are well characterized by a stretched exponential (Weibull) scaling. The extent of this deviation depends strongly on semantic type – a measure of the logicality of each word – and less strongly on frequency. We develop a generative model of this behavior that fully determines the dynamics of word usage. Conclusions/Significance Recurrence patterns of words are well described by a stretched exponential distribution of recurrence times, an empirical scaling that cannot be anticipated from Zipf's law. Because the use of words provides a uniquely precise and powerful lens on human thought and activity, our findings also have implications for other overt manifestations of collective human dynamics. PMID:19907645
A New Goodness-of-Fit Test for the Weibull Distribution Based on Spacings
1993-03-01
Values for Z* test statistic: Samplesize N, shape parameter 1.0, a levels are 0.20 thru 0.01 ........................... .. 24 3. Skewness of the...parameter K=0.5, a levels are 0.20 thru 0.01 ....... ............................ 30 5. Power of the Test: Samplesize N=20, shape parameter K=1.0, a ...parameter 1.0, alpha level 0.01 ...... ... 36 12. Power of the Test: Samplesize N=30, shape parameter K=1.5, a levels are 0.20 thru 0.01
Accuracy of Time Phasing Aircraft Development using the Continuous Distribution Function
2015-03-26
Breusch - Pagan test ; the reported p-value of 0.5264 fails to rejects the null hypothesis of constant... Breusch - Pagan Test : P-value – 0.6911 0 2 4 6 8 10 12 -1 -0.75 -0.5 -0.25 0 0.25 0.5 0.75 1 Shapiro-Wilk W Test Prob. < W: 0.9849 -1...Weibull Scale Parameter β – Constant Variance Breusch - Pagan Test : P-value – 0.5176 Beta Shape Parameter α – Influential Data
Evolution of Self-Organization in Adiabatic Shear Bands
NASA Astrophysics Data System (ADS)
Meyers, Marc A.; Xue, Qing; Nesterenko, Vitali F.
2001-06-01
The evolution of multiple adiabatic shear bands was investigated in stainless steel, an Fe-15%Cr-15% Ni alloy, titanium, and Ti-6%Al-4%V alloy through the radial collapse of a thick-walled cylinder under high-strain-rate deformation ( 10^4 s-1). The shear-band initiation, propagation, as well as spatial distribution were examined under different global strains(varied from 0 to 0.9). The shear-band spacing is compared with one-dimensional theoretical predictions based on perturbation (Ockendon- Wright and Molinari) and momentum diffusion (Grady-Kipp). The experimentally observed spacing reveals the two-dimensional character of self-organization. These aspects are incorporated into a novel analytical description, in which a distribution of embryos(potential initiation sites) is activated as a function of strain (greater than a threshold) accoding to a Weibull-type distribution. The model incorporates embryo disactivation by stress shielding as well as selective growth of shear bands. The imposed strain rate, embryo distribution, and rates of initiation and propagation determine the evolutionary shear band configurations. The microstructural parameter investigated for stainless steel was the grain size, that was varied from 30 and 500 um. The influence of grain size was found to be minor and through the flow stress. Titanium and Ti-6%Al-4%V displayed drastically different patterns of shear bands,which are explained in terms of the model proposed. Research Supported by US Army Research Office MURI Program (Contract DAAH 04-96-1-0376).
Modeling Rabbit Responses to Single and Multiple Aerosol ...
Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev
Paes, P N G; Bastian, F L; Jardim, P M
2017-09-01
Consider the efficacy of glass infiltration etching (SIE) treatment as a procedure to modify the zirconia surface resulting in higher interfacial fracture toughness. Y-TZP was subjected to 5 different surface treatments conditions consisting of no treatment (G1), SIE followed by hydrofluoric acid treatment (G2), heat treated at 750°C (G3), hydrofluoric acid treated (G4) and airborne-particle abrasion with alumina particles (G5). The effect of surface treatment on roughness was evaluated by Atomic Force Microscopy providing three different parameters: R a , R sk and surface area variation. The ceramic/resin cement interface was analyzed by Fracture Mechanics K I test with failure mode determined by fractographic analysis. Weibull's analysis was also performed to evaluate the structural integrity of the adhesion zone. G2 and G4 specimens showed very similar, and high R a values but different surface area variation (33% for G2 and 13% for G4) and they presented the highest fracture toughness (K IC ). Weibull's analysis showed G2 (SIE) tendency to exhibit higher K IC values than the other groups but with more data scatter and a higher early failure probability than G4 specimens. Selective glass infiltration etching surface treatment was effective in modifying the zirconia surface roughness, increasing the bonding area and hence the mechanical imbrications at the zirconia/resin cement interface resulting in higher fracture toughness (K IC ) values with higher K IC values obtained when failure probability above 20% was expected (Weibull's distribution) among all the experimental groups. Copyright © 2017 The Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Hyperchaotic Dynamics for Light Polarization in a Laser Diode
NASA Astrophysics Data System (ADS)
Bonatto, Cristian
2018-04-01
It is shown that a highly randomlike behavior of light polarization states in the output of a free-running laser diode, covering the whole Poincaré sphere, arises as a result from a fully deterministic nonlinear process, which is characterized by a hyperchaotic dynamics of two polarization modes nonlinearly coupled with a semiconductor medium, inside the optical cavity. A number of statistical distributions were found to describe the deterministic data of the low-dimensional nonlinear flow, such as lognormal distribution for the light intensity, Gaussian distributions for the electric field components and electron densities, Rice and Rayleigh distributions, and Weibull and negative exponential distributions, for the modulus and intensity of the orthogonal linear components of the electric field, respectively. The presented results could be relevant for the generation of single units of compact light source devices to be used in low-dimensional optical hyperchaos-based applications.
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.
1983-01-01
Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Computer routines for probability distributions, random numbers, and related functions
Kirby, W.H.
1980-01-01
Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)
Taravat, Alireza; Oppelt, Natascha
2014-01-01
Oil spills represent a major threat to ocean ecosystems and their environmental status. Previous studies have shown that Synthetic Aperture Radar (SAR), as its recording is independent of clouds and weather, can be effectively used for the detection and classification of oil spills. Dark formation detection is the first and critical stage in oil-spill detection procedures. In this paper, a novel approach for automated dark-spot detection in SAR imagery is presented. A new approach from the combination of adaptive Weibull Multiplicative Model (WMM) and MultiLayer Perceptron (MLP) neural networks is proposed to differentiate between dark spots and the background. The results have been compared with the results of a model combining non-adaptive WMM and pulse coupled neural networks. The presented approach overcomes the non-adaptive WMM filter setting parameters by developing an adaptive WMM model which is a step ahead towards a full automatic dark spot detection. The proposed approach was tested on 60 ENVISAT and ERS2 images which contained dark spots. For the overall dataset, an average accuracy of 94.65% was obtained. Our experimental results demonstrate that the proposed approach is very robust and effective where the non-adaptive WMM & pulse coupled neural network (PCNN) model generates poor accuracies. PMID:25474376
Micro-mechanics of hydro-mechanical coupled processes during hydraulic fracturing in sandstone
NASA Astrophysics Data System (ADS)
Caulk, R.; Tomac, I.
2017-12-01
This contribution presents micro-mechanical study of hydraulic fracture initiation and propagation in sandstone. The Discrete Element Method (DEM) Yade software is used as a tool to model fully coupled hydro-mechanical behavior of the saturated sandstone under pressures typical for deep geo-reservoirs. Heterogeneity of sandstone strength tensile and shear parameters are introduced using statistical representation of cathodoluminiscence (CL) sandstone rock images. Weibull distribution of statistical parameter values was determined as a best match of the CL scans of sandstone grains and cement between grains. Results of hydraulic fracturing stimulation from the well bore indicate significant difference between models with the bond strengths informed from CL scans and uniform homogeneous representation of sandstone parameters. Micro-mechanical insight reveals formed hydraulic fracture typical for mode I or tensile cracking in both cases. However, the shear micro-cracks are abundant in the CL informed model while they are absent in the standard model with uniform strength distribution. Most of the mode II cracks, or shear micro-cracks, are not part of the main hydraulic fracture and occur in the near-tip and near-fracture areas. The position and occurrence of the shear micro-cracks is characterized as secondary effect which dissipates the hydraulic fracturing energy. Additionally, the shear micro-crack locations qualitatively resemble acoustic emission cloud of shear cracks frequently observed in hydraulic fracturing, and sometimes interpreted as re-activation of existing fractures. Clearly, our model does not contain pre-existing cracks and has continuous nature prior to fracturing. This observation is novel and interesting and is quantified in the paper. The shear particle contact forces field reveals significant relaxation compared to the model with uniform strength distribution.
NASA Astrophysics Data System (ADS)
Starn, J. J.; Belitz, K.; Carlson, C.
2017-12-01
Groundwater residence-time distributions (RTDs) are critical for assessing susceptibility of water resources to contamination. This novel approach for estimating regional RTDs was to first simulate groundwater flow using existing regional digital data sets in 13 intermediate size watersheds (each an average of 7,000 square kilometers) that are representative of a wide range of glacial systems. RTDs were simulated with particle tracking. We refer to these models as "general models" because they are based on regional, as opposed to site-specific, digital data. Parametric RTDs were created from particle RTDs by fitting 1- and 2-component Weibull, gamma, and inverse Gaussian distributions, thus reducing a large number of particle travel times to 3 to 7 parameters (shape, location, and scale for each component plus a mixing fraction) for each modeled area. The scale parameter of these distributions is related to the mean exponential age; the shape parameter controls departure from the ideal exponential distribution and is partly a function of interaction with bedrock and with drainage density. Given the flexible shape and mathematical similarity of these distributions, any of them are potentially a good fit to particle RTDs. The 1-component gamma distribution provided a good fit to basin-wide particle RTDs. RTDs at monitoring wells and streams often have more complicated shapes than basin-wide RTDs, caused in part by heterogeneity in the model, and generally require 2-component distributions. A machine learning model was trained on the RTD parameters using features derived from regionally available watershed characteristics such as recharge rate, material thickness, and stream density. RTDs appeared to vary systematically across the landscape in relation to watershed features. This relation was used to produce maps of useful metrics with respect to risk-based thresholds, such as the time to first exceedance, time to maximum concentration, time above the threshold (exposure time), and the time until last exceedance; thus, the parameters of groundwater residence time are measures of the intrinsic susceptibility of groundwater to contamination.
A method for developing design diagrams for ceramic and glass materials using fatigue data
NASA Technical Reports Server (NTRS)
Heslin, T. M.; Magida, M. B.; Forrest, K. A.
1986-01-01
The service lifetime of glass and ceramic materials can be expressed as a plot of time-to-failure versus applied stress whose plot is parametric in percent probability of failure. This type of plot is called a design diagram. Confidence interval estimates for such plots depend on the type of test that is used to generate the data, on assumptions made concerning the statistical distribution of the test results, and on the type of analysis used. This report outlines the development of design diagrams for glass and ceramic materials in engineering terms using static or dynamic fatigue tests, assuming either no particular statistical distribution of test results or a Weibull distribution and using either median value or homologous ratio analysis of the test results.
NASA Astrophysics Data System (ADS)
Kaoga, Dieudonné Kidmo; Bogno, Bachirou; Aillerie, Michel; Raidandi, Danwe; Yamigno, Serge Doka; Hamandjoda, Oumarou; Tibi, Beda
2016-07-01
In this work, 28 years of wind data, measured at 10m above ground level (AGL), from Maroua meteorological station is utilized to assess the potential of wind energy at exposed ridges tops of mountains surrounding the city of Maroua. The aim of this study is to estimate the cost of wind-generated electricity using six types of wind turbines (50 to 2000 kW). The Weibull distribution function is employed to estimate Weibull shape and scale parameters using the energy pattern factor method. The considered wind shear model to extrapolate Weibull parameters and wind profiles is the empirical power law correlation. The results show that hilltops in the range of 150-350m AGL in increments of 50, fall under Class 3 or greater of the international system of wind classification and are deemed suitable to outstanding for wind turbine applications. A performance of the selected wind turbines is examined as well as the costs of wind-generated electricity at the considered hilltops. The results establish that the lowest costs per kWh are obtained using YDF-1500-87 (1500 kW) turbine while the highest costs are delivered by P-25-100 (90 kW). The lowest costs (US) per kWh of electricity generated are found to vary between a minimum of 0.0294 at hilltops 350m AGL and a maximum of 0.0366 at hilltops 150m AGL, with corresponding energy outputs that are 6,125 and 4,932 MWh, respectively. Additionally, the matching capacity factors values are 38.05% at hilltops 150m AGL and 47.26% at hilltops 350m AGL. Furthermore, YDF-1500-87 followed by Enercon E82-2000 (2000 kW) wind turbines provide the lowest cost of wind generated electricity and are recommended for use for large communities. Medium wind turbine P-15-50 (50 kW), despite showing the best coefficients factors (39.29% and 48.85% at hilltops 150 and 350m AGL, in that order), generates electricity at an average higher cost/kWh of US0.0547 and 0.0440 at hilltops 150 and 350m AGL, respectively. P-15-50 is deemed a more advantageous option for off-grid electrification of small and remote communities.
Tensile strength of ramie yarn (spinning by machine)/HDPE thermoplastic matrix composites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banowati, Lies, E-mail: liesbano@gmail.com; Hadi, Bambang K., E-mail: bkhadi@ae.itb.ac.id; Suratman, Rochim, E-mail: rochim@material.itb.ac.id
2016-03-29
Technological developments should be trooped to prevent a gap between technology and environmental sustainability, then it needs to be developed “Green technology”. In this research is making of green composites which use natural fiber ramie as reinforcement. Whereas the matrix used was HDPE (High Density Polyethylene) thermoplastic polymer which could be recycled and had a good formability and flexibility. The ramie yarns and fibers for unidirectional (0°) direction respectively were mixed with HDPE powder and processed using hot compression molding. The surface morphology was observed by SEM (Scanning Electrone Microscopy). Results showed that both tensile strength of the ramie fiber/HDPEmore » composites increased in comparison with the ramie yarn (spinning by machine)/HDPE composites. However, the ramie yarn (spinning by machine)/HDPE composites have a good producibility for wider application. Analysis of the test results using the Weibull distribution as approaches to modeling the reliability of the specimens.« less
Optimum cooking conditions for shrimp and Atlantic salmon.
Brookmire, Lauren; Mallikarjunan, P; Jahncke, M; Grisso, R
2013-02-01
The quality and safety of a cooked food product depends on many variables, including the cooking method and time-temperature combinations employed. The overall heating profile of the food can be useful in predicting the quality changes and microbial inactivation occurring during cooking. Mathematical modeling can be used to attain the complex heating profile of a food product during cooking. Studies were performed to monitor the product heating profile during the baking and boiling of shrimp and the baking and pan-frying of salmon. Product color, texture, moisture content, mass loss, and pressed juice were evaluated during the cooking processes as the products reached the internal temperature recommended by the FDA. Studies were also performed on the inactivation of Salmonella cocktails in shrimp and salmon. To effectively predict inactivation during cooking, the Bigelow, Fermi distribution, and Weibull distribution models were applied to the Salmonella thermal inactivation data. Minimum cooking temperatures necessary to destroy Salmonella in shrimp and salmon were determined. The heating profiles of the 2 products were modeled using the finite difference method. Temperature data directly from the modeled heating profiles were then used in the kinetic modeling of quality change and Salmonella inactivation during cooking. The optimum cooking times for a 3-log reduction of Salmonella and maintaining 95% of quality attributes are 100, 233, 159, 378, 1132, and 399 s for boiling extra jumbo shrimp, baking extra jumbo shrimp, boiling colossal shrimp, baking colossal shrimp, baking Atlantic salmon, and pan frying Atlantic Salmon, respectively. © 2013 Institute of Food Technologists®
Trudeau, Michaela P.; Verma, Harsha; Sampedro, Fernando; Urriola, Pedro E.; Shurson, Gerald C.; McKelvey, Jessica; Pillai, Suresh D.; Goyal, Sagar M.
2016-01-01
Infection with porcine epidemic diarrhea virus (PEDV) causes diarrhea, vomiting, and high mortality in suckling pigs. Contaminated feed has been suggested as a vehicle of transmission for PEDV. The objective of this study was to compare thermal and electron beam processing, and the inclusion of feed additives on the inactivation of PEDV in feed. Feed samples were spiked with PEDV and then heated to 120–145°C for up to 30 min or irradiated at 0–50 kGy. Another set of feed samples spiked with PEDV and mixed with Ultracid P (Nutriad), Activate DA (Novus International), KEM-GEST (Kemin Agrifood), Acid Booster (Agri-Nutrition), sugar or salt was incubated at room temperature (~25°C) for up to 21 days. At the end of incubation, the virus titers were determined by inoculation of Vero-81 cells and the virus inactivation kinetics were modeled using the Weibull distribution model. The Weibull kinetic parameter delta represented the time or eBeam dose required to reduce virus concentration by 1 log. For thermal processing, delta values ranged from 16.52 min at 120°C to 1.30 min at 145°C. For eBeam processing, a target dose of 50 kGy reduced PEDV concentration by 3 log. All additives tested were effective in reducing the survival of PEDV when compared with the control sample (delta = 17.23 days). Activate DA (0.81) and KEM-GEST (3.28) produced the fastest inactivation. In conclusion, heating swine feed at temperatures over 130°C or eBeam processing of feed with a dose over 50 kGy are effective processing steps to reduce PEDV survival. Additionally, the inclusion of selected additives can decrease PEDV survivability. PMID:27341670
USDA-ARS?s Scientific Manuscript database
This study investigated the compositional characteristics and shelf-life of Njangsa seed oil (NSO). Oil from Njangsa had a high polyunsaturated fatty acid (PUFA) content of which alpha eleosteric acid (alpha-ESA), an unusual conjugated linoleic acid was the most prevalent (about 52%). Linoleic acid...
A bivariate model for analyzing recurrent multi-type automobile failures
NASA Astrophysics Data System (ADS)
Sunethra, A. A.; Sooriyarachchi, M. R.
2017-09-01
The failure mechanism in an automobile can be defined as a system of multi-type recurrent failures where failures can occur due to various multi-type failure modes and these failures are repetitive such that more than one failure can occur from each failure mode. In analysing such automobile failures, both the time and type of the failure serve as response variables. However, these two response variables are highly correlated with each other since the timing of failures has an association with the mode of the failure. When there are more than one correlated response variables, the fitting of a multivariate model is more preferable than separate univariate models. Therefore, a bivariate model of time and type of failure becomes appealing for such automobile failure data. When there are multiple failure observations pertaining to a single automobile, such data cannot be treated as independent data because failure instances of a single automobile are correlated with each other while failures among different automobiles can be treated as independent. Therefore, this study proposes a bivariate model consisting time and type of failure as responses adjusted for correlated data. The proposed model was formulated following the approaches of shared parameter models and random effects models for joining the responses and for representing the correlated data respectively. The proposed model is applied to a sample of automobile failures with three types of failure modes and up to five failure recurrences. The parametric distributions that were suitable for the two responses of time to failure and type of failure were Weibull distribution and multinomial distribution respectively. The proposed bivariate model was programmed in SAS Procedure Proc NLMIXED by user programming appropriate likelihood functions. The performance of the bivariate model was compared with separate univariate models fitted for the two responses and it was identified that better performance is secured by the bivariate model. The proposed model can be used to determine the time and type of failure that would occur in the automobiles considered here.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
Wei, Chenhui; Zhu, Wancheng; Chen, Shikuo; Ranjith, Pathegama Gamage
2016-01-01
This paper proposes a coupled thermal–hydrological–mechanical damage (THMD) model for the failure process of rock, in which coupling effects such as thermally induced rock deformation, water flow-induced thermal convection, and rock deformation-induced water flow are considered. The damage is considered to be the key factor that controls the THM coupling process and the heterogeneity of rock is characterized by the Weibull distribution. Next, numerical simulations on excavation-induced damage zones in Äspö pillar stability experiments (APSE) are carried out and the impact of in situ stress conditions on damage zone distribution is analysed. Then, further numerical simulations of damage evolution at the heating stage in APSE are carried out. The impacts of in situ stress state, swelling pressure and water pressure on damage evolution at the heating stage are simulated and analysed, respectively. The simulation results indicate that (1) the v-shaped notch at the sidewall of the pillar is predominantly controlled by the in situ stress trends and magnitude; (2) at the heating stage, the existence of confining pressure can suppress the occurrence of damage, including shear damage and tensile damage; and (3) the presence of water flow and water pressure can promote the occurrence of damage, especially shear damage. PMID:28774001
The Wind Energy Potential of Kurdistan, Iran
Arefi, Farzad; Moshtagh, Jamal; Moradi, Mohammad
2014-01-01
In the current work by using statistical methods and available software, the wind energy assessment of prone regions for installation of wind turbines in, Qorveh, has been investigated. Information was obtained from weather stations of Baneh, Bijar, Zarina, Saqez, Sanandaj, Qorveh, and Marivan. The monthly average and maximum of wind speed were investigated between the years 2000–2010 and the related curves were drawn. The Golobad curve (direction and percentage of dominant wind and calm wind as monthly rate) between the years 1997–2000 was analyzed and drawn with plot software. The ten-minute speed (at 10, 30, and 60 m height) and direction (at 37.5 and 10 m height) wind data were collected from weather stations of Iranian new energy organization. The wind speed distribution during one year was evaluated by using Weibull probability density function (two-parametrical), and the Weibull curve histograms were drawn by MATLAB software. According to the average wind speed of stations and technical specifications of the types of turbines, the suitable wind turbine for the station was selected. Finally, the Divandareh and Qorveh sites with favorable potential were considered for installation of wind turbines and construction of wind farms. PMID:27355042
Effect of BN coating on the strength of a mullite type fiber
NASA Astrophysics Data System (ADS)
Chawla, K. K.; Xu, Z. R.; Ha, J.-S.; Schmücker, M.; Schneider, H.
1997-09-01
Nextel 480 is a polycrystalline essentially mullite fiber (70 wt.-% Al2O3+28 wt.-% SiO2+2 wt.-% B2O3). Different thicknesses of BN were applied as coatings on this fiber. Optical, scanning electron, and transmission electron microscopy were used to characterize the microstructure of the coatings and fibers. The effects of coating and high temperature exposure on the fiber strength were investigated using two-parameter Weibull distribution. TEM examination showed that the BN coating has a turbostratic structure, with the basal planes lying predominantly parallel to the fiber surface. Such an orientation of coating is desirable for easy crack deflection and subsequent fiber pullout in a composite. The BN coated Nextel 480 fiber showed that Weibull mean strength increased first and then decreased with increasing coating thickness. This was due to the surface flaw healing effect of the coating (up to 0.3 μm) while in the case of thick BN coating (1 μm), the soft nature of the coating material had a more dominant effect and resulted in a decrease of the fiber strength. High temperature exposure of Nextel 480 resulted in grain growth, which led to a strength loss.
Predicting Numbers of Problems in Development of Software
NASA Technical Reports Server (NTRS)
Simonds, Charles H.
2005-01-01
A method has been formulated to enable prediction of the amount of work that remains to be performed in developing flight software for a spacecraft. The basic concept embodied in the method is that of using an idealized curve (specifically, the Weibull function) to interpolate from (1) the numbers of problems discovered thus far to (2) a goal of discovering no new problems after launch (or six months into the future for software already in use in orbit). The steps of the method can be summarized as follows: 1. Take raw data in the form of problem reports (PRs), including the dates on which they are generated. 2. Remove, from the data collection, PRs that are subsequently withdrawn or to which no response is required. 3. Count the numbers of PRs created in 1-week periods and the running total number of PRs each week. 4. Perform the interpolation by making a least-squares fit of the Weibull function to (a) the cumulative distribution of PRs gathered thus far and (b) the goal of no more PRs after the currently anticipated launch date. The interpolation and the anticipated launch date are subject to iterative re-estimation.
Emperical Tests of Acceptance Sampling Plans
NASA Technical Reports Server (NTRS)
White, K. Preston, Jr.; Johnson, Kenneth L.
2012-01-01
Acceptance sampling is a quality control procedure applied as an alternative to 100% inspection. A random sample of items is drawn from a lot to determine the fraction of items which have a required quality characteristic. Both the number of items to be inspected and the criterion for determining conformance of the lot to the requirement are given by an appropriate sampling plan with specified risks of Type I and Type II sampling errors. In this paper, we present the results of empirical tests of the accuracy of selected sampling plans reported in the literature. These plans are for measureable quality characteristics which are known have either binomial, exponential, normal, gamma, Weibull, inverse Gaussian, or Poisson distributions. In the main, results support the accepted wisdom that variables acceptance plans are superior to attributes (binomial) acceptance plans, in the sense that these provide comparable protection against risks at reduced sampling cost. For the Gaussian and Weibull plans, however, there are ranges of the shape parameters for which the required sample sizes are in fact larger than the corresponding attributes plans, dramatically so for instances of large skew. Tests further confirm that the published inverse-Gaussian (IG) plan is flawed, as reported by White and Johnson (2011).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirosaki, Naota; Akimune, Yoshio; Mitomo, Mamoru
1993-07-01
[beta]-Si[sub 3]N[sub 4] powder containing 1 mol% of equimolar Y[sub 2]O[sub 3]-Nd[sub 2]O[sub 3] was gas-pressure sintered at 2,000C for 2 h (SN2), 4 h (SN4), and 8 h (SN8) in 30-MPa nitrogen gas. These materials had a microstructure of in-situ composites'' as a result of exaggerated grain growth of some [beta]-Si[sub 3]N[sub 4] grains during firing. Growth of elongated grains was controlled by the sintering time, so that the desired microstructures were obtained. SN2 had a Weibull modulus as high as 53 because of the uniform size and spatial distribution of its large grains. SN4 had a fracture toughnessmore » of 10.3 MPa[center dot]m[sup 1/2] because of toughening provided by the bridging of elongated grains, whereas SN8 showed a lower fracture toughness, possibly caused by extensive microcracking resulting from excessively large grains. Gas-pressure sintering of [beta]-Si[sub 3]N[sub 4] powder was shown to be effective in fostering selective grain growth for obtaining the desired composite microstructure.« less
Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.
Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R
2012-08-01
Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.
NASA Astrophysics Data System (ADS)
Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.
2018-01-01
Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.
NASA Astrophysics Data System (ADS)
Prasanna, V.
2018-01-01
This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.
What are the Shapes of Response Time Distributions in Visual Search?
Palmer, Evan M.; Horowitz, Todd S.; Torralba, Antonio; Wolfe, Jeremy M.
2011-01-01
Many visual search experiments measure reaction time (RT) as their primary dependent variable. Analyses typically focus on mean (or median) RT. However, given enough data, the RT distribution can be a rich source of information. For this paper, we collected about 500 trials per cell per observer for both target-present and target-absent displays in each of three classic search tasks: feature search, with the target defined by color; conjunction search, with the target defined by both color and orientation; and spatial configuration search for a 2 among distractor 5s. This large data set allows us to characterize the RT distributions in detail. We present the raw RT distributions and fit several psychologically motivated functions (ex-Gaussian, ex-Wald, Gamma, and Weibull) to the data. We analyze and interpret parameter trends from these four functions within the context of theories of visual search. PMID:21090905
Time-dependent breakdown of fiber networks: Uncertainty of lifetime
NASA Astrophysics Data System (ADS)
Mattsson, Amanda; Uesaka, Tetsu
2017-05-01
Materials often fail when subjected to stresses over a prolonged period. The time to failure, also called the lifetime, is known to exhibit large variability of many materials, particularly brittle and quasibrittle materials. For example, a coefficient of variation reaches 100% or even more. Its distribution shape is highly skewed toward zero lifetime, implying a large number of premature failures. This behavior contrasts with that of normal strength, which shows a variation of only 4%-10% and a nearly bell-shaped distribution. The fundamental cause of this large and unique variability of lifetime is not well understood because of the complex interplay between stochastic processes taking place on the molecular level and the hierarchical and disordered structure of the material. We have constructed fiber network models, both regular and random, as a paradigm for general material structures. With such networks, we have performed Monte Carlo simulations of creep failure to establish explicit relationships among fiber characteristics, network structures, system size, and lifetime distribution. We found that fiber characteristics have large, sometimes dominating, influences on the lifetime variability of a network. Among the factors investigated, geometrical disorders of the network were found to be essential to explain the large variability and highly skewed shape of the lifetime distribution. With increasing network size, the distribution asymptotically approaches a double-exponential form. The implication of this result is that, so-called "infant mortality," which is often predicted by the Weibull approximation of the lifetime distribution, may not exist for a large system.
Edge on Impact Simulations and Experiments
2013-09-01
silicon carbide ( SiC ) and aluminum oxynitride (AlON) ceramics are predicted using the Kayenta macroscopic constitutive model. Aspects regarding...damage propagation. 2.1. Silicon Carbide SiC is an opaque ceramic explored by the armor community. It is perhaps the most extensively characterized...the Weibull modulus for SiC . 4.1. Silicon Carbide Figures 3 and 4 compare experimental images with model predictions of EOI of SiC targets at respective
Fire frequency, area burned, and severity: A quantitative approach to defining a normal fire year
Lutz, J.A.; Key, C.H.; Kolden, C.A.; Kane, J.T.; van Wagtendonk, J.W.
2011-01-01
Fire frequency, area burned, and fire severity are important attributes of a fire regime, but few studies have quantified the interrelationships among them in evaluating a fire year. Although area burned is often used to summarize a fire season, burned area may not be well correlated with either the number or ecological effect of fires. Using the Landsat data archive, we examined all 148 wildland fires (prescribed fires and wildfires) >40 ha from 1984 through 2009 for the portion of the Sierra Nevada centered on Yosemite National Park, California, USA. We calculated mean fire frequency and mean annual area burned from a combination of field- and satellite-derived data. We used the continuous probability distribution of the differenced Normalized Burn Ratio (dNBR) values to describe fire severity. For fires >40 ha, fire frequency, annual area burned, and cumulative severity were consistent in only 13 of 26 years (50 %), but all pair-wise comparisons among these fire regime attributes were significant. Borrowing from long-established practice in climate science, we defined "fire normals" to be the 26 year means of fire frequency, annual area burned, and the area under the cumulative probability distribution of dNBR. Fire severity normals were significantly lower when they were aggregated by year compared to aggregation by area. Cumulative severity distributions for each year were best modeled with Weibull functions (all 26 years, r2 ??? 0.99; P < 0.001). Explicit modeling of the cumulative severity distributions may allow more comprehensive modeling of climate-severity and area-severity relationships. Together, the three metrics of number of fires, size of fires, and severity of fires provide land managers with a more comprehensive summary of a given fire year than any single metric.
Manzoor, Behzad; Suleiman, Mahmood; Palmer, Richard M
2013-01-01
The crestal bone level around a dental implant may influence its strength characteristics by offering protection against mechanical failures. Therefore, the present study investigated the effect of simulated bone loss on modes, loads, and cycles to failure in an in vitro model. Different amounts of bone loss were simulated: 0, 1.5, 3.0, and 4.5 mm from the implant head. Forty narrow-diameter (3.0-mm) implant-abutment assemblies were tested using compressive bending and cyclic fatigue testing. Weibull and accelerated life testing analysis were used to assess reliability and functional life. Statistical analyses were performed using the Fisher-Exact test and the Spearman ranked correlation. Compressive bending tests showed that the level of bone loss influenced the load-bearing capacity of implant-abutment assemblies. Fatigue testing showed that the modes, loads, and cycles to failure had a statistically significant relationship with the level of bone loss. All 16 samples with bone loss of 3.0 mm or more experienced horizontal implant body fractures. In contrast, 14 of 16 samples with 0 and 1.5 mm of bone loss showed abutment and screw fractures. Weibull and accelerated life testing analysis indicated a two-group distribution: the 0- and 1.5-mm bone loss samples had better functional life and reliability than the 3.0- and 4.5-mm samples. Progressive bone loss had a significant effect on modes, loads, and cycles to failure. In addition, bone loss influenced the functional life and reliability of the implant-abutment assemblies. Maintaining crestal bone levels is important in ensuring biomechanical sustainability and predictable long-term function of dental implant assemblies.
1981-12-01
preventing the generation of 16 6 negative location estimators. Because of the invariant pro- perty of the EDF statistics, this transformation will...likelihood. If the parameter estimation method developed by Harter and Moore is used, care must be taken to prevent the location estimators from being...vs A 2 Critical Values, Level-.Ol, n-30 128 , 0 6N m m • w - APPENDIX E Computer Prgrams 129 Program to Calculate the Cramer-von Mises Critical Values
A criterion for establishing life limits. [for Space Shuttle Main Engine service
NASA Technical Reports Server (NTRS)
Skopp, G. H.; Porter, A. A.
1990-01-01
The development of a rigorous statistical method that would utilize hardware-demonstrated reliability to evaluate hardware capability and provide ground rules for safe flight margin is discussed. A statistical-based method using the Weibull/Weibayes cumulative distribution function is described. Its advantages and inadequacies are pointed out. Another, more advanced procedure, Single Flight Reliability (SFR), determines a life limit which ensures that the reliability of any single flight is never less than a stipulated value at a stipulated confidence level. Application of the SFR method is illustrated.
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2013-01-01
Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524
Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.
Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina
2015-01-01
Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.
NASA Technical Reports Server (NTRS)
Welch, R. M.; Sengupta, S. K.; Chen, D. W.
1990-01-01
Stratocumulus cloud fields in the FIRE IFO region are analyzed using LANDSAT Thematic Mapper imagery. Structural properties such as cloud cell size distribution, cell horizontal aspect ratio, fractional coverage and fractal dimension are determined. It is found that stratocumulus cloud number densities are represented by a power law. Cell horizontal aspect ratio has a tendency to increase at large cell sizes, and cells are bi-fractal in nature. Using LANDSAT Multispectral Scanner imagery for twelve selected stratocumulus scenes acquired during previous years, similar structural characteristics are obtained. Cloud field spatial organization also is analyzed. Nearest-neighbor spacings are fit with a number of functions, with Weibull and Gamma distributions providing the best fits. Poisson tests show that the spatial separations are not random. Second order statistics are used to examine clustering.
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Zhou, Wei-Xing; Tan, Qun-Zhao
2009-11-01
Massive multiplayer online role-playing games (MMORPGs) are very popular in China, which provides a potential platform for scientific research. We study the online-offline activities of avatars in an MMORPG to understand their game-playing behavior. The statistical analysis unveils that the active avatars can be classified into three types. The avatars of the first type are owned by game cheaters who go online and offline in preset time intervals with the online duration distributions dominated by pulses. The second type of avatars is characterized by a Weibull distribution in the online durations, which is confirmed by statistical tests. The distributions of online durations of the remaining individual avatars differ from the above two types and cannot be described by a simple form. These findings have potential applications in the game industry.
NASA Astrophysics Data System (ADS)
Mathieu, Jean-Philippe; Inal, Karim; Berveiller, Sophie; Diard, Olivier
2010-11-01
Local approach to brittle fracture for low-alloyed steels is discussed in this paper. A bibliographical introduction intends to highlight general trends and consensual points of the topic and evokes debatable aspects. French RPV steel 16MND5 (equ. ASTM A508 Cl.3), is then used as a model material to study the influence of temperature on brittle fracture. A micromechanical modelling of brittle fracture at the elementary volume scale already used in previous work is then recalled. It involves a multiscale modelling of microstructural plasticity which has been tuned on experimental inter-phase and inter-granular stresses heterogeneities measurements. Fracture probability of the elementary volume can then be computed using a randomly attributed defect size distribution based on realistic carbides repartition. This defect distribution is then deterministically correlated to stress heterogeneities simulated within the microstructure using a weakest-link hypothesis on the elementary volume, which results in a deterministic stress to fracture. Repeating the process allows to compute Weibull parameters on the elementary volume. This tool is then used to investigate the physical mechanisms that could explain the already experimentally observed temperature dependence of Beremin's parameter for 16MND5 steel. It is showed that, assuming that the hypothesis made in this work about cleavage micro-mechanisms are correct, effective equivalent surface energy (i.e. surface energy plus plastically dissipated energy when blunting the crack tip) for propagating a crack has to be temperature dependent to explain Beremin's parameters temperature evolution.
NASA Astrophysics Data System (ADS)
Roirand, Q.; Missoum-Benziane, D.; Thionnet, A.; Laiarinandrasana, L.
2017-09-01
Textile composites are composed of 3D complex architecture. To assess the durability of such engineering structures, the failure mechanisms must be highlighted. Examinations of the degradation have been carried out thanks to tomography. The present work addresses a numerical damage model dedicated to the simulation of the crack initiation and propagation at the scale of the warp yarns. For the 3D woven composites under study, loadings in tension and combined tension and bending were considered. Based on an erosion procedure of broken elements, the failure mechanisms have been modelled on 3D periodic cells by finite element calculations. The breakage of one element was determined using a failure criterion at the mesoscopic scale based on the yarn stress at failure. The results were found to be in good agreement with the experimental data for the two kinds of macroscopic loadings. The deterministic approach assumed a homogeneously distributed stress at failure all over the integration points in the meshes of woven composites. A stochastic approach was applied to a simple representative elementary periodic cell. The distribution of the Weibull stress at failure was assigned to the integration points using a Monte Carlo simulation. It was shown that this stochastic approach allowed more realistic failure simulations avoiding the idealised symmetry due to the deterministic modelling. In particular, the stochastic simulations performed have shown several variations of the stress as well as strain at failure and the failure modes of the yarn.
Al-Samman, A. M.; Rahman, T. A.; Azmi, M. H.; Hindia, M. N.; Khan, I.; Hanafi, E.
2016-01-01
This paper presents an experimental characterization of millimeter-wave (mm-wave) channels in the 6.5 GHz, 10.5 GHz, 15 GHz, 19 GHz, 28 GHz and 38 GHz frequency bands in an indoor corridor environment. More than 4,000 power delay profiles were measured across the bands using an omnidirectional transmitter antenna and a highly directional horn receiver antenna for both co- and cross-polarized antenna configurations. This paper develops a new path-loss model to account for the frequency attenuation with distance, which we term the frequency attenuation (FA) path-loss model and introduce a frequency-dependent attenuation factor. The large-scale path loss was characterized based on both new and well-known path-loss models. A general and less complex method is also proposed to estimate the cross-polarization discrimination (XPD) factor of close-in reference distance with the XPD (CIX) and ABG with the XPD (ABGX) path-loss models to avoid the computational complexity of minimum mean square error (MMSE) approach. Moreover, small-scale parameters such as root mean square (RMS) delay spread, mean excess (MN-EX) delay, dispersion factors and maximum excess (MAX-EX) delay parameters were used to characterize the multipath channel dispersion. Multiple statistical distributions for RMS delay spread were also investigated. The results show that our proposed models are simpler and more physically-based than other well-known models. The path-loss exponents for all studied models are smaller than that of the free-space model by values in the range of 0.1 to 1.4 for all measured frequencies. The RMS delay spread values varied between 0.2 ns and 13.8 ns, and the dispersion factor values were less than 1 for all measured frequencies. The exponential and Weibull probability distribution models best fit the RMS delay spread empirical distribution for all of the measured frequencies in all scenarios. PMID:27654703
Al-Samman, A M; Rahman, T A; Azmi, M H; Hindia, M N; Khan, I; Hanafi, E
This paper presents an experimental characterization of millimeter-wave (mm-wave) channels in the 6.5 GHz, 10.5 GHz, 15 GHz, 19 GHz, 28 GHz and 38 GHz frequency bands in an indoor corridor environment. More than 4,000 power delay profiles were measured across the bands using an omnidirectional transmitter antenna and a highly directional horn receiver antenna for both co- and cross-polarized antenna configurations. This paper develops a new path-loss model to account for the frequency attenuation with distance, which we term the frequency attenuation (FA) path-loss model and introduce a frequency-dependent attenuation factor. The large-scale path loss was characterized based on both new and well-known path-loss models. A general and less complex method is also proposed to estimate the cross-polarization discrimination (XPD) factor of close-in reference distance with the XPD (CIX) and ABG with the XPD (ABGX) path-loss models to avoid the computational complexity of minimum mean square error (MMSE) approach. Moreover, small-scale parameters such as root mean square (RMS) delay spread, mean excess (MN-EX) delay, dispersion factors and maximum excess (MAX-EX) delay parameters were used to characterize the multipath channel dispersion. Multiple statistical distributions for RMS delay spread were also investigated. The results show that our proposed models are simpler and more physically-based than other well-known models. The path-loss exponents for all studied models are smaller than that of the free-space model by values in the range of 0.1 to 1.4 for all measured frequencies. The RMS delay spread values varied between 0.2 ns and 13.8 ns, and the dispersion factor values were less than 1 for all measured frequencies. The exponential and Weibull probability distribution models best fit the RMS delay spread empirical distribution for all of the measured frequencies in all scenarios.
Robustness of power systems under a democratic-fiber-bundle-like model
NASA Astrophysics Data System (ADS)
Yaǧan, Osman
2015-06-01
We consider a power system with N transmission lines whose initial loads (i.e., power flows) L1,...,LN are independent and identically distributed with PL(x ) =P [L ≤x ] . The capacity Ci defines the maximum flow allowed on line i and is assumed to be given by Ci=(1 +α ) Li , with α >0 . We study the robustness of this power system against random attacks (or failures) that target a p fraction of the lines, under a democratic fiber-bundle-like model. Namely, when a line fails, the load it was carrying is redistributed equally among the remaining lines. Our contributions are as follows. (i) We show analytically that the final breakdown of the system always takes place through a first-order transition at the critical attack size p=1 -E/[L ] maxx(P [L >x ](α x +E [L |L >x ]) ) , where E [.] is the expectation operator; (ii) we derive conditions on the distribution PL(x ) for which the first-order breakdown of the system occurs abruptly without any preceding diverging rate of failure; (iii) we provide a detailed analysis of the robustness of the system under three specific load distributions—uniform, Pareto, and Weibull—showing that with the minimum load Lmin and mean load E [L ] fixed, Pareto distribution is the worst (in terms of robustness) among the three, whereas Weibull distribution is the best with shape parameter selected relatively large; (iv) we provide numerical results that confirm our mean-field analysis; and (v) we show that p is maximized when the load distribution is a Dirac delta function centered at E [L ] , i.e., when all lines carry the same load. This last finding is particularly surprising given that heterogeneity is known to lead to high robustness against random failures in many other systems.
Statistical Characterization and Classification of Edge-Localized Plasma Instabilities
NASA Astrophysics Data System (ADS)
Webster, A. J.; Dendy, R. O.
2013-04-01
The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.
Prevalence Incidence Mixture Models
The R package and webtool fits Prevalence Incidence Mixture models to left-censored and irregularly interval-censored time to event data that is commonly found in screening cohorts assembled from electronic health records. Absolute and relative risk can be estimated for simple random sampling, and stratified sampling (the two approaches of superpopulation and a finite population are supported for target populations). Non-parametric (absolute risks only), semi-parametric, weakly-parametric (using B-splines), and some fully parametric (such as the logistic-Weibull) models are supported.
NASA Technical Reports Server (NTRS)
Platt, M. E.; Lewis, E. E.; Boehm, F.
1991-01-01
A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.
NASA Astrophysics Data System (ADS)
Wable, Pawan S.; Jha, Madan K.
2018-02-01
The effects of rainfall and the El Niño Southern Oscillation (ENSO) on groundwater in a semi-arid basin of India were analyzed using Archimedean copulas considering 17 years of data for monsoon rainfall, post-monsoon groundwater level (PMGL) and ENSO Index. The evaluated dependence among these hydro-climatic variables revealed that PMGL-Rainfall and PMGL-ENSO Index pairs have significant dependence. Hence, these pairs were used for modeling dependence by employing four types of Archimedean copulas: Ali-Mikhail-Haq, Clayton, Gumbel-Hougaard, and Frank. For the copula modeling, the results of probability distributions fitting to these hydro-climatic variables indicated that the PMGL and rainfall time series are best represented by Weibull and lognormal distributions, respectively, while the non-parametric kernel-based normal distribution is the most suitable for the ENSO Index. Further, the PMGL-Rainfall pair is best modeled by the Clayton copula, and the PMGL-ENSO Index pair is best modeled by the Frank copula. The Clayton copula-based conditional probability of PMGL being less than or equal to its average value at a given mean rainfall is above 70% for 33% of the study area. In contrast, the spatial variation of the Frank copula-based probability of PMGL being less than or equal to its average value is 35-40% in 23% of the study area during El Niño phase, while it is below 15% in 35% of the area during the La Niña phase. This copula-based methodology can be applied under data-scarce conditions for exploring the impacts of rainfall and ENSO on groundwater at basin scales.
Leung, Brian T W; Tsoi, James K H; Matinlinna, Jukka P; Pow, Edmond H N
2015-09-01
Fluorophlogopite glass ceramic (FGC) is a biocompatible, etchable, and millable ceramic with fluoride releasing property. However, its mechanical properties and reliability compared with other machinable ceramics remain undetermined. The purpose of this in vitro study was to compare the mechanical properties of 3 commercially available millable ceramic materials, IPS e.max CAD, Vitablocs Mark II, and Vita Enamic, with an experimental FGC. Each type of ceramic block was sectioned into beams (n=15) of standard dimensions of 2×2×15 mm. Before mechanical testing, specimens of the IPS e.max CAD group were further fired for final crystallization. Flexural strength was determined by the 3-point bend test with a universal loading machine at a cross head speed of 1 mm/min. Hardness was determined with a hardness tester with 5 Vickers hardness indentations (n=5) using a 1.96 N load and a dwell time of 15 seconds. Selected surfaces were examined by scanning electron microscopy and energy-dispersive x-ray spectroscopy. Data were analyzed by the 1-way ANOVA test and Weibull analysis (α=.05). Weibull parameters, including the Weibull modulus (m) as well as the characteristic strength at 63.2% (η) and 10.0% (B10), were obtained. A significant difference in flexural strength (P<.001) was found among groups, with IPS e.max CAD (341.88 ±40.25 MPa)>Vita Enamic (145.95 ±12.65 MPa)>Vitablocs Mark II (106.67 ±18.50 MPa), and FGC (117.61 ±7.62 MPa). The Weibull modulus ranged from 6.93 to 18.34, with FGC showing the highest Weibull modulus among the 4 materials. The Weibull plot revealed that IPS e.max CAD>Vita Enamic>FGC>Vitablocs Mark II for the characteristic strength at both 63.2% (η) and 10.0% (B10). Significant difference in Vickers hardness among groups (P<.001) was found with IPS e.max CAD (731.63 ±30.64 H(V))>Vitablocs Mark II (594.74 ±25.22 H(V))>Vita Enamic (372.29 ±51.23 H(V))>FGC (153.74 ±23.62 H(V)). The flexural strength and Vickers hardness of IPS e.max CAD were significantly higher than those of the 3 materials tested. The FGC's flexural strength was comparable with Vitablocs Mark II. The FGC's Weibull modulus was the highest, while its Vickers hardness was the lowest among the materials tested. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Did Child Restraint Laws Globally Converge? Examining 40 Years of Policy Diffusion.
Nazif-Muñoz, José Ignacio
2015-01-01
The objective of the current study is to determine what factors have been associated with the global adoption of mandatory child restraint laws (ChRLs) since 1975. In order to determine what factors explained the global adoption of mandatory ChRLs, Weibull models were analyzed. To carry out this analysis, 170 countries were considered and the time risk corresponded to 5,146 observations for the period 1957-2013. The dependent variable was first time to adopt a ChRL. Independent variables representing global factors were the World Health Organization (WHO) and World Bank's (WB) road safety global campaign; the Geneva Convention on Road Traffic; and the United Nation's (UN) 1958 Vehicle Agreement. Independent variables representing regional factors were the creation of the European Transport Safety Council and being a Commonwealth country. Independent variables representing national factors were population; gross domestic product (GDP) per capita; political violence; existence of road safety nongovernmental organizations (NGOs); and existence of road safety agencies. Urbanization served as a control variable. To examine regional dynamics, Weibull models for Africa, Asia, Europe, North America, Latin America, the Caribbean, and the Commonwealth were also carried out. Empirical estimates from full Weibull models suggest that 2 global factors and 2 national factors are significantly associated with the adoption of this measure. The global factors explaining adoption are the WHO and WB's road safety global campaign implemented after 2004 (P <.01), and the UN's 1958 Vehicle Agreement (P <.001). National factors were GDP (P <.01) and existence of road safety agencies (P <.05). The time parameter ρ for the full Weibull model was 1.425 (P <.001), suggesting that the likelihood of ChRL adoption increased over the observed period of time, confirming that the diffusion of this policy was global. Regional analysis showed that the UN's Convention on Road Traffic was significant in Asia, the creation of the European Transport Safety Council was significant in Europe and North America, and the global campaign was in Africa. In Commonwealth and European and North American countries, the existence of road safety agencies was also positively associated with ChRL adoption. Results of the world models suggest that the WHO and WB's global road safety campaign was effective in disseminating ChRLs after 2004. Furthermore, regions such as Asia and Europe and North America were early adopters since specific regional and national characteristics anticipated the introduction of this policy before 2004. In this particular case, the creation of the European Transport Safety Council was fundamental in promoting ChRLs. Thus, in order to introduce conditions to more rapidly diffuse road safety measures across lagging regions, the maintenance of global efforts and the creation of road safety regional organizations should be encouraged. Lastly, the case of ChRL convergence illustrates how mechanisms of global and regional diffusion need to be analytically differentiated in order better to assess the process of policy diffusion.
Modern methodology of designing target reliability into rotating mechanical components
NASA Technical Reports Server (NTRS)
Kececioglu, D. B.; Chester, L. B.
1973-01-01
Experimentally determined distributional cycles-to-failure versus maximum alternating nominal strength (S-N) diagrams, and distributional mean nominal strength versus maximum alternating nominal strength (Goodman) diagrams are presented. These distributional S-N and Goodman diagrams are for AISI 4340 steel, R sub c 35/40 hardness, round, cylindrical specimens 0.735 in. in diameter and 6 in. long with a circumferential groove 0.145 in. radius for a theoretical stress concentration = 1.42 and 0.034 in. radius for a stress concentration = 2.34. The specimens are subjected to reversed bending and steady torque in specially built, three complex-fatigue research machines. Based on these results, the effects on the distributional S-N and Goodman diagrams and on service life of superimposing steady torque on reversed bending are established, as well as the effect of various stress concentrations. In addition a computer program for determining the three-parameter Weibull distribution representing the cycles-to-failure data, and two methods for calculating the reliability of components subjected to cumulative fatigue loads are given.